Linear Algebra Review

Probability & Information Theory

Information Theory revolves around quantifying how much information is in a signal

Logistic Regression

Regression model where the dependent variable is categorical

Numeric Computation

Overflow and Underflow are two problems when represented infinitely many real numbers with finite bit strings

Conditioning refers to how quickly a function changes with respect to small changes to input

Most learning algorithms involve optimization, specifically minimization of functions

Jacobian Matrix J corresponds to a function. It contains all partial derivatives of a vector field: Ji,j = ∂/∂xj • f(x)i, where f(x)i denotes the ith output as a function of the vector x (we can think of a vector field as a vector of vector functions)