Deep Learning

Back to Data-Science

See prerequisite math and foundational machine learning and neural networks

Deep Learning models that support inferences and are robustly applicable to many problems

A model can be seen as a computation graph (DAG), whose nodes represent operations, leaves are the input and root is the output

Optimization and Training techniques -- source

Hyper-parameters are picked constants that largely influence the effectiveness of an algorithm

Improving neural networks by preventing co-adaptation of feature detectors

Regularization in deep learning is used to add an extra term to the cost function (weight decay, L2 regularization)

max-norm regularization works well in conjunction with dropout. Bounds the norm of the weight vector at each hidden unit by c, clipping constant.

Deep Generative Models

Multimodal Learning & Learning with Structured Outputs