This video provides an introduction to deep learning. Topics to be covered include:
- History and MLP. We cover the history of ANN (Artificial neural nets) and then we describe perceptron which is the fundamental unit of ANN and its limitations. Finally we introduce the MLP (Multi layer perceptron) and its advantages over the perceptron.
- Concepts and Purpose. We cover the learning rule for multi-layer perceptron using gradient descent and backpropagation technique. We then describe the complete process for training multi layer feed forward nets and describe how to design layers of ANNs and commonly used activation functions and loss functions in the Output Layer. We conclude by defining deep learning and the reasons for using it.
- Use Cases. We describe deep learning use cases such as the coloring of black and white photos, deep learning for medical imaging, increasing pixel resolution, and providing the computer vision in self-driving cars.
- Standard Learning Process. We cover convolutional neural nets in detail. We discuss the motivation, overview of CNNs, architecture and its key components (feature maps, pooling, non-linearity, convolution, and input image), convolution layers (convolution, stride, and zero padding), and specific terms related to conv layer and properties of conv layer (such as shared parameters and hyperparameters).
- Neural Net Architectures. We dive deeper into neural nets, including the calculation of parameters in the conv layer, the pooling layer and its benefits, and several specialized architectures of the pooling layer. Apart from CNN, we also cover RNN.
- Training Neural Nets. Learn about GAN (Generative Adversarial Network). Both generative and discriminative models are covered.
- Advice Using Neural Nets. We cover practical tips for deep learning including train-test-validate, feature scaling (standardization and normalization), Sigmoid, tanh, and ReLU.