Deep Learning Foundations Algorithms with Numpy, PyTorch and Fast.ai.
The objective is build a state of the art deep learning model from scratch.
Reference course Fast.ai.
- Matrix Multiplication
- Tensor
- Forbenius norm
- Broadcasting
- Einstein summation
- Forward Passes
- Fully connected
- Conv2D
- Backward Passes
- Linear
- ReLU
- MSE
- Optimizers
- SGD
- Adam
- Loss functions
- MSE
- Training loop
- Callbacks and event handlers
- Data Block API and generic optimizer
Reference book Fastbook.
- Parameters and activations
- Random initialization and transfer learning
- SGD, Momentum, Adam, and other optimizers
- CNNs (Convolutions)
- Batch normalization
- Dropout
- Data augmentation
- Weight decay
- ResNet and DenseNet architectures
- Image classification and regression
- Embeddings
- RNNs (Recurrent neural networks)
- Segmentation
- U-Net
Effective Approaches to Attention-based Neural Machine Translation
ResNet - Deep Residual Learning for Image Recognition
Bag of Tricks for Image Classification with Convolutional Neural Networks
The Computational Limits of Deep Learning
Understanding the difficulty of training deep feedforward neural networks
Fixup Initialization: Residual Learning Without Normalization
How You Should Read Research Papers According To Andrew Ng
CNN Networks (Course 4 Andrew Ng DeepLearning.ai
Deep Learning by Ian Goodfellow