ghhernandes / deep-learning-foundations

Deep Learning Foundations Algorithms with Numpy, PyTorch and Fast.ai.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Deep Learning Foundations

Deep Learning Foundations Algorithms with Numpy, PyTorch and Fast.ai.

The objective is build a state of the art deep learning model from scratch.

Foundations Algorithms Roadmap

Reference course Fast.ai.

  • Matrix Multiplication
    • Tensor
    • Forbenius norm
    • Broadcasting
    • Einstein summation
  • Forward Passes
    • Fully connected
    • Conv2D
  • Backward Passes
    • Linear
    • ReLU
    • MSE
  • Optimizers
    • SGD
    • Adam
  • Loss functions
    • MSE
  • Training loop
  • Callbacks and event handlers
  • Data Block API and generic optimizer

Table of Contents

Reference book Fastbook.

  • Parameters and activations
  • Random initialization and transfer learning
  • SGD, Momentum, Adam, and other optimizers
  • CNNs (Convolutions)
  • Batch normalization
  • Dropout
  • Data augmentation
  • Weight decay
  • ResNet and DenseNet architectures
  • Image classification and regression
  • Embeddings
  • RNNs (Recurrent neural networks)
  • Segmentation
  • U-Net

AI Quizzes

Fast.AI

Papers

Attention Is All You Need

Effective Approaches to Attention-based Neural Machine Translation

ResNet - Deep Residual Learning for Image Recognition

Bag of Tricks for Image Classification with Convolutional Neural Networks

The Computational Limits of Deep Learning

Understanding the difficulty of training deep feedforward neural networks

Fixup Initialization: Residual Learning Without Normalization

Articles

How You Should Read Research Papers According To Andrew Ng

Videos

CNN Networks (Course 4 Andrew Ng DeepLearning.ai

Books

Deep Learning by Ian Goodfellow

Deep Learning for Coders with Fastai and PyTorch

Deep Learning with Python

Numerical Python

About

Deep Learning Foundations Algorithms with Numpy, PyTorch and Fast.ai.


Languages

Language:Jupyter Notebook 96.2%Language:Python 3.8%