This repo is a collection of implementations and notes gathered from this Lecture Series by Andrej Karpathy
-
Lecture 1: Building a pytorch inspired computation graph framework for automatic differentiation over a DAG. This is a step-by step imlpementation for backpropogation and training entire neural networks from scratch.
-
Lecture 2:
-
Lecture 3:
-
Lecture 4:
-
Lecture 5:
-
Lecture 6:
-
Lecture 7:
Vanishing Gradients Socashtic Gradient Descent