cschoeller / autodiff

Minimal Automatic Differentiation Example

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Minimal Automatic Differentiation

This project contains a minimal automatic differentiation engine (inspired by micrograd). Automatic differentiation works by building a mathematical DAG and subsequently applying the chain rule of differentiation (and it's multivariable extension) in the backward pass to compute gradients.

We use this scalar autodiff engine to define a Module, similar to PyToch, that automatically registers model parameters. Based on this Module we build a quintic polynomial model and fit it with vanilla stochastic gradient descent to a small generated dataset. The resulting fit of the model looks like this:

result

This minimal example illustrates how modern deep learning engines work. The mechanism is silmiar, but instead of scalar variables they are implemented for tensors and optionally executed on GPU.

About

Minimal Automatic Differentiation Example

License:MIT License


Languages

Language:Python 100.0%