rayaanoidPrime / smolTorch

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

smolTorch

A tiny Autograd engine inspired by Andrej Karpathy's Micrograd. Implements backpropagation over a dynamically built DAG and a small neural networks library on top of it with a PyTorch-like API.The DAG only operates over scalar values, so e.g. we chop up each neuron into all of its individual tiny adds and multiplies. However, this is enough to build up entire deep neural nets doing binary classification, as the demo notebook shows.Built for educational purposes.

Example usage

Below is a slightly contrived example showing a number of possible supported operations:

from core.engine import Value

a = Value(-4.0)
b = Value(2.0)
c = a + b
d = a * b + b**3
c += c + 1
c += 1 + c + (-a)
d += d * 2 + (b + a).relu()
d += 3 * d + (b - a).relu()
e = c - d
f = e**2
g = f / 2.0
g += 10.0 / f
print(f'{g.data:.4f}') # prints 24.7041, the outcome of this forward pass
g.backward()
print(f'{a.grad:.4f}') # prints 138.8338, i.e. the numerical value of dg/da
print(f'{b.grad:.4f}') # prints 645.5773, i.e. the numerical value of dg/db

Training a neural net

The notebook demo.ipynb provides a full demo of training an 2-layer neural network (MLP) binary classifier. This is done by implementing a simple svm "max-margin" binary classification loss and using SGD for optimization. As shown in the notebook, using a 2-layer neural net with two 16-node hidden layers we achieve the following decision boundary on the moon dataset:

Tracing / visualization

For added convenience, the notebook trace_graph.ipynb produces graphviz visualizations. E.g. this one below is of a simple 2D neuron, arrived at by calling draw_dot on the code below, and it shows both the data (left number in each node) and the gradient (right number in each node).

from core import nn
n = nn.Neuron(2)
x = [Value(1.0), Value(-2.0)]
y = n(x)
dot = draw_dot(y)

License

MIT

About


Languages

Language:Jupyter Notebook 98.8%Language:Python 1.2%