Ceyron / autodiff-table

An overview of major automatic differentiation primitive rules

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Explicit Autodiff Table πŸ“š

An overview of major automatic differentiation primitive rules for explicit scalar and tensor operations, e.g., addition and multiplication.

πŸ‘‰ Here is the web version: https://ceyron.github.io/autodiff-table/.

πŸ’‘ Background

Given a unary function with one input and one output

$$ f(x) =: z $$

these autodiff rules define ways of obtaining the Jacobian-vector product (Jvp)

$$ \dot{x} \mapsto \frac{\partial f}{\partial x} \cdot \dot{x} = \dot{z} $$

and the vector-Jacobian product (vJp)

$$ \bar{z} \mapsto \bar{z}^T \frac{\partial f}{\partial x} = \bar{x}^T $$

without ever explicitly constructing the Jacobian matrix $\frac{\partial f}{\partial x}$.

🧠 Modern automatic differentiation engines like JAX, TensorFlow, PyTorch, Autograd, Zygote and many more then use these rules in different approaches (static graphs, dynamic graphs, source-code transformation, piggybacking etc.) to compute derivatives of arbitrary programs (often associated to numerical computing like machine learning --- especially deep learning --- and scientific computing).

If one has a general function with many inputs and outputs, the Jvp for one specific output tangent is sum of propagation from all the inputs. Vice versa, the vJp for one specific input cotangent is the sum of the backpropagation from all output cotangents.

πŸ“– Resources

Check out my video playlist I created with in-depth derivations for most of the rules in this table. You find the handwritten notes over on the GitHub Repo of the channel.

About

An overview of major automatic differentiation primitive rules

License:MIT License


Languages

Language:HTML 100.0%