There are 1 repository under autodiff topic.
Burn is a new comprehensive dynamic Deep Learning Framework built using Rust with extreme flexibility, compute efficiency and portability as its primary goals.
Deep learning in Rust, with shape checked tensors and neural networks
Transparent calculations with uncertainties on the quantities involved (aka "error propagation"); calculation of derivatives.
Betty: an automatic differentiation library for generalized meta-learning and multilevel optimization
Autodifferentiation package in Rust.
An interface to various automatic differentiation backends in Julia.
[Experimental] Graph and Tensor Abstraction for Deep Learning all in Common Lisp
Automatic differentiation of implicit functions
An experimental deep learning framework for Nim based on a differentiable array programming language
Minimal deep learning library written from scratch in Python, using NumPy/CuPy.
A JIT compiler for hybrid quantum programs in PennyLane
A probabilistic programming language that combines automatic differentiation, automatic marginalization, and automatic conditioning within Monte Carlo methods.
FastAD is a C++ implementation of automatic differentiation both forward and reverse mode.
A .NET library that provides fast, accurate and automatic differentiation (computes derivative / gradient) of mathematical functions.
Geometry processing utilities compatible with jax for autodifferentiation.
Сustom torch style machine learning framework with automatic differentiation implemented on numpy, allows build GANs, VAEs, etc.
library of C++ functions that support applications of Stan in Pharmacometrics
Utilities for testing custom AD primitives.
Differentiable optical models as parameterised neural networks in Jax using Zodiax
Fazang is a Fortran library for reverse-mode automatic differentiation, inspired by Stan/Math library.
A new lightweight auto-differentation library that directly builds on numpy. Used as a homework for CMU 11785/11685/11485.
A lightweight deep learning framework made with ❤️