Michael E. Sander's repositories
momentumnet
Drop-in replacement for any ResNet with a significantly reduced memory footprint and better representation capabilities
sinkformers
Transformers with doubly stochastic attention
resnet_nodes
Do ResNets discretize Neural ODEs?
implicit-regularization-resnets-nodes
Implicit Regularization of ResNets towards Neural ODEs
google-research
Google Research
jacobian_free_backprop
Implicit networks can be trained efficiently and simply by using Jacobian-free Backprop (JFB).
jaxopt
Hardware accelerated, batchable and differentiable optimizers in JAX.
Local-Lipschitz-Constants
[NeurIPS 2022] Code for paper "Efficiently Computing Local Lipschitz Constants of Neural Networks via Bound Propagation"
mamba-minimal
Simple, minimal implementation of the Mamba SSM in one file of PyTorch.
michaelsdr.github.io
Github Pages template for academic personal websites, forked from mmistakes/minimal-mistakes
T2T-ViT
ICCV2021, Tokens-to-Token ViT: Training Vision Transformers from Scratch on ImageNet
ViT-CIFAR
PyTorch implementation for Vision Transformer[Dosovitskiy, A.(ICLR'21)] modified to obtain over 90% accuracy FROM SCRATCH on CIFAR-10 with small number of parameters (= 6.3M, originally ViT-B has 86M).
mistral-src
Reference implementation of Mistral AI 7B v0.1 model.
pytorch-image-models
PyTorch image models, scripts, pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (ViT), MobileNet-V3/V2, RegNet, DPN, CSPNet, Swin Transformer, MaxViT, CoAtNet, ConvNeXt, and more