Aditya Singh's repositories
SharpenFocus-Pytorch
An unoffocial Pytorch-based implementation
convmixer-cifar
A pytorch experiment set-up for convmixer on CIFARs
awesome-jax
JAX - A curated list of resources https://github.com/google/jax
awesome-trusty
Curation and brief overviews of reliability and trustworthiness of predictions
BigGAN-PyTorch
See https://github.com/ilyakava/gan for results on Imagenet 128. Code for a Multi-Hinge Loss with K+1 Conditional GANs
CLIP
Contrastive Language-Image Pretraining
CutMix-PyTorch
Official Pytorch implementation of CutMix regularizer
DATM
ICLR 2024, Towards Lossless Dataset Distillation via Difficulty-Aligned Trajectory Matching
equinox
Callable PyTrees and filtered transforms => neural networks in JAX. https://docs.kidger.site/equinox/
FTD-distillation
The code of the paper "Minimizing the Accumulated Trajectory Error to Improve Dataset Distillation" (CVPR2023)
HoMM
High order Moment Models
Made-With-ML
Learn how to responsibly deliver value with ML.
mkdocs-material
Documentation that simply works
NeurIPS-2019
Repository for submission to the NeurIPS-2019
optax
Optax is a gradient processing and optimization library for JAX.
Parametric-Contrastive-Learning
Parametric Contrastive Learning (ICCV2021)
Patch-Fool
[ICLR 2022] "Patch-Fool: Are Vision Transformers Always Robust Against Adversarial Perturbations?" by Yonggan Fu, Shunyao Zhang, Shang Wu, Cheng Wan, Yingyan Lin
pytorch-lightning
Pretrain, finetune and deploy AI models on multiple GPUs, TPUs with zero code changes.
rlcard
Reinforcement Learning / AI Bots in Card (Poker) Games - Blackjack, Leduc, Texas, DouDizhu, Mahjong, UNO.
Robust-Vision-Transformer
The implementation of our paper: Towards Robust Vision Transformer (CVPR2022)
URepDistiller
[ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods