TanmDL's starred repositories
vit-pytorch
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
recommenders
Best Practices on Recommendation Systems
Swin-Transformer
This is an official implementation for "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows".
Awesome-Transformer-Attention
An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
loss-landscape
Code for visualizing the loss landscape of neural nets
benchmark_VAE
Unifying Variational Autoencoder (VAE) implementations in Pytorch (NeurIPS 2022)
lightweight-gan
Implementation of 'lightweight' GAN, proposed in ICLR 2021, in Pytorch. High resolution image generations that can be trained within a day or two
mlp-mixer-pytorch
An All-MLP solution for Vision, from Google AI
how-do-vits-work
(ICLR 2022 Spotlight) Official PyTorch implementation of "How Do Vision Transformers Work?"
sharpened-cosine-similarity
An alternative to convolution in neural networks
dl-with-bayes
Contains code for the NeurIPS 2019 paper "Practical Deep Learning with Bayesian Principles"
Awesome-Few-Shot-Class-Incremental-Learning
Awesome Few-Shot Class-Incremental Learning
pytorch-auto-augment
PyTorch implementation of AutoAugment.
pytorch-image-generation-metrics
Pytorch implementation of common image generation metrics.
Instructions-of-the-PersonX-dataset
Images of the PerxonX dataset and the original 3D human models of this dataset
Elastic-Impedance-Inversion-Using-Recurrent-Neural-Networks
Codes and data for the paper: M. Alfarraj, and G. AlRegib, "Semi-Supervised Sequence Modeling for Elastic Impedance Inversion," in Interpretation, Aug. 2019
Transductive_ZSL_3D_Point_Cloud
Implementation of "Transductive Zero-Shot Learning for 3D Point Cloud Classification"
pure-noise
Official implementation for "Pure Noise to the Rescue of Insufficient Data: Improving Imbalanced Classification by Training on Random Noise Images" https://arxiv.org/abs/2112.08810