fire23's starred repositories

q-diffusion

[ICCV 2023] Q-Diffusion: Quantizing Diffusion Models.

Language:PythonLicense:MITStargazers:305Issues:0Issues:0

fast-compress-vit

Official Pytorch Implementation of "A Fast Training-free Compression Framework for Vision Transformers"

License:MITStargazers:4Issues:0Issues:0

get

Generative Equilibrium Transformer

Language:PythonLicense:MITStargazers:17Issues:0Issues:0

T2T-ViT

ICCV2021, Tokens-to-Token ViT: Training Vision Transformers from Scratch on ImageNet

Language:Jupyter NotebookLicense:NOASSERTIONStargazers:1135Issues:0Issues:0

deit

Official DeiT repository

Language:PythonLicense:Apache-2.0Stargazers:3968Issues:0Issues:0

Awesome-Quantization-Papers

List of papers related to neural network quantization in recent AI conferences and journals.

License:MITStargazers:386Issues:0Issues:0

DiffRate

[ICCV 23]An approach to enhance the efficiency of Vision Transformer (ViT) by concurrently employing token pruning and token merging techniques, while incorporating a differentiable compression rate.

Language:Jupyter NotebookStargazers:82Issues:0Issues:0

VisionMamba

Implementation of Vision Mamba from the paper: "Vision Mamba: Efficient Visual Representation Learning with Bidirectional State Space Model" It's 2.8x faster than DeiT and saves 86.8% GPU memory when performing batch inference to extract features on high-res images

Language:PythonLicense:MITStargazers:329Issues:0Issues:0
Language:PythonLicense:Apache-2.0Stargazers:395Issues:0Issues:0
Language:PythonLicense:MITStargazers:65Issues:0Issues:0
Language:Jupyter NotebookStargazers:37Issues:0Issues:0

pytorch-OpCounter

Count the MACs / FLOPs of your PyTorch model.

Language:PythonLicense:MITStargazers:4807Issues:0Issues:0

HiDiffusion

[ECCV 2024] HiDiffusion: Increases the resolution and speed of your diffusion model by only adding a single line of code!

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:716Issues:0Issues:0

glow

Code for reproducing results in "Glow: Generative Flow with Invertible 1x1 Convolutions"

Language:PythonLicense:MITStargazers:3103Issues:0Issues:0

latent-consistency-model

Latent Consistency Models: Synthesizing High-Resolution Images with Few-Step Inference

Language:PythonLicense:MITStargazers:4245Issues:0Issues:0

celeba-hq-modified

Modified h5tool.py make user getting celeba-HQ easier

Language:PythonLicense:MITStargazers:119Issues:0Issues:0

vqvae

A pytorch implementation of the vector quantized variational autoencoder (https://arxiv.org/abs/1711.00937)

Language:Jupyter NotebookStargazers:565Issues:0Issues:0

make-CelebA-HQ

Supposed you've downloaded CelebA & CelebA-HQ dataset, and want to get HQ images from them.

Language:PythonStargazers:21Issues:0Issues:0

progressive_growing_of_gans

Progressive Growing of GANs for Improved Quality, Stability, and Variation

Language:PythonLicense:NOASSERTIONStargazers:6079Issues:0Issues:0

torch-fidelity

High-fidelity performance metrics for generative models in PyTorch

Language:PythonLicense:NOASSERTIONStargazers:952Issues:0Issues:0

PixelCNN

PyTorch implementation of gated PixelCNN

Language:PythonStargazers:53Issues:0Issues:0

PixelCNN-Pytorch

A naive implementation of PixelCNN in Pytorch as described in A Oord et. al.

Language:PythonLicense:GPL-3.0Stargazers:59Issues:0Issues:0

pytorchviz

A small package to create visualizations of PyTorch execution graphs

Language:Jupyter NotebookLicense:MITStargazers:3131Issues:0Issues:0

DL-Demos

Demos for deep learning

Language:PythonStargazers:328Issues:0Issues:0

SwissRoll-VAE-pytorch

VAE model for approximating the distribution of the Swiss roll dataset.

Language:PythonStargazers:1Issues:0Issues:0

PyTorch-VAE

A Collection of Variational Autoencoders (VAE) in PyTorch.

Language:PythonLicense:Apache-2.0Stargazers:6361Issues:0Issues:0

vae

a simple vae and cvae from keras

Language:PythonStargazers:1237Issues:0Issues:0

enhancing-transformers

An unofficial implementation of both ViT-VQGAN and RQ-VAE in Pytorch

Language:PythonLicense:MITStargazers:277Issues:0Issues:0

gpt-2

Code for the paper "Language Models are Unsupervised Multitask Learners"

Language:PythonLicense:NOASSERTIONStargazers:22096Issues:0Issues:0

DALL-E

PyTorch package for the discrete VAE used for DALL·E.

Language:PythonLicense:NOASSERTIONStargazers:10765Issues:0Issues:0