Seunghyun Lee's repositories
KD_methods_with_TF
Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)
Knowledge_distillation_via_TF2.0
The codes for recent knowledge distillation algorithms and benchmark results via TF2.0 low-level API
GALA_TF2.0
Tensorflow 2.0 implementation of "Symmetric Graph Convolutional Autoencoder for Unsupervised Graph Representation Learning" in ICCV2019
Lightweighting_Cookbook
This project attempts to build neural network training and lightweighting cookbook including three kinds of lightweighting solutions, i.e., knowledge distillation, filter pruning, and quantization.
Variational_Information_Distillation
Reproducing VID in CVPR2019 (on working)
TF2-jit-compile-on-multi-gpu
Tensorflow2 training code with jit compiling on multi-GPU.
Autoslim_TF2
Implementation of Autoslim using Tensorflow2
awesome-knowledge-distillation
Awesome Knowledge Distillation
CNN_via_Tensorflow2_low-level
Colvolutional neural network implementation with Tensorflow2.0 low level API only
pytorch-cifar
95.16% on CIFAR10 with PyTorch
aingo03304.github.io
Minjae's Blog
github-readme-stats
:zap: Dynamically generated stats for your github readmes
putting-nerf-on-a-diet
Putting NeRF on a Diet: Semantically Consistent Few-Shot View Synthesis Implementation
pytorch-retinanet
Pytorch implementation of RetinaNet object detection.
slimmable_networks
Slimmable Networks, AutoSlim, and Beyond, ICLR 2019, and ICCV 2019
SPT_LSA_ViT
Implementation for Visual Transformer for Small-size Datasets
tensorflow
An Open Source Machine Learning Framework for Everyone