LU's repositories
Awesome-Efficient-AI-for-Large-Scale-Models
Paper survey of efficient computation for large scale models.
ATD
Adaptive Temperature Distillation Method for Mining Hard Sample’s Knowledge
Conference-Accepted-Paper-List
Some Conferences' accepted paper lists (including AI, ML, Robotic)
Deep-Class-Incremental-Learning
The code repository for "Deep Class-Incremental Learning: A Survey" in PyTorch.
Dipoorlet
Offline Quantization Tools for Deploy.
Dual-Cross
Cross-Domain and Cross-Modal Knowledge Distillation in Domain Adaptation for 3D Semantic Segmentation (ACMMM2022)
ECON
[CVPR 2023] ECON: Explicit Clothed humans Obtained from Normals
FlatTrajectoryDistillation_FTD
The code of the paper "Minimizing the Accumulated Trajectory Error to Improve Dataset Distillation" (CVPR2023)
GaussianDistillation
Data-free knowledge distillation using Gaussian noise (NeurIPS paper)
GENIUS
Can GPT-4 Perform Neural Architecture Search?
HieraSeg
CVPR2022 - Deep Hierarchical Semantic Segmentation - A structured, pixel-wise description of visual scenes in terms of the class hierarchy.
KDSR
This project is the official implementation of 'Knowledge Distillation based Degradation Estimation for Blind Super-Resolution', ICLR2023
Multi-Level-Logit-Distillation
Code for 'Multi-level Logit Distillation' (CVPR2023)
OKDPH
OKDPH: Generalization Matters: Loss Minima Flattening via Parameter Hybridization for Efficient Online Knowledge Distillation
online-hyperparameter-optimization
PyTorch implementation of "Online Hyperparameter Optimization for Class-Incremental Learning" (AAAI 2023)
PreNAS
The official implementation of paper PreNAS: Preferred One-Shot Learning Towards Efficient Neural Architecture Search
prompt-in-context-learning
Awesome resources for in-context learning and prompt engineering: Mastery of the LLMs such as ChatGPT, GPT-3, and FlanT5, with up-to-date and cutting-edge updates.
segment-anything
The repository provides code for running inference with the SegmentAnything Model (SAM), links for downloading the trained model checkpoints, and example notebooks that show how to use the model.
SMD
Pytorch implementation of 'Improving Self-supervised Lightweight Model Learning via Hard-aware Metric Distillation. In ECCV 2022'
SMP
Pruning Pre-trained Language Models Without Fine-Tuning
superclass-FSIS
This is the official implementation of the paper "Instance-level Few-shot Learning with Class Hierarchy Mining"
XDED
Official PyTorch implementation of "Cross-Domain Ensemble Distillation for Domain Generalization" (ECCV 2022)