There are 0 repository under model-distillation topic.
Awesome Knowledge Distillation
Images to inference with no labeling (use foundation models to train supervised models).
🚀 PyTorch Implementation of "Progressive Distillation for Fast Sampling of Diffusion Models(v-diffusion)"
Mechanistically interpretable neurosymbolic AI (Nature Comput Sci 2024): losslessly compressing NNs to computer code and discovering new algorithms which generalize out-of-distribution and outperform human-designed algorithms
Our open source implementation of MiniLMv2 (https://aclanthology.org/2021.findings-acl.188)
The Codebase for Causal Distillation for Language Models (NAACL '22)
A framework for knowledge distillation using TensorRT inference on teacher network
The Codebase for Causal Distillation for Task-Specific Models
Repository for the publication "AutoGraph: Predicting Lane Graphs from Traffic"
Awesome Deep Model Compression
[Master Thesis] Research project at the Data Analytics Lab in collaboration with Daedalean AI. The thesis was submitted to both ETH Zürich and Imperial College London.
Autodistill Google Cloud Vision module for use in training a custom, fine-tuned model.
Use AWS Rekognition to train custom models that you own.
Use LLaMA to label data for use in training a fine-tuned LLM.
Model distillation of CNNs for classification of Seafood Images in PyTorch