There are 1 repository under self-distillation topic.
This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.
A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', https://arxiv.org/abs/1905.08094
Deep Hash Distillation for Image Retrieval - ECCV 2022
Self-Distillation with weighted ground-truth targets; ResNet and Kernel Ridge Regression
(Unofficial) Data-Distortion Guided Self-Distillation for Deep Neural Networks (AAAI 2019)
Self-Distillation and Knowledge Distillation Experiments with PyTorch.
Pytorch implementation of "Emerging Properties in Self-Supervised Vision Transformers" (a.k.a. DINO)
A minimalist unofficial implementation of "Self-Distillation from the Last Mini-Batch for Consistency Regularization"
A generalized self-supervised training paradigm for unimodal and multimodal alignment and fusion.
Official implementation of Self-Distillation for Gaussian Processes
Self supervised learning through self distillation with no labels (DINO) with Vision Transformers on the PCAM dataset.
Modality-Agnostic Learning for Medical Image Segmentation Using Multi-modality Self-distillation