Tebmer / Awesome-Knowledge-Distillation-of-LLMs

This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Further distillation papers to consider

begab opened this issue · comments

Thanks for the great repo, these additional papers related to masked latent semantic modeling (in which pre-training is achieved by recovering latent semantic information extracted from a teacher model) might fit the scope of the survey as well:

Great work! Thanks! We have added them into the repo :)