Giters
karanchahal
/
distiller
A large scale study of Knowledge Distillation.
Geek Repo:
Geek Repo
Github PK Tool:
Github PK Tool
Stargazers:
215
Watchers:
9
Issues:
5
Forks:
30
karanchahal/distiller Issues
Using Distillation on a differenta dataset using a trained teacher.
Updated
4 years ago
Comments count
2
Pytorch lightning?
Updated
4 years ago
Comments count
2
Is it possible to train a smaller model (student)?
Closed
4 years ago
Comments count
1
Is it possible to keep the learing rate constant?
Closed
4 years ago
Comments count
3
Trouble training teaches
Closed
4 years ago
Comments count
4