jhoon-oh / kd_data

IJCAI 2021, "Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation"

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Official] Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation

This repository is the official implementation of "Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation" paper presented in IJCAI 2021. Thanks to the contributors. [IJCAI2021Poster]

Results

You can reproduce all results in the paper with our code. All results have been described in our paper including Appendix. The results of our experiments are so numerous that it is difficult to post everything here. However, if you experiment several times by modifying the hyperparameter value in the .sh file, you will be able to reproduce all of our analysis.

Contact

Feel free to contact us if you have any questions:)

Acknowledgements

This work was supported by Institute of Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) [No.2019-0-00075, Artificial Intelligence Graduate School Program (KAIST)] and [No. 2021-0-00907, Development of Adaptive and Lightweight Edge-Collaborative Analysis Technology for Enabling Proactively Immediate Response and Rapid Learning].

About

IJCAI 2021, "Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation"


Languages

Language:Jupyter Notebook 98.8%Language:Python 1.1%Language:Shell 0.0%