[EMNLP 2022 main] Code for "Understanding and Improving Knowledge Distillation for Quantization-Aware-Training of Large Transformer Encoders"
Home Page:https://arxiv.org/abs/2211.11014
Geek Repo:Geek Repo
Github PK Tool:Github PK Tool