shawnricecake / EdgeQAT

Official Repo for EdgeQAT

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

EdgeQAT

Official repo for the paper: EdgeQAT: Entropy and Distribution Guided Quantization-Aware Training for the Acceleration of Lightweight LLMs on the Edge

Implementation

Follow the instructions of the BabyLLaMA to implement the training environment, and BabyLM Challenge to implement the evaluation environment.

Usage

  1. Download dataset from BabyLM Challenge
  2. Clean the dataset according to BabyLLaMA
  3. Pretrain teacher model
  4. Download FP16 LLaMA-58M model from BabyLLaMA
  5. QAT with scripts in distill_train/scripts/
  6. Evaluation with scripts in evaluation_pipeline/

Citation

@article{shen2024edgeqat,
  title={EdgeQAT: Entropy and Distribution Guided Quantization-Aware Training for the Acceleration of Lightweight LLMs on the Edge},
  author={Shen, Xuan and Kong, Zhenglun and Yang, Changdi and Han, Zhaoyang and Lu, Lei and Dong, Peiyan and others},
  journal={arXiv preprint arXiv:2402.10787},
  year={2024}
}

About

Official Repo for EdgeQAT


Languages

Language:Python 98.6%Language:Shell 1.4%