[NeurIPS 2023] LLM-Pruner: On the Structural Pruning of Large Language Models. Support LLaMA, Llama-2, BLOOM, Vicuna, Baichuan, etc.
Home Page:https://arxiv.org/abs/2305.11627
Geek Repo:Geek Repo
Github PK Tool:Github PK Tool
JunKong5 opened this issue 17 days ago · comments
There are no random seed settings in post_training.py. Did the results in the paper use a random seed setting?
I look forward to your reply. Thank you very much!