horseee / LLM-Pruner

[NeurIPS 2023] LLM-Pruner: On the Structural Pruning of Large Language Models. Support LLaMA, Llama-2, BLOOM, Vicuna, Baichuan, etc.

Home Page:https://arxiv.org/abs/2305.11627

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

RecursionError: maximum recursion depth exceeded

Zhenyu001225 opened this issue · comments

When I'm running
python generate.py --model_type pretrain
The error occurs, I can't understand the reason...

Solved.
Previously I used baffo32/decapoda-research-llama-7B-hf
Now I changed to huggyllama/llama-7b