nlpxucan / WizardLM

LLMs build upon Evol Insturct: WizardLM, WizardCoder, WizardMath

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

quantity with auto_gptq avg loss: nan

hgcdanniel opened this issue · comments

hi,when I try to use https://github.com/PanQiWei/AutoGPTQ to quantity the model finetune with startcoder, the Quantizing attn.c_attn in layer 1/40...2/40..... are all avg loss: nan, no mather 8/4/bit ,could you share your quantity script??

thanks