leehanchung / lora-instruct

Finetune Falcon, LLaMA, MPT, and RedPajama on consumer hardware using PEFT LoRA

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Error message when training MPT-7B

jianchaoji opened this issue · comments

Hi I got a message when I try to use Lora to train MPT-B, do you have any ideas to solve it?

ValueError: Unable to create tensor, you should probably activate truncation and/or padding with 'padding=True' 'truncation=True' to have batched tensors with the same length. Perhaps your features (instruction in this case) have excessive nesting (inputs type list where type int is expected).

commented

Hi @jianchaoji Were able to fine tune MPT model?

I'm afraid not.