FlagAI-Open / FlagAI

FlagAI (Fast LArge-scale General AI models) is a fast, easy-to-use and extensible toolkit for large-scale model.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

TypeError: AquilaPreTrainedModel._set_gradient_checkpointing() got an unexpected keyword argument 'enable'

Chenjingliang1 opened this issue · comments

System Info

Flagai version 1.8.2
python version 3.10

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as T5/AltCLIP, ...)
  • My own task or dataset (give details below)

Reproduction

运行的脚本是https://github.com/FlagAI-Open/Aquila2/blob/main/finetune/34B/finetune.sh

Traceback (most recent call last):
File "/home/jingliang.chen/llm/test/Aquila2/finetune/finetune.py", line 481, in
train()
File "/home/jingliang.chen/llm/test/Aquila2/finetune/finetune.py", line 455, in train
trainer.train()
File "/usr/local/lib/python3.10/site-packages/transformers/trainer.py", line 1555, in train
return inner_training_loop(
File "/usr/local/lib/python3.10/site-packages/transformers/trainer.py", line 1668, in _inner_training_loop
self.model.gradient_checkpointing_enable(gradient_checkpointing_kwargs=gradient_checkpointing_kwargs)
File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 1872, in gradient_checkpointing_enable
self._set_gradient_checkpointing(enable=True, gradient_checkpointing_func=gradient_checkpointing_func)
TypeError: AquilaPreTrainedModel._set_gradient_checkpointing() got an unexpected keyword argument 'enable'

Expected behavior

看着像是flagai的bug,AquilaPreTrainedModel 有_set_gradient_checkpointing() 函数,跟transformers里的冲突了。

这边提到是从transformers里copy的,但去transformers对应地方看了下,没有这个函数
image

我把这几行删了,不报错了。
https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/aquila2/modeling_aquila.py#L504-L506

commented

transformers 版本是多少,建议4.31.0 试试

确认只支持 transformers 版本 4.31.0