Xirider / finetune-gpt2xl

Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpeed

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Crashes with new Transformers version

barakw2021 opened this issue · comments

Here's the error:

Traceback (most recent call last):
File "run_clm.py", line 478, in
main()
File "run_clm.py", line 422, in main
trainer = Trainer(
File "/root/miniconda3/lib/python3.8/site-packages/transformers/trainer.py", line 295, in init
logging.set_verbosity(log_level)
File "/root/miniconda3/lib/python3.8/site-packages/transformers/utils/logging.py", line 161, in set_verbosity
_get_library_root_logger().setLevel(verbosity)
File "/root/miniconda3/lib/python3.8/logging/init.py", line 1409, in setLevel
self.level = _checkLevel(level)
File "/root/miniconda3/lib/python3.8/logging/init.py", line 194, in _checkLevel
raise ValueError("Unknown level: %r" % level)

The fix was to install transformers v4.6.0 from pip

Thanks for reporting! I pinned now the transformers dependency to 4.7.0.