kyegomez / Andromeda

An all-new Language Model That Processes Ultra-Long Sequences of 100,000+ Ultra-Fast

Home Page:https://discord.gg/qUtxnK2NMf

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Traceback (most recent call last): File "train_distributed_accelerate.py", line 664, in <module> main() File "train_distributed_accelerate.py", line 569, in main optim, train_loader, lr_scheduler = accelerator.prepare( File "/home/ubuntu/.local/lib/python3.8/site-packages/accelerate/accelerator.py", line 1139, in prepare result = self._prepare_deepspeed(*args) File "/home/ubuntu/.local/lib/python3.8/site-packages/accelerate/accelerator.py", line 1381, in _prepare_deepspeed raise ValueError( ValueError: You cannot create a DummyScheduler without specifying a scheduler in the config file.\

kyegomez opened this issue · comments

Traceback (most recent call last):
File "train_distributed_accelerate.py", line 664, in
main()
File "train_distributed_accelerate.py", line 569, in main
optim, train_loader, lr_scheduler = accelerator.prepare(
File "/home/ubuntu/.local/lib/python3.8/site-packages/accelerate/accelerator.py", line 1139, in prepare
result = self._prepare_deepspeed(*args)
File "/home/ubuntu/.local/lib/python3.8/site-packages/accelerate/accelerator.py", line 1381, in _prepare_deepspeed
raise ValueError(
ValueError: You cannot create a DummyScheduler without specifying a scheduler in the config file.\