microsoft / DynamicHead

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

TypeError: add(): argument 'alpha' must be Number, not NoneType

reubenwenisch opened this issue · comments

I was trying to run the repo DynamicHead and trying to run the model with a custom dataset and have the following error. In the repo the launch function is called and I get these errors.

The command I run was

DETECTRON2_DATASETS=$DATASET python train_net.py --config configs/dyhead_r50_retina_fpn_1x.yaml --num-gpus 8

Logs

Traceback (most recent call last):
  File "train_net_custom.py", line 222, in <module>
    launch(
  File "/home/xxx/anaconda3/envs/detectron/lib/python3.8/site-packages/detectron2/engine/launch.py", line 82, in launch
    main_func(*args)
  File "train_net_custom.py", line 216, in main
    return trainer.train()
  File "/home/xxx/anaconda3/envs/detectron/lib/python3.8/site-packages/detectron2/engine/defaults.py", line 484, in train
    super().train(self.start_iter, self.max_iter)
  File "/home/xxx/anaconda3/envs/detectron/lib/python3.8/site-packages/detectron2/engine/train_loop.py", line 149, in train
    self.run_step()
  File "/home/xxx/anaconda3/envs/detectron/lib/python3.8/site-packages/detectron2/engine/defaults.py", line 494, in run_step
    self._trainer.run_step()
  File "/home/xxx/anaconda3/envs/detectron/lib/python3.8/site-packages/detectron2/engine/train_loop.py", line 294, in run_step
    self.optimizer.step()
  File "/home/xxx/anaconda3/envs/detectron/lib/python3.8/site-packages/torch/optim/lr_scheduler.py", line 65, in wrapper
    return wrapped(*args, **kwargs)
  File "/home/xxx/anaconda3/envs/detectron/lib/python3.8/site-packages/torch/optim/optimizer.py", line 89, in wrapper
    return func(*args, **kwargs)
  File "/home/xxx/anaconda3/envs/detectron/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
    return func(*args, **kwargs)
  File "/home/xxx/anaconda3/envs/detectron/lib/python3.8/site-packages/torch/optim/sgd.py", line 110, in step
    F.sgd(params_with_grad,
  File "/home/xxx/anaconda3/envs/detectron/lib/python3.8/site-packages/torch/optim/_functional.py", line 160, in sgd
    d_p = d_p.add(param, alpha=weight_decay)
TypeError: add(): argument 'alpha' must be Number, not NoneType

Detectron2 version: 0.6

Expected behaviour is the training loop runs without error.

I downgraded to detectron2 version: 0.5 and things are working fine.