facebookresearch / mae

PyTorch implementation of MAE https//arxiv.org/abs/2111.06377

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Not able to import inf from torch._six

tauruswcc opened this issue · comments

  • Describe the bug

I tried to reproduce submitit_pretrain.py by running:
python submitit_pretrain.py --job_dir '/home/yewei/ye_wei/jarvis/scripts/mae/experiments' --nodes 1 --batch_size 64 --model mae_vit_large_patch16 --norm_pix_loss --mask_ratio 0.75 --epochs 800 --warmup_epochs 40 --blr 1.5e-4 --weight_decay 0.05 --data_path '/home/yewei/ye_wei/jarvis/scripts/hy-tmp/gen1/images/train' --accum_iter .5
but met the following error:
Traceback (most recent call last):
File "/home/yewei/ye_wei/jarvis/scripts/mae/submitit_pretrain.py", line 15, in
import main_pretrain as trainer
File "/home/yewei/ye_wei/jarvis/scripts/mae/main_pretrain.py", line 30, in
import util.misc as misc
File "/home/yewei/ye_wei/jarvis/scripts/mae/util/misc.py", line 21, in
from torch._six import inf
ModuleNotFoundError: No module named 'torch._six'
Please advise. Thank you!

  • Desktop (please complete the following information):
  1. OS: Ubuntu 22.04.1
  2. PyTorch version: 2.0.1+cu118; Python 3.9.13
  3. GPU: NVIDIA RTX A5000 * 4

I figured it out. Just an update.
In Line 288 of the misc.py, just rewrite if norm_type == float('inf') and it worked.

timm=0.3.2 also imports torch._six , did you find a way around this?

timm=0.3.2 also imports torch._six , did you find a way around this?

Please refer to README.

Changing to from torch import inf
Refer to: microsoft/DeepSpeed#2845