Possible empty file
BabyCoder69 opened this issue · comments
In the part 3 of the inference notebook, while loading vtoonify.load_state_dict(torch.load(os.path.join(MODEL_DIR, style_type+'_generator.pt'), map_location=lambda storage, loc: storage)['g_ema'])
I encountered the following error
EOFError
Traceback (most recent call last)
[<ipython-input-12-40b58cac2d4b>](https://localhost:8080/#) in <module>
5
6 vtoonify = VToonify(backbone = 'dualstylegan')
----> 7 vtoonify.load_state_dict(torch.load(os.path.join(MODEL_DIR, style_type+'_generator.pt'), map_location=lambda storage, loc: storage)['g_ema'])
8 vtoonify.to(device)
9
1 frames
[/usr/local/lib/python3.8/dist-packages/torch/serialization.py](https://localhost:8080/#) in _legacy_load(f, map_location, pickle_module, **pickle_load_args)
1000 "functionality.")
1001
-> 1002 magic_number = pickle_module.load(f, **pickle_load_args)
1003 if magic_number != MAGIC_NUMBER:
1004 raise RuntimeError("Invalid magic number; corrupt file?")
EOFError: Ran out of input
Could this be due an empty file being provided?
You can check whether the model is downloaded in os.path.join(MODEL_DIR, style_type+'_generator.pt')
for example
print(os.listdir(MODEL_DIR))