Loading checkpoints
kanishkamisra opened this issue · comments
Kanishka commented
How does one load the pre-trained checkpoints?
I tried it with embs = torch.load('ewiser.semcor+wngt.pt')
but it gives me an error:
---------------------------------------------------------------------------
ModuleNotFoundError Traceback (most recent call last)
<ipython-input-2-123918dcfdcf> in <module>
----> 1 embs = torch.load('ewiser.semcor+wngt.pt', map_location = 'cpu')
~/miniconda3/lib/python3.8/site-packages/torch/serialization.py in load(f, map_location, pickle_module, **pickle_load_args)
606 return torch.jit.load(opened_file)
607 return _load(opened_zipfile, map_location, pickle_module, **pickle_load_args)
--> 608 return _legacy_load(opened_file, map_location, pickle_module, **pickle_load_args)
609
610
~/miniconda3/lib/python3.8/site-packages/torch/serialization.py in _legacy_load(f, map_location, pickle_module, **pickle_load_args)
785 unpickler = pickle_module.Unpickler(f, **pickle_load_args)
786 unpickler.persistent_load = persistent_load
--> 787 result = unpickler.load()
788
789 deserialized_storage_keys = pickle_module.load(f, **pickle_load_args)
ModuleNotFoundError: No module named 'qbert'
Michele Bevilacqua commented
Check out https://github.com/SapienzaNLP/ewiser/blob/master/ewiser/spacy/disambiguate.py
.
Kanishka commented
Thanks! I'll take a look -- for some reason I mistakenly thought the 'pt' files were regular tensors! Thanks for the clarification!