Riccorl / transformer-srl

Reimplementation of a BERT based model (Shi et al, 2019), currently the state-of-the-art for English SRL. This model implements also predicate disambiguation.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

RuntimeError: Error loading state dict for TransformerSrlSpan

creatorrr opened this issue · comments

On running the example in README.md, getting the following error:

RuntimeError: Error loading state dict for TransformerSrlSpan
        Missing keys: ['transformer.embeddings.position_ids']
        Unexpected keys: []

(Model downloaded from the Dropbox link mentioned)

Environment:

python = "^3.8"
transformer-srl = "^2.5"
spacy = "2.3"

Can you share all the packages in the env?

I run it recently, and it worked fine in a clean environment.

That’s weird. I used it in a fresh environment too. I’ll do it once more and report but what python version were you using?

No luck. :-/

Same error using python3.7

Stacktrace:

Python 3.7.5 (default, Feb 23 2021, 13:22:40)
[GCC 8.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from transformer_srl import dataset_readers, models, predictors

>>>
>>> predictor = predictors.SrlTransformersPredictor.from_path("./srl_bert_base_conll2012.tar.gz", "transformer_srl")
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/ubuntu/src/github.com/Riccorl/transformer-srl/transformer_srl/predictors.py", line 159, in from_path
    load_archive(archive_path, cuda_device=cuda_device),
  File "/home/ubuntu/src/github.com/Riccorl/transformer-srl/.venv/lib/python3.7/site-packages/allennlp/models/archival.py", line 208, in load_archive
    model = _load_model(config.duplicate(), weights_path, serialization_dir, cuda_device)
  File "/home/ubuntu/src/github.com/Riccorl/transformer-srl/.venv/lib/python3.7/site-packages/allennlp/models/archival.py", line 246, in _load_model
    cuda_device=cuda_device,
  File "/home/ubuntu/src/github.com/Riccorl/transformer-srl/.venv/lib/python3.7/site-packages/allennlp/models/model.py", line 406, in load
    return model_class._load(config, serialization_dir, weights_file, cuda_device)
  File "/home/ubuntu/src/github.com/Riccorl/transformer-srl/.venv/lib/python3.7/site-packages/allennlp/models/model.py", line 349, in _load
    f"Error loading state dict for {model.__class__.__name__}\n\t"
RuntimeError: Error loading state dict for TransformerSrlSpan
        Missing keys: ['transformer.embeddings.position_ids']
        Unexpected keys: []

Last time i tried python 3.6 Someone opened a similar issue #7, but I tried in a new env and it worked (#7 (comment)).

Try with

conda create -n srl-test python=3.6
conda activate srl-test
pip install transformer-srl==2.4.6
echo '{"sentence": "Did Uriah honestly think he could beat the game in under three hours?"}' | \
allennlp predict path/to/srl_bert_base_conll2012.tar.gz - --include-package transformer_srl

It should work

Previously, I had installed transformer-srl==2.5 (the latest one pypi). These instructions (with 2.4.6) led to new errors however. Seems like they depend on an older version of allennlp.

Stacktrace:

Traceback (most recent call last):
  File "/home/ubuntu/src/github.com/Riccorl/transformer-srl/.venv/bin/allennlp", line 8, in <module>
    sys.exit(run())
  File "/home/ubuntu/src/github.com/Riccorl/transformer-srl/.venv/lib/python3.6/site-packages/allennlp/__main__.py", line 34, in run
    main(prog="allennlp")
  File "/home/ubuntu/src/github.com/Riccorl/transformer-srl/.venv/lib/python3.6/site-packages/allennlp/commands/__init__.py", line 117, in main
    import_module_and_submodules(package_name)
  File "/home/ubuntu/src/github.com/Riccorl/transformer-srl/.venv/lib/python3.6/site-packages/allennlp/common/util.py", line 354, in import_module_and_submodules
    import_module_and_submodules(subpackage)
  File "/home/ubuntu/src/github.com/Riccorl/transformer-srl/.venv/lib/python3.6/site-packages/allennlp/common/util.py", line 343, in import_module_and_submodules
    module = importlib.import_module(package_name)
  File "/usr/lib/python3.6/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 994, in _gcd_import
  File "<frozen importlib._bootstrap>", line 971, in _find_and_load
  File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/ubuntu/src/github.com/Riccorl/transformer-srl/transformer_srl/predictors.py", line 3, in <module>
    from allennlp.data.tokenizers.token_class import Token
ModuleNotFoundError: No module named 'allennlp.data.tokenizers.token_class'

Did you install with --upgrade option? Because AllenNLP got a major update between 2.4.6 (the one that works with the pretrained model) and the most recent one

Yep, same problem. For v2.5, could it be an issue with the dropbox archive?

Same with 2.4.11

The pretrained weights on Dropbox only work with 2.4.6

ah, gotcha. could you try to reproduce once more when you get a chance? Not sure what I am doing wrong, followed the instructions in your comment to the t. :-|

Just tried the following instructions on an Ubuntu machine

wget https://www.dropbox.com/s/4tes6ypf2do0feb/srl_bert_base_conll2012.tar.gz
conda create -n srl-test python=3.6
conda activate srl-test
pip install transformer-srl==2.4.6
echo '{"sentence": "Did Uriah honestly think he could beat the game in under three hours?"}' | \
allennlp predict srl_bert_base_conll2012.tar.gz - --include-package transformer_srl

and I got the correct output

input 0:  {"sentence": "Did Uriah honestly think he could beat the game in under three hours?"}
prediction:  {"verbs": [{"verb": "Did", "description": "[do.01: Did] Uriah honestly think he could beat the game in under three hours ?", "tags": ["B-V", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O"], "frame": "do.01", "frame_score": 0.9999996423721313, "lemma": "do"}, {"verb": "think", "description": "Did [ARG0: Uriah] [ARGM-ADV: honestly] [think.01: think] [ARG1: he could beat the game in under three hours] ?", "tags": ["O", "B-ARG0", "B-ARGM-ADV", "B-V", "B-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "O"], "frame": "think.01", "frame_score": 1.0, "lemma": "think"}, {"verb": "could", "description": "Did Uriah honestly think he [go.04: could] beat the game in under three hours ?", "tags": ["O", "O", "O", "O", "O", "B-V", "O", "O", "O", "O", "O", "O", "O", "O"], "frame": "go.04", "frame_score": 0.10186540335416794, "lemma": "could"}, {"verb": "beat", "description": "Did Uriah honestly think [ARG0: he] [ARGM-MOD: could] [beat.03: beat] [ARG1: the game] [ARGM-TMP: in under three hours] ?", "tags": ["O", "O", "O", "O", "B-ARG0", "B-ARGM-MOD", "B-V", "B-ARG1", "I-ARG1", "B-ARGM-TMP", "I-ARGM-TMP", "I-ARGM-TMP", "I-ARGM-TMP", "O"], "frame": "beat.03", "frame_score": 0.9999936819076538, "lemma": "beat"}], "words": ["Did", "Uriah", "honestly", "think", "he", "could", "beat", "the", "game", "in", "under", "three", "hours", "?"]}

Worked this time for some reason. 🤷 Thanks so much for patiently helping! Maybe update the README to install 2.4.6 ?

Worked this time for some reason. 🤷 Thanks so much for patiently helping! Maybe update the README to install 2.4.6 ?

Glad I could help :)

Yeah maybe you are right, I should write better to use 2.4.6