facebookresearch / protein-ebm

Energy-based models for atomic-resolution protein conformations

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Problem running

simoncorrea opened this issue · comments

when running this line, got an error

python vis_sandbox.py --exp=transformer_gmm_uniform --resume-iter=130000 --task=rotamer_trial --sample-mode=rosetta --neg-sample 500 --rotations 10

Transfer between distributed to non-distributed
Traceback (most recent call last):
  File "vis_sandbox.py", line 715, in main_single
    model.load_state_dict(checkpoint["model_state_dict"])
  File "/home/user/anaconda3/lib/python3.7/site-packages/torch/nn/modules/module.py", line 830, in load_state_dict
    self.__class__.__name__, "\n\t".join(error_msgs)))
RuntimeError: Error(s) in loading state_dict for RotomerTransformerModel:
        Missing key(s) in state_dict: "layers.0.self_attn.k_proj.weight", "layers.0.self_attn.k_proj.bias", "layers.0.self_attn.v_proj.weight", "layers.0.self_attn.v_proj.bias", "layers.0.self_attn.q_proj.weight","layers.0.self_attn.q_proj.bias", "layers.1.self_attn.k_proj.weight", "layers.1.self_attn.k_proj.bias", "layers.1.self_attn.v_proj.weight", "layers.1.self_attn.v_proj.bias", "layers.1.self_attn.q_proj.weight", "layers.1.self_attn.q_proj.bias", "layers.2.self_attn.k_proj.weight", "layers.2.self_attn.k_proj.bias", "layers.2.self_attn.v_proj.weight", "layers.2.self_attn.v_proj.bias", "layers.2.self_attn.q_proj.weight", "layers.2.self_attn.q_proj.bias", "layers.3.self_attn.k_proj.weight", "layers.3.self_attn.k_proj.bias", "layers.3.self_attn.v_proj.weight", "layers.3.self_attn.v_proj.bias", "layers.3.self_attn.q_proj.weight", "layers.3.self_attn.q_proj.bias", "layers.4.self_attn.k_proj.weight", "layers.4.self_attn.k_proj.bias", "layers.4.self_attn.v_proj.weight", "layers.4.self_attn.v_proj.bias", "layers.4.self_attn.q_proj.weight", "layers.4.self_attn.q_proj.bias", "layers.5.self_attn.k_proj.weight", "layers.5.self_attn.k_proj.bias", "layers.5.self_attn.v_proj.weight", "layers.5.self_attn.v_proj.bias", "layers.5.self_attn.q_proj.weight", "layers.5.self_attn.q_proj.bias".
        Unexpected key(s) in state_dict: "layers.0.self_attn.in_proj_weight", "layers.0.self_attn.in_proj_bias", "layers.1.self_attn.in_proj_weight", "layers.1.self_attn.in_proj_bias", "layers.2.self_attn.in_proj_weight", "layers.2.self_attn.in_proj_bias", "layers.3.self_attn.in_proj_weight", "layers.3.self_attn.in_proj_bias", "layers.4.self_attn.in_proj_weight", "layers.4.self_attn.in_proj_bias", "layers.5.self_attn.in_proj_weight", "layers.5.self_attn.in_proj_bias".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "vis_sandbox.py", line 797, in <module>
    main_single(flags_dict)
  File "vis_sandbox.py", line 729, in main_single
    model.load_state_dict(model_state_dict)
  File "/home/user/anaconda3/lib/python3.7/site-packages/torch/nn/modules/module.py", line 830, in load_state_dict
    self.__class__.__name__, "\n\t".join(error_msgs)))
RuntimeError: Error(s) in loading state_dict for RotomerTransformerModel:
        Missing key(s) in state_dict: "layers.0.self_attn.k_proj.weight", "layers.0.self_attn.k_proj.bias", "layers.0.self_attn.v_proj.weight", "layers.0.self_attn.v_proj.bias", "layers.0.self_attn.q_proj.weight","layers.0.self_attn.q_proj.bias", "layers.1.self_attn.k_proj.weight", "layers.1.self_attn.k_proj.bias", "layers.1.self_attn.v_proj.weight", "layers.1.self_attn.v_proj.bias", "layers.1.self_attn.q_proj.weight", "layers.1.self_attn.q_proj.bias", "layers.2.self_attn.k_proj.weight", "layers.2.self_attn.k_proj.bias", "layers.2.self_attn.v_proj.weight", "layers.2.self_attn.v_proj.bias", "layers.2.self_attn.q_proj.weight", "layers.2.self_attn.q_proj.bias", "layers.3.self_attn.k_proj.weight", "layers.3.self_attn.k_proj.bias", "layers.3.self_attn.v_proj.weight", "layers.3.self_attn.v_proj.bias", "layers.3.self_attn.q_proj.weight", "layers.3.self_attn.q_proj.bias", "layers.4.self_attn.k_proj.weight", "layers.4.self_attn.k_proj.bias", "layers.4.self_attn.v_proj.weight", "layers.4.self_attn.v_proj.bias", "layers.4.self_attn.q_proj.weight", "layers.4.self_attn.q_proj.bias", "layers.5.self_attn.k_proj.weight", "layers.5.self_attn.k_proj.bias", "layers.5.self_attn.v_proj.weight", "layers.5.self_attn.v_proj.bias", "layers.5.self_attn.q_proj.weight", "layers.5.self_attn.q_proj.bias".
        Unexpected key(s) in state_dict: "layers.0.self_attn.in_proj_weight", "layers.0.self_attn.in_proj_bias", "layers.1.self_attn.in_proj_weight", "layers.1.self_attn.in_proj_bias", "layers.2.self_attn.in_proj_weight", "layers.2.self_attn.in_proj_bias", "layers.3.self_attn.in_proj_weight", "layers.3.self_attn.in_proj_bias", "layers.4.self_attn.in_proj_weight", "layers.4.self_attn.in_proj_bias", "layers.5.self_attn.in_proj_weight", "layers.5.self_attn.in_proj_bias".

Could you post the command you ran to obtain the model? It looks like there is mismatch between the tested model and the loaded model.

sorry the model i was trying to run is the pretrained one from the github README.

then,
python vis_sandbox.py --exp=transformer_gmm_uniform --resume-iter=130000 --task=rotamer_trial --sample-mode=rosetta --neg-sample 500 --rotations 10

this is the line, thanks you.

Could you detail what fairseq version you are using?

im using
fairseq.version
'0.9.0'

Is it possible to try using version '0.6.2'?

yeah i did that, thanks you, now it works. my bad.

Changing the version resolves the issue, so I am closing this out. @simoncorrea, thanks for your interest and please let us know if you have any more questions!