mit-han-lab / lite-transformer

[ICLR 2020] Lite Transformer with Long-Short Range Attention

Home Page:https://arxiv.org/abs/2004.11886

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Export model to ONNX

suyuzhang opened this issue · comments

commented

Hi

I try to convert the lite-transformer model to ONNX, but I catch a lot of problems during this process.
I can't move forward with errors. Does anybody have a positive experience in export this model to ONNX?

Error message:

Traceback (most recent call last):
  File "generate.py", line 202, in <module>
    cli_main()
  File "generate.py", line 198, in cli_main
    main(args)
  File "generate.py", line 110, in main
    torch.onnx.export(model, args=(dummy_1, dummy_3, dummy_2), f='output.onnx', keep_initializers_as_inputs=True, opset_version=9, operator_export_type=torch.onnx.OperatorExportTypes.ONNX_ATEN_FALLBACK)
  File "/opt/conda/lib/python3.6/site-packages/torch/onnx/__init__.py", line 230, in export
    custom_opsets, enable_onnx_checker, use_external_data_format)
  File "/opt/conda/lib/python3.6/site-packages/torch/onnx/utils.py", line 92, in export
    use_external_data_format=use_external_data_format)
  File "/opt/conda/lib/python3.6/site-packages/torch/onnx/utils.py", line 538, in _export
    fixed_batch_size=fixed_batch_size)
  File "/opt/conda/lib/python3.6/site-packages/torch/onnx/utils.py", line 374, in _model_to_graph
    graph, torch_out = _trace_and_get_graph_from_model(model, args)
  File "/opt/conda/lib/python3.6/site-packages/torch/onnx/utils.py", line 327, in _trace_and_get_graph_from_model
    torch.jit._get_trace_graph(model, args, strict=False, _force_outplace=False, _return_inputs_states=True)
  File "/opt/conda/lib/python3.6/site-packages/torch/jit/__init__.py", line 135, in _get_trace_graph
    outs = ONNXTracedModule(f, strict, _force_outplace, return_inputs, _return_inputs_states)(*args, **kwargs)
  File "/opt/conda/lib/python3.6/site-packages/torch/nn/modules/module.py", line 726, in _call_impl
    result = self.forward(*input, **kwargs)
  File "/opt/conda/lib/python3.6/site-packages/torch/jit/_trace.py", line 116, in forward
    self._force_outplace,
  File "/opt/conda/lib/python3.6/site-packages/torch/jit/_trace.py", line 105, in wrapper
    out_vars, _ = _flatten(outs)
RuntimeError: Only tuples, lists and Variables supported as JIT inputs/outputs. Dictionaries and strings are also accepted but their usage is not recommended. But got unsupported type NoneType

Thanks.

Thank you for asking! We did not support the convention of the model to ONNX format. We are appreciated it if anyone would like to contribute for that. ;)