WildChlamydia / MiVOLO

MiVOLO age & gender transformer neural network

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

export onnx

tiamo405 opened this issue · comments

I am trying to export model to onnx with code snippet. However, opset_version says it's not supported, can anyone edit it for me?


with torch.no_grad():
        with io.BytesIO() as f:
            input_names = ['input_1']
            output_names = ['output_1']
            dynamic_axes = {input_names[0]: {0:'batch'}}
            for _, name in enumerate(output_names):
                dynamic_axes[name] = dynamic_axes[input_names[0]]
            extra_args = {'opset_version': 9, 'verbose': False,
                'input_names': input_names, 'output_names': output_names,
                'dynamic_axes': dynamic_axes}
            torch.onnx.export(model, inputs, f, **extra_args)
            onnx_model = onnx.load_from_string(f.getvalue())

@tiamo405

Hello!
Currently, ONNX export is only possible with opset 18, and it requires some dirty workarounds.
I've created an issue regarding this matter here:
huggingface/pytorch-image-models#1872
You can also find related PyTorch issues at that link, which are causing this limitation.

If you are interested in exporting using dirty hacks, like PyTorch code modification, I can explain how to do that.
That way, onnx inference will work perfectly. But TensorRT won't work anyway because TRT does not have col2im support, and there is no information available on whether it will be added in the future.

@tiamo405

Hello! Currently, ONNX export is only possible with opset 18, and it requires some dirty workarounds. I've created an issue regarding this matter here: huggingface/pytorch-image-models#1872 You can also find related PyTorch issues at that link, which are causing this limitation.

If you are interested in exporting using dirty hacks, like PyTorch code modification, I can explain how to do that. That way, onnx inference will work perfectly. But TensorRT won't work anyway because TRT does not have col2im support, and there is no information available on whether it will be added in the future.

Thanks for your answer. I want to export to ONNX to be able to convert it into trt. If Trt does not have col2im support then I think I won't need it anymore. I convert to another form is Torchscript for temporary replacement.

Yeah, I see. Indeed, TorchScript appears to be the optimal choice at the moment. However, ONNX models cannot be converted to TRT, and furthermore, ONNX models perform poorly with batch processing. The speedup achieved from batching in Torch and TorchScript is significantly higher. In light of this, I recommend that we await these fixes before proceeding.

@tiamo405

If you are interested in exporting using dirty hacks, like PyTorch code modification, I can explain how to do that. That way, onnx inference will work perfectly. But TensorRT won't work anyway because TRT does not have col2im support, and there is no information available on whether it will be added in the future.

Can you please explain how to do that? TensorRT is is not my requirement. So the solution could be helpful for me.