Unable to Export Chatglm3 Model to ONNX Format in Optimum
Harini-Vemula-2382 opened this issue · comments
System Info
Optimum Version: 1.18.0
Python Version: 3.8
Platform: Windows, x86_64
Who can help?
@michaelbenayoun @JingyaHuang @echarlaix
I am writing to report an issue I encountered while attempting to export a Chatglm3 model to ONNX format using Optimum.
Information
- The official example scripts
- My own modified scripts
Tasks
- An officially supported task in the
examples
folder (such as GLUE/SQuAD, ...) - My own task or dataset (give details below)
Reproduction (minimal, reproducible, runnable)
optimum-cli export onnx --model THUDM/chatglm3-6b chatglm_optimum_onnx/ --trust-remote-code
Expected behavior
I would expect Optimum to successfully export the chatglm3 model to ONNX format without encountering any errors or issues.
Hi, THUDM/chatglm3-6b
is not a Transformers model and the export is expected to fail. Could you share your export log here?
"optimum-cli export onnx --model THUDM/chatglm3-6b --framework pt --task text-generation-with-past ./chatglm_onnx --trust-remote-code"
Above command line is used to export models in ONNX format, but I got error like
"ValueError: Trying to export a chatglm model, that is a custom or unsupported architecture, but no custom onnx configuration was passed as custom_onnx_configs
. Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models. Please open an issue at https://github.com/huggingface/optimum/issues if you would like the model type chatglm to be supported natively in the ONNX export."
Actually I am not getting how to do ONNX export, if possible can you provide me steps to follow to export and if available share any scripts to download .
Here is a snippet of export log
Kindly help me in this regards, let me know if you have any concerns