emilianavt / OpenSeeFace

Robust realtime face and facial landmark tracking on CPU with Unity integration

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

error in converting onnx models to tensorflow

jeffxtang opened this issue · comments

I tried onnx-tf on the models in the models folder but got the error BackendIsNotSupposedToImplementIt: FusedConv is not implemented. Any ideas how this can be fixed? Thanks.

Please try the onnx models from here: #48

Please try the onnx models from here: #48

Thanks that works! Now I'm trying to convert the tf model to tflite, what should be the input and output names (same as the names used in torch.onnx._export when converting the models from PyTorch to onnx)?

I believe the names are "input" and "output", but if you load the model with onnxruntime, you should also be able to query them.

Thanks @emilianavt. Yes you're right they're 'input' and 'output' as printed from the code below:

import onnxruntime as ort
ort_session = ort.InferenceSession('lm_model1.onnx', providers=['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'])
print(ort_session.get_inputs()[0].name, ort_session.get_outputs()[0].name)

Btw, using the model file in the zip of #48, this converts from ONNX to TF: onnx-tf convert -i lm_model1.onnx -o lm_model1.pb

This converts from TF to tflite:

import tensorflow as tf
converter = tf.compat.v1.lite.TFLiteConverter.from_saved_model('lm_model1.pb')

converter.experimental_new_converter = True
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS,
                                       tf.lite.OpsSet.SELECT_TF_OPS]

tf_lite_model = converter.convert()
open('lm_model1.tflite', 'wb').write(tf_lite_model)