How to generate material/testmodels/mobilenetv3small_0.pb
AIxyz opened this issue · comments
Hi,
I would like to generate a TF2 frozen pb model (such as material/testmodels/mobilenetv3small_0.pb), but my models(generated by TF 2.6.0 and TF 2.7.0) can't be converted to nnm-ir. I find dataset/generator/generate_model.py is used to generate the keras h5 models. Is it possible to release the reference code to generate TF2 frozen pb model? And which TF version was used to generate material/testmodels/mobilenetv3small_0.pb?
Thanks & Regards,
X. Zhang
Hi Zhang,
The frozen pb model such as material/testmodels/mobilenetv3small_0.pb
were generated by tensorflow==1.12. We didn't have a well test of TF2 frozen pb model. Maybe you can try this link: https://leimao.github.io/blog/Save-Load-Inference-From-TF2-Frozen-Graph/
In dataset/generator/generate_model.py
we use the keras h5 model as the final format, because we plan to release a converter from keras h5 model to nn-meter ir. This tool is blocking in one of another of our work, and I will ask my colleague if this tool is ready these days.
Sorry for the inconvenience, and I will give new comment after I test the TF2 frozen pb conversion.
Best regards,
Jiahang
Hi, I just wanted to follow up any update on TF2 frozen pb conversion?