serizba / cppflow

Run TensorFlow models in C++ without installation and without Bazel

Home Page:https://serizba.github.io/cppflow/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Does cppflow::model has support for Tensorflow Lite models ?

bmiftah opened this issue · comments

I tested cppflow , to load and infer model saved as savedModel . Model loading and prediction time to be slower - taking up to 4 seconds. I try some way to optimize it and I came across the idea of converting the model in to a Tensorflow Lite model which is an optimized FlatBuffer format identified by the .tflite file extension). The conversion can be done following the method from the offical tensorflow page. My issue here is :- I wonder if such model has support in cppflow::model , can it be loaded and infered from? or Is there any tip to get a better speed in inference time , such as the possibility of freezing the model ? any help is very much appreciated !

did you know the answer please?

Hi, as you can see above, I didnt get reply for that issue. I still don't know how to use Lite model in cppflow( perhaps there could be effort towards this , but I don't know any yet). So , I go ahead with using frozen model. In the process , I followed some suggestion given here to solve some problems that came up during loading frozen model.