serizba / cppflow

Run TensorFlow models in C++ without installation and without Bazel

Home Page:https://serizba.github.io/cppflow/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Problem on reading a .pb model

FrancescaCi opened this issue · comments

Hi,
Can I have some support for reading a .pb model, please? I am trying to use the example code load_model to open a different model .pb, I have the .pb file but not the variables as the example, and the error is:
SavedModel load for tags { serve }; Status: fail: NOT_FOUND: Could not find meta graph def matching supplied tags: { serve }. To inspect available tag-sets in the SavedModel, please use the SavedModel CLI: saved_model_cli. Took 7103 microseconds.
terminate called after throwing an instance of 'std::runtime_error'
what(): Could not find meta graph def matching supplied tags: { serve }. To inspect available tag-sets in the SavedModel, please use the SavedModel CLI: saved_model_cli

How can I manage this issue?
Thank you so much.

just incase you are still looking for an answer.

I had a similar problem, the issue was that the model was a frozen graph and must be loaded using:

auto Model = model(model_path, cppflow::model::FROZEN_GRAPH)

thank you so much.
I will try your solution and let you know!

thank you so much for the support.
Do you know why I am facing this issues, in your opinion?
error: ‘FROZEN_GRAPH’ is not a member of ‘cppflow::model’
10 | auto Model = cppflow::model("../model", cppflow::model::FROZEN_GRAPH);

@sph001

error: ‘TYPE’ is not a member of ‘cppflow::model’
10 | auto Model = cppflow::model("../model", cppflow::model::TYPE::FROZEN_GRAPH);
| ^~~~
Yes.... Infact I am not understanding why it is recognizing as not a member

Hi @FrancescaCi

Did you manage to fix this? Did you try with the latest version of the code?

Hi , If still looking for solution for this issue, let me share what I tried . For the issue of loading frozen model , I did the following : - (similar to what was mentioned ealier , but note the model is frozen one ! )

cppflow::model model("Froozen_model/ab_frozen_graph.pb",cppflow::model::TYPE::FROZEN_GRAPH);

loading the model was okay but later when I make inference using thi loaded model using : output_tensor = model(input_1);

This error showed up :- No operation named "serving_default_input_1" exists -

I don't have operation named serving_default_input_1 in my model , rather I have 'input_1' as my input node , Is that what cppflow expects ? I saw else where this issue was rasied but still I couldnt get work around for this issue ? I stuck here for long and any suggestion is appreciated !