Error loading model
vicmosin opened this issue · comments
Hi there,
I am trying to make the image work with TF's built-in models, but /load endpoint fails with the following (without any additional error details in log of the service):
{
"data": null,
"error": "Error loading model",
"success": false
}
I did add the model (to models folder, see attachment) and config.json:
{
"inference_engine_name": "tensorflow_detection",
"confidence": 60,
"predictions": 15,
"number_of_classes": 2,
"framework": "tensorflow",
"type": "detection",
"network": "inception"
}
The model is from https://github.com/tensorflow/models/blob/477ed41e7e4e8a8443bc633846eb01e2182dc68a/object_detection/g3doc/detection_model_zoo.md
Hello vicmosin, can you please provide us with the exact pretrained model you used from the table in order to test it. For more information about the error, you can add print(e) before raise ModelNotLoaded() (on line 24, base_inference_engine.py).
We tested a model from the link you provided and we found that the error is from the object-detection.pbtxt file you downloaded. Please check this link for more information regarding the format of the file: https://github.com/tensorflow/models/blob/master/research/object_detection/data/kitti_label_map.pbtxt