NVIDIA / retinanet-examples

Fast and accurate object detection with end-to-end GPU optimization

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

tensorrt error running cppapi

natxopedreira opened this issue · comments

Hello

Im getting this error when trying to run inference with cpp api... what can be the cause?

./infer ./model.plan ./foto.png 
Loading engine...
6: The engine plan file is not compatible with this version of TensorRT, expecting library version 8.0.0 got 8.0.1, please rebuild.
4: [runtime.cpp::deserializeCudaEngine::74] Error Code 4: Internal Error (Engine deserialization failed.)
Segmentation fault (core dumped)

Regenerate model.plan again, seems like it was generated using different TRT

thanks, that did the trick!