SthPhoenix / InsightFace-REST

InsightFace REST API for easy deployment of face recognition services with TensorRT in Docker.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Need for config.pbtxt models for triton server

zerodwide opened this issue · comments

Hi
I followed guide on TRT model structure at #60 exact comment

but when running triton server with this command:
docker run --gpus=1 --rm -p8000:8000 -p8001:8001 -p8002:8002 -v/model/triton:/models nvcr.io/nvidia/tritonserver:22.03-py3 tritonserver --model-repository=/models

this error come out:

... model_repository_manager.cc:1927] Poll failed for model directory 'arcface_r100_v1': failed to open text file for read /models/arcface_r100_v1/config.pbtxt: No such file or directory

Can you provide config.pbtxt for arcface_r100_v1, scrfd_10g_gnkps or retinaface_r50_v1 ?

Thanks in Advance.

Hi you can run Triton server with --strict-model-config=false argument, it should generate config automatically.

Though I'm not sure if this project still works with Triton, I haven't checked it for a while since current version works much faster with TRT backend.

@SthPhoenix Thanks for quick response.

I have check it out. still it works but as you mentioned TRT is faster.

image: tritonserver:22.03-py3
retinaface_r50_v1 (OK)
scrfd_10g_gnkps (FAILED)
arcface_r100_v1 (OK)