SthPhoenix / InsightFace-REST

InsightFace REST API for easy deployment of face recognition services with TensorRT in Docker.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ValueError when using the CPU build via Docker

mnts-i opened this issue · comments

I am running Docker on Ubuntu and I used the provided CPU dockerfile to build the image. The image is created successfully but the app keeps crashing with the following error:

valueerror: this ort build has ['azure execution provider', 'cudaexecutionprovider'] enabled. since ort 1.9, you are required to explicitly set the providers parameter when instantiating inferencesession

After some digging I found a similar issue from a different project: Gourieff/sd-webui-reactor#108. I followed the comments and edited the requirements.txt, bumping the onnx from 1.13.0 to 1.14.0 (and adding the onnxruntime==1.15.0) and the app started working again. Just posting it here for anyone who encounters the same error.

Hi! That's weird, I have build image from scratch and have no errors, but anyway I'll bump versions, just in case.Thanks!