ashishawasthi / infer

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Model Inference API example

Self-contained minimalistic ML model inference service example. The application demonstrates a wrapper API over models to serve the predictions in batches. It includes e2e tests for requests and prediction responses.

Application expects the provided model object to have an sklearn-style predict function.

Local Development Environment

Setup

pip install -r requirements.txt

Test

python -m pytest
coverage run -m pytest
coverage report -m

Run

flask run

Try

<flask_url>/infer?model_id=iris_svm_v1&model_inputs=[[1,2,3,4],[1,1,1,1]]

Deloy Docker Image

docker build  -t infer .
docker run -it infer

Deloy on Google Cloud Run

gcloud builds submit --tag gcr.io/<project-id>/infer
gcloud run deploy --image gcr.io/<project-id>/infer

Deployed at

https://infer-vz7lve6tka-as.a.run.app/infer?model_id=iris_svm_v1&model_inputs=[[1,2,3,4],[1,1,1,1]]

About


Languages

Language:Python 95.6%Language:Dockerfile 4.4%