dimitreOliveira / torchserve_od_example

Simple example of using TorchServe to serve a PyTorch Object Detection model

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Simple example of using TorchServe to serve a PyTorch Object Detection model

Repository content

Local Setup

Optional

Docker Setup

General Setup

Download FastRCNN model weights

sh scripts/get_fastrcnn.sh

Run locally

Archive model

sh scripts/archive_model.sh

Start TorchServe

sh scripts/start_torchserve.sh

Stop TorchServe

torchserve --stop

Run with Docker

Build Docker image from the Dockerfile

sudo docker build -f Dockerfile -t docker_torchserve .

Run the Docker container

sudo docker run -p 8080:8080 -u 0 -ti -v $(pwd)/models/:/home/model-server/models/ docker_torchserve /bin/bash

Archive model

sh scripts/archive_model.sh

Start TorchServe

sh scripts/start_torchserve.sh

Stop TorchServe

torchserve --stop

Run with Docker compose

Build image and run with Build and run your app with Docker compose

sudo docker-compose up

Stop the application

docker-compose down

Inference

Run sample inference using REST APIs

curl http://127.0.0.1:8080/predictions/fastrcnn -T ./samples/man2.jpg

Or iteratively run the "query_notebook.ipynb" notebook

Content

  • models — Model's assets.
  • samples — Image samples used to test inference.
  • scripts — Scripts for general usage.
  • utils — Utility files.
  • query_notebook — Jupyter notebook for iterative inference.

References

About

Simple example of using TorchServe to serve a PyTorch Object Detection model

License:MIT License


Languages

Language:Jupyter Notebook 98.6%Language:Python 1.0%Language:Shell 0.3%Language:Dockerfile 0.1%