randombenj / detectron2onnx-inference

Export [detectron2](https://github.com/facebookresearch/detectron2) model to [onnx](https://github.com/onnx/onnx) and run inference using [caffe2 onnx backend](https://pytorch.org/tutorials/advanced/super_resolution_with_caffe2.html). This let's you run inference on a raspberry pi with acceptable inference times.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Detectron2 ONNX Inference

Exporting detectron2 models to onnx and running inference on them is surprisingly hard.

This rapository contains my personal learnings with detectron2 and onnx inference.

About

Export [detectron2](https://github.com/facebookresearch/detectron2) model to [onnx](https://github.com/onnx/onnx) and run inference using [caffe2 onnx backend](https://pytorch.org/tutorials/advanced/super_resolution_with_caffe2.html). This let's you run inference on a raspberry pi with acceptable inference times.


Languages

Language:Jupyter Notebook 100.0%