This repo is a project for a ResNet50 inference application using ONNXRuntime in C++.
Currently, I build and test on Windows10 with Visual Studio 2019 only. All resources (build-system, dependencies and etc...) are cross-platform, so maybe you can build the application on other environment.
If you don't have ONNXRuntime, you have two options:
- Download the distribution package
- Build from sources
If you download the distribution package, skip the next section about building ONNXRuntime.
NOTE: If you want to link ONNXRuntime statically, you have to build ONNXRuntime from sources.
You can build ONNXRuntime from sources like below:
git clone https://github.com/microsoft/onnxruntime.git
cd onnxruntime
# if win
build.bat --config Debug --build_shared_lib --parallel
build.bat --config Release --build_shared_lib --parallel
# else
./build.sh --config Debug --build_shared_lib --parallel
./build.sh --config Release --build_shared_lib --parallel
More information about building ONNXRuntime can be found at official docs.
After build and test, place libraries like below:
${ORT_ROOT}
├─include # headers
└─lib
├─Debug # DO NOT forget pdb files...
│ ├─shared
│ │ ├─onnxruntime.dll
│ │ └─onnxruntime.lib
│ └─static
│ ├─onnxruntime_common.lib
│ ├─onnxruntime_flatbuffers.lib
│ ├─onnxruntime_framework.lib
│ ├─onnxruntime_graph.lib
│ ├─onnxruntime_mlas.lib
│ ├─onnxruntime_optimizer.lib
│ ├─onnxruntime_providers.lib
│ ├─onnxruntime_session.lib
│ ├─onnxruntime_util.lib
│ └─external (dependencies)
│ ├─clog.lib
│ ├─cpuinfo.lib
│ ├─flatbuffers.lib
│ ├─libprotobuf-lited.lib
│ ├─onnx.lib
│ ├─onnx_proto.lib
│ └─re2.lib
├─Release
│ :
:
Before you build the application, you have to output resources like ResNet50 model of ONNX format, imagenet labels and a test image.
To do this, run python/output_resource.py
like below:
python python/output_resource.py -o resource
After run above command, you can see a directory named resource
.
You can build the application by cmake like below:
cmake -B build -DORT_STATIC=OFF -DUSE_DIST_ORT=OFF -DORT_ROOT=path/to/ORT_ROOT
cmake --build build --config Debug
After build succeeded, you can run application and see inference results.
build/Debug/ORTResnet.exe -i resource/dog_input.png -r reource
To validate results, run python/check_inference_result.py
and compare to application results.
python python/check_inference_result.py -i resource/dog_input.png -l resource/imagenet_classes.txt
This project is licensed under the MIT License.