olilarkin / iPlug2OnnxRuntime

ML Audio plug-in example using iPlug2 & ONNX Runtime

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

iPlug2OnnxRuntime

Machine Learning Audio plug-in/App example using iPlug2 and Microsoft ONNX Runtime.

This example processes an LSTM Neural Network Model trained using Steve Atkinson's Neural Amp Modeler. The C++ code to run this model is found in LSTMModelInference.h. The model itself is in ort-builder/model.onnx, and converted to .ort format and serialized to a bin2c resource in ort-builder/model/model.ort.h. The project links to customised ONNX Runtime static libs which are pruned to contain only the operators and types required for a particular model, only including support for inference using the CPU. ORT is linked statically to make the audio plug-in more portable. These libs are created with a separate repo ort-builder, which can be used to customize the libs for your model, and to add support for e.g. GPU inference.

It should compile for macOS, iOS and Windows.

For Windows, you'll need to unzip the onnxruntime.lib in /ort-builder/libs/win-x86_64/MinSizeRel. If you need to build the Debug target, you'll need to compile the debug build of onnxruntime.lib (not included due to its size).

License: MIT

About

ML Audio plug-in example using iPlug2 & ONNX Runtime


Languages

Language:Python 35.9%Language:Shell 26.4%Language:Batchfile 9.2%Language:C++ 8.1%Language:Inno Setup 8.1%Language:Makefile 4.6%Language:C 3.3%Language:Rich Text Format 3.1%Language:Objective-C 0.7%Language:Dockerfile 0.4%Language:TeX 0.1%