Skip to content

ML Audio plug-in example using iPlug2 & ONNX Runtime

Notifications You must be signed in to change notification settings

olilarkin/iPlug2OnnxRuntime

Repository files navigation

iPlug2OnnxRuntime

Machine Learning Audio plug-in/App example using iPlug2 and Microsoft ONNX Runtime.

This example processes an LSTM Neural Network Model trained using Steve Atkinson's Neural Amp Modeler. The C++ code to run this model is found in LSTMModelInference.h. The model itself is in ort-builder/model.onnx, and converted to .ort format and serialized to a bin2c resource in ort-builder/model/model.ort.h. The project links to customised ONNX Runtime static libs which are pruned to contain only the operators and types required for a particular model, only including support for inference using the CPU. ORT is linked statically to make the audio plug-in more portable. These libs are created with a separate repo ort-builder, which can be used to customize the libs for your model, and to add support for e.g. GPU inference.

It should compile for macOS, iOS and Windows.

For Windows, you'll need to unzip the onnxruntime.lib in /ort-builder/libs/win-x86_64/MinSizeRel. If you need to build the Debug target, you'll need to compile the debug build of onnxruntime.lib (not included due to its size).

License: MIT