Rust-to-Rust dynamic library boundary for tract ONNX inference. Keeps the heavy tract dependency (267 crates, 16 MB) out of your binary; loads it at runtime via dlopen.
tract-onnx is great for pure-Rust ML inference, but it pulls in 7 sub-crates and dominates compile time for any project that uses it. If all you need is "f32 tensor in, f32 tensor out," there's no reason to statically link it.
zentract puts tract behind a cdylib with a C ABI. Your application depends only on zentract-api (14 crates, 350 KB stripped) and loads the plugin at runtime.
| Crate | Type | Description |
|---|---|---|
zentract-types |
lib (no_std) |
Shared #[repr(C)] types: TensorMeta, DType, ErrorCode |
zentract-abi |
cdylib | Plugin that links tract-onnx, exports extern "C" functions |
zentract-api |
lib | Host wrapper using libloading — no tract dependency |
use zentract_api::{InferenceEngine, TensorMeta};
// Load the plugin (once, at startup)
let engine = InferenceEngine::load("libzentract_abi.so")?;
// Load an ONNX model with a fixed input shape
let onnx_bytes = std::fs::read("model.onnx")?;
let input_shape = TensorMeta::f32_shape(&[1, 3, 320, 320]);
let model = engine.load_onnx(&onnx_bytes, input_shape)?;
// Run inference
let input: Vec<f32> = preprocess_image(/* ... */);
let output = model.infer(&input, 0)?; // output_index = 0
let scores: &[f32] = &output.data;| Artifact | Size |
|---|---|
libzentract_abi.so (tract inside) |
16 MB |
| Host binary (libloading only) | 350 KB |
The plugin exports five extern "C" functions:
zentract_abi_version() -> u32zentract_load(onnx_bytes, len, input_meta) -> handlezentract_infer(handle, input, len, output_index, out_ptr, out_len, out_meta) -> errorzentract_output_count(handle) -> countzentract_free(handle)
Output pointers reference data inside the plugin. Valid until the next infer or free call on the same handle. The host copies out what it needs.
Both plugin and host are in the same workspace, built simultaneously:
cargo build --releaseThis produces target/release/libzentract_abi.so (the plugin) and makes zentract-api available for downstream crates.
Dual-licensed: AGPL-3.0 or commercial.
I've maintained and developed open-source image server software — and the 40+ library ecosystem it depends on — full-time since 2011. Fifteen years of continual maintenance, backwards compatibility, support, and the (very rare) security patch. That kind of stability requires sustainable funding, and dual-licensing is how we make it work without venture capital or rug-pulls. Support sustainable and secure software; swap patch tuesday for patch leap-year.
Your options:
- Startup license — $1 if your company has under $1M revenue and fewer than 5 employees. Get a key →
- Commercial subscription — Governed by the Imazen Site-wide Subscription License v1.1 or later. Apache 2.0-like terms, no source-sharing requirement. Sliding scale by company size. Pricing & 60-day free trial →
- AGPL v3 — Free and open. Share your source if you distribute.
See LICENSE-COMMERCIAL for details.