Skip to content

Small ONNX inference runtime written in Rust

License

Notifications You must be signed in to change notification settings

uchiiii/altius

 
 

Repository files navigation

altius

CI codecov

Small ONNX inference runtime written in Rust

Feel free to create issues and discussions

Requirements

  • Python 3.x (Used in some tests; You can disable them by just ignoring tests in ./altius-py)

Run

# Download large models.
(cd models && ./download.sh)

# Run examples.
# {mnist, mobilenet, deit, vit} are available.
# You can specify the number of threads for computation by editing the code.
./run.sh mnist
./run.sh mobilenet
./run.sh deit
./run.sh vit

# Experimental CPU backend (that generates code in C)
./run_cpu.sh mnist_cpu --iters 3
./run_cpu.sh mobilenet_cpu --iters 3 --profile
./run_cpu.sh deit_cpu --iters 3 --threads 8 --profile

# On macOS, you can use 'accelerate' library.
cargo run --release --features accelerate --example mobilenet

Run from WebAssembly

Currently, mobilenet v3 runs on web browsers.

cd wasm
cargo install wasm-pack
wasm-pack build --target web
yarn
yarn serve

Run from Python

cd altius-py
python -m venv .env
source .env/bin/activate
pip install -r requirements.txt
RUSTFLAGS="-C target-cpu=native" maturin develop -r --features blis
python mobilenet.py

About

Small ONNX inference runtime written in Rust

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Rust 81.7%
  • Python 12.6%
  • C++ 2.7%
  • TypeScript 1.1%
  • Shell 0.8%
  • Cuda 0.4%
  • Other 0.7%