maekawatoshiki / altius

Small ONNX inference runtime written in Rust

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Altius

CI Coverage
Small ONNX inference runtime written in Rust.
Feel free to create issues and discussions!

Requirements

  • cargo
  • rye

Run

# Download models.
(cd models && ./download.sh)
# Download minimum models.
# (cd models && ./download.sh CI)

# Run examples.
# {mnist, mobilenet, deit, vit} are available.
# You can specify the number of threads for computation by editing the code.
cargo run --release --example mnist
cargo run --release --example mobilenet
cargo run --release --example deit
cargo run --release --example vit

# Experimental CPU backend (that generates code in C)
cargo run --release --example mnist_cpu     -- --iters 10 
cargo run --release --example mobilenet_cpu -- --iters 10 --profile
cargo run --release --example deit_cpu      -- --iters 10 --threads 8 --profile

Run from WebAssembly

Currently, mobilenet v3 runs on web browsers.

cd wasm
cargo install wasm-pack
wasm-pack build --target web
yarn
yarn serve

Run from Python

cd ./crates/altius-py
rye sync --features linux # no --features needed on macOS
rye run maturin develop -r
rye run python mobilenet.py

About

Small ONNX inference runtime written in Rust

License:MIT License


Languages

Language:Rust 80.3%Language:Python 13.6%Language:C++ 2.6%Language:TypeScript 1.0%Language:C 0.9%Language:Shell 0.8%Language:Cuda 0.6%Language:Makefile 0.1%Language:HTML 0.1%Language:CSS 0.0%