WebAssembly based AI as a Service with Kubernetes
Demo 1: Run locally with WasmEdge
Prerequisite:
- Install Rust with
rustup
- Have GCC installed
- Git and cURL
Steps:
- Install the
wasm32-wasi
target
rustup target add wasm32-wasi
- Install
wasmedge
curl -sSf https://raw.githubusercontent.com/WasmEdge/WasmEdge/master/utils/install.sh | bash
source $HOME/.wasmedge/env
- Build the QuickJS interpreter with the WasmEdge Tensorflow extension
cd wasmedge-quickjs
cargo build --target wasm32-wasi --release --features=tensorflow
cd ..
- Run JS Examples locally
cd js_food
wasmedge-tensorflow-lite --dir .:. wasmedge_quickjs.wasm main.js
cd ..
- Run Rust examples locally
cd rust_mobilenet
cargo build --target wasm32-wasi --release
We can AOT compile our Rust code to machine native code, and then use WasmEdge sandbox to run the native code.
wasmedgec-tensorflow target/wasm32-wasi/release/classify.wasm classify.so
wasmedge-tensorflow-lite classify.so < grace_hopper.jpg
Demo 2: Deploy as a FAAS
Prerequisite:
- Install Rust with
rustup
- Have GCC installed
- Git and cURL
- Vercel CLI
Steps:
- Install the
wasm32-wasi
target
rustup target add wasm32-wasi
- Build the Rust program to WebAssembly bytecode
cd faas
cd api/functions/image-classification/
cargo build --release --target wasm32-wasi
- Prepare the build artifacts for deployment
cp target/wasm32-wasi/release/classify.wasm ../../
- Deploy the function
cd ../../../
vercel deploy
Demo 3
Build container
sudo buildah build --annotation "module.wasm.image/variant=compat" -t classify .