neuralize-ai / edgerunner-android

Kotlin bindings for Edgerunner

Home Page:https://github.com/neuralize-ai/edgerunner

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool



Edgerunner Android

Kotlin bindings for Edgerunner

πŸ’‘ Introduction

Edgerunner Android provides a Kotlin wrapper for the Edgerunner AI inference library. This library is a work in progress, currently offering inference of tflite models on CPU. Support for GPU and vendor-specific NPU inference will follow incrementally along with various other inference engines. See Edgerunner for more details.

The wrapper logic and public Kotlin classes are found in the edgerunner Android library.

πŸ›  Installation

The library will soon be published to Maven Central.

πŸ•Ή Usage

An example image classification app is bundled with the project. See imageclassifier.kt for edgerunner API usage.

In general, the library can be used as follows;

import com.neuralize.edgerunner.Model

// ...

/* read model file into a ByteBuffer -> modelBuffer */
// ...

val model = Model(modelBuffer.asReadOnlyBuffer())

/* ByteBuffer, direct access to input buffer for model inference */
val inputBuffer = model.getInput(0)?.getBuffer() ?: /* handle error */

/* write input to `inputBuffer` */
// ...

val executionStatus = model.execute()


val outputBuffer = model.getOutput(0)?.getBuffer() ?: /* handle error */

/* interpret output */
// ...

The full API for Model and Tensor can be found in Model.kt and Tensor.kt respectively.

πŸ“œ Licensing

See the LICENSING document.

About

Kotlin bindings for Edgerunner

https://github.com/neuralize-ai/edgerunner

License:MIT License


Languages

Language:Kotlin 75.2%Language:C++ 21.7%Language:CMake 3.2%