microsoft / onnxruntime-genai

Generative AI extensions for onnxruntime

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Mobile] System.DllNotFoundException "onnxruntime-genai" when debugging on Android

GuiOliv opened this issue · comments

Describe the issue

I am writting an app using MAUI Blazor and I'm trying to load a mocal locally on my phone. However, everytime it hits the line var model = new Model(modelPath);, it throws the exception System.DllNotFoundException "onnxruntime-genai".

I have tried copying dll files multiple times, using them, installing the package on multiple projects but nothing worked.

To reproduce

To reproduce, you may try to load a local model by copying it to your phone and using

string modelPath = "/storage/emulated/0/onnx/cpu_and_mobile/cpu-int4-rtn-block-32";

Then simply load the model with the line var model = new Model(modelPath);, all this while debugging with your phone via USB.

Visual Studio Community Version 17.10.0

Urgency

This is a personal project so not of much urgency but I would really appreciate if I could have any help.

Platform

Android

OS Version

14

ONNX Runtime Installation

Built from Source

Compiler Version (if 'Built from Source')

Debug / Any CPU

Package Name (if 'Released Package')

Microsoft.ML.OnnxRuntime

ONNX Runtime Version or Commit ID

Newest

ONNX Runtime API

C#

Architecture

Other / Unknown

Execution Provider

Default CPU

Execution Provider Library Version

No response

Might be better to add an issue here: https://github.com/microsoft/onnxruntime-genai/issues

Not sure if the C# bindings and nuget package for GenAI are setup for mobile. You need to do different things to load the correct native library (different library name on windows vs android vs ios), as well as have a nuget package that includes builds for all platforms.

GuiOliv The nuget package is not currently setup for mobile. I'll add this to our internal planning document, and we will decide when we would be able to add support for this.

Since this is a duplicate of #496, I'll close this one. We can continue discussion on the other issue.