microsoft / onnxjs

ONNX.js: run ONNX models using JavaScript

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Scikit-Learn to Onnx to Onnx.js problems

jlooper opened this issue · comments

hello, I'm looking for documentation on supported models that we can use when leveraging Scikit-Learn. I can convert a Scikit-Learn model to .onnx (using this package: https://onnx.ai/sklearn-onnx/) but I have problems using that model in a web app using onnx.js as some operators are not supported or need particular backend implementation.

Is there documentation around what kind of scikit-learn models can be used with onnx.js, perhaps with specific backend requirrements listed? So far I've had trouble making BernoulliNB, KNeighborsClassifier, DecisionTreeClassifier or RandomForestClassifier work. My starting code is here:

https://github.com/jlooper/onnx-apps (the notebook is in the recipes folder).

Thank you for any guidance!

Even linear regression desnt work

Onnx.js supports onnx opset v7 and it's way behind of the current onnx version. To resolve this issue, we decided to leverage onnxruntime as WebAssembly backend and started new project called onnxruntime web. We plan to release the official version at 6/1 along with onnxruntime 1.8 official release. Before that, you can try a dev npm package.

hi @hanbitmyths and thank you! I'm testing the onnxruntime for the web, and consistently get this error: 'Uncaught (in promise) Error: input '0' is missing in 'feeds'.' - can you shed some light here? thanks!

this error looks like the feeds passed to InferenceSession.run() does not contains key "0", which is the name of model's one input.

aha! I see the issue, thank you, I am making progress now!

@hanbitmyths
is it going to support Pytorch too? For example, it will support all the functions of Pytorch and its custom functions?
For example, the traditional ONNX.js does not really support Pytorch.

If you build a model, you have to downgrade it because the upgraded ONNX.js does not support your PyTorch model. But the downgraded version may not support the certain upgraded functions of Pytorch. It does not support some custom function in Pytorch, and it was a pain in the neck. Does the Onnx web solve these issues?

@hanbitmyths I'm working to import the npm web package for use in a Vue.js app: https://github.com/jlooper/onnx-apps/tree/main/recipe-guesser - and still getting errors when trying to import a random forest model: ort-web.min.js?f1d4:6 Uncaught (in promise) TypeError: cannot resolve operator 'TreeEnsembleClassifier' with opsets: ai.onnx.ml v1 - this is puzzling!

@hanbitmyths I'm working to import the npm web package for use in a Vue.js app: https://github.com/jlooper/onnx-apps/tree/main/recipe-guesser - and still getting errors when trying to import a random forest model: ort-web.min.js?f1d4:6 Uncaught (in promise) TypeError: cannot resolve operator 'TreeEnsembleClassifier' with opsets: ai.onnx.ml v1 - this is puzzling!

it looks like you try to load the model with 'webgl' backend, which does not support operators from 'ai.onnx.ml'. 'wasm' backend should work for the model.

@fs-eire thank you for looking - I added , { executionProviders: ["wasm"], }
new error is:

"Uncaught (in promise) Error: no available backend found. ERR: [wasm] RuntimeError: abort(CompileError: WebAssembly.instantiate(): expected magic word 00 61 73 6d, found 3c 21 44 4f @+0). Build with -s ASSERTIONS=1 for more info."

Does this onnxruntime-web npm package which I'm using here have wasm available?

@fs-eire thank you for looking - I added , { executionProviders: ["wasm"], }
new error is:

"Uncaught (in promise) Error: no available backend found. ERR: [wasm] RuntimeError: abort(CompileError: WebAssembly.instantiate(): expected magic word 00 61 73 6d, found 3c 21 44 4f @+0). Build with -s ASSERTIONS=1 for more info."

Does this onnxruntime-web npm package which I'm using here have wasm available?

Yes, the NPM package contains wasm backend implementation. This error looks wired: it looks like have a mismatching version of ort.min.js and ort-wasm.wasm (this is one of the possible reasons). I didn't run into this error before. could you create a new issue in onnxruntime repository for this error?