xenova / transformers.js

State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!

Home Page:https://huggingface.co/docs/transformers.js

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Example not working on Chrome/Arc v.124(M1 Mac)

justrach opened this issue · comments

commented

System Info

M1 Mac
Sonoma
Using the example code.

Environment/Platform

  • Website/web-app
  • Browser extension
  • Server-side (e.g., Node.js, Deno, Bun)
  • Desktop app (e.g., Electron)
  • Other (e.g., VSCode extension)

Description

Using the repository from : https://github.com/xenova/transformers.js/tree/v3/examples/webgpu-chat and "@xenova/transformers": "github:xenova/transformers.js#v3", it gets stuck on ..loading model..

On the huggingface demo here is the output on the console:
image

Meanwhile on the local page, here is how the output looks like:
image

Reproduction

  1. Clone repo from https://github.com/xenova/transformers.js/tree/v3/examples/webgpu-chat
  2. npm i and do npm run dev;
  3. Go to url
  4. Load Model
  5. Model gets stuck on ...loading...

Duplicate of #748.

It should now work (commits: here and here)!

commented

Thanks! Do you have a page where there's documentation on how to convert ONNX models to ONNX-Web models by any chance

Sure, we use Optimum, and you can find additional information there.

You can also use our helper conversion script for helping with quantization.

commented

Sure, we use Optimum, and you can find additional information there.

You can also use our helper conversion script for helping with quantization.

Thanks Joshua!