mlc-ai / web-llm

High-performance In-browser LLM Inference Engine

Home Page:https://webllm.mlc.ai

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Tracking] WebLLM: Frontend Compatibility Issues and CDN Delivery

CharlieFRuan opened this issue · comments

This issue tracks the resolution to various front-end compatibility issues.

Action items

  • Address require() issues
  • Address perf_hooks import issue
  • Add CDN delivery

require() issue in next projects

This file is being treated as an ES module because it has a '.js' file extension and '\web-llm\examples\next-simple-chat\node_modules\@mlc-ai\web-llm\package.json' contains "type": "module". To treat it as a CommonJS script, rename it to use the '.cjs' file extension.

This error has been reported in various issues:

  • #140
  • #353
    • The error in this issue for some reason stems from web-tokenizer rather than web-llm (though the _scriptDir part is indeed from web-tokenizer)
  • #383

Revisit after the following PR is merged and npm is updated, should be able to resolve all issues above:

perf_hooks issue

This error is reported in issues: