mlc-ai / web-llm

High-performance In-browser LLM Inference Engine

Home Page:https://webllm.mlc.ai

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Link in main readme doesn't work

flatsiedatsie opened this issue · comments

The link in:

NOTE: you don't need to build by yourself unless you would like to change the WebLLM package, follow [use WebLLM](https://github.com/mlc-ai/web-llm?tab=readme-ov-file#use-web-llm-package) instead.

Doesn't go anywhere?

Thanks for the catch, just updated it! The purpose was to point the users to get-started-like examples