Ryan-yang125 / ChatLLM-Web

🗣️ Chat with LLM like Vicuna totally in your browser with WebGPU, safely, privately, and with no server. Powered by web llm.

Home Page:https://chat-llm-web.vercel.app

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Feature request: Desktop PWA support

matthoffner opened this issue · comments

Thanks for building this! Desktop PWA support would be useful. Looking forward to using this more.

Thanks @matthoffner ! I have released a new version which supports pwa, now you can download ChatLLM-Web in the location bar. I don't know much about PWA, but it looks like a good way to cache data, and I am very happly to hear more ideas about it.

Wow you move fast! Thanks again! Looking forward to contributing where I can.