Ryan-yang125 / ChatLLM-Web

🗣️ Chat with LLM like Vicuna totally in your browser with WebGPU, safely, privately, and with no server. Powered by web llm.

Home Page:https://chat-llm-web.vercel.app

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

support setting page

Ryan-yang125 opened this issue · comments

setting page about:

  1. model:
  • models selection
  • param config: temperature, max-length
  1. device:
  • gpu switch
  • cache usage
  1. ui
  • dark/light theme