Ryan-yang125 / ChatLLM-Web

🗣️ Chat with LLM like Vicuna totally in your browser with WebGPU, safely, privately, and with no server. Powered by web llm.

Home Page:https://chat-llm-web.vercel.app

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ideas

mundurragacl opened this issue · comments

support local document search https://github.com/matthoffner/web-llm-embed
support multi models such as llama2
save prompt
agent support https://github.com/idosal/AgentLLM

Thank you for your suggestion. I appreciate your idea to expand the capabilities of the project. However, I currently do not have plans to further develop the project, as I don't see practical application scenarios for this project. If you have any more ideas or questions, feel free to share them. Once again, thank you for your input!