High-performance In-browser LLM Inference Engine
Home Page:https://webllm.mlc.ai
Geek Repo:Geek Repo
Github PK Tool:Github PK Tool
nahidalam opened this issue a month ago · comments
Are you planning to support LLaVA? I see you have this issue open #276
Do you also plan to support video-LLaVA?