qnguyen3 / chat-with-mlx

An all-in-one LLMs Chat UI for Apple Silicon Mac using MLX Framework.

Home Page:https://twitter.com/stablequan

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

feature request

taozhiyuai opened this issue · comments

I set an API Server on LM Studio. hope to link api with chat-with-mlx.

and is that possible to chat with mutual-PDFs at the same time?

please show the inference speed, time consuming , etc. more performance dates is required .