qnguyen3 / chat-with-mlx

An all-in-one LLMs Chat UI for Apple Silicon Mac using MLX Framework.

Home Page:https://twitter.com/stablequan

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

can not launch chat-with-mlx

taozhiyuai opened this issue · comments

can not launch chat-with-mlx.can anyone help?

I have finished Manual Pip Installation

save model files to director quantized-gemma-2b-it

截屏2024-03-04 15 58 20 截屏2024-03-04 15 58 27 截屏2024-03-04 15 58 42

Hi, I think your connection to HuggingFace is not stable to download to embedding model

Hi, I think your connection to HuggingFace is not stable to download to embedding model

you are right.
but I have downloaded on my hard disk. can I run locally? @qnguyen3

@taozhiyuai yes, i understand. however, for the first time, you have to connect to the internet for the embedding model to download (not the LLM). After that you can run everything locally