qnguyen3 / chat-with-mlx

An all-in-one LLMs Chat UI for Apple Silicon Mac using MLX Framework.

Home Page:https://twitter.com/stablequan

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Connection refused

desilinguist opened this issue · comments

Very excited to try this. Created a conda environment and pip install'd the code in editable mode using pip install -e .. Loaded the Mistral 7B model, uploaded and indexed a doc file and asked a simple question. Got the following error:

File "/Users/nmadnani/anaconda/envs/mlxchat/lib/python3.11/site-packages/httpx/_transports/default.py", line 69, in map_httpcore_exceptions
    yield
  File "/Users/nmadnani/anaconda/envs/mlxchat/lib/python3.11/site-packages/httpx/_transports/default.py", line 233, in handle_request
    resp = self._pool.handle_request(req)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/nmadnani/anaconda/envs/mlxchat/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py", line 216, in handle_request
    raise exc from None
  File "/Users/nmadnani/anaconda/envs/mlxchat/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py", line 196, in handle_request
    response = connection.handle_request(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/nmadnani/anaconda/envs/mlxchat/lib/python3.11/site-packages/httpcore/_sync/connection.py", line 99, in handle_request
    raise exc
  File "/Users/nmadnani/anaconda/envs/mlxchat/lib/python3.11/site-packages/httpcore/_sync/connection.py", line 76, in handle_request
    stream = self._connect(request)
             ^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/nmadnani/anaconda/envs/mlxchat/lib/python3.11/site-packages/httpcore/_sync/connection.py", line 122, in _connect
    stream = self._network_backend.connect_tcp(**kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/nmadnani/anaconda/envs/mlxchat/lib/python3.11/site-packages/httpcore/_backends/sync.py", line 205, in connect_tcp
    with map_exceptions(exc_map):
  File "/Users/nmadnani/anaconda/envs/mlxchat/lib/python3.11/contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/Users/nmadnani/anaconda/envs/mlxchat/lib/python3.11/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
    raise to_exc(exc) from exc
httpcore.ConnectError: [Errno 61] Connection refused

Hi, If you are installing the manual way, you can go to chat_with_mlx/models/download, delete the folder of Mistral 7B and try to redownload again. HF is down so I am not able to check right now, I will let you know how it goes!

Same issue, model loaded, doc indexed, but the chat responses 'Error'. Accidentally conflicts with a used port?

chat-with-mlx
You try to use a model that was created with version 2.4.0.dev0, however, your version is 2.4.0. This might cause unexpected behavior or errors. In that case, try to update to the latest version.

Running on local URL: http://127.0.0.1:7860

To create a public link, set share=True in launch().
Traceback (most recent call last):
File "/Users/xxxx/anaconda3/lib/python3.11/site-packages/httpcore/_exceptions.py", line 10, in map_exceptions
yield
File "/Users/xxxx/anaconda3/lib/python3.11/site-packages/httpcore/_backends/sync.py", line 100, in connect_tcp
sock = socket.create_connection(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/xxxx/anaconda3/lib/python3.11/socket.py", line 851, in create_connection
raise exceptions[0]
File "/Users/xxxx/anaconda3/lib/python3.11/socket.py", line 836, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 61] Connection refused

Hi @desilinguist and @hubert-fan-rft , can you two try delete the package then reinstall and try export no_proxy="localhost,127.0.0.1" before you start the program?

@qnguyen3 had a try, but not working...

But I was digging around and found the script will initiate a mlx server in the background. I tried run the python command manually against the downloaded model folders, and this popped up with a "No safetensors found" error. That's the problem then as I can only see a 'weights.npz' seems to be the model file.

Anything I should fix there?

Hi @hubert-fan-rft and @desilinguist, my bad for including the wrong repo of Mistral-Instruct-v0.2, it is updated now. You should remove your old chat-with-mlx with pip uninstall chat-with-mlx and reinstall again since there is already a folder of Mistral on your local path and it will not redownload the model if you just do a simple pip install -U

Problem still exists, as I can find the code from mlx_lm/util.py, it only looks for *.safetensors files.

weight_files = glob.glob(str(model_path / "*.safetensors"))
if not weight_files:
    logging.error(f"No safetensors found in {model_path}")
    raise FileNotFoundError(f"No safetensors found in {model_path}")

But thanks for the good work you're sharing anyway, I successfully had my local Mistral model mounted on 127.0.0.0:8080, and the rest parts of chat-with-mlx work all fine.

Yes, the problem is I put in the wrong mlx-repo that does not have .safetensors in it. I updated now with the right one! Maybe you need to clear your HuggingFace cache to make it work. Anyways, happy to have it running on your machine