SamurAIGPT / EmbedAI

An app to interact privately with your documents using the power of GPT, 100% privately, no data leaks

Home Page:https://www.thesamur.ai/?utm_source=github&utm_medium=link&utm_campaign=github_privategpt

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Traceback Error while Downloading a Model

MiroHaap opened this issue · comments

I finally got it to start, I uploaded a file, ingested data and then it downloaded until 32% and then this happened

`Traceback (most recent call last):
File "C:\Users\User\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\response.py", line 705, in _error_catcher
yield
File "C:\Users\User\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\response.py", line 830, in _raw_read
raise IncompleteRead(self._fp_bytes_read, self.length_remaining)
urllib3.exceptions.IncompleteRead: IncompleteRead(1236664320 bytes read, 2548583961 more expected)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "C:\Users\User\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\models.py", line 816, in generate
yield from self.raw.stream(chunk_size, decode_content=True)
File "C:\Users\User\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\response.py", line 935, in stream
data = self.read(amt=amt, decode_content=decode_content)
File "C:\Users\User\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\response.py", line 874, in read data = self._raw_read(amt)
File "C:\Users\User\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\response.py", line 808, in _raw_read
with self._error_catcher():
File "C:\Users\User\AppData\Local\Programs\Python\Python310\lib\contextlib.py", line 153, in exit
self.gen.throw(typ, value, traceback)
File "C:\Users\User\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\response.py", line 722, in _error_catcher
raise ProtocolError(f"Connection broken: {e!r}", e) from e
urllib3.exceptions.ProtocolError: ('Connection broken: IncompleteRead(1236664320 bytes read, 2548583961 more expected)', IncompleteRead(1236664320 bytes read, 2548583961 more expected))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\User\AppData\Local\Programs\Python\Python310\lib\site-packages\flask\app.py", line 2190, in wsgi_app
response = self.full_dispatch_request()
File "C:\Users\User\AppData\Local\Programs\Python\Python310\lib\site-packages\flask\app.py", line 1486, in full_dispatch_request
rv = self.handle_user_exception(e)
File "C:\Users\User\AppData\Local\Programs\Python\Python310\lib\site-packages\flask_cors\extension.py", line 165, in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
File "C:\Users\User\AppData\Local\Programs\Python\Python310\lib\site-packages\flask\app.py", line 1484, in full_dispatch_request
rv = self.dispatch_request()
File "C:\Users\User\AppData\Local\Programs\Python\Python310\lib\site-packages\flask\app.py", line 1469, in dispatch_request
return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)
File "C:\Users\User\Downloads\privateGPT-main\server\privateGPT.py", line 190, in download_and_save
for chunk in response.iter_content(chunk_size=4096):
File "C:\Users\User\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\models.py", line 818, in generate
raise ChunkedEncodingError(e)
requests.exceptions.ChunkedEncodingError: ('Connection broken: IncompleteRead(1236664320 bytes read, 2548583961 more expected)', IncompleteRead(1236664320 bytes read, 2548583961 more expected))`

I think similar to conda, check this

invoke-ai/InvokeAI#1828

Looks like an internet issue. You can also manually download the model and save it in models folder

That worked, thank you very much.