QuivrHQ / quivr

Your GenAI Second Brain 🧠 A personal productivity assistant (RAG) βš‘οΈπŸ€– Chat with your docs (PDF, CSV, ...) & apps using Langchain, GPT 3.5 / 4 turbo, Private, Anthropic, VertexAI, Ollama, LLMs, Groq that you can share with users ! Local & Private alternative to OpenAI GPTs & ChatGPT powered by retrieval-augmented generation.

Home Page:https://quivr.app

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Bug]: Unable to add any knowledge to a brain

kszys opened this issue Β· comments

What happened?

When trying to add any file to a brain knowledge base, as soon as I click on Feed Brain, a "Network Error" is reported in the UI, but quickly disappears. Below are the logs I could find in Docker.

Relevant log output

2024-03-28 16:38:36 backend-core  | 2024-03-28 15:38:36,589:INFO - HTTP Request: GET http://host.docker.internal:54321/rest/v1/customers?select=email%2Cid&email=eq.admin%40quivr.app "HTTP/1.1 500 Internal Server Error"
2024-03-28 16:38:36 backend-core  | [INFO] models.databases.supabase.user_usage [user_usage.py:161]: None
2024-03-28 16:38:36 backend-core  | [ERROR] models.databases.supabase.user_usage [user_usage.py:162]: {'code': 'XX000', 'details': None, 'hint': None, 'message': 'called `Result::unwrap()` on an `Err` value: InvalidPosition'}
2024-03-28 16:38:36 backend-core  | [ERROR] models.databases.supabase.user_usage [user_usage.py:163]: Error while checking if user is a premium user. Stripe needs to be configured.
2024-03-28 16:38:36 backend-core  | [ERROR] models.databases.supabase.user_usage [user_usage.py:166]: {'code': 'XX000', 'details': None, 'hint': None, 'message': 'called `Result::unwrap()` on an `Err` value: InvalidPosition'}
2024-03-28 16:38:36 backend-core  | INFO:     192.168.65.1:39664 - "POST /upload?brain_id=40ba47d7-51b2-4b2a-9247-89e29619efb0&chat_id=21f099dd-49b0-4e0c-9f92-c1b561b0679b HTTP/1.1" 500 Internal Server Error
2024-03-28 16:38:36 backend-core  | ERROR:    Exception in ASGI application
2024-03-28 16:38:36 backend-core  | Traceback (most recent call last):
2024-03-28 16:38:36 backend-core  |   File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/h11_impl.py", line 408, in run_asgi
2024-03-28 16:38:36 backend-core  |     result = await app(  # type: ignore[func-returns-value]
2024-03-28 16:38:36 backend-core  |              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-03-28 16:38:36 backend-core  |   File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 69, in __call__
2024-03-28 16:38:36 backend-core  |     return await self.app(scope, receive, send)
2024-03-28 16:38:36 backend-core  |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-03-28 16:38:36 backend-core  |   File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
2024-03-28 16:38:36 backend-core  |     await super().__call__(scope, receive, send)
2024-03-28 16:38:36 backend-core  |   File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
2024-03-28 16:38:36 backend-core  |     await self.middleware_stack(scope, receive, send)
2024-03-28 16:38:36 backend-core  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
2024-03-28 16:38:36 backend-core  |     raise exc
2024-03-28 16:38:36 backend-core  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
2024-03-28 16:38:36 backend-core  |     await self.app(scope, receive, _send)
2024-03-28 16:38:36 backend-core  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 91, in __call__
2024-03-28 16:38:36 backend-core  |     await self.simple_response(scope, receive, send, request_headers=headers)
2024-03-28 16:38:36 backend-core  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 146, in simple_response
2024-03-28 16:38:36 backend-core  |     await self.app(scope, receive, send)
2024-03-28 16:38:36 backend-core  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
2024-03-28 16:38:36 backend-core  |     await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
2024-03-28 16:38:36 backend-core  |   File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
2024-03-28 16:38:36 backend-core  |     raise exc
2024-03-28 16:38:36 backend-core  |   File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
2024-03-28 16:38:36 backend-core  |     await app(scope, receive, sender)
2024-03-28 16:38:36 backend-core  |   File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 758, in __call__
2024-03-28 16:38:36 backend-core  |     await self.middleware_stack(scope, receive, send)
2024-03-28 16:38:36 backend-core  |   File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 778, in app
2024-03-28 16:38:36 backend-core  |     await route.handle(scope, receive, send)
2024-03-28 16:38:36 backend-core  |   File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 299, in handle
2024-03-28 16:38:36 backend-core  |     await self.app(scope, receive, send)
2024-03-28 16:38:36 backend-core  |   File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 79, in app
2024-03-28 16:38:36 backend-core  |     await wrap_app_handling_exceptions(app, request)(scope, receive, send)
2024-03-28 16:38:36 backend-core  |   File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
2024-03-28 16:38:36 backend-core  |     raise exc
2024-03-28 16:38:36 backend-core  |   File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
2024-03-28 16:38:36 backend-core  |     await app(scope, receive, sender)
2024-03-28 16:38:36 backend-core  |   File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 74, in app
2024-03-28 16:38:36 backend-core  |     response = await func(request)
2024-03-28 16:38:36 backend-core  |                ^^^^^^^^^^^^^^^^^^^
2024-03-28 16:38:36 backend-core  |   File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 278, in app
2024-03-28 16:38:36 backend-core  |     raw_response = await run_endpoint_function(
2024-03-28 16:38:36 backend-core  |                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-03-28 16:38:36 backend-core  |   File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
2024-03-28 16:38:36 backend-core  |     return await dependant.call(**values)
2024-03-28 16:38:36 backend-core  |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-03-28 16:38:36 backend-core  |   File "/code/modules/upload/controller/upload_routes.py", line 59, in upload_file
2024-03-28 16:38:36 backend-core  |     send_telemetry("upload_file", {"file_name": uploadFile.filename})
2024-03-28 16:38:36 backend-core  |   File "/code/packages/utils/telemetry.py", line 27, in send_telemetry
2024-03-28 16:38:36 backend-core  |     machine_key = generate_machine_key()
2024-03-28 16:38:36 backend-core  |                   ^^^^^^^^^^^^^^^^^^^^^^
2024-03-28 16:38:36 backend-core  |   File "/code/packages/utils/telemetry.py", line 20, in generate_machine_key
2024-03-28 16:38:36 backend-core  |     unique_key = hashlib.sha256(seed.encode()).hexdigest()
2024-03-28 16:38:36 backend-core  |                                 ^^^^^^^^^^^
2024-03-28 16:38:36 backend-core  | AttributeError: 'NoneType' object has no attribute 'encode'

Twitter / LinkedIn details

No response

Hey there, @kszys! Great to see you back. Hope you're doing well. πŸš€

Thank you for providing the detailed logs. It seems like there are a couple of issues happening here.

Firstly, the error 'NoneType' object has no attribute 'encode' is being thrown by the generate_machine_key function in the telemetry.py file. This function is trying to generate a unique machine key by hashing the OpenAI API key. The error suggests that the OpenAI API key might not be set in your environment variables, hence os.getenv("OPENAI_API_KEY") is returning None, and the subsequent call to encode() is failing. You can resolve this by ensuring that the OpenAI API key is correctly set in your environment variables.

Secondly, the error 'called Result::unwrap() on an Err value: InvalidPosition' is being thrown by the check_user_is_customer and check_subscription_validity functions in the user_usage.py file. These functions are trying to check if the user is a premium user and if their subscription is still valid. The error suggests that there might be an issue with the position of the result being unwrapped. Unfortunately, without more context, it's hard to pinpoint the exact cause of this error. It would be helpful if you could provide more information about the state of your database and the user you're trying to add knowledge for.

Lastly, the HTTP/1.1 500 Internal Server Error suggests that there's an issue with the backend server. This could be related to the errors mentioned above or could be a separate issue altogether. It would be helpful if you could provide more information about your server setup and any additional error messages you might be seeing.

I hope this helps! If you have any more questions or need further clarification, feel free to ask.

Sources

About Dosu

This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a πŸ‘ if this is helpful and πŸ‘Ž if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

I commented out the OPENAI_API_KEY in .env because I do not want to use OPENAI obviously. It is a bit of a bummer that something advertised as:

Secure: Your data, your control. Always.

Cannot relly run without access to OpenAI...

...not to mention that also have TELEMETRY_ENABLED=false in the .env - and yet telemetry appears nevertheless (attempted) to be sent each time a file is uploaded to a brain.

Hey πŸ‘‹

Just put a random api key in OpenAI.

And sorry about the telemetry I'll fix this

Perhaps a correctly formatted example key in the comments would help? I'll prepare a PR.

Even with a key in my case, seeing somewhat of a problem with brains, but I'll post a new issue just in case it is a different matter altogether. Also, unable to test if the fake key option works for this reason.