[Bug]: LLAMA INDEX is becoming a Dependency Hell itself
moonlightnexus opened this issue · comments
Bug Description
It's literally impossible to fix this issue without intervention from LLAMA INDEX team
Please upgrade it's dep. -> llama-index-vector-stores-chroma 0.0.1 depends on llama-index-core<0.10.0 and >=0.9.32
Please note I already tried everything possible to manage these dependencies but nothing worked.
53.95 llama-index-question-gen-openai 0.1.3 depends on llama-index-core<0.11.0 and >=0.10.1
53.95 llama-index-readers-file 0.1.17 depends on llama-index-core<0.11.0 and >=0.10.1
53.95 llama-index-readers-llama-parse 0.1.4 depends on llama-index-core<0.11.0 and >=0.10.7
53.95 llama-index-vector-stores-chroma 0.0.1 depends on llama-index-core<0.10.0 and >=0.9.32
53.95
53.95 To fix this you could try to:
53.95 1. loosen the range of package versions you've specified
53.95 2. remove package versions to allow pip attempt to solve the dependency conflict
53.95
53.95 ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts
------
Dockerfile:13```
### Version
latest
### Steps to Reproduce
[requirements.txt](https://github.com/run-llama/llama_index/files/15284572/requirements.txt)
To replicate create a venv and use this file requirements.txt
### Relevant Logs/Tracbacks
```shell
#10 53.72 INFO: pip is looking at multiple versions of llama-index-vector-stores-chroma to determine which version is compatible with other requirements. This could take a while.
#10 53.74 Downloading llama_index_vector_stores_chroma-0.0.2-py3-none-any.whl.metadata (745 bytes)
#10 53.75 Downloading llama_index_vector_stores_chroma-0.0.1-py3-none-any.whl.metadata (655 bytes)
#10 53.95 ERROR: Cannot install -r requirements.txt (line 129), -r requirements.txt (line 130), -r requirements.txt (line 131), -r requirements.txt (line 133), -r requirements.txt (line 134), -r requirements.txt (line 135), -r requirements.txt (line 136), -r requirements.txt (line 137), -r requirements.txt (line 139), -r requirements.txt (line 140), -r requirements.txt (line 141), -r requirements.txt (line 142), -r requirements.txt (line 143), -r requirements.txt (line 144), -r requirements.txt (line 145), -r requirements.txt (line 146), -r requirements.txt (line 147), -r requirements.txt (line 148), -r requirements.txt (line 149), -r requirements.txt (line 150), -r requirements.txt (line 151), -r requirements.txt (line 154) and llama-index-core because these package versions have conflicting dependencies.
#10 53.95
#10 53.95 The conflict is caused by:
#10 53.95 The user requested llama-index-core
#10 53.95 llama-index 0.10.28 depends on llama-index-core<0.11.0 and >=0.10.28
#10 53.95 llama-index-agent-openai 0.2.2 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-cli 0.1.11 depends on llama-index-core<0.11.0 and >=0.10.11.post1
#10 53.95 llama-index-embeddings-huggingface 0.2.0 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-embeddings-mistralai 0.1.4 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-embeddings-openai 0.1.7 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-experimental 0.1.3 depends on llama-index-core<0.11.0 and >=0.10.11.post1
#10 53.95 llama-index-indices-managed-llama-cloud 0.1.5 depends on llama-index-core<0.11.0 and >=0.10.0
#10 53.95 llama-index-llms-anthropic 0.1.10 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-llms-bedrock 0.1.6 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-llms-groq 0.1.3 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-llms-huggingface 0.1.4 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-llms-ollama 0.1.2 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-llms-openai 0.1.15 depends on llama-index-core<0.11.0 and >=0.10.24
#10 53.95 llama-index-llms-openai-like 0.1.3 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-multi-modal-llms-ollama 0.1.3 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-multi-modal-llms-openai 0.1.5 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-program-openai 0.1.5 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-question-gen-openai 0.1.3 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-readers-file 0.1.17 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-readers-llama-parse 0.1.4 depends on llama-index-core<0.11.0 and >=0.10.7
#10 53.95 llama-index-vector-stores-chroma 0.1.0 depends on llama-index-core==0.10.0
#10 53.95 The user requested llama-index-core
#10 53.95 llama-index 0.10.28 depends on llama-index-core<0.11.0 and >=0.10.28
#10 53.95 llama-index-agent-openai 0.2.2 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-cli 0.1.11 depends on llama-index-core<0.11.0 and >=0.10.11.post1
#10 53.95 llama-index-embeddings-huggingface 0.2.0 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-embeddings-mistralai 0.1.4 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-embeddings-openai 0.1.7 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-experimental 0.1.3 depends on llama-index-core<0.11.0 and >=0.10.11.post1
#10 53.95 llama-index-indices-managed-llama-cloud 0.1.5 depends on llama-index-core<0.11.0 and >=0.10.0
#10 53.95 llama-index-llms-anthropic 0.1.10 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-llms-bedrock 0.1.6 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-llms-groq 0.1.3 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-llms-huggingface 0.1.4 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-llms-ollama 0.1.2 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-llms-openai 0.1.15 depends on llama-index-core<0.11.0 and >=0.10.24
#10 53.95 llama-index-llms-openai-like 0.1.3 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-multi-modal-llms-ollama 0.1.3 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-multi-modal-llms-openai 0.1.5 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-program-openai 0.1.5 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-question-gen-openai 0.1.3 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-readers-file 0.1.17 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-readers-llama-parse 0.1.4 depends on llama-index-core<0.11.0 and >=0.10.7
#10 53.95 llama-index-vector-stores-chroma 0.0.2 depends on llama-index-core<0.10.0 and >=0.9.32
#10 53.95 The user requested llama-index-core
#10 53.95 llama-index 0.10.28 depends on llama-index-core<0.11.0 and >=0.10.28
#10 53.95 llama-index-agent-openai 0.2.2 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-cli 0.1.11 depends on llama-index-core<0.11.0 and >=0.10.11.post1
#10 53.95 llama-index-embeddings-huggingface 0.2.0 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-embeddings-mistralai 0.1.4 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-embeddings-openai 0.1.7 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-experimental 0.1.3 depends on llama-index-core<0.11.0 and >=0.10.11.post1
#10 53.95 llama-index-indices-managed-llama-cloud 0.1.5 depends on llama-index-core<0.11.0 and >=0.10.0
#10 53.95 llama-index-llms-anthropic 0.1.10 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-llms-bedrock 0.1.6 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-llms-groq 0.1.3 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-llms-huggingface 0.1.4 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-llms-ollama 0.1.2 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-llms-openai 0.1.15 depends on llama-index-core<0.11.0 and >=0.10.24
#10 53.95 llama-index-llms-openai-like 0.1.3 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-multi-modal-llms-ollama 0.1.3 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-multi-modal-llms-openai 0.1.5 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-program-openai 0.1.5 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-question-gen-openai 0.1.3 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-readers-file 0.1.17 depends on llama-index-core<0.11.0 and >=0.10.1
#10 53.95 llama-index-readers-llama-parse 0.1.4 depends on llama-index-core<0.11.0 and >=0.10.7
#10 53.95 llama-index-vector-stores-chroma 0.0.1 depends on llama-index-core<0.10.0 and >=0.9.32
#10 53.95
#10 53.95 To fix this you could try to:
#10 53.95 1. loosen the range of package versions you've specified
#10 53.95 2. remove package versions to allow pip attempt to solve the dependency conflict
#10 53.95
#10 53.95 ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts
#10 ERROR: process "/bin/sh -c pip install --no-cache-dir -r requirements.txt" did not complete successfully: exit code: 1
------
> [5/5] RUN pip install --no-cache-dir -r requirements.txt:
53.95 llama-index-question-gen-openai 0.1.3 depends on llama-index-core<0.11.0 and >=0.10.1
53.95 llama-index-readers-file 0.1.17 depends on llama-index-core<0.11.0 and >=0.10.1
53.95 llama-index-readers-llama-parse 0.1.4 depends on llama-index-core<0.11.0 and >=0.10.7
53.95 llama-index-vector-stores-chroma 0.0.1 depends on llama-index-core<0.10.0 and >=0.9.32
53.95
53.95 To fix this you could try to:
53.95 1. loosen the range of package versions you've specified
53.95 2. remove package versions to allow pip attempt to solve the dependency conflict
53.95
53.95 ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts
------
Dockerfile:13
--------------------
11 |
12 | # Install any needed packages specified in requirements.txt
13 | >>> RUN pip install --no-cache-dir -r requirements.txt
14 | ARG AUTH_API_KEY
15 | ARG AUTH_API_SECRET
--------------------
ERROR: failed to solve: process "/bin/sh -c pip install --no-cache-dir -r requirements.txt" did not complete successfully: exit code: 1
Error: Process completed with exit code 1.
Hello @moonlightnexus! 🌙 I'm Dosu, a friendly bot here to assist with bugs, answer your questions, and guide you on contributing while we wait for a human maintainer. I'm here to help you out! Let me take a look at the issue you're facing and I'll get back to you with a full answer shortly.
@moonlightnexus why are you installing llama-index-chroma-vector-stores==0.0.1 ? The latest is much newer?
You can see here the dependencies all make sense, as well as what the latest version is
If you are having trouble, you can send your requirements.txt, I'm sure one small adjustment would make it work.
@logan-markewich yeah please do sending
requirements.txt
@logan-markewich I didn't explicitly mentioned any version for chroma and let pip decide itself
pro-tip @moonlightnexus: don't run pip freeze > requirements.txt
-- you'll save yourself so much time if you just keep the actual top-level packages in your requirements, and not everything. Then, not only is it clear what packages you are actually using in your code, but it makes managing dependencies of those top-level packages easier.
Looking at this list, I have no idea which packages you are actually even using/importing in your code
In this case, you didn't specify a version for llama-index-vector-stores-chroma
, but you did for other chroma packages, which I think is causing the issue
Almost all the packages are interdependent taking off any of those packages will result in broken code and most importantly the llama_index library itself I've managed 2 libraries earlier and went through this kind of issue earlier with first one, these are next to impossible to manage without changing/loosening some dependencies from the library side
@logan-markewich
It's already 7am here and didn't slept whole night fixing this if you find some solution please let me know it will be a great help @logan-markewich
@moonlightnexus I'm not sure if you've used something like poetry before, but normally the scalable solution here is using something like poetry add <package>
to add the packages you are actually importing in your code. Then, all those package dependencies get handled automatically, without you keeping track of them in some un-managable requirements.txt file.
In any case, here's some steps for someone new to poetry.
I made a new directory, cd to it, and did poetry init
Then, I made a file new_requirements.txt
like this
llama-index
llama-index-agent-openai
llama-index-cli
llama-index-core
llama-index-embeddings-huggingface
llama-index-embeddings-mistralai
llama-index-embeddings-openai
llama-index-experimental
llama-index-indices-managed-llama-cloud
llama-index-llms-anthropic
llama-index-llms-bedrock
llama-index-llms-groq
llama-index-llms-huggingface
llama-index-llms-ollama
llama-index-llms-openai
llama-index-llms-openai-like
llama-index-multi-modal-llms-ollama
llama-index-multi-modal-llms-openai
llama-index-program-openai
llama-index-question-gen-openai
llama-index-readers-file
llama-index-readers-llama-parse
llama-index-embeddings-huggingface
llama-index-vector-stores-chroma
llama-index-readers-github
llama-index-readers-jira
llama-parse
llamaindex-py-client
Then I run poetry shell
to create a fresh venv for my project.
And then did this to run poetry add
on each line
cat new_requirements.txt | xargs -n 1 poetry add
Now, I have this final set of dependencies which all works fine together:
[tool.poetry.dependencies]
python = "^3.10"
llama-index = "^0.10.36"
llama-index-agent-openai = "^0.2.4"
llama-index-cli = "^0.1.12"
llama-index-core = "^0.10.36"
llama-index-embeddings-huggingface = "^0.2.0"
llama-index-embeddings-mistralai = "^0.1.4"
llama-index-embeddings-openai = "^0.1.9"
llama-index-experimental = "^0.1.3"
llama-index-indices-managed-llama-cloud = "^0.1.6"
llama-index-llms-anthropic = "^0.1.11"
llama-index-llms-bedrock = "^0.1.7"
llama-index-llms-groq = "^0.1.3"
llama-index-llms-huggingface = "^0.2.0"
llama-index-llms-ollama = "^0.1.3"
llama-index-llms-openai = "^0.1.18"
llama-index-llms-openai-like = "^0.1.3"
llama-index-multi-modal-llms-ollama = "^0.1.3"
llama-index-multi-modal-llms-openai = "^0.1.5"
llama-index-program-openai = "^0.1.6"
llama-index-question-gen-openai = "^0.1.3"
llama-index-readers-file = "^0.1.22"
llama-index-readers-llama-parse = "^0.1.4"
llama-index-vector-stores-chroma = "^0.1.8"
llama-index-readers-jira = "^0.1.3"
llama-parse = "^0.4.2"
llamaindex-py-client = "^0.1.19"
Highly recommend using poetry, and only running poetry add
for packages that you actually import in your code. I think I saw langchain in your original requirements, so that probably needs to be added with poetry add <package>
as well. Your life will be much easier
Now for installing in something like docker, you just need the toml and lock file, and you can do poetry install
to install everything in a fresh env
@logan-markewich Thanks for your help, finally made it that day.
I have one more concern, how to have granular control over Chat history. What I'm doing right now is creating a lot of wrapper functions to customize default functions from LLamaindex but having it will be way more convenient and a good step towards making the application much more compliant towards PIIs and business use case specific.