ipex-llm[xpu] is not compatible with ipex-llm[cpp]
jiafeng5513 opened this issue · comments
ipex-llm[xpu] require torch==2.1.0a0 and intel-extension-for-pytorch==2.1.10+xpu, but ipex-llm[cpp] require different torch(2.2.0) which is not a intel special version. There are some issuess about this:
- ipex-llm[cpp] install will even cause cudnn to be downloaded automatically which is completely ridiculous: the reason of why I want to try something about ipex is I dont have NVIDIA device in my environment, but the ipex-llm[cpp] still download tons of cuda and cudnn libraries into my conda env, which are totally useless.
- ipex-llm[cpp] and ipex-llm[xpu] have many common dependencies, Why can't they be combined together? Why release the separate portable ollama and lamma.cpp package? Let me tell you This does not make using ipex easier, but more complicated and confusing, we want ONE pip package just like ipex-llm[xpu], and contains all features!
- intel-extension-for-pytorch now update to 2.6.10 but the latest ipex-llm still depends on ipex 2.1.10, so there is a dependency chain: if i want to use ipex-llm, i must use ipex 2.1.10, and then must use the pytorch 2.1.0a0, god please!! its too old !! look at the ipex please, it keep up the pace of offical pytorch, we hope the ipex-llm also do the same.
- the ipex-llm[xpu] need different version of oneapi components with ipex-llm[cpp] and ollama, its hard to install ipex-llm[xpu] and ollama together. I'm really curious why completely incompatible dependencies are needed, as these packages(ipex-llm[xpu], Ollama portable and ipex-llm[cpp]) are published in the same repository?
So we have portable zip now, for llamacpp and ollama users. Most of they don't need pytorch. See https://github.com/intel/ipex-llm/blob/main/docs/mddocs/Quickstart/llamacpp_portable_zip_gpu_quickstart.md and https://github.com/intel/ipex-llm/blob/main/docs/mddocs/Quickstart/ollama_portable_zip_quickstart.md.
We have supported ipex-llm[xpu] with PyTorch 2.6, and we are working on the test and documentation. You could refer to this comment regarding how to install ipex-llm with PyTorch 2.6 support for now.