all-in-one with version 2.1.0b1 failed
Fred-cell opened this issue · comments
Will try to reproduce from our side
Hi @Fred-cell, I followed the steps below, cannot reproduce this issue from our side
conda create -n llm python=3.10
conda activate llm
pip install --pre --upgrade ipex-llm[xpu]==2.1.0b1 --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
pip install transformers==4.37.0
pip install omegaconf
pip install pandas
bash run-arc.sh
Can you provide the env information using https://github.com/intel-analytics/ipex-llm/blob/main/python/llm/scripts/env-check.sh?
According to the offline synchronization with @Fred-cell , the cause of the problem was that oneapi 2024.1 was installed in their environment. We recommended that they install oneapi 2024.0 according to the guide and the problem was solved.