wangcx18 / llm-vscode-inference-server

An endpoint server for efficiently serving quantized open-source LLMs for code.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Can't pip install requirements.txt on CPU-only system

ciprianelies opened this issue · comments

I saw the README references running on CPU as a goal, is the project there right now or is there still work to be done to achieve that?
Currently I'm seeing this error if I try pip install -r requirements.txt:

      RuntimeError: Cannot find CUDA_HOME. CUDA must be available to build the package.

Thank you for your query. The CPU feature is in progress, but my availability to advance it is limited for now. Your patience is appreciated.

I've encountered this as well. Hope this can be added!