li-plus / chatglm.cpp

C++ implementation of ChatGLM-6B & ChatGLM2-6B & ChatGLM3 & GLM4

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

error: wheels for chatglm.cpp on windows

srdevore opened this issue · comments

I get an error with pip install chatglm.cpp due to problem with wheel despite multiple troubleshooting attempts. Details are:

Context: want to use Chinese LLMs in xinference.

Windows machine (local); 100+ gb free space
Python 3.9.12
Installed Cmake 3.29.1
Visual studio 2022 including Desktop development with C++ (build tools inlcuded)
These installs fixed similar issue with llama-cpp-python, so it doesn't seem like the config is the problem.

ERROR: Could not build wheels for chatglm.cpp, which is required to install pyproject.toml-based projects
full output is attached, but root cause seems to be here:
-- Building for: Visual Studio 17 2022
-- Selecting Windows SDK version 10.0.22621.0 to target Windows 10.0.22631.
-- The CXX compiler identification is MSVC 19.39.33523.0
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - failed
-- Check for working CXX compiler: C:/Program Files/Microsoft Visual
chatglm_output.txt

Any help is greatly appreciated!
chatglm_output.txt

commented

Apparently the online package file has not been updated (and requires direct download from GitHub)? xorbitsai/inference#1393