The current revision bf11a49 fails to build: Cannot find source file: ../../../../include/bitnet-lut-kernels.h
yurivict opened this issue · comments
CMake Error at 3rdparty/llama.cpp/ggml/src/CMakeLists.txt:1324 (add_library):
Cannot find source file:
../../../../include/bitnet-lut-kernels.h
Tried extensions .c .C .c++ .cc .cpp .cxx .cu .mpp .m .M .mm .ixx .cppm
.ccm .cxxm .c++m .h .hh .h++ .hm .hpp .hxx .in .txx .f .F .for .f77 .f90
.f95 .f03 .hip .ispc
CMake Error at 3rdparty/llama.cpp/ggml/src/CMakeLists.txt:1324 (add_library):
No SOURCES given to target: ggml
I ran into the same problem. I believe it is caused by a bad symbolic link in the llama.cpp
fork, at https://github.com/Eddie-Wang1120/llama.cpp/tree/814d0ee5440495255a4e3a5a8abf001b27b539d4/spm-headers
.
BTW. That fork is now over 400 commits behind the head of llama.cpp
. Is there a documented strategy for keeping pace with llama.cpp
development efforts?
We had to roll a patch on top of the custom llama.cpp/ggml used as a submodule in this repo:
https://github.com/eugenehp/bitnet-cpp-rs/blob/main/bitnet-cpp-sys/patches/llama.cpp.patch#L11
This repository is broken as it is now.
The header file include/bitnet-lut-kernels.h
is generated by utils/codegen_tl*.py
and should be generated automatically if setup_env.py
is executed correctly. Would it be possible to check if each step was executed successfully and look at the error logs in the logs/ folder?
It seems to be a compiler bug, try upgrading gcc/g++ or changing compiler to clang/clang++.
It seems to be a compiler bug, try upgrading gcc/g++ or changing compiler to clang/clang++.
Duh ... I didn't notice that the install for clang didn't set it as the default CC and CXX. Thanks for catching that.
I'm up and running now and will leave it to the OP to decide if this can be closed. Thanks again!