ggerganov / llama.cpp

LLM inference in C/C++

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

relocation R_X86_64_32 against hidden symbol `__TMC_END__' can not be used when making a shared object

asarubbo opened this issue · comments

On 2879 when I enable the generation of the static libraries (-DLLAMA_STATIC=ON) I get:

[1/97] /usr/bin/x86_64-pc-linux-gnu-gcc -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_LLAMAFILE -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/var/tmp/portage/dev-cpp/llama-cpp-2879/work/llama.cpp-b2879/.  -O2 -march=x86-64 -pipe -pipe -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -mf16c -mfma -mavx -mavx2 -MD -MT CMakeFiles/ggml.dir/ggml.c.o -MF CMakeFiles/ggml.dir/ggml.c.o.d -o CMakeFiles/ggml.dir/ggml.c.o -c /var/tmp/portage/dev-cpp/llama-cpp-2879/work/llama.cpp-b2879/ggml.c
[2/97] /usr/bin/x86_64-pc-linux-gnu-gcc -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_LLAMAFILE -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/var/tmp/portage/dev-cpp/llama-cpp-2879/work/llama.cpp-b2879/.  -O2 -march=x86-64 -pipe -pipe -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -mf16c -mfma -mavx -mavx2 -MD -MT CMakeFiles/ggml.dir/ggml-alloc.c.o -MF CMakeFiles/ggml.dir/ggml-alloc.c.o.d -o CMakeFiles/ggml.dir/ggml-alloc.c.o -c /var/tmp/portage/dev-cpp/llama-cpp-2879/work/llama.cpp-b2879/ggml-alloc.c
[3/97] /usr/bin/x86_64-pc-linux-gnu-gcc -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_LLAMAFILE -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/var/tmp/portage/dev-cpp/llama-cpp-2879/work/llama.cpp-b2879/.  -O2 -march=x86-64 -pipe -pipe -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -mf16c -mfma -mavx -mavx2 -MD -MT CMakeFiles/ggml.dir/ggml-backend.c.o -MF CMakeFiles/ggml.dir/ggml-backend.c.o.d -o CMakeFiles/ggml.dir/ggml-backend.c.o -c /var/tmp/portage/dev-cpp/llama-cpp-2879/work/llama.cpp-b2879/ggml-backend.c
[4/97] /usr/bin/x86_64-pc-linux-gnu-gcc -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_LLAMAFILE -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/var/tmp/portage/dev-cpp/llama-cpp-2879/work/llama.cpp-b2879/.  -O2 -march=x86-64 -pipe -pipe -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -mf16c -mfma -mavx -mavx2 -MD -MT CMakeFiles/ggml.dir/ggml-quants.c.o -MF CMakeFiles/ggml.dir/ggml-quants.c.o.d -o CMakeFiles/ggml.dir/ggml-quants.c.o -c /var/tmp/portage/dev-cpp/llama-cpp-2879/work/llama.cpp-b2879/ggml-quants.c
[5/97] /usr/bin/x86_64-pc-linux-gnu-g++ -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_LLAMAFILE -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/var/tmp/portage/dev-cpp/llama-cpp-2879/work/llama.cpp-b2879/.  -O2 -march=x86-64 -pipe -pipe -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 -std=gnu++11 -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-array-bounds -Wno-format-truncation -Wextra-semi -mf16c -mfma -mavx -mavx2 -MD -MT CMakeFiles/ggml.dir/sgemm.cpp.o -MF CMakeFiles/ggml.dir/sgemm.cpp.o.d -o CMakeFiles/ggml.dir/sgemm.cpp.o -c /var/tmp/portage/dev-cpp/llama-cpp-2879/work/llama.cpp-b2879/sgemm.cpp
[6/97] : && /usr/bin/cmake -E rm -f libggml_static.a && /usr/bin/x86_64-pc-linux-gnu-ar qc libggml_static.a  CMakeFiles/ggml.dir/ggml.c.o CMakeFiles/ggml.dir/ggml-alloc.c.o CMakeFiles/ggml.dir/ggml-backend.c.o CMakeFiles/ggml.dir/ggml-quants.c.o CMakeFiles/ggml.dir/sgemm.cpp.o && /usr/bin/x86_64-pc-linux-gnu-ranlib libggml_static.a && :
[7/97] : && /usr/bin/x86_64-pc-linux-gnu-g++ -fPIC -O2 -march=x86-64 -pipe -pipe -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0  -Wl,-O1 -Wl,--as-needed -Wl,--defsym=__gentoo_check_ldflags__=0   -static -shared -Wl,-soname,libggml_shared.so -o libggml_shared.so CMakeFiles/ggml.dir/ggml.c.o CMakeFiles/ggml.dir/ggml-alloc.c.o CMakeFiles/ggml.dir/ggml-backend.c.o CMakeFiles/ggml.dir/ggml-quants.c.o CMakeFiles/ggml.dir/sgemm.cpp.o   && :
FAILED: libggml_shared.so 
: && /usr/bin/x86_64-pc-linux-gnu-g++ -fPIC -O2 -march=x86-64 -pipe -pipe -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0  -Wl,-O1 -Wl,--as-needed -Wl,--defsym=__gentoo_check_ldflags__=0   -static -shared -Wl,-soname,libggml_shared.so -o libggml_shared.so CMakeFiles/ggml.dir/ggml.c.o CMakeFiles/ggml.dir/ggml-alloc.c.o CMakeFiles/ggml.dir/ggml-backend.c.o CMakeFiles/ggml.dir/ggml-quants.c.o CMakeFiles/ggml.dir/sgemm.cpp.o   && :
/usr/lib/gcc/x86_64-pc-linux-gnu/13/../../../../x86_64-pc-linux-gnu/bin/ld: /usr/lib/gcc/x86_64-pc-linux-gnu/13/crtbeginT.o: relocation R_X86_64_32 against hidden symbol `__TMC_END__' can not be used when making a shared object
/usr/lib/gcc/x86_64-pc-linux-gnu/13/../../../../x86_64-pc-linux-gnu/bin/ld: failed to set dynamic section sizes: bad value
collect2: error: ld returned 1 exit status
ninja: build stopped: subcommand failed.

Project was configured as:

cmake -C /var/tmp/portage/dev-cpp/llama-cpp-2879/work/llama.cpp-b2879_build/gentoo_common_config.cmake -G Ninja -DCMAKE_INSTALL_PREFIX=/usr -DLLAMA_NATIVE=OFF -DLLAMA_STATIC=ON -DLLAMA_CCACHE=OFF -DLLAMA_BLAS=no -DLLAMA_CUBLAS=no -DLLAMA_LTO=no -DLLAMA_BUILD_TESTS=no -DLLAMA_HIPBLAS=no -DLLAMA_BUILD_SERVER=OFF -DCMAKE_SKIP_BUILD_RPATH=ON -DBUILD_NUMBER=1 -DCMAKE_BUILD_TYPE=RelWithDebInfo -DCMAKE_TOOLCHAIN_FILE=/var/tmp/portage/dev-cpp/llama-cpp-2879/work/llama.cpp-b2879_build/gentoo_toolchain.cmake /var/tmp/portage/dev-cpp/llama-cpp-2879/work/llama.cpp-b2879