Warning(s) when starting rLLM-cpp
dluc opened this issue · comments
Devis Lucato commented
Steps
cd rllm-cpp
./cpp-server.sh phi2
Result
- The server builds and starts
- The log contains one warning:
WARN [llama_cpp_low] llm_load_vocab: mismatch in special tokens definition ( 910/51200 vs 944/51200 )
- This log entry looks like a warning but is logged as INFO 7 times:
INFO [hf_hub] Token file not found "/Users/tester/.cache/huggingface/token"
System
- system: macOS 14.3, Apple M3
- cargo 1.75.0
- cmake version 3.28.2
- ccache version 4.9
Michał Moskal commented
The token warning comes from llama.cpp; not sure what can we do about it.
The hf_hub info is only relevant if there is a "permission denied" afterwards (eg for some of the Meta models).