bug: Error("control character (\\u0000-\\u001F) found while parsing a string
katopz opened this issue · comments
Summary
Get an error Error("control character (\\u0000-\\u001F) found while parsing a string
when using llama-api-server.wasm
Work fine via llama-chat.wasm
Reproduction steps
### Model
curl -LO https://huggingface.co/openthaigpt/openthaigpt-1.0.0-beta-13b-chat-gguf/resolve/main/ggml-model-q4_0.gguf
### API
curl -LO https://github.com/second-state/llama-utils/raw/main/api-server/llama-api-server.wasm
wasmedge --dir .:. --nn-preload default:GGML:AUTO:ggml-model-q4_0.gguf llama-api-server.wasm
curl -X POST http://0.0.0.0:8080/v1/chat/completions -H 'accept:application/json' -H 'Content-Type: application/json' -d '{"messages":[{"role":"system", "content":"You are a helpful AI assistant"}, {"role":"user", "content":"กทม ย่อมาจากอะไร"}], "model":"
openthaigpt-1.0.0-beta-13b-chat"}'
Screenshots
Any logs you want to share for showing the specific issue
thread 'main' panicked at llama-api-server/src/backend/ggml.rs:204:87:
called `Result::unwrap()` on an `Err` value: Error("control character (\\u0000-\\u001F) found while parsing a string", line: 2, column: 0)
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
[2023-12-26 15:43:18.615] [error] execution failed: unreachable, Code: 0x89
[2023-12-26 15:43:18.616] [error] In instruction: unreachable (0x00) , Bytecode offset: 0x00168a0f
[2023-12-26 15:43:18.616] [error] When executing function name: "_start"
Model Information
Operating system information
macOS 14.1.2 (23B2091)
ARCH
arm64
CPU Information
Apple m3 max
Memory Size
64GB
GPU Information
metal
VRAM Size
64GB
@katopz I tried openthaigpt-1.0.0-beta-13b-chat
with both the curl command you provided and reproduced the error. According to the investigation, the issue should be caused by the curl command. It contains special characters. The following snapshot presents the comparison between the curl command (red circle) mentioned in this issue and the one (blue circle) we created. You can try the one we created on your local system. If any further issues, please let us know. Thanks!
Oh, my bad! Super sorry! 😨