janhq / jan

Jan is an open source alternative to ChatGPT that runs 100% offline on your computer. Multiple engine support (llama.cpp, TensorRT-LLM)

Home Page:https://jan.ai/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

bug: Can't make 2nd massage

zeinalfa708 opened this issue · comments

  • I have searched the existing issues

Current behavior

i can't send any singel word/messages to Gemma 2B Q4. as the screen shot explain
image

Minimum reproduction step

  1. start a conversation
  2. enter what ever you like and send
  3. after the AI generate. you type whatever again. and it shows this error

Expected behavior

It should be normal/resume the conversation

Screenshots / Logs

LOG:

20240625 11:59:12.400000 UTC 8124 INFO sent the non stream, waiting for respone - llamaCPP.cc:424
20240625 11:59:16.945000 UTC 8060 INFO Messages:[
{
"content" : "oi",
"role" : "user"
},
{
"content" : "Hello! It's nice to hear from you. How can I assist you today?",
"role" : "assistant"
},
{
"content" : "Long time no see",
"role" : "user"
}
]

  • llamaCPP.cc:211
    20240625 11:59:16.945000 UTC 8060 INFO Stop:[]
  • llamaCPP.cc:212
    20240625 11:59:16.945000 UTC 9608 INFO Wait for task to be released:4 - llamaCPP.cc:413
    20240625 11:59:16.945000 UTC 8060 DEBUG [makeHeaderString] send stream with transfer-encoding chunked - HttpResponseImpl.cc:535
    20240625 11:59:17.032000 UTC 8060 INFO Error during inference - llamaCPP.cc:386
    20240625 11:59:17.050000 UTC 9608 INFO Task completed, release it - llamaCPP.cc:416
    20240625 11:59:18.038000 UTC 8060 INFO Messages:[
    {
    "content" : "Summarize in a 10-word Title. Give the title only. "oi"",
    "role" : "user"
    }
    ]
  • llamaCPP.cc:211
    20240625 11:59:18.038000 UTC 8060 INFO Stop:[]
  • llamaCPP.cc:212
    20240625 11:59:18.038000 UTC 8060 INFO sent the non stream, waiting for respone - llamaCPP.cc:424
    20240625 11:59:18.088000 UTC 8060 INFO Here is the result:1 - llamaCPP.cc:428
    20240625 11:59:19.985000 UTC 8124 INFO Here is the result:0 - llamaCPP.cc:428
    20240625 11:59:44.720000 UTC 8124 INFO Clean cache threshold reached! - llamaCPP.cc:199
    20240625 11:59:44.720000 UTC 8124 INFO Cache cleaned - llamaCPP.cc:201
    20240625 11:59:44.720000 UTC 8124 INFO Messages:[
    {
    "content" : "p",
    "role" : "user"
    }
    ]
  • llamaCPP.cc:211
    20240625 11:59:44.720000 UTC 8124 INFO Stop:[]
  • llamaCPP.cc:212
    20240625 11:59:44.726000 UTC 8124 DEBUG [makeHeaderString] send stream with transfer-encoding chunked - HttpResponseImpl.cc:535
    20240625 11:59:48.461000 UTC 8124 INFO reached result stop - llamaCPP.cc:373
    20240625 11:59:48.461000 UTC 8124 INFO End of result - llamaCPP.cc:346
    20240625 11:59:48.507000 UTC 9608 INFO Task completed, release it - ll

image

Jan version

0.5.1

In which operating systems have you tested?

  • macOS
  • Windows
  • Linux

Environment details

2024-06-25T11:52:47.474Z [SPECS]::Version: 0.5.12024-06-25T11:52:47.474Z [SPECS]::CPUs: [{"model":"Intel(R) Core(TM) i3-7100 CPU @ 3.90GHz","speed":3912,"times":{"user":179625,"nice":0,"sys":114078,"idle":1733703,"irq":11968}},{"model":"Intel(R) Core(TM) i3-7100 CPU @ 3.90GHz","speed":3912,"times":{"user":137437,"nice":0,"sys":59656,"idle":1830140,"irq":796}},{"model":"Intel(R) Core(TM) i3-7100 CPU @ 3.90GHz","speed":3912,"times":{"user":401093,"nice":0,"sys":143578,"idle":1482578,"irq":1875}},{"model":"Intel(R) Core(TM) i3-7100 CPU @ 3.90GHz","speed":3912,"times":{"user":243031,"nice":0,"sys":89265,"idle":1694953,"irq":1250}}]
2024-06-25T11:52:47.475Z [SPECS]::Machine: x86_64
2024-06-25T11:52:47.475Z [SPECS]::Endianness: LE
2024-06-25T11:52:47.475Z [SPECS]::Parallelism: 4
2024-06-25T11:52:47.476Z [SPECS]::Free Mem: 470646784
2024-06-25T11:52:47.476Z [SPECS]::OS Version: Windows 11 Pro
2024-06-25T11:52:47.476Z [SPECS]::OS Platform: win32
2024-06-25T11:52:47.476Z [SPECS]::OS Release: 10.0.22631
2024-06-25T11:52:47.476Z [SPECS]::Total Mem: 8539951104

hi @zeinalfa708, can you try removing the extension folder and restart Jan?

sorry for the delay. where is it? can't find anywhere.
image

The ‌‌issue is reprodu‌‌cible from our sid‌e, cc: @hahuyhoang411
Som‌‌ehow the stop word is e‌mpty ❌
image