Feature Request: Is it possible to see the output realtime?
alexcardo opened this issue · comments
Summary
We used to see how the model answering the question especially in the UI mode. Please add this feature.
Thank you!
Appendix
No response
Hi @alexcardo
Do you mean supporting streaming output in the web chatbot U? If so, we're working on it. Stay tuned.
Yes, this is exactly what I mean. I'm waiting for this!
Also, it would be great to tune up settings visually. I mean temperature, top_k, top_p, etc.
I've got this that I can use it from console while starting the server, but it also would be great to see it in the UI.
Thank you.
By the way, as I opened the issue. In the CLI version, there is no multiline input. Correct me if I'm wrong.
I mean, for instance, I want to act like this:
Summarize this:
bla bla bla....
In LLAMA CPP, there is a --multiline-input
, where I can use a "\"
symbol.
By the way, as I opened the issue. In the CLI version, there is no multiline input. Correct me if I'm wrong.
I mean, for instance, I want to act like this:
Summarize this:
bla bla bla....
In LLAMA CPP, there is a
--multiline-input
, where I can use a"\"
symbol.
PR #91 provides support for multiline
inputs in the console.
Can you give an example of how to use the multiline input? Once I type Enter, the bot starts answering the question.
I used this instructionhttps://www.secondstate.io/articles/run-llm-sh/
to install LlamaEdge
I did curl -LO https://github.com/second-state/LlamaEdge/releases/latest/download/llama-chat.wasm hoping that there are these changes:
https://github.com/second-state/LlamaEdge/pull/91/commits/bb78cf8f9d44db9b187871b327e3be3468d65270#diff-46cfefd927062cfabcabb14bc2a3cf0b5dab89b0470471c7a39a98521270447f
Either I don't realize how this implementation work or there should be some instruction. I would be happy to see it.
@alexcardo At the end of each line, just use "", then press Enter. You'll get a new line. Hope the video can help you.
multiline-inputs.mov
Looks like this issue is resolved. Is it fine to close it?