mybigday / llama.rn

React Native binding of llama.cpp

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Feature Request: TextStreaming

hushaudio opened this issue · comments

commented

Is it possible to add a text streaming feature? It looks like your loading a local cpp server I wonder does swift support sockets for react native? Inference is so slow right on mobile devices right now, streaming would help the user know something is happening. Interested in contributing if you need contribs. I believe it is supported by llama.cpp in langchains implementation but im not sure if that's custom

Hey there, I maintain an app that uses llama.rn so I have some pointers on this. Text can be streamed using the call back function of LlamaContext.completion:

async completion(
    params: CompletionParams,
    callback?: (data: TokenData) => void,
): Promise<NativeCompletionResult>

Simply pass in a callback function that inserts the new token to update some state.

commented

oh amazing, thank you so much!