ngxson / wllama

WebAssembly binding for llama.cpp - Enabling in-browser LLM inference

Home Page:https://ngxson.github.io/wllama/examples/basic/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Idea] Stream data from main thread to worker

ngxson opened this issue · comments

Data is now passing as Uint8Array. We can do better by using Streams: https://developer.mozilla.org/en-US/docs/Web/API/Streams_API/Using_readable_streams

We are now using Blob which already provides a ReadableStream.

Where's the "on fire" emoji when you need it