ngxson / wllama

WebAssembly binding for llama.cpp - Enabling in-browser LLM inference

Home Page:https://ngxson.github.io/wllama/examples/basic/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

warning: munmap failed: Invalid argument

flatsiedatsie opened this issue · comments

I'm seeing this warning when loading a model:

............................................................................................warning: munmap failed: Invalid argument

Is it anything to worry about?

That seems like a problem with mmap support in emscripten. I'm not sure where the problem comes from, probably I should see if there's a related issue in emscripten's repo: https://github.com/emscripten-core/emscripten/issues?q=munmap

In the meantime, could you please give more details on the model and configs that you're using?

I believe this happens with every model I load.

I also get this warning for any model I use.

It's also reproducible in the demo:
image

I'm closing this issue because this error message is not very important (can be ignored)

Technical explanation: llama.cpp uses mmap() to read files, which may use less memory than traditional read() syscall. This benefits us because wllama also use mmap() to prevent copying files (see #39). However, we don't implement munmap() because we don't need to unload file. This is not needed because to unload a model, we can simply call wllama.exit()