LlamaEdge / LlamaEdge

The easiest & fastest way to run customized and fine-tuned LLMs locally or on the edge

Home Page:https://llamaedge.com/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

bug: process getting aborted on raspberrypi-5

akashicMarga opened this issue · comments

Summary

I am getting below error while using api-server llama-api-server.wasm both downloading from curl and after building on raspberry-pi5 with raspberrypi-os. It works. well on my mac.

free(): invalid pointer
Aborted

Reproduction steps

I have followed alls steps in api-server readme.

Screenshots

DESCRIPTION

Any logs you want to share for showing the specific issue

No response

Model Information

Tinylllam-4bit, dolphin-phi-4bit

Operating system information

raspberry-pi os

ARCH

arm64

CPU Information

64-bit quad-core Arm Cortex-A76 processor

Memory Size

8GB

GPU Information

none

VRAM Size

0

@hydai Could you please help with this issue? Thanks a lot!

Hi @singhaki
Please share the detailed rpi OS version so we can find a docker image to reproduce this issue.

Operating System: Debian GNU/Linux 12 (bookworm)
Kernel: Linux 6.1.0-rpi8-rpi-2712
Architecture: arm64

Hi @singhaki
Could you please use curl -sSf https://raw.githubusercontent.com/WasmEdge/WasmEdge/master/utils/install.sh | bash -s -- --dist ubuntu20.04 --plugins wasi_nn-ggml wasmedge_rustls to install the wasmedge+ggml manually, and try to run the model with this?

Or you can use run-llm.sh with modifying this line with the above command: https://github.com/LlamaEdge/LlamaEdge/blob/main/run-llm.sh#L249

Working now. Thanks!! What was the reason? build was for older kernel/OS?

Working now. Thanks!! What was the reason? build was for older kernel/OS?

The installer cannot recognize the RPI OS, so it tried to install the version for the manylinux2014(which is based on centos 7, a very legacy system). And this seems not compatible with RPI OS.