llama cannot run
17862687921 opened this issue · comments
Summary
Follow the execution steps given in https://mp.weixin.qq.com/s/Ovkb2DT39DMkoUBvwyjXZQ. The final result is:
$ wasmedge --dir .:. --nn-preload default:GGML:AUTO:Meta-Llama-3-8B-Instruct-Q5_K_M.gguf llama-chat.wasm -p llama-3-chat
[2024-04-23 10:01:26.911] [error] instantiation failed: module name conflict, Code: 0x60
[2024-04-23 10:01:26.911] [error] At AST node: module
[2024-04-23 10:01:26.938] [error] instantiation failed: unknown import, Code: 0x62
[2024-04-23 10:01:26.938] [error] When linking module: "wasi_ephemeral_nn" , function name: "load_by_name_with_config"
[2024-04-23 10:01:26.938] [error] At AST node: import description
[2024-04-23 10:01:26.938] [error] At AST node: import section
[2024-04-23 10:01:26.938] [error] At AST node: module
Please tell me how to solve the problem, thank you.
Current State
$ wasmedge --dir .:. --nn-preload default:GGML:AUTO:Meta-Llama-3-8B-Instruct-Q5_K_M.gguf llama-chat.wasm -p llama-3-chat
[2024-04-23 10:01:26.911] [error] instantiation failed: module name conflict, Code: 0x60
[2024-04-23 10:01:26.911] [error] At AST node: module
[2024-04-23 10:01:26.938] [error] instantiation failed: unknown import, Code: 0x62
[2024-04-23 10:01:26.938] [error] When linking module: "wasi_ephemeral_nn" , function name: "load_by_name_with_config"
[2024-04-23 10:01:26.938] [error] At AST node: import description
[2024-04-23 10:01:26.938] [error] At AST node: import section
[2024-04-23 10:01:26.938] [error] At AST node: module
Expected State
[USER]:
Who is Robert Oppenheimer?
[ASSISTANT]:
Robert Oppenheimer was an American theoretical physicist and director of the Manhattan Project, which developed the atomic bomb during World War II. He is widely regarded as one of the most important physicists of the 20th century and is known for his contributions to the development of quantum mechanics and the theory of the atomic nucleus. Oppenheimer was also a prominent figure in the post-war nuclear weapons debate and was a strong advocate for international cooperation on nuclear weapons control.
Reproduction steps
- curl -sSf https://raw.githubusercontent.com/WasmEdge/WasmEdge/master/utils/install.sh | bash -s -- -v 0.13.5 --plugins wasi_nn-ggml wasmedge_rustls
- curl -LO https://huggingface.co/second-state/Llama-3-8B-Instruct-GGUF/resolve/main/Meta-Llama-3-8B-Instruct-Q5_K_M.gguf
- curl -LO https://github.com/LlamaEdge/LlamaEdge/releases/latest/download/llama-chat.wasm
- wasmedge --dir .:. --nn-preload default:GGML:AUTO:Meta-Llama-3-8B-Instruct-Q5_K_M.gguf llama-chat.wasm -p llama-3-chat
- Get error
[2024-04-23 10:01:26.911] [error] instantiation failed: module name conflict, Code: 0x60
[2024-04-23 10:01:26.911] [error] At AST node: module
[2024-04-23 10:01:26.938] [error] instantiation failed: unknown import, Code: 0x62
[2024-04-23 10:01:26.938] [error] When linking module: "wasi_ephemeral_nn" , function name: "load_by_name_with_config"
[2024-04-23 10:01:26.938] [error] At AST node: import description
[2024-04-23 10:01:26.938] [error] At AST node: import section
[2024-04-23 10:01:26.938] [error] At AST node: module
Screenshots
Any logs you want to share for showing the specific issue
No response
Components
CLI, C SDK, Rust SDK, Others
WasmEdge Version or Commit you used
0.13.5
Operating system information
ubuntu22.04
Hardware Architecture
x86_64
Compiler flags and options
No response
What's the output of curl -sSf https://raw.githubusercontent.com/WasmEdge/WasmEdge/master/utils/install.sh | bash -s -- -v 0.13.5 --plugins wasi_nn-ggml wasmedge_rustls
?
This error shows that the wasi_nn plugin is not installed correctly.
curl -sSf https://raw.githubusercontent.com/WasmEdge/WasmEdge/master/utils/install.sh | bash -s -- -v 0.13.5 --plugins wasi_nn-ggml wasmedge_rustls
Using Python: /usr/bin/python3
WARNING - Experimental Option Selected: plugins
WARNING - plugins option may change later
INFO - Compatible with current configuration
INFO - Running Uninstaller
ls: 无法访问 '/home/wwl/.wasmedge': 没有那个文件或目录
sed: -e 表达式 #1, 字符 4: 未知的命令:“
”↵
ERROR - Exception on process - rc= 1 output= b'\x1b[0;33mNo root permissions.\x1b[0m\n\x1b[0;32mInstallation path found at /home/wwl/.wasmedge\x1b[0m\nRemoving /home/wwl/.wasmedge/env\nRemoving ERROR : Found 1 file(s) only\nRemoving /home/wwl/.wasmedge/plugin\nRemoving /home/wwl/.wasmedge\nRemoving /home/wwl/.wasmedge\n' command= ['bash /tmp/wasmedge.8285/uninstall.sh -p /home/wwl/.wasmedge -q']
WARNING - SHELL variable not found. Using bash as SHELL
INFO - shell configuration updated
INFO - Downloading WasmEdge
|============================================================|100.00 %INFO - Downloaded
INFO - Installing WasmEdge
INFO - WasmEdge Successfully installed
INFO - Downloading Plugin: wasi_nn-ggml-cuda
|============================================================|100.00 %INFO - Downloaded
INFO - Downloading Plugin: wasmedge_rustls
|============================================================|100.00 %INFO - Downloaded
INFO - Run:
source /home/wwl/.bashrc
Is your shell bash? Do you run source /home/wwl/.bashrc
before the execution?
And what's the folder context of /home/wwl/.wasmedge
? Try to use tree /home/wwl/.wasmedge
to get the full output.
yes. I have run source /home/wwl/.bashrc before the execution.
tree /home/wwl/.wasmedge
/home/wwl/.wasmedge
├── bin
│ ├── wasmedge
│ └── wasmedgec
├── env
├── include
│ └── wasmedge
│ ├── enum_configure.h
│ ├── enum_errcode.h
│ ├── enum.inc
│ ├── enum_types.h
│ ├── int128.h
│ ├── version.h
│ └── wasmedge.h
├── lib
│ ├── libwasmedge.so -> libwasmedge.so.0
│ ├── libwasmedge.so.0 -> libwasmedge.so.0.0.3
│ └── libwasmedge.so.0.0.3
└── plugin
├── libwasmedgePluginWasiNN.so
└── libwasmedge_rustls.so
5 directories, 15 files
Looks like the installation is done. libwasmedgePluginWasiNN.so
exists.
What's the output of WASMEDGE_PLUGIN_PATH=/home/wwl/.wasmedge/plugin wasmedge --dir .:. --nn-preload default:GGML:AUTO:Meta-Llama-3-8B-Instruct-Q5_K_M.gguf llama-chat.wasm -p llama-3-chat
?
WASMEDGE_PLUGIN_PATH=/home/wwl/.wasmedge/plugin wasmedge --dir .:. --nn-preload default:GGML:AUTO:Meta-Llama-3-8B-Instruct-Q5_K_M.gguf llama-chat.wasm -p llama-3-chat
[2024-04-23 11:29:01.779] [error] instantiation failed: module name conflict, Code: 0x60
[2024-04-23 11:29:01.779] [error] At AST node: module
[2024-04-23 11:29:01.779] [error] instantiation failed: module name conflict, Code: 0x60
[2024-04-23 11:29:01.779] [error] At AST node: module
[2024-04-23 11:29:01.779] [error] instantiation failed: module name conflict, Code: 0x60
[2024-04-23 11:29:01.779] [error] At AST node: module
[INFO] Model alias: default
[INFO] Prompt context size: 512
[INFO] Number of tokens to predict: 1024
[INFO] Number of layers to run on the GPU: 100
[INFO] Batch size for prompt processing: 512
[INFO] Temperature for sampling: 0.8
[INFO] Top-p sampling (1.0 = disabled): 0.9
[INFO] Penalize repeat sequence of tokens: 1.1
[INFO] presence penalty (0.0 = disabled): 0
[INFO] frequency penalty (0.0 = disabled): 0
[INFO] Use default system prompt
[INFO] Prompt template: Llama3Chat
[INFO] Log prompts: false
[INFO] Log statistics: false
[INFO] Log all information: false
ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no
ggml_cuda_init: CUDA_USE_TENSOR_CORES: yes
ggml_cuda_init: found 1 CUDA devices:
Device 0: NVIDIA GeForce RTX 4060 Laptop GPU, compute capability 8.9, VMM: yes
[INFO] Plugin version: b2694 (commit 0d56246f)
================================== Running in interactive mode. ===================================
- Press [Ctrl+C] to interject at any time.
- Press [Return] to end the input.
- For multi-line inputs, end each line with '\' and press [Return] to get another line.
[Bot]:
Hello! How can I help you today?
You:
Does the error still occur after you appending WASMEDGE_PLUGIN_PATH=/home/wwl/.wasmedge/plugin
before the wasmedge
command?
[2024-04-23 11:29:01.779] [error] instantiation failed: module name conflict, Code: 0x60
[2024-04-23 11:29:01.779] [error] At AST node: module
[2024-04-23 11:29:01.779] [error] instantiation failed: module name conflict, Code: 0x60
[2024-04-23 11:29:01.779] [error] At AST node: module
[2024-04-23 11:29:01.779] [error] instantiation failed: module name conflict, Code: 0x60
[2024-04-23 11:29:01.779] [error] At AST node: module
The above error still exists, but "llama3" already works fine.
I cannot reproduce the error on our runner. In the environment, if there is no other wasmedge installation and the plugin is installed in the HOME directory (~/.wasmedge/plugin
). The WASMEDGE_PLUGIN_PATH
does not need to be set.
OS:
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 22.04.4 LTS
Release: 22.04
Codename: jammy
Arch:
x86_64
GPU:
Tue Apr 23 11:32:38 2024
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 545.23.08 Driver Version: 545.23.08 CUDA Version: 12.3 |
|-----------------------------------------+----------------------+----------------------+
| GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+======================+======================|
| 0 NVIDIA GeForce GTX 1080 On | 00000000:01:00.0 Off | N/A |
| 0% 28C P0 57W / 220W | 1MiB / 8192MiB | 0% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+
+---------------------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=======================================================================================|
| No running processes found |
+---------------------------------------------------------------------------------------+
The module name conflict is a false positive known error. It can be ignored. Do you have multiple wasmedge installed on your environment?
Yes, I have multiple wasmedge installations in the terminal, but all in different locations.