LlamaEdge / LlamaEdge

The easiest & fastest way to run customized and fine-tuned LLMs locally or on the edge

Home Page:https://llamaedge.com/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

bug: invalid value 'mixtral-instruct' for '--prompt-template <TEMPLATE>'

vicnaum opened this issue · comments

Summary

I'm not sure if a separate instruction template for Mixtral exists, or it should just use mistral-instruct, but anyways - I'm getting the error.

Replacing it with mistral-instruct makes everything work.

Reproduction steps

  1. Run the url
  2. Choose 35) Mixtral-8x7B-Instruct-v0.1
  3. Choose CLI
  4. Logs yes or no - doesn't matter
  5. Get an error

Screenshots

DESCRIPTION

Any logs you want to share for showing the specific issue

+ wasmedge --dir .:. --nn-preload default:GGML:AUTO:mixtral-8x7b-instruct-v0.1.Q4_0.gguf llama-chat.wasm --prompt-template mixtral-instruct
error: invalid value 'mixtral-instruct' for '--prompt-template <TEMPLATE>'
  [possible values: llama-2-chat, codellama-instruct, mistral-instruct-v0.1, mistral-instruct, mistrallite, openchat, belle-llama-2-chat, vicuna-chat, vicuna-1.1-chat, chatml, baichuan-2, wizard-coder, zephyr, intel-neural, deepseek-chat, deepseek-coder, solar-instruct]

  tip: a similar value exists: 'mistral-instruct'

For more information, try '--help'.
+ set +x

Model Information

Mixtral-8x7B-Instruct-v0.1

Operating system information

macOS 13.4

ARCH

x86_64

CPU Information

Intel Core i9 - 2,4 GHz 8-Core

Memory Size

64Gb

GPU Information

AMD Radeon Pro 5500M

VRAM Size

4GB

@vicnaum Thanks for the report. The issue is caused by the wrong prompt template. Will release v0.2.5 today, which will fix the typo. Once the release is published, please update the run-llm.sh on your side with the new one. Thanks a lot!