llm-edge / hal-9100

Edge full-stack LLM platform. Written in Rust

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How to use Mistral API end point?

failable opened this issue · comments

Hi,

I have the following settings in .env

MODEL_URL="https://api.mistral.ai/v1/chat/completions"
MODEL_API_KEY="...mistral_api_key..."

and

model: "mistral-tiny",

in the quickstart.js example.

But I can not get it work with this error

GETTING RUN:  {
  "id": "99c9b319-c88c-4fd7-87a8-10fa8a8b1187",
  "object": "",
  "created_at": 1704542367,
  "thread_id": "ceda1e03-eb4c-4870-b5a0-5834992a6c21",
  "assistant_id": "ba542f0c-e480-40a6-b160-3fe457365afb",
  "status": "failed",
  "required_action": null,
  "last_error": {
    "code": "server_error",
    "message": "Failed to decide tool: Unknown model"
  },
  "expires_at": null,
  "started_at": null,
  "cancelled_at": null,
  "failed_at": 1704542367,
  "completed_at": null,
  "model": "",
  "instructions": "You are a weather bot. Use the provided functions to answer questions.",
  "tools": [],
  "file_ids": [],
  "metadata": {}
}

Can I use mistral api end point right now? Thanks.

Ah, looking into the Rust code, it seems I should use `model: "some_prefix/mistral-tiny" format for the model. Sorry for disturbing.

@liebkne sorry about that - it's a bit of a hack atm, until better solution is found