simonw / llm-mistral

LLM plugin providing access to Mistral models using the Mistral API

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Initial prototype

simonw opened this issue · comments

Got this working:

llm -m mistral-tiny '1 short reason to get a pet pelican'               

A pelican makes for an intriguing and unique pet due to its distinctive appearance and interesting behaviors. They are social animals that form tight communities, which can provide companionship for their owners. Additionally, pelicans have a gentle and inquisitive nature, making them an enjoyable addition to any home. However, it is important to note that owning a pelican as a pet is not feasible for most people due to their large size, specialized dietary needs, and legal restrictions.

llm -m mistral-small '1 short reason to get a pet pelican'

One potential reason to get a pet pelican is that they are highly intelligent and can be trained to perform tricks or tasks. However, it's important to note that pelicans are wild animals and may not be suitable as pets in all circumstances. They require a lot of space, a specialized diet, and specific care that may be difficult for some people to provide. Additionally, in many places it is illegal to keep pelicans as pets without a permit. It's always best to research and consider the needs and welfare of the animal before deciding to keep any wild animal as a pet.

llm -m mistral-medium '1 short reason to get a pet pelican'

One short reason to get a pet pelican is that they are unique and fascinating birds with distinct personalities and can provide companionship and entertainment. However, it is important to note that pelicans are wild animals and require specialized care, housing, and permits to keep legally as pets in most places. They are not suitable for everyone and should only be considered by experienced and dedicated caretakers.

I added conversation support, so llm chat -m mistral-tiny now works:

llm chat -m mistral-tiny
Chatting with mistral-tiny
Type 'exit' or 'quit' to exit
Type '!multi' to enter multiple lines, then '!end' to finish
> hello who are you
Hello there! I'm an artificial intelligence designed to assist with various tasks and answer questions. How can I help you today?
> what is a pelican?
A pelican is a large water bird that belongs to the order Pelecaniformes and the family Pelecanidae. Pelicans have a distinctive pouch under their beak which they use to scoop up fish and other prey from the water. They are found in various parts of the world, including North and South America, Europe, Africa, and Asia. Pelicans are known for their graceful flight and their social behavior, often seen in large groups, called colonies. There are eight different species of pelicans, with sizes ranging from 1.5 to 5 feet in length.
> bird or mammal?
A pelican is a bird, not a mammal. Birds are a group of animals characterized by their feathers, laying hard-shelled eggs, a beak without teeth, and a high metabolic rate to maintain a body temperature above their environment. Mammals, on the other hand, are a group of animals that have mammary glands for producing milk to feed their young, fur or hair covering their bodies, and three middle ear bones. Pelicans are obviously not mammals, as they do not have mammary glands or fur.

System prompts work too:

cat llm_mistral.py | llm -m mistral-medium -s 'explain this code'

This code appears to be for a language model (LLM) interface that uses the Mistral AI API to generate text responses. The Mistral AI API is a service that allows you to use pre-trained language models to generate text based on a prompt.

Here is a breakdown of the code:

  1. The httpx and httpx_sse libraries are imported to make HTTP requests and handle server-sent events (SSE), respectively.
  2. The llm library is imported to use the LLM interface for interacting with language models.
  3. The register_models function is a hook that registers the available models with the LLM interface. In this case, three instances of the Mistral model are registered with different model IDs: "mistral-tiny", "mistral-small", and "mistral-medium".
  4. The Mistral class is defined, which inherits from the llm.Model class. This class represents a language model that can be used through the LLM interface.
  5. The Mistral class has a can_stream attribute set to True, which indicates that it supports server-sent events for streaming responses.
  6. The __init__ method of the Mistral class takes a model_id argument, which is used to specify the ID of the model to use when generating text.
  7. The build_messages method takes a prompt object and a conversation object as arguments. It constructs a list of messages to send to the Mistral AI API based on the prompt and conversation objects.
  8. The execute method takes a prompt, stream, response, and conversation objects as arguments. It sends an HTTP POST request to the Mistral AI API with the messages constructed by the build_messages method. The API request includes an authorization header with a bearer token, which is obtained from the llm.get_key() function.
  9. The execute method uses the connect_sse() function from the httpx_sse library to handle server-sent events. If the API request is successful,Error: peer closed connection without sending complete message body (incomplete chunked read)