zeozeozeo / ellama

Friendly interface to chat with an Ollama instance.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

πŸ¦™ Ellama Ellama Stars

Ellama is a friendly interface to chat with a local or remote Ollama instance.

Ellama, a friendly Ollama interface, running LLaVA

πŸ¦™ Features

  • Chat History: create, delete and edit model settings per-chat.
  • Multimodality: easily use vision capabilities of any multimodal model, such as LLaVA.
  • Ollama: no need to install new inference engines, connect to a regular Ollama instance instead.
  • Resource Efficient: minimal RAM and CPU usage.
  • Free: no need to buy any subscriptions or servers, just fire up a local Ollama instance.

πŸ¦™ Quickstart

  1. Download the latest Ellama release from the Releases page.
    • or build & install from source:
      $ git clone https://github.com/zeozeozeo/ellama.git
      $ cd ellama
      $ cargo install --path .
  2. In the Settings βš™οΈ tab, change the Ollama host if needed (default is http://127.0.0.1:11434)
  3. In the same tab, select a model that will be used for new chats by default. Ellama will try to select the best model on the first run.
  4. Close the Settings tab, create a new chat by pressing the "βž• New Chat" button, and start chatting!
  5. To add images, click the βž• button next to the text field, drag them onto Ellama's window, or paste them from your clipboard.

πŸ¦™ Gallery

GASLIGHT.MP4

Ellama's greeting screen

LLaVA counting people, in Ellama

Ellama's settings panel

Ellama's chat edit panel

πŸ¦™ Wishlist

These features are not yet present in Ellama, but they would be nice to have:

  • Support OpenAI-Compatible APIs: currently only has Ollama support
  • A "Notes" section, where you can edit and write LLM-asissted notes
  • Publish on crates.io: currently still relies on some git dependencies

License

MIT OR Apache-2.0

About

Friendly interface to chat with an Ollama instance.

License:Apache License 2.0


Languages

Language:Rust 100.0%