amitkot / raycast_ollama

Raycast extention for Ollama

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Raycast Ollama

Use Ollama for local llama inference on Raycast.

Requirements

  1. Ollama installed and running.
  2. At least one model installed. Use 'Manage Models' commands for pulling images or ollama cli.
ollama pull orca-mini
ollama pull llama2

Use a different model

This plugin allows you to select a different model for each command. Keep in mind that you need to have the corresponding model installed on your machine. You can find all available model here.

Create your own custom commands

With 'Create Custom Command' you can create your own custom command or chatbot using whatever model you want.

About

Raycast extention for Ollama

License:MIT License


Languages

Language:TypeScript 100.0%