Raycast Ollama
Use Ollama for local llama inference on Raycast.
Requirements
- Ollama installed and running.
- At least one model installed. Use 'Manage Models' commands for pulling images or ollama cli.
ollama pull orca-mini
ollama pull llama2
Use a different model
This plugin allows you to select a different model for each command. Keep in mind that you need to have the corresponding model installed on your machine. You can find all available model here.
Create your own custom commands
With 'Create Custom Command' you can create your own custom command or chatbot using whatever model you want.