e-s-a-i / obsidian-ollama

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

🦙 Obsidian Ollama Chat

This plugin allows you to ask your local LLM about your own notes.

Requirements:

Indexing is slow and hard to do in JS. Therefore you will need to run a lightweight python server to do the indexing for you next to your ollama.

TODO

Features:

  • Run your own model locally. Set the URL to this model and you roll
  • Index your files on startup and on file modification
  • Open a modal by shortcut or command to ask your question

Future plans:

  • Text streaming when queriing to the LLM
  • Chat window for chat style communication insdead of query.
  • Add commands for useful queries to quick ask them like:
    • Summarize note
    • Summarize topic

Any feature recommendation is welcome.

About

License:MIT License


Languages

Language:TypeScript 69.3%Language:JavaScript 18.7%Language:CSS 12.1%