David-Kunz / gen.nvim

Neovim plugin to generate text using LLMs with customizable prompts

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

feature request - Add conversation Support

kjjuno opened this issue · comments

I would like to be able to have an iterative conversation with the AI. This is supported with the web UIs for ollama already.

Example:

user: Write a lambda in typescript that returns "Hello World"

AI: <lambda code>

user: Can modify the lambda to get the response message from the process_body method?

AI: <modified lambda code>

Hi @kjjuno ,

That's an excellent suggestion and I also thought about this.

Maybe one could add a custom command

:GenPrompt <some prompt>

which sends another prompt and appends the result to the current buffer.

However, that would change the ollama invocation, I need to think about this.

Will be fixed by #36