feature request - Add conversation Support
kjjuno opened this issue · comments
Kevin Johnson commented
I would like to be able to have an iterative conversation with the AI. This is supported with the web UIs for ollama already.
Example:
user
: Write a lambda in typescript that returns "Hello World"
AI
: <lambda code>
user
: Can modify the lambda to get the response message from the process_body
method?
AI
: <modified lambda code>
Dr. David A. Kunz commented
Hi @kjjuno ,
That's an excellent suggestion and I also thought about this.
Maybe one could add a custom command
:GenPrompt <some prompt>
which sends another prompt and appends the result to the current buffer.
However, that would change the ollama
invocation, I need to think about this.
Kevin Johnson commented
Will be fixed by #36