[Roadmap] Ollama python API - enable 'format' parameter use
sealad886 opened this issue · comments
Why
Models with Ollama tend to do better overall when the format='json'
argument is set in the API request. We can update the prompt, and certain models will do quite well with that. But would be good to have this available to configure, especially with hwo I've been using Beam lately to evaluate a number of models serially on various tasks.
Description
The only supported value for format
currently is format='json'
within the API call (see here for that note), but sounds like it may be extended in the future?
Request is to have a drop-down menu on this config page perr model:
This one-item drop-down menu could be extended in the future. Default value = None. Alt-text = "Set to the format that you are also requesting explicitly in your system or user prompt."
Functionally would only be used to populate that API parameter.
Requirements
- Ollama config page UI change
- Save new field to config files
- Update Ollama API call
No change to how output is processed, necesasarily.
Sorry, this note in the API parameters documentation is actually where it says json
is the onlly supported value now.
Thanks @sealad886 this has been implemented as a "Source" parameter, meaning you'll toggle it once for this source and not the model.