enricoros / big-AGI

Generative AI suite powered by state-of-the-art models and providing advanced AI/AGI functions. It features AI personas, AGI functions, multi-model chats, text-to-image, voice, response streaming, code highlighting and execution, PDF import, presets for developers, much more. Deploy on-prem or in the cloud.

Home Page:https://big-agi.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Roadmap] Ollama python API - enable 'format' parameter use

sealad886 opened this issue · comments

Why
Models with Ollama tend to do better overall when the format='json' argument is set in the API request. We can update the prompt, and certain models will do quite well with that. But would be good to have this available to configure, especially with hwo I've been using Beam lately to evaluate a number of models serially on various tasks.

Description
The only supported value for format currently is format='json' within the API call (see here for that note), but sounds like it may be extended in the future?

Request is to have a drop-down menu on this config page perr model:
image
This one-item drop-down menu could be extended in the future. Default value = None. Alt-text = "Set to the format that you are also requesting explicitly in your system or user prompt."
Functionally would only be used to populate that API parameter.

Requirements

  • Ollama config page UI change
  • Save new field to config files
  • Update Ollama API call

No change to how output is processed, necesasarily.

Sorry, this note in the API parameters documentation is actually where it says json is the onlly supported value now.

Thanks @sealad886 this has been implemented as a "Source" parameter, meaning you'll toggle it once for this source and not the model.

To turn it on:
image

Then you can use it:
image