enricoros / big-AGI

Generative AI suite powered by state-of-the-art models and providing advanced AI/AGI functions. It features AI personas, AGI functions, multi-model chats, text-to-image, voice, response streaming, code highlighting and execution, PDF import, presets for developers, much more. Deploy on-prem or in the cloud.

Home Page:https://big-agi.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Make a checkbox for models to "remove" max response (don't provide this var when making the LLM call), so you'll get what ever output comes out without any restriction, and this is what i would expect from a model.

enricoros opened this issue · comments

Make a checkbox for models to "remove" max response (don't provide this var when making the LLM call), so you'll get what ever output comes out without any restriction, and this is what i would expect from a model.