nomic-ai / gpt4all

gpt4all: run open-source LLMs anywhere

Home Page:https://gpt4all.io

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Feature] python: Provide a way get or use the max context size supported by the LLM

Ben-Epstein opened this issue · comments

Feature Request

Currently, the default context is always set to 2048, but many models have a larger context window. I tried to showcase a simple first step by having the context limit as a part of the models3.json, but that was rejected and requested that I open an issue instead.

According to @cebtenzzre ref this is already done in the GUI.

Request: Set the default context length in the bindings to None, and if not set, dynamically set the context length based on the chosen model