MAX_TOKEN needs to be parameterized to support various models
G-Hung opened this issue · comments
Describe the bug
https://github.com/sqlchat/sqlchat/blob/main/src/components/ConversationView/index.tsx#L26-L28
Currently, the MAX_TOKEN is hardcoded as 4000, which is almost the context limit of gpt-3.5-turbo [4096]
However, GPT4 can support 8192 tokens, which is double the MAX_TOKEN, the app can handle larger context in GPT4 if MAX_TOKEN is adjusted accordingly
reference: https://platform.openai.com/docs/models/overview
To reproduce
As above
Additional context
No response
Fixed in 088f090