anse-app / anse

Supercharged experience for multiple models such as ChatGPT, DALL-E and Stable Diffusion.

Home Page:https://anse.app

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

BUG: `context_length_exceeded` issue & `Max History Message Size` setting DOES NOT seem to WORK

ailoha opened this issue · comments

commented

What operating system are you using?

Mac

What browser are you using?

Safari, Chrome

Describe the bug

When I use the OpenAI API to chat for a period of time, the problem of exceeding the token limit occurs. Even after clearing all history, the problem STILL exists.

When switching to another newly created blank conversation, it seems that the historical messages of other conversations will also be sent together. The error that the previous conversation exceeded the token limit will affect this new conversation.

The Max History Message Size setting DOES NOT seem to take effect.

截屏2023-09-15 下午12 06 00

What provider are you using?

OpenAI

What prompt did you enter?

(No matter what is ...)

Console Logs

Error: This model's maximum context length is 4097 tokens. However, you requested 16332 tokens (13 in the messages, 16319 in the completion). Please reduce the length of the messages or completion.
at (entry.mjs:63:6060)
at (entry.mjs:63:18548)
at (entry.mjs:63:18705)
at (entry.mjs:48:1281)
at (entry.mjs:58:56367)
at (entry.mjs:100:617)

Participation

  • I am willing to submit a pull request for this issue.

gpt-3.5-turbo only support up to 4096 tokens. It is not concerning the MAX HISTORY MESSAGE SIZE. You just need to set MAX TOKENS to a value lower than 4096(e.g. 2048?), it will work.