BUG: `context_length_exceeded` issue & `Max History Message Size` setting DOES NOT seem to WORK
ailoha opened this issue · comments
What operating system are you using?
Mac
What browser are you using?
Safari, Chrome
Describe the bug
When I use the OpenAI API to chat for a period of time, the problem of exceeding the token limit occurs. Even after clearing all history, the problem STILL exists.
When switching to another newly created blank conversation, it seems that the historical messages of other conversations will also be sent together. The error that the previous conversation exceeded the token limit will affect this new conversation.
The Max History Message Size
setting DOES NOT seem to take effect.
![截屏2023-09-15 下午12 06 00](https://private-user-images.githubusercontent.com/134615514/268169309-9a4bfc41-f93e-4f86-9dd4-dcb9950bf5ae.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MjI2MDU2NDcsIm5iZiI6MTcyMjYwNTM0NywicGF0aCI6Ii8xMzQ2MTU1MTQvMjY4MTY5MzA5LTlhNGJmYzQxLWY5M2UtNGY4Ni05ZGQ0LWRjYjk5NTBiZjVhZS5wbmc_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwODAyJTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDgwMlQxMzI5MDdaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1hZTk2NDkyNDM4MGZhNDhiY2VjMTBlOGQzYzBlMGIyYTdlYTgwZWIzZGZiOGYwNDljOGY1ZGIwM2MyMDQ4Y2ExJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.l2zVK5Vyrg5zBeORpRLq0NYENuDrrzSj06K2j1TDuQs)
What provider are you using?
OpenAI
What prompt did you enter?
(No matter what is ...)
Console Logs
Error: This model's maximum context length is 4097 tokens. However, you requested 16332 tokens (13 in the messages, 16319 in the completion). Please reduce the length of the messages or completion.
at (entry.mjs:63:6060)
at (entry.mjs:63:18548)
at (entry.mjs:63:18705)
at (entry.mjs:48:1281)
at (entry.mjs:58:56367)
at (entry.mjs:100:617)
Participation
- I am willing to submit a pull request for this issue.
gpt-3.5-turbo only support up to 4096 tokens. It is not concerning the MAX HISTORY MESSAGE SIZE. You just need to set MAX TOKENS to a value lower than 4096(e.g. 2048?), it will work.