Error message: This model's maximum context length is 4096 tokens
oleteacher opened this issue · comments
Installed latest release/commit.
Now getting this error at random times. If reload page, all is good and working:
Q: Install the OpenAI API package for PHP
A: Server returned an error message:This model's maximum context length is 4096 tokens. However, you requested 4222 tokens (1222 in the messages, 3000 in the completion). Please reduce the length of the messages or completion.
The json error is fixed but above error requires page reload to start over.
Sorry I must post in English; I do not speak your language but love your work!
Thank you for your compliment.
The issue you mentioned is mainly about the length of tokens. OpenAI API can support up to 4096 tokens, including the question, the context, and the result returned by the API. Submitting questions that exceed this length will prompt the error mentioned above.
Thanks for reply.
The issue is, one has to relod the whole page to reset and get rid of that message. Entering the new prompt, results in message reappearing. Once the error appears, it will not go away until page is reloaded.
Never seen this issue with any others tried, guess it is just associated with yours.
Wishing you a super day!