dirk1983 / chatgpt

演示站现在可以免费使用ChatGPT对话和画图了。全网最易部署,响应速度最快的ChatGPT环境。PHP版调用OpenAI接口进行问答和画图,采用Stream流模式通信,一边生成一边输出。前端采用EventSource,支持Markdown格式解析,支持公式显示,代码有着色处理,支持画图。页面UI简洁,支持上下文连续会话。源码只有几个文件,没用任何框架,支持所有PHP版本,全部开源,极易二开。保姆级教程,账号等周边资源,欢迎进群交流,一切全免费。

Home Page:https://mm1.ltd

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Error message: This model's maximum context length is 4096 tokens

oleteacher opened this issue · comments

commented

Installed latest release/commit.

Now getting this error at random times. If reload page, all is good and working:

Q: Install the OpenAI API package for PHP
A: Server returned an error message:This model's maximum context length is 4096 tokens. However, you requested 4222 tokens (1222 in the messages, 3000 in the completion). Please reduce the length of the messages or completion.

The json error is fixed but above error requires page reload to start over.

Sorry I must post in English; I do not speak your language but love your work!

Thank you for your compliment.
The issue you mentioned is mainly about the length of tokens. OpenAI API can support up to 4096 tokens, including the question, the context, and the result returned by the API. Submitting questions that exceed this length will prompt the error mentioned above.

commented

Thanks for reply.

The issue is, one has to relod the whole page to reset and get rid of that message. Entering the new prompt, results in message reappearing. Once the error appears, it will not go away until page is reloaded.

Never seen this issue with any others tried, guess it is just associated with yours.

Wishing you a super day!