Ulov888 / chatpdflike

an approximate implementation similar to chatpdf

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[bug] 多轮会话导致tokens超载

ac1982 opened this issue · comments

commented

openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens, however you requested 4658 tokens (3158 in your prompt; 1500 for the completion). Please reduce your prompt; or completion length.

commented

The reason for this error is that gpt3-turbo limits the input length to 4096 tokens, you can modify the length of max_tokens in the response function., but it maybelimit the length of your generated text.