yvann-ba / Robby-chatbot

AI chatbot 🤖 for chat with CSV, PDF, TXT files 📄 and YTB videos 🎥 | using Langchain🦜 | OpenAI | Streamlit ⚡

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Maximum context length error

deedeeharris opened this issue · comments

Hey,
Thanks for sharing your project.

I tried uploading a few csv files, though with all them I got the following error:
Error: This model's maximum context length is 4097 tokens. However, your messages resulted in 6317 tokens. Please reduce the length of the messages.

That's after uploading the file, and prompting "Hello".

Does your script chunk the uploaded csv file?

Thanks,
Yedidya

commented

Hey, the CSVLoader cuts each line separately, maybe if you have a lot of text in one line the error can appear, I never had this problem, do you have this error even for a smaller file like the sample file?

Thanks for the quick response.
A sample csv works. I guess you're write, my records have multiple features with a lot of a text. Do you have an idea how to solve this?

commented

Yes, you can chunk after csvloader, it will index less well because some rows will be cut and therefore separated but apart from that I do not see how it would be possible, or else just do without csvloader and use directly CharacterTextSplitter by setting a limit of tokens to 4000