Feature Request: Read a file content from the prompt.
danisztls opened this issue · comments
It would be useful to have a way to read the contents a file directly to the prompt without resorting to copy paste.
e.g. /read notes.md
This feature is on top of my to-do list. I'm just wondering what's the best way to implement it: it may be a command line argument like ./chatgpt.py --context file.txt
or a command inside the prompt as you suggest or both...
Initially I though on piping stdout.
e.g cat notes.md | chatgpt
Which would be practical but how would we prompt it (the LLM)?
One possible way is to pass the content of the file to the LLM as a "system" message, according to the documentation at https://platform.openai.com/docs/guides/chat/introduction
However, they also say:
"gpt-3.5-turbo-0301 does not always pay strong attention to system messages. Future models will be trained to pay stronger attention to system messages."
That's not very inspiring. I need to test this a little bit...
After a few experiments with system messages, I think it works quite well. I have opted for implementing it as a standard command line option, so even Windows users can use it.
That's great!