Connections aren't being closed
dougli opened this issue · comments
The library uses requests
to fetch data from OpenAI servers, but we never call .close()
on those connections which leaves dangling open file handles on the OS.
Can we clean these connections up gracefully, either after a period of time, or after making a request? There is an advantage to leaving it open as it doesn't have to do the HTTPS handshake on every request, but we should clean this up properly after a period of idleness.
This bug interacts very poorly with another bug in Docker Desktop -- moby/vpnkit#587; the openai
library stops working entirely after running it in a Docker container on Mac / Windows for a few minutes. All requests to OpenAI servers time out indefinitely.
Hi @dougli!
Thanks for writing in. I'll surface this to the team and get it on our roadmap. That said, we happily accept PRs if you'd like to take a shot at fixing this.
Any pointers where the sockets are opened or where they should be closed? I'm running into this issue
In my case it helps to ignore the ResourceWarning by adding:
import warnings
warnings.simplefilter("ignore", ResourceWarning)
Doesn't fix the open filedescriptors, though
We have deployed a dockerized application in production which crashes periodically because of this. Over time more and more connections are being opened until the container crashes because of a memory error.