[Question] Best practice to install additional packages to the image
mrtolkien opened this issue · comments
The usually recommended best practice when doing python development with Docker is to use a local virtual environment to compute dependencies.
For example, one would use poetry
, add new packages with poetry add
, and compute transitive dependencies with poetry export -o requirements.txt
to ensure that the build is full reproducible and that the Docker image does not need poetry.
But this seems backwards to me here as the Docker image already has packages installed. If poetry
does dependencies resolution in a fresh, local virtual environment, it might create conflicts when trying to install them in the environment existing in the Docker image.
Is there a better approach to dependencies resolution? If not, I might start working on a 100% Docker based dependencies resolver as it sounds like how it should be done.
Yep, you would probably have your dependencies defined outside. If you do, when installing while building the Docker image those would override the default ones. The default ones are just a convenience to simplify things, but you are probably much better off defining the dependencies outside.
Also, there's a high chance that you don't even need this Docker image: https://github.com/tiangolo/uvicorn-gunicorn-fastapi-docker#-warning-you-probably-dont-need-this-docker-image
Anyway, thanks for coming back and closing the issue! ☕
Sorry for the long delay! 🙈 I wanted to personally address each issue/PR and they piled up through time, but now I'm checking each one in order.