A set of command line tools to help you keep your pip
-based packages fresh, even when you've pinned them. You do pin them, right? (In building your Python application and its dependencies for production, you want to make sure that your builds are predictable and deterministic.)
Similar to pip
, pip-tools
must be installed in each of your project's virtual environments:
$ source /path/to/venv/bin/activate
(venv)$ python -m pip install pip-tools
Note: all of the remaining example commands assume you've activated your project's virtual environment.
The pip-compile
command lets you compile a requirements.txt
file from your dependencies, specified in either setup.py
or requirements.in
.
Run it with pip-compile
or python -m piptools compile
. If you use multiple Python versions, you can run pip-compile
as py -X.Y -m piptools compile
on Windows and pythonX.Y -m piptools compile
on other systems.
pip-compile
should be run from the same virtual environment as your project so conditional dependencies that require a specific Python version, or other environment markers, resolve relative to your project's environment.
Note: ensure you don't have requirements.txt
if you compile setup.py
or requirements.in
from scratch, otherwise, it might interfere.
Suppose you have a Django project, and want to pin it for production. If you have a setup.py
with install_requires=['django']
, then run pip-compile
without any arguments:
$ pip-compile
#
# This file is autogenerated by pip-compile
# To update, run:
#
# pip-compile
#
asgiref==3.2.3 # via django
django==3.0.3 # via my_django_project (setup.py)
pytz==2019.3 # via django
sqlparse==0.3.0 # via django
pip-compile
will produce your requirements.txt
, with all the Django dependencies (and all underlying dependencies) pinned. You should put requirements.txt
under version control.
If you don't use setup.py
(it's easy to write one), you can create a requirements.in
file to declare the Django dependency:
# requirements.in
django
Now, run pip-compile requirements.in
:
$ pip-compile requirements.in
#
# This file is autogenerated by pip-compile
# To update, run:
#
# pip-compile requirements.in
#
asgiref==3.2.3 # via django
django==3.0.3 # via -r requirements.in
pytz==2019.3 # via django
sqlparse==0.3.0 # via django
And it will produce your requirements.txt
, with all the Django dependencies (and all underlying dependencies) pinned. You should put both requirements.in
and requirements.txt
under version control.
If you would like to use Hash-Checking Mode available in pip
since version 8.0, pip-compile
offers --generate-hashes
flag:
$ pip-compile --generate-hashes requirements.in
#
# This file is autogenerated by pip-compile
# To update, run:
#
# pip-compile --generate-hashes requirements.in
#
asgiref==3.2.3 \
--hash=sha256:7e06d934a7718bf3975acbf87780ba678957b87c7adc056f13b6215d610695a0 \
--hash=sha256:ea448f92fc35a0ef4b1508f53a04c4670255a3f33d22a81c8fc9c872036adbe5 \
# via django
django==3.0.3 \
--hash=sha256:2f1ba1db8648484dd5c238fb62504777b7ad090c81c5f1fd8d5eb5ec21b5f283 \
--hash=sha256:c91c91a7ad6ef67a874a4f76f58ba534f9208412692a840e1d125eb5c279cb0a \
# via -r requirements.in
pytz==2019.3 \
--hash=sha256:1c557d7d0e871de1f5ccd5833f60fb2550652da6be2693c1e02300743d21500d \
--hash=sha256:b02c06db6cf09c12dd25137e563b31700d3b80fcc4ad23abb7a315f2789819be \
# via django
sqlparse==0.3.0 \
--hash=sha256:40afe6b8d4b1117e7dff5504d7a8ce07d9a1b15aeeade8a2d10f130a834f8177 \
--hash=sha256:7c3dca29c022744e95b547e867cee89f4fce4373f3549ccd8797d8eb52cdb873 \
# via django
To update all packages, periodically re-run pip-compile --upgrade
.
To update a specific package to the latest or a specific version use the --upgrade-package
or -P
flag:
# only update the django package
$ pip-compile --upgrade-package django
# update both the django and requests packages
$ pip-compile --upgrade-package django --upgrade-package requests
# update the django package to the latest, and requests to v2.0.0
$ pip-compile --upgrade-package django --upgrade-package requests==2.0.0
You can combine --upgrade
and --upgrade-package
in one command, to provide constraints on the allowed upgrades. For example to upgrade all packages whilst constraining requests to the latest version less than 3.0:
$ pip-compile --upgrade --upgrade-package 'requests<3.0'
To output the pinned requirements in a filename other than requirements.txt
, use --output-file
. This might be useful for compiling multiple files, for example with different constraints on django to test a library with both versions using tox:
$ pip-compile --upgrade-package 'django<1.0' --output-file requirements-django0x.txt
$ pip-compile --upgrade-package 'django<2.0' --output-file requirements-django1x.txt
Or to output to standard output, use --output-file=-
:
$ pip-compile --output-file=- > requirements.txt
$ pip-compile - --output-file=- < requirements.in > requirements.txt
You might be wrapping the pip-compile
command in another script. To avoid confusing consumers of your custom script you can override the update command generated at the top of requirements files by setting the CUSTOM_COMPILE_COMMAND
environment variable.
$ CUSTOM_COMPILE_COMMAND="./pipcompilewrapper" pip-compile requirements.in
#
# This file is autogenerated by pip-compile
# To update, run:
#
# ./pipcompilewrapper
#
asgiref==3.2.3 # via django
django==3.0.3 # via -r requirements.in
pytz==2019.3 # via django
sqlparse==0.3.0 # via django
If you have different environments that you need to install different but compatible packages for, then you can create layered requirements files and use one layer to constrain the other.
For example, if you have a Django project where you want the newest 2.1
release in production and when developing you want to use the Django debug toolbar, then you can create two *.in
files, one for each layer:
# requirements.in
django<2.2
At the top of the development requirements dev-requirements.in
you use -c requirements.txt
to constrain the dev requirements to packages already selected for production in requirements.txt
.
# dev-requirements.in
-c requirements.txt
django-debug-toolbar
First, compile requirements.txt
as usual:
$ pip-compile
#
# This file is autogenerated by pip-compile
# To update, run:
#
# pip-compile
#
django==2.1.15 # via -r requirements.in
pytz==2019.3 # via django
Now compile the dev requirements and the requirements.txt
file is used as a constraint:
$ pip-compile dev-requirements.in
#
# This file is autogenerated by pip-compile
# To update, run:
#
# pip-compile dev-requirements.in
#
django-debug-toolbar==2.2 # via -r dev-requirements.in
django==2.1.15 # via -c requirements.txt, django-debug-toolbar
pytz==2019.3 # via -c requirements.txt, django
sqlparse==0.3.0 # via django-debug-toolbar
As you can see above, even though a 2.2
release of Django is available, the dev requirements only include a 2.1
version of Django because they were constrained. Now both compiled requirements files can be installed safely in the dev environment.
To install requirements in production stage use:
$ pip-sync
You can install requirements in development stage by:
$ pip-sync requirements.txt dev-requirements.txt
Now that you have a requirements.txt
, you can use pip-sync
to update your virtual environment to reflect exactly what's in there. This will install/upgrade/uninstall everything necessary to match the requirements.txt
contents.
Run it with pip-sync
or python -m piptools sync
. If you use multiple Python versions, you can also run py -X.Y -m piptools sync
on Windows and pythonX.Y -m piptools sync
on other systems.
pip-sync
must be installed into and run from the same virtual environment as your project to identify which packages to install or upgrade.
Be careful: pip-sync
is meant to be used only with a requirements.txt
generated by pip-compile
.
$ pip-sync
Uninstalling flake8-2.4.1:
Successfully uninstalled flake8-2.4.1
Collecting click==4.1
Downloading click-4.1-py2.py3-none-any.whl (62kB)
100% |................................| 65kB 1.8MB/s
Found existing installation: click 4.0
Uninstalling click-4.0:
Successfully uninstalled click-4.0
Successfully installed click-4.1
To sync multiple *.txt
dependency lists, just pass them in via command line arguments, e.g.
$ pip-sync dev-requirements.txt requirements.txt
Passing in empty arguments would cause it to default to requirements.txt
.
If you use multiple Python versions, you can run pip-sync
as py -X.Y -m piptools sync ...
on Windows and pythonX.Y -m piptools sync ...
on other systems.
Note: pip-sync
will not upgrade or uninstall packaging tools like setuptools
, pip
, or pip-tools
itself. Use pip install --upgrade
to upgrade those packages.
- pipdeptree to print the dependency tree of the installed packages.
requirements.in
/requirements.txt
syntax highlighting:- requirements.txt.vim for Vim.
- Python extension for VS Code for VS Code.