Your intuition is correct - it can be achieved through separate requirements files.
I usually have 4 of them (2 maintained by hand, 2 generated automatically):
requirements.in - list of packages that you want to include in the final build, e.g. "django". You can be even more specific: "django<3.0"
requirements.txt - pinned versions - this will be generated by the pip-tools, so you never touch this file by hand, e.g. "django==2.2.2"
requirements-dev.in- packages used by developers (not included in the final build), e.g.: "pytest"
requirements-dev.txt - pinned versions of packages for developers generated by pip-tools.
You add packages to the *.in files, generate the *.txt files with, let's say a Makefile command, and use them to install packages on the destination servers.
Your intuition is correct - it can be achieved through separate requirements files.
I usually have 4 of them (2 maintained by hand, 2 generated automatically):
requirements.in
- list of packages that you want to include in the final build, e.g. "django". You can be even more specific: "django<3.0"requirements.txt
- pinned versions - this will be generated by the pip-tools, so you never touch this file by hand, e.g. "django==2.2.2"requirements-dev.in
- packages used by developers (not included in the final build), e.g.: "pytest"requirements-dev.txt
- pinned versions of packages for developers generated by pip-tools.You add packages to the
*.in
files, generate the*.txt
files with, let's say a Makefile command, and use them to install packages on the destination servers.Awesome! Thanks for your help. I'll definitely try this as I feel like I'm swimming against the tide with pipenv.