In the world of Python packaging, the trend is moving to using pyproject.toml
to store all of the package's metadata as well as settings for different tools. Until recently, this trend meant adding another tool like Poetry, Flit or PDM instead of using the regular Setuptools (and maybe build). But with Setuptools 61.0.0, you can still use Setuptools and have a modern Python package.
Why?
- Avoid having a plethora of Python package managers installed. This is more noticeable when some are wrappers or replacements of virtualenv and then you need to install them globally.
- With Setuptools you keep close to the state of PEP 621 and your
pyproject.toml
generic
Why not?
- You still need to manage virtual environments manually.
- You need a very recent version of Setuptools. At the time of writing this post I didn't find a distribution that had a new enough version of Setuptools packaged.
- You need a supported version of Python. Currently this means Python 3.7 or later.
- You can get very close with an equivalent
setup.cfg
and get backwards compatibility.
How?
I created a sample Python project that has everything you need to get started. It's tested to make sure a valid and working package, that editable installation work, etc. You can skip directly to the pyproject.toml file.
A few notes. The only part that's Setuptools specific is specifying the attribute for getting the version dynamically. You can specify the version explicitly and be totally generic. The other settings that can be set dynamically need to be read from other files which isn't great. I would like to see the description be an attribute like version so I can point it to the docstring like Flit does. For editable installations you still need a setup.py
file but it's a 2 line file that calls setup()
. You don't need a setup.cfg
file at all.
Looking forward
I'm pretty sure that once PEP 621 is approved, other package managers will adapt and tool specific configuration will be dropped. Also, support for using pyproject.toml
in Setuptools will stop being experimental. I think that once that's done and the only difference will be the 2 lines specifying the build system we will see projects that want easier onboarding adopt Setuptools to avoid forcing more tools on users. Over time distributions will package newer versions of Setuptools and you will be able to have a working Python environment just with the distribution packages.
Appendix - PEP 582
Another PEP that's yet to be approved is PEP 582. This propsal is for a package directory that's local to a project (in the spirit of node_modules
). The reason I mention it here is that this removes the need for virtual environments (and thus the other major selling point for the different package managers). Now, while it's not been approved and there's no version of Python that supports it out of the box, we can mimic it with a little creative shell scripting:
PYTHON_VERSION="$(python3 -c 'from sys import version_info as v; print(f"{v[0]}.{v[1]}")')" export PYTHONPATH="$PWD/__pypackages__/$PYTHON_VERSION:${PYTHONPATH:-}" export PATH="$PWD/__pypackages__/$PYTHON_VERSION/bin:$PATH"
With the above snippet we point Python to load packages from the __pypackages__
directory and add the bin
directory to the PATH
so executables are available. Lastly to install packages in that location we run python3 -m pip install -t "__pypackages__/$PYTHON_VERSION" foo
. I use direnv quite a lot and I have the following snippet in a few projects:
PYTHON_VERSION="$(python3 -c 'from sys import version_info as v; print(f"{v[0]}.{v[1]}")')" export PYTHONPATH="$PWD/__pypackages__/$PYTHON_VERSION:${PYTHONPATH:-}" export PATH="$PWD/__pypackages__/$PYTHON_VERSION/bin:$PATH" python3 -m pip install "__pypackages__/$PYTHON_VERSION" -e .
This performs an editable installation of the package and all it dependencies, so from that moment on I'm set and I don't any other package manager or even a virtual environment.