Latest post is a big one: “Why you shouldn’t invoke setup​.py directly”

A lot of people don’t know about this because we haven’t been great about getting the word out. This blog post is in part an attempt to remedy this.

Please help spread the word!

Comments available on HN (on the front page now!), subreddits r/Python and r/programming and

Show thread

@pganssle I admit that Python's packaging and building story is the area of Python where I've struggled the most since starting with it. Reading your post shows me there's more work I need to do, but I do appreciate the effort of writing it all up in a place that I can come back to later. Thank you for writing it.

@pganssle one thing: so why bother with setuptools while it is in fact another "hot mess of legacy code"?

Do you have some examples (maybe it's another post?) of:

> These will generate out-of-date metadata files and packages based on old standards, holding back the evolution of the ecosystem.

(I have a distutils project of a C extension to move it to something better)

@saper In my experience, a lot of the new class of build tools are optimized for making life easier for projects on the “happy path” — simple scripts and pure python projects, etc.

For C extensions or other niche applications, the best supported option is probably still setuptools.

@saper As for the other question, the following paragraph, “If it ain’t broke?” explains how the old upload endpoint was uploading bad metadata.

There’s also PEP 643, which will really help clean up dependency resolution when setuptools gets supported for it if people are using recent versions of setuptools.

@pganssle thanks, yes I've seen this python version issue being described there, I was wondering if there is anything else. I am going to dust off this project, start producing wheels etc. and I need to make sure the metadata are correct even if the tooling is out of date.

I read the " test" issue before and I must say I understand why the situation is so heated now.

@pganssle I guess you must be tired of this, but let me share my personal frustration with you:

ca. 2020 I inherited (another) project and wanted to convert it to something resembling current state of affairs, modern setuptools, tox, etc. We used pyscaffold 3.x to help us with that and we ended up with a sane, almost-everything-in-setup.cfg setup with setuptools_scm. This all worked fine until had to introduce cryptography and this all was in vain since RHEL 8 pip is old /cont

@pganssle so we got caught in the cryptography rust surprise like probably many other people. So now I carved out something that makes sure modern setuptools, wheel, pip, C compilers, swig, rust and other stuff gets installed on top of the obsolete RHEL things, but now i read.... I should not install setuptools globally.

Pyscaffold 4 requires some very recent version of tox (because of one little thing) so we will need to overwrite RHEL-installed global tox too. Distro lag is a major issue...

@saper I think this is a general problem, and not one that’s easy to fix. Software is an ecosystem, and even stable ecosystems are in some form of dynamic equilibrium, with a constant ebb and flow that tends to wear away static elements.

How you want to orchestrate your builds really depends on you. All the distro packagers do install setuptools globally, but that’s because they are trying to maintain a separate packaging stack.

@saper If you are using the normal Python stack, you basically just have to bootstrap to the point where you can get some sort of isolated environment that you can run pip or some equivalent in.

Most of the reason distro folks don’t usually like doing that sort of thing is because of some notion of mitigating supply chain risks by always installing from “source”.

I think this is mostly security theater (though there’s some core of value to it), but python -m build came about mainly because the Arch folks wanted a simple standards-compliant way to bootstrap the build ecosystem.

@saper So you may want to try from that entry point — bootstrap build and use that to build wheels for your whole dependency tree yourself. You can do it in a chroot or a docker container or whatever isolated environment mechanism your build system uses to separate context-specific dependencies.

@pganssle also - big thanks for this post. It is very informative and #Python needs more stuff like this!

@pganssle That explains why upload did not work properly for >1year without a fix. Eventually I moved to twine -- thanks for the explanation! #python

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.