Pinned toot

From the archives of my blog but still very relevant: "pytz: The Fastest Footgun in the West", about why you probably shouldn't be using pytz:

blog.ganssle.io/articles/2018/

Doing reviews is also incredibly valuable, IMO, which is why I always take them seriously. Looking at someone’s work and then figuring out what could be improved and how to justify why it’s an improvement teaches you so much about how to produce that kind of work.

You can apply the skills you learn from reviewing to your own work, but you can’t gain quite the same skills by critical reviews of your own work.

I think the first draft of my keynote is the best first pass at a talk I’ve ever written, and I feel like implementing the feedback I’ve gotten from the few people who’ve already seen it has made it so much better.

Good reviews are so valuable.

When PEP 621 is implemented in setuptools, using PEP 621 for your metadata spec will also work (and is probably the best option): python.org/dev/peps/pep-0621/

With the right implementation in setuptools, 90% of packages will start cutting releases with reliable dependency metadata without any action needed by their maintainers.

You can ensure that your package will have properly annotated reliable metadata by either:

Specifying install_requires in setup.cfg or by using literals in your setup.py. If you have conditional dependencies, use environment markers:

python.org/dev/peps/pep-0508/#

For some background on one problem this solves, see @di_codes@twitter.com ‘s 2018 article “Why PyPI Doesn’t Know Your Project’s Dependencies”: dustingram.com/articles/2018/0

PEP 643 can’t 100% fix this, but it makes it possible for a project to indicate that it doesn’t have this failure mode.

I am happy to announce that I have accepted PEP 643: Metadata for Package Source Distributions, which has the potential to dramatically simplify Python package metadata resolution in the future. python.org/dev/peps/pep-0643/

Soon you may be able to build reliable dependency graphs!

Finding old summaries of the history of X, then following any archive links (or just believing the summary) helps somewhat.

Any time I do historical research I feel like a pre-search engine internet user. I’d love to be able to search “the internet as of 2005” or something of that nature, but I think many pages from that era that still exist aren’t even indexed anymore!

boost from birdsite, Python, packaging

boosting @ThePyPA

pyfound.blogspot.com/2020/11/pip-20-3-new-resolver.html pip 20.3 is out. See pip.pypa.io/en/latest/user_gui for what's new (including the dependency resolver) and how to migrate. Thanks @ChanZuckerberg and @mozilla for funding!

As an interesting aside, it seems that Python itself was on PyPI at the time (2004-04-13): web.archive.org/web/2004041303

PyPI went online in late 2002, but easy_install wasn’t released until 2004.

Does anyone know how people installed stuff from PyPI before then? Did you download an sdist and unzip it manually?

Lately, I’ve been increasingly using Super + ↑↓→← to move my windows on a grid, but I’ve been frustrated by the lack of keyboard shortcuts to move them between monitors. Turns out you just need to do Shift + Super + ← / →:

Pro tip: If you’re ever in a book club, but you haven’t read the book, just say, “I thought the allegory for the Catholic Church was a bit ham-handed.”

Image #denoising with FFmpeg *without destroying detail*.
1.) Use the nlmeans filter. No, don't even look at hqdn3d unless you want a blurry mess.
2.) Crop out small parts of the image containing noise for testing (nlmeans can be very slow). 640x640 is okay.
3.) Check if you have noise in your chroma via the extractplanes filter.
4.) Set the strength to 2 for heavy noise and 1 for light noise. Higher strengths usually result in detail loss and oil paintings.
5.) Tune the patch parameter. This is your noise period. Get the period wrong and you'll be filtering in the wrong frequencies. If you have noise in your chroma as well, set the pc and rc` options. If your chroma is subsampled, start from half your luma patch size.
6.) Bump up the research window. This is your quality parameter. Higher is generally better.

You’ll know you’ve made it when you overhear this in a café:

“People were able to Photoshop teeth onto stuff in the past, this is nothing new! Heck, image editing has been around almost as long as images!”

“It’s a matter of scale! Kids today can see anything with human teeth!”

Startup idea: build an ML model that adds human teeth to any picture.

After launching your MVP, target enterprise customers with a model that adds human teeth to 3D models. Maybe some defense contracting adding teeth to predator drones.

Saw two crows attacking a hawk (I think red-tailed hawk) right above my house the other day. Harassing it and chasing it away.

Couldn’t get great pictures because the action was happening so fast, but it was pretty cool to see.

Northern Flicker at my feeder the other day.

These are beautiful birds — and they are even more colorful in flight, because they have yellow-shafted feathers and a yellow underside.

I’ve only seen them at my feeder twice, and they got scared off pretty quickly when they saw me both times.

Apparently this guy is watching his cholesterol — doesn’t want to eat any of these hard-boiled egg yolks…