From the archives of my blog but still very relevant: "pytz: The Fastest Footgun in the West", about why you probably shouldn't be using pytz:
Doing reviews is also incredibly valuable, IMO, which is why I always take them seriously. Looking at someone’s work and then figuring out what could be improved and how to justify why it’s an improvement teaches you so much about how to produce that kind of work.
You can apply the skills you learn from reviewing to your own work, but you can’t gain quite the same skills by critical reviews of your own work.
I think the first draft of my #PyConfHyd2020 keynote is the best first pass at a talk I’ve ever written, and I feel like implementing the feedback I’ve gotten from the few people who’ve already seen it has made it so much better.
Good reviews are so valuable.
With the right implementation in setuptools, 90% of packages will start cutting releases with reliable dependency metadata without any action needed by their maintainers.
You can ensure that your package will have properly annotated reliable metadata by either:
Specifying install_requires in setup.cfg or by using literals in your http://setup.py. If you have conditional dependencies, use environment markers:
For some background on one problem this solves, see @email@example.com ‘s 2018 article “Why PyPI Doesn’t Know Your Project’s Dependencies”: https://dustingram.com/articles/2018/03/05/why-pypi-doesnt-know-dependencies/
PEP 643 can’t 100% fix this, but it makes it possible for a project to indicate that it doesn’t have this failure mode.
I am happy to announce that I have accepted PEP 643: Metadata for Package Source Distributions, which has the potential to dramatically simplify Python package metadata resolution in the future. https://python.org/dev/peps/pep-0643/
Soon you may be able to build reliable dependency graphs!
Finding old summaries of the history of X, then following any archive links (or just believing the summary) helps somewhat.
boost from birdsite, Python, packaging
pyfound.blogspot.com/2020/11/pip-20-3-new-resolver.html pip 20.3 is out. See https://pip.pypa.io/en/latest/user_guide/#changes-to-the-pip-dependency-resolver-in-20-3-2020 for what's new (including the dependency resolver) and how to migrate. Thanks @`ChanZuckerberg and @`mozilla for funding!
As an interesting aside, it seems that Python itself was on PyPI at the time (2004-04-13): https://web.archive.org/web/20040413032446/http://www.python.org/pypi?:action=display&name=Python&version=2.3.2
PyPI went online in late 2002, but easy_install wasn’t released until 2004.
Does anyone know how people installed stuff from PyPI before then? Did you download an sdist and unzip it manually?
I don’t even see a download link on this wayback snapshot: https://web.archive.org/web/20031101220800/http://www.python.org/pypi?:action=display&name=docutils&version=0.3
Image #denoising with FFmpeg *without destroying detail*.
1.) Use the nlmeans filter. No, don't even look at hqdn3d unless you want a blurry mess.
2.) Crop out small parts of the image containing noise for testing (nlmeans can be very slow). 640x640 is okay.
3.) Check if you have noise in your chroma via the extractplanes filter.
4.) Set the strength to 2 for heavy noise and 1 for light noise. Higher strengths usually result in detail loss and oil paintings.
5.) Tune the patch parameter. This is your noise period. Get the period wrong and you'll be filtering in the wrong frequencies. If you have noise in your chroma as well, set the `pc` and `rc` options. If your chroma is subsampled, start from half your luma patch size.
6.) Bump up the research window. This is your quality parameter. Higher is generally better.
You’ll know you’ve made it when you overhear this in a café:
“People were able to Photoshop teeth onto stuff in the past, this is nothing new! Heck, image editing has been around almost as long as images!”
“It’s a matter of scale! Kids today can see anything with human teeth!”
Northern Flicker at my feeder the other day.
These are beautiful birds — and they are even more colorful in flight, because they have yellow-shafted feathers and a yellow underside.
I’ve only seen them at my feeder twice, and they got scared off pretty quickly when they saw me both times.
Programmer working at Google in NYC. Maintainer of python-dateutil, Python core developer and general FOSS contributor. Opinions are my own.
QOTO: Question Others to Teach Ourselves. A STEM-oriented instance.
No hate, No censorship. Be kind, be respectful
We federate with all servers: we don't block any servers.