@monnier IANAA but wasn't there https://plaudit.pub ?
@0x0 used to operate a cool paste bin where you could use curl to upload snippets of things you wanted to share with your IRC buddies. Sadly, some AI fuckers decided to abuse the service.
I just started scrolling through the feed. Madness! It's total madnes.
A video of the scrolling access log.
https://movsw.0x0.st/notes/ajoros2ny6anlc6s
A message about privacy self-violation: "claude ALSO loves putting people’s real names and email addresses in the UA string because the frontpage tells it to" (plus screenshot)
https://movsw.0x0.st/notes/ajilplw9qzf2k2xj
Screenshot of a message saying: "Some software actually includes contact info in the string in case someone needs to talk to the author, and I think that’s a great idea, especially if you’re an AI agent and can take screenshots of sensitive data and trade secrets! 0x0.st is just the perfect place to upload anything that should stay confidential!"
https://movsw.0x0.st/notes/ajimdj92b3wdv5pq
The gravity of the situation: "…the clankers are uploading pictures of scanned ID cards too"
https://movsw.0x0.st/notes/ajxref6jq64fbqq9
The size of the problem: "i’m not running this site for 200 GB of AI slop and crypto scams every goddamn week"
https://movsw.0x0.st/notes/akb0jtm9iqyv7k09
Last note: "can’t run a service that relies on civilized society to function as long as techbros and californians aren’t hanged for their psychopathy"
https://movsw.0x0.st/notes/akbfe49sp4guk4cj
As somebody who has spent hours and days and weeks fighting the bots, I feel this so much. Sooo much! #ButlerianJihad
@freya I'm inspired by mesh from the time I heard about the efforts of the One Laptop Per child device.
There's a lot of old hardware and phones in the world.
@phf not to detract from your spot-on sentiment, but bullshit was a derogatory term overused in colonial times to demean Indian practices. As a blessed agrarian society, the dung of the Indian cow was literally sacred as prosperity depended on the versatile manure it became, and that made it a target for the colonizer.
I can't suggest a replacement word, since all animal shit is manure, and human fecal transplants are the epitome of allopathic intervention!
@doctormo are you underestimating the importance most people attach to the birthday reminder service that GOOG/FCBK provide?
@alex atleast Bengalis are vocal about it (atleast now). The Chinese have Chinese keyboards on laptops, so they are better off whatever the Unicode situation is. Even a ASCII equivalent of Chinese/Bengali/Hindi would do as an input method, and it is sort of there with the itrans system (albeit Unicode-based).
@corbet Can we stop using the propaganda language "machine-generated code" for this? It's copyright-laundered code of unknown origin.
@tante correct. burn the planet down from every desktop, now get to it
I’ve read a bunch of posts in the last few weeks that say ‘Moore’s Law is over’, not as their key point but as an axiom from which they make further claims. The problem is: this isn’t really true. A bunch of things have changed since Moore’s paper, but the law still roughly holds.
Moore’s law claims that the number of transistors that you can put on a chip (implicitly, for a fixed cost: you could always put more transistors in a chip by paying more) doubles roughly every 18 months. This isn’t quite true anymore, but it was never precisely true and it remains a good rule of thumb. But a load of related things have changed.
First, a load of the free lunches were eaten. Moore’s paper was written in 1965. Even 20 years later, modern processors had limited arithmetic. The early RISC chips didn’t do (integer) divide (sometimes even multiply) in hardware because you could these with a short sequence of add and shift operations in a loop (some CISC chips had instructions for these but implemented them in microcode). Once transistor costs dropped below a certain point, of course you would do them in hardware. Until the mid ‘90s, most consumer CPUs didn’t have floating-point hardware. They had to emulate floating point arithmetic in software. Again, with more transistors, adding these things is a no brainer: they make things faster because they are providing hardware for things that people were already doing.
This started to end in the late ‘90s. Superscalar out-of-order designs existed because just running a sequence of instructions faster was no longer something you got for free. Doubling the performance of something like an 8086 was easy. It wasn’t even able to execute one instruction per cycle and a lot of things were multi-instruction sequences that could become single instructions if you had more transistors, Once you get above one instruction per cycle with hardware integer multiply and divide and hardware floating point, doubling is much harder.
Next, around 2007, Dennard Scaling ended. Prior to this, smaller feature sizes meant lower leakage. This meant that you got faster clocks in the same power budget. The 100 MHz Pentium shipped in 1994. The 1 GHz Pentium 3 in 2000. Six years after that, Intel shipped a 3.2 GHz Pentium 4, which was incredibly power hungry in comparison. Since then, we haven’t really seen an increase in clock speed.
Finally, and most important from a market perspective, demand slowed. The first computers I used were fun but you ran into hardware limitations all of the time. There was a period in the late ‘90s and early 2000s when every new generation of CPU meant you could do new things. These were things you already had requirements for, but the previous generation just wasn’t fast enough to manage. But the things people use computers for today are not that different from the things they did in 2010. Moore’s Law outpaced the growth in requirements. And the doubling in transistor count is predicated on having money from selling enough things in the previous generation. The profits from the 7 nm process funded 4 nm, which funds 2 nm, and so on.
The costs of developing new processes has also gone up but this requires more sales (or higher margins) to fund. And we’ve had that, but mostly driven by bubbles causing people to buy very-expensive GPUs and similar. The rise of smartphones was a boon because it drove a load of demand: billions of smartphones now exist and have a shorter lifespan than desktops and laptops.
Somewhere, I have an issue of BYTE magazine about the new one micron process. It confidently predicted we’d hit physical limits within a decade. That was over 30 years ago. We will eventually hit physical limits, but I suspect that we’ll hit limits of demand being sufficient to pay for new scaling first.
The slowing demand is, I believe, a big part of the reason hyperscalers push AI: they are desperate for a workload that requires the cloud. Businesses compute requirements are growing maybe 20% year on year (for successful growing companies). Moore’s law is increasing the supply per dollar by 100% every 18 months. A few iterations of that and outsourcing compute stops making sense unless you can convince them that they have some new requirements that massively increase their demand.
Time flies! It's already 5 years since the release of Crystal 1.0.
We're going to celebrate that! Meet the Core Team and community members looking back at that milestone, what happened since then, and what's coming next.
You can tag your questions with #Crystal2026AMA
Friday, 27 March 2026 16:00 UTC
Video Call: https://man.as/crystal2026ama-zoom
Live Stream: https://man.as/crystal2026ama
More info: https://forum.crystal-lang.org/t/5-years-anniversary-of-crystal-1-0/8801
@alex @theresmiling Well, wikipedia is not a wiki, even though it does use a wiki engine for its backend, so I think that doesn't count.
I pulled my venerable Thinkpad T450S out of storage and spent the evening puttering with a Debian 13 / Haiku OS dual boot situation.
I am really loving Haiku OS. It's come a long way since I last played with it. This is my first bare metal install as well.
Superb work, @haiku
@janl isn't JS the language LLM's are about the best at?
@davidjamesweir Finally, an honest book about programming.
I still can't even that my now nearly 80 year old mom has been on Fedora on a thinkpad e520 (2011) for a decade and it all just works.
I checked her laptop just now and it's fully up to date on fedora 43, so she's done like 20 version upgrades autonomously too. The battery has degraded a little but the whole thing still works fine and she's very happy with it.
This is how things should be, this is peak computing tbh.
@cy I read about one such outcome in India as well, maybe 20 years ago.
@GhostOnTheHalfShell
pro-libre software, pro-holisticism
pro-communalism, anti-consumerism
fan of #Plan9 and #HaikuOS
anti-witchhunt, see https://stallmansupport.org
I write software (C++) for a living.