Show newer

I turn on my PC.

The local wildlife are immediately sterilised by the excessive output from my four exposed nuclear reactors running at a critically risky heat output.

The sound of 10,000 cooling fans deafens the city’s morning traffic as my warehouse of 500 parallel-chained AMD Razer Threadrippers bursts into life, forming the equivalent processing power of 2015 in one room.

Nearby, my cluster of Nvidia 5090 Test Cards begins to warp the local time continuum as they calculate answers man was never meant to know.

Very gently, I open Microsoft Teams.

Instantly, the already deafening noise of fans increases to a murderous wail as they try to keep my equipment at operating temperatures. A nuclear reactors’ fusion catches up with its cooling and explodes destroying the lives of millions. The floor begins to melt away as my processors over clocked ten-fold reach critical mass and descend directly into hell. My Nvidia cluster collapses into a singularity and begins to devour the planet.

Quickly now, I open a text chat, it’s a bit laggy.

The sheer struggle of loading some text destroys the remaining systems. Me and my equipment are deleted from reality by an unknown overseer.

Humanity is not ready for instant messaging

stop doing reference counting

- references were never supposed to be counted
- hours of time spent refcounting with no use outside rust devs spiking compile times

"yes please give me a shitty thing that's worse than a garbage collector" - Statements dreamed up by the utterly deranged

LOOK what Rustaceans have been demanding your respect for all this time, with all the borrow checkers we built for them

let x = 4
let y = &x

"Hello I would like &variable please"
They have played us for absolute fools

@stux

This and people thinking "The Purge" was a good concept, is why we have police.

AI rant 

@2ck

It was the start of a dream really.

There are many other papers published that expanded on this one, even now in 2021. It is the start of machine learning actually being considered practical. Later papers compare convergence rates and error bounds between models, along with limitations on representatbility.

Before this it was an AI winter. No one took neurons seriously after it was demonstrated XOR was not even representable, with a single layer. Almost everything had to be directly built, as so progress was limited.

@mgrondin

You could try librewolf. There are tons of browsers that are designed as Firefox, just with the shady stuff ripped out.

It is like how, since Linux includes proprietary blobs, so there are many distros that simply remove these things in their own version.

AI rant 

If I were to make a book about the most influential theorems of the 1900s, this would definitely make it in the book.

"Approximation by superpositions of a sigmoidal function"

The possible use of signal processing and control applications, is basically the same as deep learning. Later papers show that deep learning is even less limited. But think, all of modern computers and electronics, live in a subset of what machine learning can simulate.

api.semanticscholar.org/Corpus

RT @bodil
So many takes today on how severely a WhatsApp outage affects the developing world where the conclusion is "therefore you're a bad person if you want Facebook gone" rather than "how did we let so much essential infrastructure fall into the hands of the worst people possible?"

@quad

I just ignored it. They can't invalidate my purchase, just because I don't want to give them free data.

@DeveloperMemes

It has been awhile since I messed with perl6 though. I liked some of their ideas, but the syntax leaves my brain as soon as I step away from perl-likes.

"What if we tried designing C a second time?"

submitted by neo1971tq

This paper is great. It is like a good slice of what is going on in formal methods and machine learning right now.

- Methods of space convergence from AI, can, and do, outperform the brightest algorithm writers, especialy when the time complexity is high. And innacuracy is easy to control for.

- It is all just programming. It is research level, but still. ML algs can be used easily in code logic, if there is a layer to handle uncertainty.

- Many of the referenced papers are about software that generates code without programmers. Obviously computer vision is going to beat any human, but the best tech for other domains are referenced here. Much of it is about indirect code reasoning.

- Real data is not needed for ML (supervised) research to progress. This data, CLEVR, is synthetic. Sometimes richer synthetic datasets make a better playground.

pages.cs.wisc.edu/~aws/papers/

"Android phones collect more data by volume, but iPhones collect more types of data, a study finds"

tomsguide.com/news/android-ios

@dqn

Thousands of years of civilization and so few seem to completely own property. :/

I hope I can be as creative as Nikola, dreaming up cellphones before we even had digital computers. :blobwizard:

Show older
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.