@joeyh @evangreer Indeed. We have no algorithm, such a ruling would be bad for Google, Facebook, and Twitter, but very good for the fediverse.
Without algorithms driving engagement numbers, Trump would never have become President, and I think a lot of Internet folks don't want to acknowledge just how much death can be attributed to the immunity Section 230 provides.
@ocdtrekkie @joeyh@octodon.social @evangreer If you think this site doesn't have any algorithms, you really don't understand how any of this works.
@LouisIngenthron @joeyh @evangreer I do understand how any of this works. =)
There is no recommendation algorithm here. There are algorithms in the way "any math problem is technically an algorithm", but not in a way that would be legally claimable as "recommending content". Even in the case of "trending" mechanisms, the mechanism is publicly inspectable, and not profit-driven.
@LouisIngenthron @joeyh @evangreer That last bit is what, unfortunately, Section 230 defenders completely miss: Section 230 *is not used to do good faith moderation*. It's used to do *profit-motivated* moderation.
Tech companies moderate content in ways that make money, not protect people. And that's why 230 *has to go*. Because it turns out, radicalizing an entire half of the US is very, very profitable.
@ocdtrekkie @joeyh@octodon.social @evangreer I remember when people understood that shooting the messenger was a bad idea.
If you think YouTube is the necessary catalyst to radicalization, you have a lot of pre-YouTube history to explain.
The problem isn't tech: it's human nature. And you're not going to fix that by restricting speech on the internet. You're going to make it worse. You're going to make it less visible, festering, hidden.
Social media, like any communication medium, brings that darkness out into the light where it can be fought with counter-speech.
The answer is education, not muzzles.
@LouisIngenthron @joeyh @evangreer My Mastodon server falls under German law, there's zero Section 230 protection here! And yet, most people would argue mastodon.social is one of the looser-moderated servers on the generally-accepted fediverse! It isn't inundated with frivolous lawsuits about any which person having said any which thing.
You are buying a complete fabrication by extremely harmful companies trying to prevent being responsible for their actions.
@ocdtrekkie @joeyh@octodon.social @evangreer So you wouldn't mind if I made a bunch of complaints to the German government about your server hosting Nazi content (not that I would do so, but just for the sake of argument)? Or signing up a bunch of accounts and actually posting such content and then reporting you to the government? How often would that have to happen before you gave up on hosting?
In America, the Nazis are the ones responsible for their content, not the people trying to run a website in good faith.
@LouisIngenthron @joeyh @evangreer I mean, it's not my server. But I'd be really interested if you could prove that the Internet couldn't function in any country but the United States because of this one bad law. The idea that you think it is just seems hilarious.
And the problem is, Google and Facebook and Twitter AREN'T RUN IN GOOD FAITH! That's the whole point of why 230 has to go.
@ocdtrekkie @joeyh@octodon.social @evangreer They already are responsible for the actions they willfully take when operating their service. That just doesn't make them responsible for the content of others just because of how they choose to order it.
@LouisIngenthron @joeyh @evangreer Incorrect. The entire case at hand is about whether or not Google and Twitter can be held responsible for *their actions in handling the content* as opposed to the content itself. This is about the actions they take as service operators.
@ocdtrekkie @joeyh@octodon.social @evangreer No, it's trying to hold them responsible for others' content because of their actions.
I'm all for them being held responsible for their own speech, but choosing the order third party speech is presented in shouldn't shift the liability.
@LouisIngenthron @joeyh @evangreer We're not talking about a bubble sort here. We're talking about a system that goes "hey, this person seems to be susceptible to fringe political positions, let me push terrorism recruitment on them!"
@ocdtrekkie @joeyh@octodon.social @evangreer Wait, you think they're intentionally trying to radicalize people?
@LouisIngenthron @joeyh @evangreer They operate a psychopathic mechanism solely designed to keep people watching, regardless of the impact on the individual or society.
Section 230 means they don't have to care about the impact, all they need to care about is maximizing advertising profit, and that means increasing eyeball time at any cost.
Every other company has to handle legal liability for their actions.
@ocdtrekkie @joeyh@octodon.social @evangreer Their "actions" are simply giving people what they ask for.
Should we also make McDonalds liable for diabetes?
@LouisIngenthron @joeyh @evangreer They didn't ask to be recruited into ISIS. YouTube's algorithm discerned it could increase watch time by presenting ISIS videos to them. This is like "videos pushed to users' front page" stuff, not search results.
@ocdtrekkie @joeyh@octodon.social @evangreer Really? Because I've watched quite a few YouTube videos and I've never seen an ISIS recruitment video. Have you?
Seems to me that someone has to be searching or watching something pretty ISIS-adjacent for the algorithm to even offer that.
@ocdtrekkie @joeyh@octodon.social @evangreer Also, even if you were shown an ISIS recruitment video, would that make you want to join ISIS?
Or would you already have to have some sympathy for their cause before even considering that?
@LouisIngenthron @joeyh @evangreer Radicalizing is a process of increasingly sliding the window over. Maybe they start with information about the COVID vaccine, and spend a bit too much time with a crazy theory. Next you're getting videos about Bill Gates and what "the Jews" are up to. And then it keeps going downhill.
But it sure keeps those view counts going up!
@ocdtrekkie @joeyh@octodon.social @evangreer You're proving my point. These people are seeking out this information.
@LouisIngenthron @joeyh @evangreer They weren't seeking to become a terrorist when they were curious if the vaccine was safe.
YouTube goes for maximizing engagement. Handing out little carrots to increasingly outrageous ideas.
And it goes well beyond just foreign terrorism considered in this case: Section 230 is squarely to blame for the mass shooting problem and the election of Donald Trump. Our democracy depends on repealing this bad law.
@LouisIngenthron @joeyh @evangreer Let me take a different tack: If I check your profile, you've boosted several people's comments about Section 230. Are you aware how many of them are Google-funded?
@ocdtrekkie @joeyh@octodon.social @evangreer I'm aware of the allegations you're preparing to spew, yes.
@ocdtrekkie @joeyh@octodon.social @evangreer Really? See, I thought the assholes pulling the triggers were responsible for mass shootings. Silly me.
It's weird how humans have no agency in your world, but corporations do.
@LouisIngenthron @joeyh @evangreer It's important to understand both individuals and the systems and influences that push them around.
For instance, if we only consider individual agency, why do we concern ourselves about who funds money into political lobbying groups at all? People are still just... going to vote what they believe, right?
@ocdtrekkie @joeyh@octodon.social @evangreer People are, yes.
But politicians may not because politicians wield power and power corrupts.
@LouisIngenthron @joeyh @evangreer So there's no point in political advertising at all? All the billions spent on TV commercials are for nothing?
@ocdtrekkie Of course influence is a factor. But it's not mind control.
Just because McDonalds advertises to me doesn't mean it's not my decision whether or not to eat there. It doesn't make them liable for my choices.
@LouisIngenthron And nobody's trying to charge Google with murder here. They're trying to charge Google with aiding and abetting, of influencing and contributing to a terrorist act. And there's no doubt that YouTube's profit-maximizing algorithm helped recruit a lot of terrorists.
@LouisIngenthron @joeyh @evangreer Removing Section 230 doesn't restrict speech online. Weird blanket immunity for tech companies is a solely United States thing, nobody else does it, and free speech is, strangely, not solely for American Internet servers.
Removing Section 230 will require companies be responsible for the actions they willfully take when operating their service. Every apocalyptic line of bull you've read beyond that is false.