Has anyone considered impacts of #section230 on mastodon clients and hosts?
As a U.S. based developer with a U.S. based product (@trunksapp), I will likely cease operating this client if 230 is weakened or neutered in a meaningful way.
I don't have the care, want, or resources to deal with the possibility of being held liable for the content that you see (or don't see) from other users.
If you think that a client couldn't be held liable then I'd suggest reading some of the arguments being made in the supreme court case.
It can be inferred (extreme), based on the arguments presented before the supreme court, that any algorithm that produces offensive content could be considered as the clients or developers own speech regardless of where content is hosted
I'm not a lawyer, but the possibility of that is enough of a chilling effect that I'd rather not deal with the hassle.
I feel the need to share this since there seems to be a gap in understanding of what an algorithm is.
An algorithm isn't a recommendation engine, an algorithm is just any steps or rules taken to create a result.
Like what happens when you click the "login" button in this app. Or how to divide two numbers. Or even as complex as how to show you things you might be interested in.
If you litigate the consequences of an algorithm showing you content, well that's possibly anything in software.
@anders @trunksapp "algorithm" isn't used at all. The relevant term in the law is "interactive computer service" which is actually even broader.
@trunksapp @anders Are you referring to the Gonzalez argument regarding recommendation algorithms, or something else?
@LouisIngenthron it's not just recommendation algorithms, it touches on any presentation algorithms as well.
@LouisIngenthron @anders the arguments before the court talk about algorithms, go read it.