Olives boosted

Open Rights Group fights to protect digital rights 🦾

Becoming a member helps us to campaign, lobby, go to court – whatever it takes to challenge restrictions to our human rights.

Join the movement today ⬇️

#DigitalRightsAreHumanRights

openrightsgroup.org/join/

Television is a broadcast medium, often with a more limited number of channels. Also, depending on how many TV sets someone has, they might have watch whatever they're watching with a bunch of other people.

The Internet is a marketplace of potentially limitless numbers of people offering content and someone can look for whatever content it is that they like (and avoid that which they don't like).

Between Australia and the U.K., the U.K. is actually the worse one. Australia has more of an issue of archaic standards but the U.K. actively looks to create new problems.

For instance, one of the proponents of internet censorship in Australia went on about *television* from decades ago* and even how television is pre-recorded. He seemed confused as to what this new technology was.

I think Mother Run 3D has been banned before, so I don't know why it appeared again. Maybe, they pushed a new version, that got assessed, then got banned too.

Olives  
https://www.refused-classification.com/censorship-timelines/game-iarc/ More games banned by Australia. https://play.google.com/store/apps/details?i...

refused-classification.com/cen
More games banned by Australia.

play.google.com/store/apps/det
Mother Run 3D. A game about eating vegetables and giving birth to babies.

usagixr.com
Some sort of scientific VR drug game was banned for drug references.

All they do is look at what minorities are doing, play that up as something spooky, then punish them more harshly. How is that racist?

Show thread

It's not like the U.S. could possibly be racist. Not when the government passed laws where one group of people was punished much more harshly for doing the same thing.

It's ironic when someone makes a meme making fun of people who scapegoat minorities, then someone replies in a ridiculous fashion to basically scapegoat minorities, essentially proving the point of the person who made the original post.

"Twitch claims their policies haven't changed."

Big Tech companies ( is owned by Amazon) often have two sets of policies. A vague public set of "guidelines" which are there to justify a decision they've made.

And then, there are the more detailed internal policies which are distributed to moderators.

I can't speak for Twitch specifically but there are other companies which are like this.

And hey, I think their rules are silly and censorious, I'm not defending them.

Of course, someone might then ask a question like "what to do if an index from the origin server is corrupt?". As I said, this one is just an idea off the top of my head. It's here to think about how it might be done.

Practically speaking, it might be done differently. For instance, one server? Multiple servers? How might it work? What if a server or set of servers fails?

Show thread

I dunno how content curation could work. One idea off the top of my head might be to have servers which act as gateways and servers which act as indexers.

An individual server might have some sort of compact index which the indexer could index. The indexer wouldn't store the actual data so to save as much on disk / database space as possible.

The user could have an app (maybe a web app) which queries a set (i.e. for load balancing purposes) of indexers for public posts (the query might denote something like what interests they subscribe to, maybe also negative prompts, maybe someone doesn't like AIArt) and the gateway for visibility restricted posts (and anything else, if the index isn't adequately complete).

They could then hand the results to the gateway to get the actual data.

The idea is that rather than having every gateway having to maintain a complete index of the fediverse (or the accompanying data), the logic of figuring out what is on the fediverse could be off-loaded in a more efficient manner to an indexer. Alternatively, they could do both, if resources permit.

This would still be fairly decentralized, although I suppose that someone might prefer the model of each server being it's own unit of decentralization. If an indexer, or a set of indexers, aren't functioning properly, pick another one / set. I dunno.

Part of the "content problem" might not even be a problem of users or content. It might just be basic content curation / collection. Right now, it is a bit of a hose pipe. Information is indiscriminately pulled in, if anyone shows a sliver of interest in it. Conversely, information that might be interesting is not pulled in, if someone doesn't already know about it.

Practically speaking, there is no way to subscribe to a particular related interest or anything. And that is one of the things mainstream social media kind of does. They guess at what someone is interested in, they guess at what content they might like, and they serve that. Of course, this model has it's flaws.

The fediverse is kind of anti social media in design because it goes out of it's way to *not do that*. It's not that particular users or content don't exist on social media. They do. But, generally, you might not see them and they might be buried by what someone is *actually* interested in finding. Likewise, particular types of users might engage with their own content, and that might keep them busy, rather than the subject being some sort of drama.

Then again, that is just a theory.

Show thread

What I suspect happens is that someone markets the fediverse as some sort of "safe space" (rather than a "pick your own server" social network*), then someone comes in with that expectation, and that leads to more toxicity than is really needed.

* An idea which is interesting in and of itself.

Even if marketing it as a "safe space" gets a few more users, the so-called growth is not worth the toxicity it creates. Is indiscriminately trying to pull in more users even what is really wanted? Chances are it is never going to be Facebook.

Show thread

"how to deal with block list mania"

For starters, stop marketing the fediverse as some sort of "safe space" unique among social networks, as if that is ever a thing.

At the very least, the Recall thing which Microsoft wants to add should be opt-in (and the user should understand how it works). Don't show lots of prompts to try to get the user to turn it on.

One of the cardinal rules of privacy is that someone is supposed to *minimize* the amount of data held. This is the exact opposite.

So, that is already one red flag against it.

While it is true that there might be stuff like cached data, and maybe that data should be easier to manage, this is also far more indiscriminate in what it collects.

The utility of it at this time is also largely hypothetical, although even supposing it is plausibly useful to someone, it is very over-engineered for the utility it actually provides and it's not clear why someone needs to use *this particular tool*. But, you know what, let's give it the benefit of the doubt momentarily in this post.

It is a treasure trove of information. If someone lives with an abuser (or with some sort of abusive dynamic going on), that might be troublesome.

It could be legally hazardous. What if someone inadvertently encounters illegal content? What if someone does moderation? Is that content going to be "saved"?

How might the data be managed? Can it be managed? Is it clear what is in it?

Show thread

Someone tried to explain to me how a location tracking app is supposedly good, but the more I heard, the creepier it sounded.

I could be more specific with A"I" but that would look weird.

Show thread
Show older
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.