Show newer

Join @WoodhullFreedom at a free conversation on how censorship has impacted adult performers, how porn can be beneficial for sex education, and how it affirms sexual freedom. eventbrite.com/e/censored-plea

We signed the @dontdelete.art manifesto along with 1,800 artists and institutions because censoring art from social media is censoring constitutionally protected self-expression. Join us and sign here: www.dontdelete.art.

From Japanese nonprofit NPO Uguisu Ribbon, three new author's talks from @JudithLevine, @jilliancyork, and Nadine Strossen on pornography, sex, children, and human rights. youtube.com/playlist?list=PLxK

The dangerous, anti-encryption #EarnItAct passed out of committee this morning. We know this bill—it's back from the dead to restrict the internet and make everyone less safe online. Thanks to 🐦SenAlexPadilla for entering a letter from 133 human rights orgs into the record

Twitter has banned @VirPedSpeaks, one of the platform's last bastions for non-offending MAPs. This is despite years of warnings from child protection experts that such bans make it harder to prevent child sexual abuse.
Read an open letter from 2018: prostasia.org/wp-content/uploa

@FSCArmy's Capitol Hill Fly-In is your opportunity to meet with members of Congress or their key staff to communicate how financial discrimination affects your business and the adult industry.action.freespeechcoalition.com

"Reporting numbers to the CyberTipline program sponsored by the National Center for Missing and Exploited Children (NCMEC), however, indicate that adult content prohibitions have little to do with image-based sexual abuse prevention." ynot.com/does-banning-adult-so

Does Banning Adult Content on Social Media Sites Prevent CSAM?

WASHINGTON — The National Center on Sexual Exploitation (NCOSE) sent a joint letter to Reddit demanding that the platform remove and censor thousands of subreddits hosting hundreds of millions of NSFW images, GIFs, and videos that are otherwise legally produced by consenting adults. NCOSE’s letter was signed by 320 “experts” accusing Reddit of failing to prevent image-based sexual abuse materials, including child sexual abuse material and non-consensual intimate imagery. However, the letter doesn’t outline any movements or policy recommendations to update the social media platform’s terms of service to prevent this material and chooses to remove all content, even if it is legal and consensual. This is just yet another flashpoint in the latest attempts to widely block adult content on social media sites. Reddit is one of the most popular social media websites. This social network has grown in popularity over the years and is a friendly and pro-speech platform for various viewpoints and subject matter, especially adult content posts, videos, and images. But a recent trend in forcing censorship on sex-friendly social networks has led to a significant crackdown. Partly driven by the legislative history behind the highly controversial FOSTA-SESTA signed into law by former President Donald Trump, sex workers have openly reported that Reddit has de-platformed them time and again. This trend stems from other platforms having removed adult content including Tumblr in 2018. Reddit also isn’t the first platform to face accusations of widespread image-based sexual abuse. Much of the justification for these types of campaigns to censor legal adult content can be linked to efforts to counter CSAM circulation on some of the world’s most popular sites. The social media networks owned by Meta Platforms Inc., Instagram and Facebook, have long had terms and services that prohibit adult content and pornography on their platforms. Reporting numbers to the CyberTipline program sponsored by the National Center for Missing and Exploited Children (NCMEC), however, indicate that adult content prohibitions have little to do with image-based sexual abuse prevention. Over 22 million reports, the vast majority of CyberTipline reports made to NCMEC in 2021, were made by the Meta-owned platforms. As Vice News points out, Reddit only transmitted 10,059 to NCMEC. NCMEC data for 2022 indicates that Reddit submitted 52,592 reports. Imgur, a popular platform for hosting images on social media sites like Reddit, has recently faced a similar controversy. Imgur updated its terms of service with the intention of banning the upload of new adult content to its website. It will also remove several years of other sexually explicit material, mainly virtual pornography and consensual adult material, from its site. This caused a stir, including among subreddit communities with millions of members. These communities include r/Gonewild (4.4 million members), r/NSFW (4 million members), r/Sex (2.4 million members), and r/RealGirls (3.6 million members). Members of these sorts of communities commonly use Imgur to host images and GIFs. Imgur will remove petabytes of content that, according to some sources, would “break” entire web communities. But does banning adult-oriented and sexual content prevent CSAM? In the same datasets from the CyberTipline for both 2021 and 2022, Imgur did its civic duty of reporting to NCMEC even with nudity allowed. In no way am I indicating that there is a direct connection between reporting numbers of CSAM and the terms and services for social media networks that permit or ban adult content. The point is to understand everything possible that can account for harm reduction in site policy.   Social media image by Visual Tag Mx from Pexels

www.ynot.com

Corporal punishment is child abuse. And it's legal in many parts of the world, including in many schools.

Abolish corporal punishment worldwide now! end-violence.org/sites/default

Show older
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.