卫报发布了一篇文章,宣称Fediverse除了Gab之外。还有着几十个通过p2p技术和开源软件从事极右政治,宣扬阴谋论和新纳粹主义的实例。这篇文章还引用一个叫做 Megan Squire的"专家"的说法,声称主流的开源文化长期以来一直是"极端厌女主义"的体现,"对社会充满了毒害","虐待着所有人"。该文章指出,类似于Pleroma, Mastodon 和 Matrix 的社交平台和Facebook, Twitter这些巨头不一样,由于它们的去中心化特性,这些平台是无法遭到审查的。在Guardian看来,这是一个很严重的问题,因为这意味着权威机构和技术巨头不能审查言论,Guardian暗示道,要打击"仇恨言论" 和 "纳粹主义",就要把言论置于政府和巨头的控制之下,这些去中心化平台是不允许存在的。
"Beyond Gab’s ambiguous place in the fediverse, the Guardian found dozens of servers using peer-to-peer, open source tools, which were either exclusively or disproportionately devoted either to far-right politics, or to conspiracy theories that mainstream social media services have previously cracked down on, including coronavirus denialism, “incel” culture and neo-Nazism."
"Megan Squire is a professor of computer science at Elon University who has published research on both the far right and open source software communities. She says that “the dominant open source culture historically has been one of extreme misogyny, unfounded meritocracy, toxicity and abuse of everyone,” and that Smith is one of those resisting efforts to change that culture."
"Some open source communications platforms go a step beyond this, and do away with the need for servers altogether by implementing a “peer-to-peer” network. PeerTube, for example, allows users to browse and watch videos in a similar way to YouTube, but instead of streaming it to users from a central server, each user watching a video acts as a relay point.
The technical details are perhaps less important than the practical effect: no one has authority over these platforms: no one owns them. While governments and users can place pressure on the big social media companies to ban problematic users or communities, for better or worse, no one can stop anyone creating their own servers or peer-to-peer networks.
These technologies, then, are effectively uncensorable. According to a report by Emmi Bevensee, the co-founder of research consultancy Rebellious Data and the social media monitoring tool SMAT, extremists have been advocating, and even developing them, for years."
https://amp.theguardian.com/world/2021/mar/12/far-right-open-source-technology-censorship
https://poa.st/objects/3c198fd5-7927-4a14-acce-4b9c74b2f8c5
对于该报道的内容,也许人们会有不同的理解,我没有时间遂条反驳,所以放出原文供大家自行判断,有意的朋友可以翻译这篇文章:
Far-right supporters move to open source to evade censorship
A suicide and a strange bitcoin bequest have opened a window on to the new frontier of extremist online media
Fri 12 Mar 2021 10.10 GMT
On 8 December last year, a Frenchman called Laurent Bachelier gave away a total of 28.5 bitcoins – worth $556,000 – to 22 people. On the same day, he killed himself.
In suicide notes written in French and English, he explained that the burden of illness (he suffered from a neurological pain disorder) and his loss of hope for the future had led him to despair. After railing against the decline of western civilization and attacks on free speech, he wrote that he had decided to “leave his modest wealth to certain causes and people”.
Allusions to the “14 words” slogan used by white supremacists offered a clue as to the causes he favored. The beneficiaries of Bachelier’s largesse were all either prominent far-right agitators, or platforms offering them a home. The donations immediately attracted the attention of cybersecurity researchers, extremism watchers and law enforcement officers.
Bachelier gave the video platform BitChute two bitcoins (in January, the price of a single bitcoin ranged between $30,000 and $40,000). The neo-Nazi website the Daily Stormer got one, the French Holocaust denier Vincent Reynouard got 1.5, and the US white nationalist celebrity Nick Fuentes, an attendee of the riots in Charlottesville and the rally that preceded the storming of the Capitol in Washington, received 13.5 – worth over $450,000.
A Guardian investigation can now reveal that one of the lesser-known beneficiaries is a YouTube influencer of sorts – one with a history of promoting far-right political ideology. Luke Smith, now a Florida resident, maintains a monetized YouTube channel with 109,000 subscribers. He received at least one bitcoin from Bachelier, valued at the time of writing at just over $30,000.
It’s possible that Bachelier saw in Luke Smith a like mind and a shared purpose. Beyond their common ground in far-right politics, each saw technology as a weapon in their war against liberal, tolerant societies.
Like Bachelier, Smith eschews so-called proprietary software – like MacOS or Microsoft Word – and communications tools like Facebook or Twitter, built and controlled by Silicon Valley firms. Instead, Smith is an advocate for so-called “open source software” – the kind that makes it possible to use, copy, redistribute and modify software legally. And recently, he has been promoting communications platforms that might help extremists to operate beyond the reach of censorship – and even the law.
What Smith preaches: a war against the modern world
The man being funded by Bachelier’s donation likes to present himself as a latter-day Ted Kaczynski – the so-called Unabomber, whose infamous manifesto Smith has at times earnestly recommended to his followers.
Kaczynski, a terrorist still imprisoned for a 17-year bombing campaign that killed three and injured 23, was motivated by a hatred of the modern technological world. In recent years, his apocalyptic account of an industrial civilization on the brink of collapse has resonated with rightwing extremists – including the Christchurch mosque murderer, Brenton Tarrant – who describe themselves as “eco-fascists”.
In 2019, Smith said in a video he wanted to live in a “Unabomber cabin” to escape the surveillance and censorship which he believes is especially aimed at the far right. In a post on his blog in the same year – since deleted – he described the modern world as one “where your every action is watched, if you use proprietary software and communicate only via social media services”.
The fantasy of the US splintering along ethnic lines has long been entertained by white nationalists
Public records show that Smith moved to a rural property that year near Mayo, in northern Florida, whose title is held by a family member. Since then, most of his videos have been recorded in and around the property.
In various videos and podcasts, Smith rehearses other ideas associated with the far right. He advocates breaking the US up – potentially into racial enclaves “maybe [by] dividing by states, maybe [by] dividing by ethnic groups”. The fantasy of the US splintering along ethnic lines has long been entertained by white nationalists, who have taken to calling themselves the “Balk Right”.
This is not the only place where Smith touches on ideas associated with white nationalism. In a 2018 podcast, he offers an account of human history that relies on arguments made in The 10,000 Year Explosion, described by the Southern Poverty Law Center as a white nationalist book. Smith also directed readers to websites like radishmag, where readers are asked to “reconsider” slavery and lynching is painted in a positive light.
Luke Smith did not respond to repeated requests for comment.
Taken together, these beliefs come back to another far-right splinter ideology: the neoreactionary movement, which in the last decade has been enjoying an online renaissance of sorts, especially among some of Silicon Valley’s tech elite.
The birth of the neoreactionary movement
The neoreactionary movement traces its history to 2007, when the Silicon Valley entrepreneur Curtis Yarvin started a popular blog under the pseudonym Mencius Moldbug. He used it to attack liberalism, democracy and equality, discussed racial hierarchy in the euphemistic terms of “human biodiversity”, and counseled followers to simply detach themselves from the society ruled by the institutions of liberalism.
Journalist Corey Pein wrote an account of the culture of Silicon Valley which, in part, examines the influence that Yarvin’s ideas had in the tech world. Pein says that while neoreactionary ideology is somewhat incoherent, what is consistent is the members’ commitment to extricate themselves from liberal democracy. This “exit” doctrine was influential among some Silicon Valley leaders, including the tech billionaire Peter Thiel, who once memorably said: “I no longer believe that freedom and democracy are compatible.”
Smith follows the same ideological path. His principal outlet for these ideas is his YouTube channel, where he offers tutorials on how to use austere open source software applications, encouraging viewers to detach themselves from Silicon Valley’s products. The channel is both relatively successful and lucrative, and followers rate him highly. His videos have had more than 18.7m views,meaning he could earn anywhere up to $31,100 a year from his channel on current numbers.
Smith has been pushing users in the direction of decentralized social media platforms in the so-called 'fediverse'
YouTube confirmed that Smith’s channel remained in their partner program, meaning that he continues to earn money from the channel, but that they had removed one video, featuring racial slurs, which the Guardian had asked about.
Media representatives for Google responded to requests for comment with their
own request for clarification of questions about Smith’s channel and
their community guidelines, but ultimately offered no comment.
Smith has lately been pushing users in the direction of decentralized, resilient social media platforms in the so-called “fediverse”, a network of independent social media sites that communicate with one another, and allow people to interact across different sites. This could allow far-right activists to operate in ways that make them very difficult to shut down.
Though many prominent programmers and advocates in both the wider open source software movement and the fediverse are motivated by progressive, anti-corporate or anti-authoritarian political ideals, now the tools they have created might be used to shelter far-right extremists from the consequences of their hate speech and organizing.
Manipulating the open source movement for nefarious ends
The free and open source software movement has attracted many people with progressive politics, who have used it to help provide digital tools to those with few resources, to breathe new life into hardware that might otherwise have been added to a growing mountain of e-waste, or to move public institutions from Barcelona to Brasília away from dependence on expensive software.
However, experts say that it is not surprising that someone like Smith would be tolerated or even welcomed by some elements of open source culture.
Megan Squire is a professor of computer science at Elon University who has published research on both the far right and open source software communities. She says that “the dominant open source culture historically has been one of extreme misogyny, unfounded meritocracy, toxicity and abuse of everyone,” and that Smith is one of those resisting efforts to change that culture.
In recent years, and especially since the Gamergate movement intensified scrutiny on toxicity in tech, some responded to the blatant sexism, antisemitism and racism online with codes of conduct after realizing this behavior was actually starting to hurt them (Squires says they couldn’t recruit and retain developers).
The provision of safer online spaces for marginalized groups is a large part of the motivation of many of the people who have created the underlying software. On those platforms, tools for moderation and easy ways to flag sensitive content are baked in by design. But Smith is among a small group who repeatedly rail against the introduction of such codes of conduct within open source projects.
Some open source communications platforms do away with the need for servers by implementing a 'peer-to-peer' network
In a video recorded a week after the Capitol riots, when social media bans were removing rightwingers from Donald Trump down to prevent further violence, Smith said that those who wanted to bypass censorship should use the Twitter-like platform, Pleroma.
Open source software like Pleroma, Mastodon and Matrix reproduce the functions of Twitter, allowing users to send out brief messages to followers. But their implementation and structure are much more decentralized, allowing anyone to set up their own platform on their own server, after which they can join up, or “federate”, with other such communities.
Some open source communications platforms go a step beyond this, and do away with the need for servers altogether by implementing a “peer-to-peer” network. PeerTube, for example, allows users to browse and watch videos in a similar way to YouTube, but instead of streaming it to users from a central server, each user watching a video acts as a relay point.
The technical details are perhaps less important than the practical effect: no one has authority over these platforms: no one ownsthem. While governments and users can place pressure on the big social media companies to ban problematic users or communities, for better or worse, no one can stop anyone creating their own servers or peer-to-peer networks.
These technologies, then, are effectively uncensorable. According to a report by Emmi Bevensee, the co-founder of research consultancy Rebellious Data and the social media monitoring tool SMAT, extremists have been advocating, and even developing them, for years.
The reason I want it as a trans anti-fascist is the same reason that a Nazi wants it; we just have opposite ends
“Every marginalized community knows what it’s like to be systematically deplatformed”, says Bevensee, who uses non-binary pronouns, pointing to the way in which groups such as sex workers have adopted platforms like Mastodon after finding themselves unable to advertise their services.
But as Bevensee’s report shows, peer-to-peer platforms are a double-edged sword. “The reason I want it as a trans anti-fascist is the same reason that a Nazi wants it; we just have opposite ends,” they explain.
“You know who really doesn’t understand it? The FBI,” Bevensee adds: “we’re talking about a technology that can’t be subpoenaed. It can’t be surveiled” and, in order to carry out remote surveillance of private chats, “you would have to back door every single device in the world”.
This opens the way for extremists to propagandize and organize on platforms that are beyond the reach of legal authorities and tech giants alike. After the far right-friendly social media site Gab encountered hosting problems and app store bans, it rebuilt itself on Mastodon’s software, despite determined opposition from the platform’s creators and users.
Beyond Gab’s ambiguous place in the fediverse, the Guardian found dozens of servers using peer-to-peer, open source tools, which were either exclusively or disproportionately devoted either to far-right politics, or to conspiracy theories that mainstream social media services have previously cracked down on, including coronavirus denialism, “incel” culture and neo-Nazism.
With the far right under pressure from mainstream social media companies and internet hosts, this may be just the beginning.
But experts say that despite their recurrent complaints about Silicon Valley’s platforms, extremists will maintain their foothold in the mainstream for as long as they can. As Squire says of Smith’s internet activity: “Why is he still on YouTube? Because that’s where the eyeballs are, that’s where the money is.”
• In the US, the National Suicide Prevention Lifeline is at 800-273-8255 or chat for support. You can also text HOME to 741741 to connect with a crisis text line counselor. In the UK and Ireland, Samaritans can be contacted on 116 123 or email jo@samaritans.org or jo@samaritans.ie. In Australia, the crisis support service Lifeline is 13 11 14. Other international helplines can be found at www.befrienders.org
有一位朋友质疑卫报的这篇文章是否真的主张把言论置于政府和巨头的控制之下,抑制去中心化技术,这是我的回答,由于原本的讨论串并不公开,所以我再重新发一遍:
问:我希望你能解释一下,原文中到底哪里暗示“要打击"仇恨言论" 和 "纳粹主义",就要把言论置于政府和巨头的控制之下,这些去中心化平台是不允许存在的。”?
答:如果你觉得原嘟文的暗示还不够明显的话,我愿意再补充一些:
The technical details are perhaps less important than the practical effect: no one has authority over these platforms: no one owns them. While governments and users can place pressure on the big social media companies to ban problematic users or communities, for better or worse, no one can stop anyone creating their own servers or peer-to-peer networks.
These technologies, then, are effectively uncensorable. According to a report by Emmi Bevensee, the co-founder of research consultancy Rebellious Data and the social media monitoring tool SMAT, extremists have been advocating, and even developing them, for years.
" The reason I want it as a trans anti-fascist is the same reason that a Nazi wants it; we just have opposite ends "
“Every marginalized community knows what it’s like to be systematically deplatformed”, says Bevensee, who uses non-binary pronouns, pointing to the way in which groups such as sex workers have adopted platforms like Mastodon after finding themselves unable to advertise their services.
But as Bevensee’s report shows, peer-to-peer platforms are a double-edged sword. “The reason I want it as a trans anti-fascist is the same reason that a Nazi wants it; we just have opposite ends,” they explain.
“You know who really doesn’t understand it? The FBI,” Bevensee adds: “we’re talking about a technology that can’t be subpoenaed. It can’t be surveiled” and, in order to carry out remote surveillance of private chats, “you would have to back door every single device in the world”.
This opens the way for extremists to propagandize and organize on platforms that are beyond the reach of legal authorities and tech giants alike. After the far right-friendly social media site Gab encountered hosting problems and app store bans, it rebuilt itself on Mastodon’s software, despite determined opposition from the platform’s creators and users.
文章提到,假如用户使用的是去中心化平台,这些平台就不会因为外部的施压而封杀"问题用户"。 (去中心化保护问题用户)
文章还提到,"极端主义者"长年以来一直在推行在研发这类技术。(研发去中心化和p2p等技术的是坏人)
文章还提到,因为这些技术无法被监控,无法被传唤,极端主义者就可以利用这些技术进行宣传和组织。(使用这些技术的是坏人)
问:我还看到了:
"a double-edged sword"
"The reason I want it as a trans anti-fascist is the same reason that a Nazi wants it; we just have opposite ends"
为什么你看到卫报提到 "can’t be surveiled" ,就觉得它是在强调censorship的重要性?还有,下面讲的这些利用技术为恶的难道不是事实?为了不被闭嘴而转行研发这些技术的确实有一大部分人是极端人士啊。
答:"can't be surveiled 来自于该文对Bevensee报告的引用,该文在其后又加入"This opens the way for extremists to propagandize and organize on platforms that are beyond the reach of legal authorities and tech giants alike. " ,再联系前文的"While governments and users can place pressure on the big social media companies to ban problematic users or communities" ,可推导出该文主张把这些人困在中心化的平台中。的确,极端分子利用技术为恶是事实,但首先,我们要把重点集中在有权有势的人用利用技术所行的恶,也就是政府和技术寡头所行之恶,其次,任何技术都有好处有坏处,但总体而言,我认为去中心化的技术是利大于弊的技术,再者,在去中心化技术出现以前,就已经有过印刷术,电报,电话等传播信息的技术,这些技术当然也曾经被坏人使用,但我不认为应该禁止印刷术,电报,电话。即使没有任何技术,只要人有一张嘴,就有可能传播坏思想,但我认为不应该因此封住所有人的嘴。最后,要想完全扼杀人们的恶行,只有靠扼杀人们的自由意志,只有没有自由的地方,完全的安全才是可能的,但这种安全是无意义的。
@Vectorfield @HuangTang 这个荒堂师说不定可以讨论一下。
@Vectorfield 左逼想当big brother and watching everyone
@Vectorfield@qoto.org poa.st ……那个确实是纳粹节点啊:nacho_look:
@376668346 汝覺得是否需詳細駁斥一下該譯文?
@376668346 @Svartalfheim @KirisameUkiyo
「这篇文章还引用一个叫做 Megan Squire的"专家"的说法,声称主流的开源文化长期以来一直是"极端厌女主义"的体现,"对社会充满了毒害","虐待着所有人"。」
此處「長期一直以來是」易使人理解為「等於」,暗示「該專家聲稱『開源等於新納粹』」。
原文用詞為「 historically has been 」。故而,是否可理解為「觀點實際為『新納粹化僅為開源文化的一歷史性、暫時性問題,可以解決,開源本身並無問題』」?
原文後緊接「 and that Smith is one of those resisting efforts to change that culture. 」,「 change the culture 」即要改變而非開源文化。要進一步驗證需參考上下文:
@376668346 @Svartalfheim @KirisameUkiyo
上文:「
Though many prominent programmers and advocates in both the wider open source software movement and the fediverse are motivated by progressive, ...
## Manipulating the open source movement for nefarious ends
The free and open source software movement has attracted many people with progressive politics, who have used it to help provide digital ...
」
上段中「雖然」部分認為「很多著名開源人與 fediverse 人都是進步人」,下段中則認為「自由與開源軟件被廣泛用於進步事業中」。可見報文作者並不認為「開源是新納粹」。
@376668346 @Svartalfheim @KirisameUkiyo
下文:「
In recent years, and especially since the Gamergate movement intensified scrutiny on toxicity in tech, some responded to the blatant sexism, antisemitism and racism online with codes of conduct after realizing this behavior was actually starting to hurt them (Squires says they couldn’t recruit and retain developers).
」
該段承認開源社群近年來的反歧視行動,而「專家」觀點為「(歧視使)社群無法招新與留住開發者」。可見「專家」認同「開源與納粹本質不兼容」,亦不認為「開源是新納粹」。
@Svartalfheim @KirisameUkiyo
據 @376668346 ,「感觉好像原文就只是没有暗示“需要把所有言论置于控制之下”」。
對映翻譯:
A. 在Guardian看来,这是一个很严重的问题
B. 因为这意味着权威机构和技术巨头不能审查言论
C. Guardian暗示道,要打击"仇恨言论" 和 "纳粹主义",就要把言论置于政府和巨头的控制之下,这些去中心化平台是不允许存在的
(B->A)=>C
@Svartalfheim @KirisameUkiyo @376668346
首先, B 對映論述存在,為:「
The technical details are perhaps less important ... extremists have been advocating, and even developing them, for years.
」
總結起來確實是「政府與大眾無法審查其言論」之義。
尾段為「
But experts say that ... As Squire says of Smith’s internet activity: “Why is he still on YouTube? Because that’s where the eyeballs are, that’s where the money is.”
」
該段揭示一現象,即新納粹特殊時期會大聲宣傳「去中心化」,但風聲過後即會回歸 Big Tech 平臺宣傳與「恰飯」(吃飯),可見作者態度比較樂觀。既然「問題」並非「很嚴重」,則能否通過製造恐慌,以起呼籲審查之效,就大打折扣了。
@Svartalfheim @KirisameUkiyo @376668346
接下來三段才是關鍵:「
“Every marginalized community knows what it’s like to be systematically deplatformed” ...
But as Bevensee’s report shows, ... they explain.
“You know who really doesn’t understand it? The FBI,”... in the world”.
」
此處主人公是一位進步士,第三段是其對審查評論。若文章試圖將「為進步事業我們需反對中心化,支持審查」作為一種隱含觀點宣傳出來,則此處會破壞該宣傳:
1. 「 a double-edged sword 」表示 P2P 平臺有好的一面,且主人公本人即使用 Mastodon 進行進步事業。這違背「 P2P 平臺即退步」。
2. 「我們在討論 FBI 不理解之事物」對立了「我們」與「 FBI 」,且「你將不得不為世上每臺設備設後門」之論述明顯不利於宣傳審查。
@376668346
这里有两个问题,第一个问题是纳粹实例的标准是什么,在什么样的情况下我们可以认为一个实例是纳粹实例?我的看法是当一个实例将纳粹主义确立为官方意识形态或组织原则时,这个实例便可以被认为是纳粹实例。我刚才在搜索引擎上搜索了poa.st + nazi,并没有找到poa.st是纳粹实例的证据,也许是因为该实例关闭了目录索引。我对poa.st此前没有任何接触。而你也许对poa.st的详细信息有比较多的了解,我希望你能提供该实例为什么是纳粹实例的理由。
第二个问题是对于纳粹实例和纳粹用户应该采取什么态度(我认为作为纳粹主义者的个体用户和以纳粹主义为组织原则的实例是有很大的不同的),对于持纳粹主义观点的个体用户,我的观点是,除非该实例明确表明禁止纳粹用户,他们有权在Fediverse的各实例上注册并且发表观点,或建立自己的实例,但如果有明确的证据表明他通过Fediverse在现实中从事违法的活动,那么法律机构或实例管理员便可以依法对其惩罚。我不赞成以反纳粹的言论对纳粹用户和言论进行清理,首先,在不妨碍他人权益的情况下,言论自由是每个人的权利,是民主制度的核心原则之一,基于意识形态的言论审查本身要比纳粹言论的危险大得多。其次,纳粹一词在今日已被大幅滥用,覆盖了各个政治光谱,反纳粹在实际操作中几乎不可避免地无限扩大。第三,最重要的一点,要对所有的纳粹言论进行审查,只可能依靠一个权力高度集中,控制力极强的中央权力才能实现,这种权力的存在本身就是一个危险得多的炸弹。
对于纳粹实例,我的观点是,首先按照法律规定的来,违法了就必须接受惩罚,合法的话公权力便不宜干涉,但由于纳粹实例的侵略性一般比单独的纳粹个体要高,因此人们应该对其保持高度警惕,不过前提是该实例的确是纳粹实例。若该实例只是自说自话,不打算传教,人们不必过于担忧,但如果该实例充满了意识形态狂热,打算传播信仰,那么其它实例就应该组织起来,积极地与之战斗了。
以上是我的看法,不论以那种角度来看,Guardian所主张的通过技术巨头和政府机构对思想言论进行的中心化控制和对去中心化社交平台进行的封杀(虽然没有直说,但意思很明显),都是对思想自由和言论自由的无耻攻击,体现的都是一种反民主的危险倾向。
Neo-Nazism 新納粹主義,不一定等同歷史上存在的納粹主義,亦不一定要 實例明文將納粹主義官方意識形態才算
正如我判斷「入關學」是某種中國狂熱民族主義的體現,也不一定要ta們自己承認如是
至於公權力/中心化審查一事
我是不認爲 衛報這篇文章有如此 暗示/支持
文章純粹指出 去中心化技術的 陰暗面
而這確實是事實
文章也提及 mastodon如何讓被主流SNS deplatform的 邊緣羣體(例如性工作者)找到宣傳渠道
這就是文章闡述 沒有中心化審查的優點
@Vectorfield@qoto.org
1.
https://poa.st/@RacistVirgin
https://poa.st/@Aimin
https://poa.st/@mac_ack
https://poa.st/@Plerome
https://poa.st/@guttersessions
还有在 https://poa.st/search 搜索关键词,比如“racist”,立刻打开新世界的大门
2.
没有人在尝试让公权力介入审查。对于poa.st这种充满了种族歧视、neo-nazi发言、仇恨言论的实例,我们一向都是躲着走的,通过包括实例屏蔽等方式停止与这些人交流。
3.
我没有看到任何Guardian尝试诱导读者思考“应该让政府和大公司介入集中化言论审查”的迹象,你能截一段给我看看吗?
1.你说得有道理,这几个人的主页我都看了,他们的确是在宣传种族主义和纳粹主义,管理员甚至自豪地自称自己是种族主义者和纳粹主义者,不过这些人用的是二次元头像,时间线上多是二次元图片和牢骚,还有一些种族主义的侮辱词汇,例如nigger,我搜索racist的结果有30多个,大都支持种族歧视,但是Nazist和Nazism 的结果只有一个,这些人有的在头像旁边加上了纳粹符号,有的转一些歌颂纳粹主义的表情包,有的发一些期待纳粹征服世界之类的话,我认为的确可以说这个实例是一个宣扬种族主义和纳粹的实例,但其用户对纳粹主义停留在表面的符号宗拜(不排除我了解的不够多的情况),我认为这类的纳粹主义者还是应该和另一类纳粹主义者相区分,即组织严密,思想同质的纳粹主义的,但宽泛而言,可以说这是一个种族主义和纳粹主义实例。
2.我认为屏蔽是正确的做法,如果这些的确在进行违法活动,公权力介入也是应该的,当然了,我们并不生活在同一个国家,也管不了别国的事,我们自己的问题更多,更复杂。
3.The technical details are perhaps less important than the practical effect: no one has authority over these platforms: no one owns them. While governments and users can place pressure on the big social media companies to ban problematic users or communities, for better or worse, no one can stop anyone creating their own servers or peer-to-peer networks.
These technologies, then, are effectively uncensorable. According to a report by Emmi Bevensee, the co-founder of research consultancy Rebellious Data and the social media monitoring tool SMAT, extremists have been advocating, and even developing them, for years.
" The reason I want it as a trans anti-fascist is the same reason that a Nazi wants it; we just have opposite ends "
“Every marginalized community knows what it’s like to be systematically deplatformed”, says Bevensee, who uses non-binary pronouns, pointing to the way in which groups such as sex workers have adopted platforms like Mastodon after finding themselves unable to advertise their services.
But as Bevensee’s report shows, peer-to-peer platforms are a double-edged sword. “The reason I want it as a trans anti-fascist is the same reason that a Nazi wants it; we just have opposite ends,” they explain.
“You know who really doesn’t understand it? The FBI,” Bevensee adds: “we’re talking about a technology that can’t be subpoenaed. It can’t be surveiled” and, in order to carry out remote surveillance of private chats, “you would have to back door every single device in the world”.
This opens the way for extremists to propagandize and organize on platforms that are beyond the reach of legal authorities and tech giants alike. After the far right-friendly social media site Gab encountered hosting problems and app store bans, it rebuilt itself on Mastodon’s software, despite determined opposition from the platform’s creators and users.
文章提到,假如用户使用的是去中心化平台,这些平台就不会因为外部的施压而封杀"问题用户"。 (去中心化保护问题用户)
文章还提到,"极端主义者"长年以来一直在推行在研发这类技术。(研发去中心化和p2p等技术的是坏人)
文章还提到,因为这些技术无法被监控,无法被传唤,极端主义者就可以利用这些技术进行宣传和组织。(使用这些技术的是坏人)
我看到了Double edged sword,但总体而言这篇文章对于去中心化等技术表示的是敌意。
@Vectorfield @376668346 the correct way to address nazi problem is to defer to law. It makes no sense to take down platform. Why not ban English since neonazis speak English? Why not ban telephone because neonazis use smart phones? I stopped using Facebook because everyone on my facebook is successful and beautiful 35 yr old F500 director with 2-3 kids and sporting 6 pack, and all of a sudden I am a facist because I am using a platform that doesn’t stress that I need to buy Botox 😓😓😓
@Vectorfield 等一下 这是自称中立报到的媒体说出来的?迷惑望天
@Vectorfield I see the Guardian failed to figure out that Gab is no longer in the Fediverse.
Someone should email them a complaint asking for a correction. I would do so myself, but I don't know if they listen to people who aren't British and I have too much stuff to do anyway.
你後面的論點我同意
但這句
//可推导出该文主张把这些人困在中心化的平台中。//
你是如何推導出來的?
你的引文也只是說在 監管 極端分子上
去中心化技術 比 中心化平臺差
但沒有說因此 去中心化技術就是弊大於利/要禁止並把人都困在中心化平臺上
@amokhuxley 对,问题集中在监管上,或者更坦白地说,审查上,而不可审查的特性,或者说极难审查的特性,正是去中心化技的核心特性之一,可以说审查(当然,这里说的是来自集中权力的审查,而非实例的自治行为)就是与去中心化相冲突的。那么问题就来了,卫报认为可以为了去中心化技术而牺牲审查吗?
这篇文章认为去中心化的坏处有:
1.让极端右翼和阴谋论者有了发言空间。
(This could allow far-right activists to operate in ways that make them very difficult to shut down)
(be used to shelter far-right extremists from the consequences of their hate speech and organizing.)
2.其他人无法让平台封杀问题言论。
(While governments and users can place pressure on the big social media companies to ban problematic users or communities, for better or worse, no one can stop anyone creating their own servers or peer-to-peer networks.)
3.无法保障弱势群体的"安全空间"。
[“the dominant open source culture historically has been one of extreme misogyny, unfounded meritocracy, toxicity and abuse of everyone,” and that Smith is one of those resisting efforts to change that culture.
In recent years, and especially since the Gamergate movement intensified scrutiny on toxicity in tech, some responded to the blatant sexism, antisemitism and racism online with codes of conduct after realizing this behavior was actually starting to hurt them (Squires says they couldn’t recruit and retain developers).
The provision of safer online spaces for marginalized groups is a large part of the motivation of many of the people who have created the underlying software. On those platforms, tools for moderation and easy ways to flag sensitive content are baked in by design. But Smith is among a small group who repeatedly rail against the introduction of such codes of conduct within open source projects.
In a video recorded a week after the Capitol riots, when social media bans were removing rightwingers from Donald Trump down to prevent further violence, Smith said that those who wanted to bypass censorship should use the Twitter-like platform, Pleroma.]
一些开发者为了保障安全空间在设计中对敏感内容进行标记,可是像Smith这样拒绝改变"厌女主义的和充满毒性和虐待的开源文化"的人却依然能用p2p和去中心化等技术建立不标记敏感内容的平台。总之,去中心化的性质意味着你不好迫使每个实例都接受这些"codes of conduct",保障不了"安全空间"
好处有:
1.让另一些少数群体比如性工作者有了平台,全文基本上就提到了这一个好处,也许Fediverse的提倡者是进步主义左派也能算一点,可是该文又说就历史而言开源文化是厌女主义的和充满毒性和虐待的文化。
那么好了,是让极端右翼和阴谋论者有了发言空间,让问题言论得以传播,让弱势群体失去"安全空间"的保障更好呢?还是让另一部分弱势群体有平台更好?我想意思已经很明显了,我们姑且不谈坏处1和坏处2,只谈坏处3,也就是让弱势群体失去"安全空间"的保障,让它和好处1,也就是让另一些少数群体比如性工作者有了平台相比:首先,性工作者只是弱势群体的一部分,去中心化平台只能保护一部分弱势群体,也许也保护了另一些弱势群体,但是是保护的范围没有中心化的"安全空间"大,比方说,中心化的"安全空间"能保障黑人不必看到Nigger,但去中心化平台就不能,你必须自己点block,而在你block之前你大概率已经看到了这个词,中心化的"安全空间"能保障LGBT不必看到"男人永远变不成女人",去中心化就不能保障。中心化的"安全空间"保护的范围是高于去中心化提供的平台的,假如更多的人选择了去中心化平台,"安全空间"就不能提供那么多的保护了,也许你认为卫报不见得觉得中心化提供的"安全空间"优于去中心化提供的平台,但再加上再加上1.极端右翼和阴谋论者也有了发言空间。 2.问题用户封杀不掉。 这两点,显然,这里体现的是弊大于利。
弊大于利是否就意味着禁止去中心化平台,或者把人们困在中心化平台呢?我认为,如果去中心化和中心化是两种相并列的技术,就不需要非此即彼,但这两种技术是相冲突的,它们的运行方式和设计理念截然相反,现在它们的冲突不明显,是因为去中心化还在初期阶段,但倘使去中心化兴盛了起来,成为了无法忽视的势力,比方说全世界人有一半人都在用去中心化平台,恐怕Facebook Twitter就早已与其产生根本的冲突了。如果去中心化真的战胜了中心化巨头,那么根据卫报的看法,一个无法审查,无法提供"安全空间",充满极端右翼和阴谋论的问题用户的前景就成为了现实,因此完全有理由把这些技术扼杀在萌芽状态(如果真的能成气侯的话)。对于当前状态的去中心化技术,卫报的这篇文章,以及它代表的观点,持的是一种敌意和警觉的态度,并不明确表示封杀,但如果去中心化技术能发展状大,那么封杀是一定的。
// 3.无法保障弱势群体的"安全空间"。//
這點不是衛報說的啊
你引用的文章中就有提到
開源軟體開發者的動機 很大程度上就是爲 邊緣羣體提供安全空間的;文章是肯定這點
//The provision of safer online spaces for marginalized groups is a **large** part of the motivation of many of the people who have created the underlying software.//
相對而言,Smith只是代表小部分人而已(“Smith is among a small group”)
所以第3點無法成立
另外文章批評 去中心化技術讓 極右份子有發言空間,但同時引用Bevensee(一位使用去中心化技術的反法西斯主義者),所以「好處」那裏大概也要加上 讓左派有發言空間什麼的
至於後面你對中心化與去中心化的比較,就只是你個人意見,而非從文章所能得出的推論
你說衛報文章警惕 去中心化技術 的陰暗面,我同意;但說提倡扼殺則言過其實
@amokhuxley 不,文章并不认为Smith是少数,开源软件开发者大都希望保护弱势群体。原文的说法是"the dominant open source culture historically has been one of extreme misogyny, unfounded meritocracy, toxicity and abuse of everyone,” and that Smith is one of those resisting efforts to change that culture." 什么叫Dominant呢?如果是少数那还叫Dominant吗?文章的说法是"占主导地位"的开源文化是厌女主义的和充满毒性和虐待的文化(而且历史上就一直如此),因此(在这篇文章眼中),厌女的,压迫的,邪恶的开源开发者不大可能是少数,而应该占多数。而Smith只是"one of those resisting efforts to change that culture"而已,也就是说,Smith在保卫传统的主流开源文化,抵制左翼分子希望实现的改变,既然Smith和主流开源文化站在同一阵线,那么Smith代表的也不大可能是少数。文章虽然说到了那些"为弱势群体提供安全空间的 is a large part of the motivation of many of the people who have created the underlying software。但对于这些underlying software到底是哪些却语焉不详,我们只好结合上下文。上文是"In recent years, and especially since the Gamergate movement intensified scrutiny on toxicity in tech, some responded to the blatant sexism, antisemitism and racism online with codes of conduct after realizing this behavior was actually starting to hurt them"。 我查了一下,Gamergate movement不属于开源的圈内,而toxicity in tech也是泛指,不特指开源社区。下文是"On those platforms, tools for moderation and easy ways to flag sensitive content are baked in by design. But Smith is among a small group who repeatedly rail against the introduction of such codes of conduct within open source projects." 显然,作者认为,这里说的"嵌入设计中的敏感标记和审查机制"是在开源社区以外的,否则就不需要"introduction"了(我们这些自己使用mastodon的用户也知道这一点),而Smith则是开源社区中多次拒绝这种"introduction"的少部分人群之一。综上所述,这些underlying software 大概率指的是Facebook Twitter这些采取了审核标记机制的中心平台,而非去中心化的fediverse社区。因此,文章提到的"替弱势群体提供安全空间"应该指的是Facebook Twitter开发者的动机,而非开源软件开发者的动机,故你的反驳不能成立。至于让右派,性工作者和右派拥有(不受审查的)发言空间,这一点可以概括为言论自由,这是由去中心化技术的性质决定的,但Bevensee真的认为言论自由是一个优点吗?这很难说。他希望的也许是只限于部分人的言论自由,但去中心化技术,至少目前而言提供的就是所有人的言论自由。最后两点,关于个人意见与实际意图之间的关系,我只能说,作者的真实动机是什么,只有作者知道,所以我只能称之为暗示,但我的这种意见并非没有根据,我们没有办法知道他人的主观想法,但我们可以通过客观的依据,例如外在的言语和行为,做出最接近事实的推测。而根据这些客观证据,我们得出的结论是,该文重点描述了中心化技术的坏处,而对中心化技术的好处轻描淡写,而且指出了去中心化与控制问题言论之间不可调合的矛盾,所以我认为我的个人意见是有依据的,当然,我不把我的意见视为绝对真理,你也可以把这篇文再分享一遍,加上你不同的评价。
(时间有限,我不想把更多资源花在这条讨论串上,因此不会有更多的回复,谢谢。)
另外twitter現在也在開發 去中心化的社交平臺了
叫bluesky
之前ta們出了份報告,考察現有的各種去中心化社交平臺
mastodon開發者Gargron也在諮詢範圍
https://matrix.org/_matrix/media/r0/download/twitter.modular.im/981b258141aa0b197804127cd2f7d298757bad20
還有就算撇開衛報文章不論
直接比較去中心化與中心化的 保護程度
也不是那麼高下立見的
固然,中心化平臺 名義上 能確保統一的社羣守則(實際執行是另一回事);去中心化則有賴自覺
但同時去中心化技術也很多種
像mastodon這種聯邦網絡
實例admin的權力其實遠大於 centralized SNS moderator
所以保障相應更高
@Vectorfield 纳粹就是喜欢贼喊捉贼🙄
纽约时报的专栏作家Kevin Roose在今年也发布了一篇抨击加密通讯软件的文章,该文章认为不受权威机构管控的加密通讯会"助长虚假信息的传播"。此人还发布过一篇文章,号召拜登政府设立"真理官"。
https://www.nytimes.com/2021/02/03/technology/personaltech/telegram-signal-misinformation.html
https://www.nytimes.com/2021/02/02/technology/biden-reality-crisis-misinformation.html