@VirginiaEubanks @epicprivacy @shriramk Am extremely uneasy with this demonization of "algorithms". Reminds me of ignorant people becoming distraught on learning they have "chemicals" in their bodies, that their house contains "atomic matter", that they have bacteria in their intestines, or that mobile phones emit "radiation".
Of course algorithms quietly run every government. That's why governments have been buying lots of computers since the 1950s: to run algorithms on them. That's all you can do with a computer!
How long until we hear of professors and grad students being lynched for working on developing type inference algorithms?
Is shameful to see this kind of rhetoric in a Wired article, and is shameful to see people who should know better endorsing it.
@VirginiaEubanks @epicprivacy @shriramk I'm not accusing you of lynching, nor am I threatening you with lynching. I am saying that the rhetoric used in that article's title will predictably lead to the literal, lethal lynching of people like me.
Creating that threat is what is not OK. Pointing out the threat is OK.
@VirginiaEubanks @epicprivacy @shriramk Nor am I accusing you of *being* either ignorant or shameful. I am saying that the people who will do the lynching are ignorant, and that the *act* of whipping them up like that is shameful.
@radehi @VirginiaEubanks @epicprivacy I read the article. I don't agree with your characterization of it.
I love and teach and create algorithms, but they HAVE had significant bad social impacts. The article is talking about their use in just such domains. Critically:
- "often without residents’ knowledge"
- "city agencies would not provide full details of how their technology worked or was used"
THAT is what it criticizes, and it is right to do that.
@shriramk @VirginiaEubanks @epicprivacy I'm criticizing the *title* of the article, and the way it uses the word "algorithms".
I agree that we should criticize harmful ways of using algorithms, of which there are of course many. But the things you're describing are a question of lack of transparency in government, not of use of algorithms. Keeping secret files on citizens on paper, or conducting secret hearings, would also be harmful, for the same reasons. Secret algorithms can be more harmful because they are more efficient, of course, and more extreme. The article's example of placing automated traffic cameras disproportionately in Black communities is an example of how this can produce injustice: heavily policing Black neighborhoods has been a known problem for decades, but automated cameras remove the cost, potentially enabling far deeper levels of oppression.
However, the article consistently uses the term "algorithm" in a way I am sure you will agree is nonsense:
'The nonprofit spent 14 months investigating the city’s use of algorithms and found they were used across 20 agencies, with more than a third deployed in policing or criminal justice. ... The project team concluded that the city is likely using still more algorithms that they were not able to uncover.'
If we interpret the term 'algorithm' according to its standard meaning, these sentences make no sense, because are so trivially, obviously true (except for the part about "more than a third"), but yet are presented as if they are news, the damning results of a 14-month investigation!
This is like the political candidate who accused his opponent of having a sister who was a known thespian and having himself matriculated in his youth. It's utterly indefensible, shameful scaremongering, and it leads to the kind of travesty mentioned at the end of the article:
'Last month, lawmakers in Pennsylvania...proposed an algorithm registry law.'
In less law-abiding places than the US, expect mob violence instead.
I haven't read EPIC's report, and so I don't know if this is entirely Wired's fault.
@shriramk @VirginiaEubanks @epicprivacy What I said above is slightly wrong: I should clarify that disproportionately placing surveillance cameras in Black neighborhoods is an example of how greater efficiency can make algorithmic law enforcement more harmful than human law enforcement even without secrecy; secrecy is a separate issue.
@radehi @VirginiaEubanks @epicprivacy There are several spheres where "algorithm" has lost its previous neutral meaning. As algorithmic impacts have grown in society, those impacted — but without a technical background — have reinterpreted the word. (See also "crypto", etc.)
This will lead to some frictions. Language is messy.
I'm not sure the vehemence of your response was warranted; or at least it could have been directed far better.
@shriramk @VirginiaEubanks @epicprivacy Perhaps you are correct; I will have to consider it. Thank you.
@radehi @VirginiaEubanks @epicprivacy My friend Claire Mathieu, an algorithms expert, was a professor at Collège de France for a year. During that time she answered questions in the media (TV, papers…). She has mentioned how she found it very disorienting how the word "algorithm" had been repurposed, and had to adjust this so she could answer the intended question.
@radehi @epicprivacy @shriramk The report doesn't demonize. I am neither ignorant nor shameful. And not OK to use the language of lynching.