Nice piece in Wired referencing a report I worked on with @epicprivacy.
"Washington, DC, is ... home to 690,000 people—and 29 obscure algorithms that shape their lives [by] screen[ing] housing applicants, predict[ing] criminal recidivism, identify[ing] food assistance fraud, determin[ing] if a high schooler is likely to drop out, inform[ing] sentencing decisions for young people, and many other things."
"Algorithms Quietly Run the City of DC—and Maybe Your Hometown" https://www.wired.com/story/algorithms-quietly-run-the-city-of-dc-and-maybe-your-hometown/
@VirginiaEubanks @epicprivacy @shriramk Am extremely uneasy with this demonization of "algorithms". Reminds me of ignorant people becoming distraught on learning they have "chemicals" in their bodies, that their house contains "atomic matter", that they have bacteria in their intestines, or that mobile phones emit "radiation".
Of course algorithms quietly run every government. That's why governments have been buying lots of computers since the 1950s: to run algorithms on them. That's all you can do with a computer!
How long until we hear of professors and grad students being lynched for working on developing type inference algorithms?
Is shameful to see this kind of rhetoric in a Wired article, and is shameful to see people who should know better endorsing it.
@radehi @epicprivacy @shriramk The report doesn't demonize. I am neither ignorant nor shameful. And not OK to use the language of lynching.
@VirginiaEubanks @epicprivacy @shriramk I'm not accusing you of lynching, nor am I threatening you with lynching. I am saying that the rhetoric used in that article's title will predictably lead to the literal, lethal lynching of people like me.
Creating that threat is what is not OK. Pointing out the threat is OK.
@VirginiaEubanks @epicprivacy @shriramk Nor am I accusing you of *being* either ignorant or shameful. I am saying that the people who will do the lynching are ignorant, and that the *act* of whipping them up like that is shameful.
@radehi @VirginiaEubanks @epicprivacy I read the article. I don't agree with your characterization of it.
I love and teach and create algorithms, but they HAVE had significant bad social impacts. The article is talking about their use in just such domains. Critically:
- "often without residents’ knowledge"
- "city agencies would not provide full details of how their technology worked or was used"
THAT is what it criticizes, and it is right to do that.
@shriramk @VirginiaEubanks @epicprivacy What I said above is slightly wrong: I should clarify that disproportionately placing surveillance cameras in Black neighborhoods is an example of how greater efficiency can make algorithmic law enforcement more harmful than human law enforcement even without secrecy; secrecy is a separate issue.