Attention @black_in_ai, @TaelurAlexis, @ModFigsPodcast.
---
RT @osazuwa
An image of @BarackObama getting upsampled into a white guy is floating around because it illustrates racial bias in . Just in case you think it isn't real, it is, I got the code working locally. Here is me, and here is @AOC.
twitter.com/osazuwa/status/127

@realcaseyrollins @peterdrake most neural networks that process the photos of real people can't work with anyone who is not an european and who wear glasses. It just messes up with NN which never seen that or not trained enough.

I'm not a scientist but I'm not sure how this can be solved technically. But I guess it would an interesting and game changing paper if somebody started to research this.

@blight @a1batross @realcaseyrollins

Not in the sense of "the AI has beliefs about white supremacy", but yes in the sense that AI systems can be used to deliberately or accidentally perpetuate racism (also sexism, ableism, etc.). The books "Weapons of Math Destruction" and "Technically Wrong" enumerate many examples.

One small example is the "racist soap dispenser", which astonishingly has been developed more than once:

youtube.com/watch?v=YJjv_OeiHm

youtube.com/watch?v=WHynGQ9Vg3

There's no reason to believe that a person (or AI) *decided* to behave this way. It's much more likely that the development team and the training data were both devoid of dark skin.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.