Follow

nytimes.com/2018/08/30/technol

Relevant both to AI and culture war., fairness, and ethics. The author's choice of examples is interesting, another well-known example of surprising results is the images that pop up if you search for "American scientist". Just like the author's examples are taken as evidence of right wing (using the term loosely) bias, that example was taken as the opposite.

It is easy to see where this is coming from - mentions of an "american scientist" is commonly a mention of an "african-american scientist". For cultural and historical reasons, ethnicity of scientists is important, nationality not so much.

For that matter, google just "scientist" and see if you think the image results are anything remotely representative for scientists. I had to scroll down to find an image of Einstein, and it is of course the one where he sticks his tongue out.

But it is a dilemma. Also accurate and technically unbiased algorithms may - and often will - give results that are suprising, deemed undesirable, and thus interpreted and condemned as bias. And our unwilingness to accept reality is perhaps the first hurdle we need to overcome.

"Eventually, this error was fixed. But how many other such errors are hidden in Google? We have no idea."

Huh. This is like somebody demanding their mirror be altered when they don't like what they see in it.

Show thread
Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.