The Politics of Images in Machine Learning Training Sets

www.excavating.ai

You open up a database of pictures used to train artificial intelligence systems. At first, things seem straightforward. You’re met with thousands of images: apples and oranges, birds, dogs, horses, mountains, clouds, houses, and street signs. But as you probe further into the dataset, people begin to appear: cheerleaders, scuba divers, welders, Boy Scouts, fire walkers, and flower girls. Things get strange: A photograph of a woman smiling in a bikini is labeled a “slattern, slut, slovenly woman, trollop.”

Fascinating essay on bias (or politics) in AI through an investigation into the politics of training sets, and the fundamental problems with classifying humans.

Read more...
Linkedin

Want to receive more content like this in your inbox?