
Recently, people online have been asking an AI tool to see what it sees when it looks at their face. The results have been surprising, sometimes flattering, and often quite racist…
ImageNet Roulette uses a neural network to classify pictures of people uploaded to the site. You simply go to the site and paste the URL of a photo you want analysed (or just upload your own photo) and it will tell you what the algorithm sees in your photograph.
If you try it and get a bad result, remember that there are much worse things it can call you.
is imagenet roulette tryna fuck pic.twitter.com/VeKjRq7xI4
— laura lux (@DarthLux) September 16, 2019
It’s also known to be quite insulting.
Aww right sound nae bother pmsl x https://t.co/LJAxh0tnWK pic.twitter.com/iJ80KqqZTp
— IRAldo (@aldomax_) September 16, 2019
And as the title suggests, even slightly racist…
No matter what kind of image I upload, ImageNet Roulette, which categorizes people based on an AI that knows 2500 tags, only sees me as Black, Black African, Negroid or Negro.
Some of the other possible tags, for example, are “Doctor,” “Parent” or “Handsome.” pic.twitter.com/wkjHPzl3kP
— Lil Uzi Hurt (@lostblackboy) September 18, 2019
The whole internet loves Imagenet AI, an image classifier that makes quirky predictions! *5 seconds later* We regret to inform you that the AI is racist pic.twitter.com/bWFKddlePG
— Eric (@khale_) September 16, 2019
This tool was created by artist Trevor Paglen and co founder of New York University’s AI Institute Kate Crawford, and it uses an algorithm from one of the most “historically significant training sets” in AI. In 2009, computer scientists at Stanford and Princeton tried to train computers to recognise a wide range of objects. To do this, they amassed a huge database of photographs of everything from trees to goats. They then got people to sort the photos into categories.
The result was ImageNet.
“We want to shed light on what happens when technical systems are trained on problematic training data. AI classifications of people are rarely made visible to the people being classified. ImageNet Roulette provides a glimpse into that process – and to show the ways things can go wrong,” Paglen and Crawford explain on the tool’s website.
“ImageNet Roulette is meant in part to demonstrate how various kinds of politics propagate through technical systems, often without the creators of those systems even being aware of them.”
Essentially, the machines become racist and misogynistic because humans are racist and misogynistic.
“ImageNet contains a number of problematic, offensive, and bizarre categories, all drawn from WordNet. Some use misogynistic or racist terminology. Hence, the results ImageNet Roulette returns will also draw upon those categories.”