(Slate) – Image recognition software is still in its infancy. Sometimes that means it’s a little silly, as when Wolfram Alpha’s algorithms confuses cats with sharks or goats with dogs. Sometimes it’s a little creepy, as it was when Facebook announced that it can identify you even if your face isn’t showing. And sometimes it’s just really, really icky.
When Brooklyn-based computer programmer Jacky Alciné looked over a set of images that he had uploaded to Google Photos on Sunday, he found that the service had attempted to classify them according to their contents. Google offers this capability as a selling point of its service, boasting that it lets you “search by what you remember about a photo, no description needed.” In Alciné’s case, many of those labels were basically accurate: A photograph of an airplane wing had been filed under “Airplanes,” one of two tall buildings under “Skyscrapers,” and so on.
Then there was a picture of Alciné and a friend. They’re both black. And Google had labeled the photo “Gorillas.” On investigation, Alciné found that many more photographs of the pair—and nothing else—had been placed under this literally dehumanizing rubric.