Google Photos Auto-Tagging Feature Labels Two Black Americans As ‘Gorillas’


Google was forced to issue an apology on Monday after its image recognition software, Google Photos, failed spectacularly by labeling photos and the folder containing photos of an African American couple as “Gorillas.”

A bug in the image recognition software’s auto-tagging feature which helps to group photos uploaded to the Google service, labeled the folder containing images of Jacky Alcine, 21, a black American computer programmer based in New York, and his female friend, as “Gorillas.”


Alcine immediately sent a series of Tweets to Google, pointing out the error.


“Google Photos, y’all f***** up. My friend’s not a gorilla. The only thing under this tag is my friend and I being tagged as a gorilla. What kind of sample image data you collected that would result in this son?

“And it’s only photos I have with her it’s doing this with. This is how you determine someone’s target market.”


Alcine’s tweets received prompt response from Google’s chief architect of social, Yonatan Zunger. He issued an apology, saying that Google was “appalled” and “genuinely sorry” about the error.

“Holy f**k. G+ CA here. No, this is not how you determine someone’s target market. This is 100 percent not okay. Thank you for telling us so quickly. Sheesh. High on my list of bugs you *never* want to see happen. Shudder.”


Zunger asked if Alcine would allow Google to access his account to fix the problem, and shortly after told Alcine that the problem had been fixed.

But after Zunger said the problem had been fixed, Alcine said two photos were still being labeled “Gorilla” and “Gorillas.”

This forced Google to temporarily disable the classification “Gorilla/Gorillas,” to stop the software from grouping anything — including real gorillas — under that category.

“We’re also working on longer-term fixes around both linguistics – words to be careful about in photos of people – and image recognition itself, e.g. better recognition of dark skinned faces. Lots of work being done and lots still to be done. But we’re very much on it.”

According to Yahoo Tech, a spokesperson of the company said, “We’re appalled and genuinely sorry that this happened. We are taking immediate action to prevent this type of result from appearing.

“There is still clearly a lot of work to do with automatic image labeling, and we’re looking at how we can prevent these types of mistakes from happening in the future.”

Alcine said the problem could have been avoided “with accurate and more complete classifying of black people, especially darker-toned people of color like myself and my friend.”

But a Google spokesperson said that “we test our image recognition systems on people of all races and colors.”

This is not the first time that the auto-tagging feature for an image recognition software has committed major blunders. Flickr’s auto-tagging program recently labeled people as “apes” and “animals” and concentration camps as “jungle gyms.”

Google’s “Map” results in May included the White House for searches with racial slurs such as “nigger house.”

HP’s facial recognition software introduced in 2009 was repeatedly unable to recognize faces of black people although it could recognize white people’s faces.

Google launched its standalone Google Photos app in May, with extra features, such as ability to automatically categorize images of people, objects, food and landscapes.

[Images: Twitter/Jacky Alcine]