Facial Recognition Can't Distinguish Picture From Actual Person

David Johnson

Facial recognition is all the rage. Everyone seems to be very excited about it, that is, until it turns against them. Some have reason to believe we are pushing the technology much faster than we should. It is already being deployed before it has been proven. That leads to the sort of incident that happened in Zhejiang province this week. The Verge provides more detail.

"As first reported by Abacus, it all took place in the Zhejiang province, south of Shanghai. The face of Dong Mingzhu, a president of China's top air-conditioning company, flashed on a large screen displayed to the public listing nearby jaywalkers caught by cameras. A line of text captioned her photo, saying she had broken the law. It also listed part of her government ID number and her name, but misidentified her surname as 'Ju.'

"But what the camera actually saw was an ad featuring Dong's face on the side of a bus."

"But what the camera actually saw was an ad featuring Dong's face on the side of a bus."

This is not the first time facial recognition has made an embarrassing mistake. Slashdot posts other incidents where facial recognition went wrong. John Gass had his license revoked by mail because he looked too much like a person who had broken the law. The State of Massachusetts began using the software thanks to a grant from Homeland Security. Since then, it has been used in over 1,000 cases resulting in many misidentifications.

Amazon's facial recognition has similar problems. Not long ago, it infamously misidentified 28 out of 435 congressmen as criminals. Worse, 39 percent of those mistakes were made involving people of color. Only 5 percent were white. Researchers from the Massachusetts Institute of Technology and Stanford University have shown facial recognition software to make more errors with women than with men, and more with darker rather than lighter skin.

There is no magic to facial recognition software, but it is possible that it just mirrors the biases of the ones who train it. If mostly white males are used for the recognition training, it will likely be most accurate with white male faces. It may not have anything to do with white male faces being more distinct or easier to recognize. So it could lead to a more fundamental problem with women and people of color being potentially victimized by technology that is supposed to keep us safe.