Are Face-Detection Cameras Racist?

  • Share
  • Read Later
Joz Wang

(2 of 2)

YouTube commenters expressed what was on a lot of people's minds. "Seems they rushed the product to market before testing thoroughly enough," wrote one. "I'm guessing it's because all the people who tested the software were white," wrote another. HP declined to comment on their methods for testing the webcam or how involved they were in designing the software, but they did say the software was based on "standard algorithms." Often, the manufacturers of the camera parts will also supply the software to well-known brands, which might explain why HP isn't the only company whose cameras have exhibited an accidental prejudice against minorities, since many brands could be using the same flawed code. TIME tested two of Sony's latest Cyber-shot models with face detection (the DSC-TX1 and DSC-WX1) and found they, too, had a tendency to ignore camera subjects with dark complexions.

But why? It's not necessarily the programmers' fault. It comes down to the fact that the software is only as good as its algorithms, or the mathematical rules used to determine what a face is. There are two ways to create them: by hard-coding a list of rules for the computer to follow when looking for a face, or by showing it a sample set of hundreds, if not thousands, of images and letting it figure out what the ones with faces have in common. In this way, a computer can create its own list of rules, and then programmers will tweak them. You might think the more images — and the more diverse the images — that a computer is fed, the better the system will get, but sometimes the opposite is true. The images can begin to generate rules that contradict each other. "If you have a set of 95 images and it recognizes 90 of those, and you feed it five more, you might gain five, but lose three," says Vincent Hubert, a software engineer at Montreal-based Simbioz, a tech company that is developing futuristic hand-gesture technology like the kind seen in Minority Report. It's the same kind of problem speech-recognition software faces in handling unusual accents.

And just as the software is only as good as its code and the hardware it lives in, it's also only as good as the light it's got to work with. As HP noted in its blog post, the lighting in the YouTube video was dim, and, the company said, there wasn't enough contrast to pick up the facial shadows the computer needed for seeing. (An overlit person with a fair complexion might have had the same problem.) A better camera wouldn't necessarily have guaranteed a better result, because there's another bottleneck: computing power. The constant flow of images is usually too much for the software to handle, so it downsamples them, or reduces the level of detail, before analyzing them. That's one reason why a person watching the YouTube video can easily make out the black employee's face, while the computer can't. "A racially inclusive training set won't help if the larger platform is not capable of seeing those details," says Steve Russell, founder and chairman of 3VR, which creates face recognition for security cameras.

The blink problem Wang complained about has less to do with lighting than the plain fact that her Nikon was incapable of distinguishing her narrow eye from a half-closed one. An eye might only be a few pixels wide, and a camera that's downsampling the images can't see the necessary level of detail. So a trade-off has to be made: either the blink warning would have a tendency to miss half blinks or a tendency to trigger for narrow eyes. Nikon did not respond to questions from TIME as to how the blink detection was designed to work.

Why these glitches weren't ironed out before the cameras hit Best Buy is not something that HP, Nikon or Sony, when contacted by TIME, were willing to answer. Perhaps in this market of rapidly developing technologies, consumers who fork over a few hundred dollars for the latest gadget are the test market. A few years ago, speech-recognition software was teeth-gnashingly unreliable. Today, it's up to 99% accurate. With the flurry of consumer complaints out there, most of the companies seem to be responding. HP has offered instructions on how to adjust its webcam's sensitivity to backlighting. Nikon says it's working to improve the accuracy of the blink-warning function on its Coolpix cameras. (Sony wouldn't comment on the performance of its Cyber-shot cameras and said only that it's "not possible to track the face accurately all the time.") Perhaps in a few years' time, the only faces cameras won't be able to pick up will be those of the blue-skinned humanoids from Avatar.

  1. 1
  2. 2
  3. Next