When Joz Wang and her brother bought their mom a Nikon Coolpix S630 digital camera for Mother's Day last year, they discovered what seemed to be a malfunction. Every time they took a portrait of each other smiling, a message flashed across the screen asking, "Did someone blink?" No one had. "I thought the camera was broken!" Wang, 33, recalls. But when her brother posed with his eyes open so wide that he looked "bug-eyed," the messages stopped.
Wang, a Taiwanese-American strategy consultant who goes by the Web handle "jozjozjoz," thought it was funny that the camera had difficulties figuring out when her family had their eyes open. So she posted a photo of the blink warning on her blog under the title, "Racist Camera! No, I did not blink... I'm just Asian!" The post was picked up by Gizmodo and Boing Boing, and prompted at least one commenter to note, "You would think that Nikon, being a Japanese company, would have designed this with Asian eyes in mind."
Nikon isn't the only big brand whose consumer cameras have displayed an occasional though clearly unintentional bias toward Caucasian faces. Face detection, one of the latest "intelligent" technologies to trickle down to consumer cameras, is supposed to make photography more convenient. Some cameras with face detection are designed to warn you when someone blinks; others are programmed to automatically take a picture when somebody smiles a feature that, theoretically, makes the whole problem of timing your shot to catch the brief glimpse of a grin obsolete. Face detection has also found its way into computer webcams, where it can track a person's face during a video conference or enable face-recognition software to prevent unauthorized access.
The principle behind face detection is relatively simple, even if the math involved can be complex. Most people have two eyes, eyebrows, a nose and lips and an algorithm can be trained to look for those common features, or more specifically, their shadows. (For instance, when you take a normal image and heighten the contrast, eye sockets can look like two dark circles.) But even if face detection seems pretty straightforward, the execution isn't always smooth.
Indeed, just last month, a white employee at an RV dealership in Texas posted a YouTube video showing a black co-worker trying to get the built-in webcam on an HP Pavilion laptop to detect his face and track his movements. The camera zoomed in on the white employee and panned to follow her, but whenever the black employee came into the frame, the webcam stopped dead in its tracks. "I think my blackness is interfering with the computer's ability to follow me," the black employee jokingly concludes in the video. "Hewlett-Packard computers are racist."
The "HP computers are racist" video went viral, with almost 2 million views, and HP, naturally, was quick to respond. "Everything we do is focused on ensuring that we provide a high-quality experience for all our customers, who are ethnically diverse and live and work around the world," HP's lead social-media strategist Tony Welch wrote on a company blog within a week of the video's posting. "We are working with our partners to learn more." The post linked to instructions on adjusting the camera settings, something both Consumer Reports and Laptop Magazine tested successfully in Web videos they put online.
Still, some engineers question how a webcam even made it onto the market with this seemingly glaring flaw. "It's surprising HP didn't get this right," says Bill Anderson, president of Oculis Labs in Hunt Valley, Md., a company that develops security software that uses face recognition to protect work computers from prying eyes. "These things are solvable." Case in point: Sensible Vision, which develops the face-recognition security software that comes with some Dell computers, said their software had no trouble picking up the black employee's face when they tested the YouTube video.