Hacker News new | past | comments | ask | show | jobs | submit login
HP computers are racist (youtube.com)
67 points by jawngee on Dec 19, 2009 | hide | past | favorite | 45 comments



It's probably because there weren't any black coworkers around to test on. Otherwise they would have noticed it.

See? As if we didn't already have it, here's proof that diversity = better software :-)


Or giving them the benefit of the doubt, they tested on a black person but someone with lighter skin than the man in the video.


Reminds me of the story about Nikon camera's blink detection asking "Did someone blink?" when taking photos of Asian people (http://www.flickr.com/photos/jozjozjoz/3529106844/).


Considering that the camera was designed and tested by Japanese engineers, there probably is another explanation.


As funny as this is on the surface, I think the underlying causes are incredibly interesting.


Black people's faces obviously reflect less light. This is a much bigger problem for the cheap computer cameras that are not very good at collecting light. The software probably could not distinguish his facial features because there was less contrast due to the lower light.


Nonsense, if that were the case we'd heard about it by now. Also, the face tracking algorithms can already operate in far less optimal conditions than the one presented here so in _this_ case it shouldn't be a problem.

Forgetting that the algorithm was tuned to a specific color range however seems like an entirely plausible scenario.


You have not heard about dark skin reflecting less light?


The same thing happened in early motion pictures. Black actors' faces appeared washed out because the cameras were optimized for white actors.


I wonder how hard this would be to spoof. At 2:05 the camera "follows" edging to the left even though Wanda's face has already exited left. Also, the very last time Wanda comes into frame the camera doesn't pan anymore, but picks up a last minute zoom.


I suspect that the tracking is 100% software-based. It would probably be prohibitively expensive to have optical pan and zoom. I'm so glad that modern cameras generally have an option to disable digital "zoom". It's really just a euphemism for premature cropping.


Seems like it would be really easy. Think about it we don't see the actual computer or camera any where in the video. For all we know it's guy with a handy cam.


Interesting. So, I guess they trained the software, and presumably the training consisted of tracking white developers' faces.


Back for about 5~8 years, it is a known issue in face detection area where the face of Africa-America people just not so distinguishable than others. But for today's face detection system, it should be no problem. My only guess is that the guy in video is the rare "false negative" somehow.

Edit: it should be "false negative".


Off-topic, I know, but still: saying that face detection systems have issues "faces of African-Americans" is really weird, and wrong. Face detection systems have issues with the color of the skin of people, and not all people with dark skin are "African-Americans".

Please, don't be PC when there is no reason to. This is one moment that it is more correct to say "black people" than anything else.


Many indians are as dark skinned and would probably have the same problem. Do you also refer to them as "black people"?

In the U.S, "black" is a race, it's not a skin color. About 70% of "black" people in movies and music have light skin color that would probably not have the same camera problem.

So technically, saying "black people" would be wrong. The technically correct term in this particular example would be "dark skinned people".


It's something of a generational thing. I'm overgeneralizing, but self-identifying as black is more common in Generations X and Y and self-identifying as African-American is more common among Baby Boomers.


Presumably you mean "false negative," because true negative would seem to indicate that the software correctly recognizes that he does not have a face.


I really doubt this was the case. This is what is supposed to happen when we have AI, but then again, as you surely know true AI will come ten years from now.


What? OCR and pattern recognition software can easily be implemented with a trained neural network, but has nothing to do with actual AI.


All we have are two data points, and while there is obviously a skin tone difference between the two samples, that's not the only difference. Without more information we can't even be sure that skin color is the issue.

For example, the algorithm might just fail for the black man's facial features, which differ greatly from those of the white woman.

Can anyone test this?

Edit: By "differ greatly" I mean from the point of view of the algorithm. We don't know what features (as in data extracted from the image, not facial features) they're using.


Perhaps if you removed the webcam's IR filter the algorithm would have a better chance of working for faces of all shades.


Excellent point, though you might need to take other steps to deal with the increase in thermal noise (which lowers the color quality somewhat, not that anyone seriously uses a webcam for its photographic fidelity).

There's a market opportunity here if your theory is correct.


Btw there are so many cool things one could do if, say, the iPhone could turn off its IR filter as in a shutter :-)


this feels fake in a "viral" kind of way.


This is why we need diversity in the work environment. That software to track faces was probably designed by a bunch of white kids and they forgot the people of color.

This isn't the first time either: the voice recognition software that did not recognize the female voice at all.


I found an excellent solution: http://www.amazon.com/Hees-Design-MIME-Male-Mask/dp/B0019L1R...

HP should brand this and sell it as a peripheral.


Youtube Comment:

"The xbox360's new system had the same issue, "Project Natal" that was the reason it's release got set back was I believe it was ying yang twins or one of them were testing it and it couldn't detect their movements."


It was Newsweek's N'Gai Croal:

http://bitmob.com/index.php/mobfeed/Project-Natal-Hands-Feet...

>When game consultant and former Newsweek writer N'Gai Croal gave Paradise a test drive, however, the game had trouble reading his steering actions. The footwork (gas and brakes) worked fine, but Croal couldn't steer his car at all. It wasn't clear whether this was a problem of calibration differences between Tsunoda and Croal's very different body types, or if Croal's crazy dreadlocks threw Natal off. But it was working just fine when Tsunoda was at the "wheel."


Crazy dreadlocks? They look normal to me ;-) .


Really? I thought that the Natal was using Infrared sensors to get a depth map. I guess dark skin does absorb more infrared energy, but I'd think that they'd have already been able to calibrate it against dark clothing too.


if anyone has been following natal development....apparently the infrared cameras that natal uses for depth information has trouble with darker skin. I can't wait for the hilarity upon release if they don't fix it.


Where have you heard this? I've seen a black co-worker use a Natal without any noticeable difference in skeletal tracking.


Was anyone else reminded of the Better Off Ted episode where they install motion detectors that can't detect black people?


could the problem be the exposure level? to get good contrast for the details in a face with darker skin you need to use a longer exposure, which would blow out light areas in the background.

(i agree it could also be that it was trained on white faces)


  (more info)
  
  Uploaded Using HP MediaSmart
At least that much works...


Why is the image mirrored?


Never taken a photo in Photobooth on the Mac? They mirror the image that shows to you so it appears like your screen is a...well...mirror. I'm guessing when they record they save it as you saw it on the screen.


Difficulty processing an image with a low contrast ratio != racism. What a moronic claim.


The claim is obviously made in a tongue-in-cheek fashion rather than in anger - but it's effective at pointing out the net result of not testing for a wide enough sample of your potential customers, resulting in some of them not getting the functionality that they paid for.


I would assume the title is facetious...


So you're defending the title by claiming that the title is linkbait?


It's the same title as the YouTube video.


That's right; the title is linkbait


You are a great person.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: