Black people's faces obviously reflect less light. This is a much bigger problem for the cheap computer cameras that are not very good at collecting light. The software probably could not distinguish his facial features because there was less contrast due to the lower light.
Nonsense, if that were the case we'd heard about it by now. Also, the face tracking algorithms can already operate in far less optimal conditions than the one presented here so in _this_ case it shouldn't be a problem.
Forgetting that the algorithm was tuned to a specific color range however seems like an entirely plausible scenario.
I wonder how hard this would be to spoof. At 2:05 the camera "follows" edging to the left even though Wanda's face has already exited left. Also, the very last time Wanda comes into frame the camera doesn't pan anymore, but picks up a last minute zoom.
I suspect that the tracking is 100% software-based. It would probably be prohibitively expensive to have optical pan and zoom. I'm so glad that modern cameras generally have an option to disable digital "zoom". It's really just a euphemism for premature cropping.
Seems like it would be really easy. Think about it we don't see the actual computer or camera any where in the video. For all we know it's guy with a handy cam.
Back for about 5~8 years, it is a known issue in face detection area where the face of Africa-America people just not so distinguishable than others. But for today's face detection system, it should be no problem. My only guess is that the guy in video is the rare "false negative" somehow.
Off-topic, I know, but still: saying that face detection systems have issues "faces of African-Americans" is really weird, and wrong. Face detection systems have issues with the color of the skin of people, and not all people with dark skin are "African-Americans".
Please, don't be PC when there is no reason to. This is one moment that it is more correct to say "black people" than anything else.
Many indians are as dark skinned and would probably have the same problem. Do you also refer to them as "black people"?
In the U.S, "black" is a race, it's not a skin color. About 70% of "black" people in movies and music have light skin color that would probably not have the same camera problem.
So technically, saying "black people" would be wrong. The technically correct term in this particular example would be "dark skinned people".
It's something of a generational thing. I'm overgeneralizing, but self-identifying as black is more common in Generations X and Y and self-identifying as African-American is more common among Baby Boomers.
Presumably you mean "false negative," because true negative would seem to indicate that the software correctly recognizes that he does not have a face.
I really doubt this was the case. This is what is supposed to happen when we have AI, but then again, as you surely know true AI will come ten years from now.
All we have are two data points, and while there is obviously a skin tone difference between the two samples, that's not the only difference. Without more information we can't even be sure that skin color is the issue.
For example, the algorithm might just fail for the black man's facial features, which differ greatly from those of the white woman.
Can anyone test this?
Edit: By "differ greatly" I mean from the point of view of the algorithm. We don't know what features (as in data extracted from the image, not facial features) they're using.
Excellent point, though you might need to take other steps to deal with the increase in thermal noise (which lowers the color quality somewhat, not that anyone seriously uses a webcam for its photographic fidelity).
There's a market opportunity here if your theory is correct.
This is why we need diversity in the work environment. That software to track faces was probably designed by a bunch of white kids and they forgot the people of color.
This isn't the first time either: the voice recognition software that did not recognize the female voice at all.
"The xbox360's new system had the same issue, "Project Natal" that was the reason it's release got set back was I believe it was ying yang twins or one of them were testing it and it couldn't detect their movements."
>When game consultant and former Newsweek writer N'Gai Croal gave Paradise a test drive, however, the game had trouble reading his steering actions. The footwork (gas and brakes) worked fine, but Croal couldn't steer his car at all. It wasn't clear whether this was a problem of calibration differences between Tsunoda and Croal's very different body types, or if Croal's crazy dreadlocks threw Natal off. But it was working just fine when Tsunoda was at the "wheel."
Really? I thought that the Natal was using Infrared sensors to get a depth map. I guess dark skin does absorb more infrared energy, but I'd think that they'd have already been able to calibrate it against dark clothing too.
if anyone has been following natal development....apparently the infrared cameras that natal uses for depth information has trouble with darker skin. I can't wait for the hilarity upon release if they don't fix it.
could the problem be the exposure level? to get good contrast for the details in a face with darker skin you need to use a longer exposure, which would blow out light areas in the background.
(i agree it could also be that it was trained on white faces)
Never taken a photo in Photobooth on the Mac? They mirror the image that shows to you so it appears like your screen is a...well...mirror. I'm guessing when they record they save it as you saw it on the screen.
The claim is obviously made in a tongue-in-cheek fashion rather than in anger - but it's effective at pointing out the net result of not testing for a wide enough sample of your potential customers, resulting in some of them not getting the functionality that they paid for.
See? As if we didn't already have it, here's proof that diversity = better software :-)