Back for about 5~8 years, it is a known issue in face detection area where the face of Africa-America people just not so distinguishable than others. But for today's face detection system, it should be no problem. My only guess is that the guy in video is the rare "false negative" somehow.
Off-topic, I know, but still: saying that face detection systems have issues "faces of African-Americans" is really weird, and wrong. Face detection systems have issues with the color of the skin of people, and not all people with dark skin are "African-Americans".
Please, don't be PC when there is no reason to. This is one moment that it is more correct to say "black people" than anything else.
Many indians are as dark skinned and would probably have the same problem. Do you also refer to them as "black people"?
In the U.S, "black" is a race, it's not a skin color. About 70% of "black" people in movies and music have light skin color that would probably not have the same camera problem.
So technically, saying "black people" would be wrong. The technically correct term in this particular example would be "dark skinned people".
It's something of a generational thing. I'm overgeneralizing, but self-identifying as black is more common in Generations X and Y and self-identifying as African-American is more common among Baby Boomers.
Presumably you mean "false negative," because true negative would seem to indicate that the software correctly recognizes that he does not have a face.
I really doubt this was the case. This is what is supposed to happen when we have AI, but then again, as you surely know true AI will come ten years from now.