Hacker News new | past | comments | ask | show | jobs | submit login

The problem is that it isn't. The police have routinely conducted similar dragnets with racial profiling in the past.

Even in systems with less human intervention (e.g automated loan approval systems) bias is still being built in.

If anything this could provide people with the excuse of "see! it was the algorithm!".




>> Even in systems with less human intervention (e.g automated loan approval systems) bias is still being built in.

Just because there is bias in the system does not mean it is OK to open the floodgates and introduce more

>> If anything this could provide people with the excuse of "see! it was the algorithm!".

It doesn't work that way. First the person has to come up with tens of thousands of dollars to defend themselves and hire lawyers and expert witnesses. What you are stating is just a luxury of the rich. If the new system is that rich people get extra avenues of defense, you are right.


> The police have routinely conducted similar dragnets with racial profiling in the past

How does the geofence-based system to determine whether you were in the area of a crime when it was committed know what race you are? I'm suggesting that in the absence of a tool like this, cops will use their eyeballs and knock on doors, and we already know that process is racially biased.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: