Hacker News new | past | comments | ask | show | jobs | submit login

> When AI criminal risk prediction software used by judges in deciding the severity of punishment for those convicted predicts a higher chance of future offence for a young, Black first time offender than for an older white repeat felon.

Younger people are more likely to re-offend than older people. Remove race from the situation entirely, and this is still the expected result. There's zero reason to think this algorithm was based with respect to race.




If you read deeper than a one line description you'll see:

1. Even after accounting for criminal history, recidivism, age and gender, black defendants were still scored as much more likely to re-offend.

2. It incorrectly predicted that black defendants would re-offend much more frequently than white defendants.

3. It incorrectly predicted that white defendants would not re-offend much more frequently than black defendants.

Considering those three things and that they have refused to give any details about how the algorithm works, I see zero reason to give them the benefit of the doubt. Fire them until they can demonstrate that the algorithm is not in fact biased like it appears to be.

The score also takes into account answers to questions like whether the defendant's parents were separated or whether their parents were ever arrested. Those things are completely out of the defendant's control and are highly correlated with race. Even including those things in their score is damning.


Phrasing it in the way you do is misleading. The defendants labeled as high risk and low risk were just as likely to re-offend. To put this in simpler numbers

* Out of 100 white people, 5 were labeled high risk.

* Out of 100 black people, 20 were labeled high risk.

* Out of the 5 white people labeled high risk, 4 re-offended.

* Out of 20 black people labeled high risk, 16 re-offended.

In either case, someone labeled high risk had the same likelihood to re-offend: 80%.

"It incorrectly predicted that black defendants would re-offend more frequently than white defendants." This is technically correct, but not because the algorithm was bad a predicting rates if re-offending. It's because there was higher rates of re-offending. The likelihood of re-offending among someone labeled high risk is the same.

This kind of objection seems like a blanket rejection of any system that produces an inequitable outcome. But the reality is that rates of re-offending is not equal. Even a perfectly accurate prediction of re-offense is going to predict higher rates of re-offending among men. Because men re-offend at higher rates. This isn't sexism.

That doesn't mean we shouldn't recognize the disparate impact of incarceration on underprivileged people. But simply concluding bias due to inequitable outcomes is simplistic.


Your analysis above is a prime example of how dangerous statistics can be when not properly considered. Even if we took your numbers above as truth, it actually does not indicate that the system isn't biased. In order to determine that you'd need to know what percentage of each group _not_ labeled as high risk never re-offended. You're only analyzing the true positive rate without taking into account the false negative rate. You'd need to know how many people in each group that weren't labeled went on to re-offend.

Thankfully, the article I linked to looked at both:

                                               White  African American
    Labeled Higher Risk, But Didn’t Re-Offend  23.5%  44.9%
    Labeled Lower Risk, Yet Did Re-Offend      47.7%  28.0%


When you are doing crystal ball voodoo based on stuff that has no connection to the defendant's choices and no direct connection to criminality like whether or not their parents were separated and whether their parents were ever arrested, then you're damn right there should be a blanket rejection of any inequitable outcome.

I'm not even convinced that you should be allowed to make a decision based on stuff like that even if it is somehow equitable.


> no direct connection to criminality like whether or not their parents were separated and whether their parents were ever arrested

Again, not right at all. These things heavily correlate with crime. A broken home is the primary indicator of someone's future success, even over a 'better' but broken home.

I'm male, and not a rapist, but having a penis heavily correlates with rape. I suggest you do not pick me, or any male, to watch your children. Sure it's rude to the innocent, but oh well, a little rudeness vs potential harm.

> I'm not even convinced that you should be allowed to make a decision based on stuff like that even if it is somehow equitable.

Then nobody will ever follow the law. If it's illegal for me to reject a potentially bad babysitter for something I know about them that could risk my child's safety I'll happily lie and say I didn't like their haircut.

You'd get further if you tried to identify these people and treat them better - grants to move out of bad neighborhoods, to get educated, to get pardons for unrelated crimes, to get counseling for abuse, etc - than with this "wrong unless it's 100% identical, equity-of-outcome" thing.


We're talking about an algorithm being used as part of the court system to determine whether or not someone rots in jail, not about how you choose your babysitter.

It needs to be held to a higher standard and punish people based on their actions, not based on what their parents did.


No, you're talking about making it illegal to use the things you know about someone or something to make a fully qualified decision. About a babysitter, a potential criminal, an immigrant, whatever.

> It needs to be held to a higher standard and punish people based on their actions, not based on what their parents did.

And that's not what actually happened. A guy got arrested on bad data and then released, we know some kids might be at risk because they're from broken homes, etc. Nobody was jailed for their parent's actions, and nobody even proposed it. Some trends are worrying, but you act like they're the intent of the entire system not just some scammy products that a company is pushing.

Voting machines, for instance, should all be burned. But I understand that most people don't know why and support them for convenience. I think they're anti-democratic but I don't think people are evil for using them. You should try to get a similar perspective.


I am not talking about that. You are misinterpreting my statements. "stuff that has no connection to the defendant's choices". Defendants only exist within the court system.

> Nobody was jailed for their parent's actions

Not jailed, but they are being kept in jail because of them. That is the same thing.

I am not claiming that anyone is evil. Just that this is unjust and that they need to stop doing this. You are putting words in my mouth.


> Fire them until they can demonstrate that the algorithm is not in fact biased like it appears to be.

What do the non-black box findings from the area show? If blacks are a poor demographic in the area it might be true. (Crime tracks poverty, not race.)

But yeah, certainly don't pay anyone for, or use, a black-box algorithm.

> The score also takes into account answers to questions like whether the defendant's parents were separated or whether their parents were ever arrested. Those things are completely out of the defendant's control and are highly correlated with race. Even including those things in their score is damning.

Nope. That's perfectly fair to look at. For instance, a broken family is another predictor. It's not fair, but being a victim increases your chance to offend. Similarly, the number one predictor of child sex crimes is to have suffered them yourself. If you have a child and are picking a babysitter, skip the one who was molested.

fwiw, those things don't correlate with race, they correlate with poverty which correlates with race. Broken families are more likely to be poor and abusive.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: