Hacker News new | past | comments | ask | show | jobs | submit login
An Algorithm Told Police She Was Safe. Then Her Husband Killed Her (nytimes.com)
12 points by jryb 3 months ago | hide | past | favorite | 15 comments




Humans just can't handle statistical pattern recognition.


An algorithm?!? Oh god, an algorithm!!

With all of these scare mongering headlines (paywalled article), I need to know... how bad are the humans (without algorithms) compared to the .... algorithms.


Reader mode let me read the article. It was actually pretty even handed, and the police say that the algorithm is more accurate. The headline is just a bit click-baity.


Apparently police never made mistakes before algorithms were introduced. Everybody knows that, silly!


As usual the concern should not be "algorithms make mistakes" but more "who is accountable for the mistake when it is made by an algorithm"

Police making mistakes (should) be held accountable for them. They often aren't, but that's a slightly different problem

With an algorithm that is "usually more accurate than an individual", it would be irresponsible to allow the individual to make a decision right? But then you can't hold anyone or anything accountable

I'm not sure we're quite ready to be living in a post-accountability world


What bothers me is what seems to be some sort of psychological distinction between "algorithm" and "process".

The cops* have had processes for years, some of which are sensible and useful and some of which are seemingly sensible but terrible, and some of which are nonsense. Over time, one hopes, the worst of the processes (or the worst steps in some processes) get winnowed out and/or replaced.

Also machinery is integrated into processes and over time its efficacy is evaluated ("shot spotters": worse than useless; police in patrol cars: better in some ways, much worse in others).

But somehow when a mechanically computed algorithm is used as a step in a process, all of this sort of reasoning goes out the window and now it's scary.

* and everybody else


> But somehow when a mechanically computed algorithm is used as a step in a process, all of this sort of reasoning goes out the window and now it's scary

Humans following procedure can be analyzed in meaningful ways by non-technical third parties to figure out where a failure point is

Black box program algorithms cannot. They require experts to analyze, who are also often the people who own the algorithm and wrote it in the first place


These algos are part of a process, so subject to the same kind of treatment. The car was a "black box" as far as its social impact when introduced, yet we managed to figure it out.

And a lot of forensic techniques, like fingerprints, flame spreading, graphology and others are epistemologically less sound than the kind of algo discussed in this article yet people seem comfortable with them. At least phrenology has been abandoned (AFAIK...eek)


> when a mechanically computed algorithm is used as a step in a process, all of this sort of reasoning goes out the window

abdication-of-responsibility-as-a-service


Isn’t the same person held accountable both with and without the algorithm: no one? Unless the police man makes a radically obviously wrong decision, isn’t he protected by immunity anyways? If he’s properly weighing the options and makes a decision according to the orders, what would happen to them?


> If he’s properly weighing the options and makes a decision according to the orders, what would happen to them?

Right, but those are things that can be investigated. If it turns out the officer didn't use good judgment, or was drunk on the job when they made the choice or was otherwise negligent, there absolutely should be consequences, no?

But how do you do this with a black box algorithm?


Well the requirement is that the AI was developed with the necessary care, and after that, what it says is not negligent by definition. If the developers mishandled something during the development, they are responsible. If the police enters wrong data and it is found to be negligent, they are responsible. I don’t really get the ethical problem here.


If you make me responsible for what happens after someone is released, I'm just going to lock everyone up forever. "Sorry, can't be sure it's okay to release them. Better safe than sorry."


Oh God. Another "protect women" at all cost BS article.

These use tragic single incident examples to legitimize laws that are then used to prosecute men in divorce.

Google "Silver Bullet Divorce Strategy"

Go kick rocks NYTimes.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: