Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
[flagged] Man raped in jail after AI technology wrongfully identifies him in robbery (star-telegram.com)
31 points by landonxjames on Jan 30, 2024 | hide | past | favorite | 19 comments


Unless EssilorLuxottica or Macy's developed all the "AI", "facial recognition", "biometric information", etc. tech in-house, are there missing company names here?

There's also this, which I'm not sure how to interpret:

> “So, when EssilorLuxottica and Macy’s compared unclear security footage to (the man’s) mugshots from the 1980s, these companies knew that there was an error rate of almost 90%. Yet these companies told (Houston Police) with absolute certainty that they identified the person who robbed the Sunglass Hut,” the lawsuit said.


Yup, I'd love to know the name of that wonderful software and the company that made it.

Also I'd love to know the names of the judge, prosecutor, and investigator who put a man in jail simply because a man from the store told them to.


There's not really any need.

The judge did it because the prosecution said he did.

The prosecution brought charges because the police said it was him.

The police arrested him because the corporation said it was him.

The corporation said it was him because the software said it was him.

The software said it was him because it was SHIT.


The whole point of those peoples' jobs is to do due diligence. If they're blindly trusting the next person/machine down the line then they're abdicating their responsibility.


>There's not really any need.

No, there is a need.

We know what happened and why.

We also need to know who all these people were, because all of them failed at their jobs.

The last person in the chain using a shitty tool doesn't excuse anyone else. The store employee could've consulted astrological charts FFS, the use astrology wouldn't be the problem in this scenario.


People must be seeing different reporting than I am.

Because otherwise it looks like some people are reacting to a headline and a vague statement (possibly morphed through the telephone game of reporting) claiming how one piece of how the outcome happened, and are ready to extrapolate from that, and be judge, jury, and executioner.

Which would seem ironic: making much the same error themselves that they believe they're correcting in someone else.


I feel like the software did its job.

It reported a 90% chance that this was not the suspect when asked to compare photos. Yet the company decided 10% possibility of being correct was good enough for them to report to the police that it’s definitely this guy.


How do you know all these things, about this specific case?


Can we learn that with an FOIA request?


Journalists might look into exactly what happened on that side of things, but this particular article doesn't seem to have done that, and some assertion in there might be incorrect or misleading.

Best not to feed Internet pitchfork villagers, who have shown countless times that they're collectively dumb as snot, with no sense of process, critical thinking, nor decency.

I'd think the identities of public servants isn't really relevant at this point, but rather, what was the evidence and chronology.

As techies, we're best suited to tackle some of the evidence around tech, like what was the tech, how does it work, how was it represented, how was it used, what did it do, etc.


> As techies, we're best suited to tackle some of the evidence around tech, like what was the tech, how does it work, how was it represented, how was it used, what did it do, etc.

Go ahead then...


How many more cases where data and AI people messed up? This is on us as a professional community of people who build the future. Is this the future we want? We seem to actively work towards it.


As many as necessary. Late stage capitalism has entered its paperclip maximizer phase - AI will pervade everything, everywhere all the time and we will simply learn (or more likely be trained) to accept the failures as the necessary price we must pay for efficient markets.

Is this the future we want? For many people, the answer is a clear no. But like the Luddites of generations before (who were right about everything) we are not the ones who get to decide what the future will be. Maybe there's still time to go be a potato farmer in Idaho or something.


> “He was followed into the bathroom by three violent criminals. He was beaten, forced on the ground, and brutally gang raped. After this violent attack, one of the criminals held a shank against his neck and told him that if he reported the rape to anyone, he would be murdered,” the lawsuit said.

Every square inch of the prison should be covered by cameras. Of all the new powers and fancy equipment law enforcement seeks, this should come first.

Yes, put cameras in the bathroom. It's stupid that we wring our hands over bathroom privacy in jails while rapes are so common. "You'll get raped, but at least you'll have privacy". Stupid.


[dupe]

More discussion last week: https://news.ycombinator.com/item?id=39118534


And not a single person will face any consequences.


[flagged]


Please don't take HN threads into nationalistic flamewar, regardless of which country you have a problem with. It's not what this site is for, and destroys what it is for.

https://news.ycombinator.com/newsguidelines.html


With all due respect, I don't believe they were trying to turn this into a nationalist flamewar.

I agree with them, and as an American we should be free to criticize our country. I'm ashamed as an American that these stories occur. "Don't drop the soap" didn't become a phrase on it's own.


You may be right, but we can't moderate by intent—we have to moderate by effects. See https://hn.algolia.com/?dateRange=all&page=0&prefix=true&sor... for past explanations.

This is an important issue, no question. But on HN, if people want to post about important issues we need them to do so thoughtfully—not in the flamewar/internet style.

https://news.ycombinator.com/newsguidelines.html




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: