Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Idea: AI police sketch artist.

3D printed case. Resting on a table. Witness describes the suspect. 10 seconds later it prints out an AI generated ink sketch.

I mean why not? Sell it through to every police station in your state. You can even put a cute little police badge emblem on the case.



These ideas are what we should be worried about, not the paperclip thing.


How is that idea any more concerning than a regular human-drawn police sketch?


It might output a much more detailed image than a human-drawn sketch which could be less useful or more damaging than the vague sketch.

Imagine that a police officer is looking for someone matching the image but doesn't know that it's hallucinated from a vague description, they could let the real suspect go or incorrectly arrest someone who happens to look like the AI generated image but otherwise doesn't have any reason to be a suspect.

Police are already greatly overestimating the accuracy of their own facial recognition tools because they don't realize the limits of the technology, and this would just be worse.


>It might output a much more detailed image than a human-drawn sketch

That's not a necessary property of AI image generation. You could just add a 'output as a sketch' system prompt.


Lack of accountability.


Accountability for what? I recall Procedure already requires approval of the final sketch by the witness. Witnesses could always make mistakes, but that's true even in the current process. Or is your argument sketches should never be used?

In fairness, with the ubiquity of cameras, sketches are much less required...


Police in your jurisdiction are held accountable?


Didn't see the comments yesterday where HN achieved consensus that racist AI might be real but isn't that bad if it is?


Our hypothetical AI won't make any decisions. It just makes sketches as described and approved by witnesses. The relevant racism here is the one any witnesses may have, that's true even with a human police sketch artist.


"as described" according to what? There is simply no way to create image from words without something closely resembling decisions. Maybe "it" won't "make" those decisions, but they will be made somewhere.


Since you opened with passive-aggressive hints of racism, it's possible that you're not following the thread, or actually reading the replies.

Please draw your attention to the discussion about the witness in the process of image generation. For example:

Officer: "Could you describe the man who attacked you, miss."

Witness: "Well, he had ...eyes, a ... forehead, and ..."

<here's the impotent part for you, _lady>

Officer grabs the first rendering from the machine and shows it to the witness: "Did he look like this?"

Witness: "No, his eyes were set further apart."

Whir, whir, the machine prints another image.

Officer: "More like this, then?"

And so on...

In the scenario I described, I'm not sure where a new source of racism is introduced.

Help me see this differently.


Yea, somebody will have to evaluate whether the image matches the word, and that is currently done by the witnesses themselves. How is it worse than the current state?


Not really sure you can say AI is "racist".

It can't think, or form opinions. It's not "intelligent" in any real sense.

It's just Eliza with a really, *really* big array of canned responses to interpolate between.


> Not really sure you can say AI is "racist".

> It can't think, or form opinions. It's not "intelligent" in any real sense.

Honest question, what is the purpose of this comment? What is the change you want to see coming out of this semantic argument?


Ideally, people will stop ascribing thought and intent to a clever IVR script.


In the racism-as-individual-intentional-malice framework sure. But I'm a consequentialist on this one. If it causes disparate & unjust outcomes mediated by perceived race then describing it as racist makes sense. No intent necessary.


No one is arguing that the AI has some sort of intentional racism and inherent real intelligence - they aren't trying to anthropomorphize it.

The argument is that the output is racially discriminatory for a variety of reasons and it's easier to just say "it's racist" than "Many of the datasets that AI is trained on under- or over-represent many ethnic groups" and then dive into the details there.


It's just Eliza with a really, really* big array of canned responses to interpolate between.*

So, just like people, then.


businesses, states, markets, any organization or other system that incorporates super-human agency is already AI, so far performed manually

the progression of technological "AI" has just been the automation and acceleration of their logic and operations

what paperclips are the police maximizing?

everything the alarmists are afraid of has already happened


arrests


Well, maybe it will be a plus then for minorities that all of the training data is of white people. I only joke, as this is a horrible idea all around, but I appreciate your creativity.


Not an expert, but my intuition is that most of the sketch artists job is asking the right questions. I would assume that most people would have trouble describing close friends or even their partners from memory.

Somewhat tangential: the "part of the brain" that is responsible for recognizing faces is incredibly well developed. That "peek-a-boo" game that you play with children? Every time you uncover your face millions of neurons in the childs brain suddenly fire giving them a jolt of "joy". The face recognition is so developed that we tend to see faces were there are none (face pareidolia).

... the point being that the brain does a lot of unconscious work recognizing individuals. Describing those individuals later consciously is pretty error prone.


As I understand it most police departments already use some kind of computer aided facial composite software instead of a traditional sketch artist. I can think of several dystopian reasons throwing AI into the mix might not be great, but tbe larger problem with this is why does it need to be sketched in pen and why does it need to be cute.

Might make a neat like coin op charicature thing though.


yeah something I thought about before, all of a sudden you're the most wanted person and police just complies because that's what the system says

would be crazy, probably a movie plot somewhere


I doubt it would change anything from what they do currently with police sketches; it would just be a faster, more accurate version. It's still just one piece of data they have to work from. The victim could describe the person to an AI, and it would update the 3D model on the fly.

"White Male, Curly hair, mole on face"

Generate.

"Good, but he had a larger nose, and blue eyes."

Generate.

"He was a bit more gaunt, and had some stubble."

Generate.

"Nearly there. More pronounced check bones, and make the jaw a bit softer"

Generate.

In 5 minutes or less, you could get a near exact picture of the potential criminal; something that might take up to an hour or more normally with a professional police sketch artist, and it could easily be in 3D too. There's tremendous value in that.


So, this is pretty much backwards from how police sketches actually work and it would likely obliterate any reliability from the system (which, as I understand, is very low already - and even worse for computer-generated imagery).

People have bad memories and bad perception in stressful situations. They don't actually know what the person looked like; they don't have a strong model in their brains. Police sketchers use clever questioning techniques to get details about features that people wouldn't otherwise think to describe or even realize they have knowledge about. The truth is that there is an absolute limitation to the effectiveness of any facial image reconstruction, which is the limits of human memory. Adding AI to the mix can't change that, but it's extremely likely to influence the witness to describing a less accurate face with higher confidence. In other words, a disaster.


This implies there is such a thing as a reliable eyewitness.

Even victims themselves are famously bad at identifying criminals.


There is probably some "wisdom in crowds" for identifying a suspect. For example one person usually can't estimate the number of gumballs in a jar, but some studies have shown that if you survey 100 people you get a very accurate number. Maybe you need far less than that.. 2-3 people + AI perhaps comes up with a reasonably accurate estimation of reality.


Sadly, events are far more complex than counting items. For example, during the Columbine shooting, many students thought there were 4 shooters (while there were only 2), because at some point one of them remove their hoodie and the other turned their baseball cap backwards. The police thought there were shooters on the second floor because of an optical illusion.

These types of problems are very widespread - it's not rare that people misremember details because of the stress and trauma, and it's also well known that the process of describing/asking questions can cause bias into the victim, as seen in the many cases of people admitting crimes they didn't commit after long interogations.

I've also heard that the quality of police sketches was highly related to the person making the sketch, some have high correlation rate but that is not the norm i.e.: the average sketch artist might not be reliable on average.

JCS Psychology on Youtube is a great channel showing the processes happening during interrogations, if you're interested.


I think having the feedback throughout the process would probably taint the whole thing


This is basically the plot of "The Net" starring Sandra Bullock. A group of hackers steals her identity and creates a new one for her in various systems to cause the police to believe she is a wanted felon.


i love the idea! personally, i'd cast tom cruise in it cause he's just such a great actor.


Not too far from Minority Report


That opens up a whole new set of job opportunities for "prompt engineering"


Why not use an injection molded case?


hahaha that's not a bad idea at all


It's intriguing, because I wonder how this would affect police work. I'm imagining things here, but I assume that when a profile sketch is developed, all officers using that image know that it's "just a sketch" because, it looks like a drawing, because it is one.

So what happens if you now generate a photorealistic "sketch" based on a description? Are officers going to be sufficiently aware to know that's not a actual photo of the guy they are looking for, and act accordingly? Or is it going to heavily bias a manhunt effort? moreover, what happens when the photo randomly ends up close to someone present in the dataset?


The police already know those sketches are super fake lol. The point of those isn't to arrest the right person it's to have an excuse to hassle arrest or maybe kill a minority.


make the system output a sketch. boom - problem solved.


Prints "<eye witness description of suspect> in <${art style}>"

Driven by a little knob selector.

Featuring 3 art styles: 1. Police sketch classic 2. Realistic photo 3. Manga character


I know that would solve the problem. What I'm curious about is what happens if that isn't done, because I can see promoting the idea of a "photorealistic AI sketch artist" as a very attractive proposal for a certain type of manager. It's just a thought experiment.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: