Hacker News new | past | comments | ask | show | jobs | submit login

One thing that comes to mind is this: imagine someone is found with cp on a device. They could defend themselves saying it is AI generated. Unless there is a reliable way to tell AI fakes from real ones people could possibly use this defense.



If AI generated porn were indistinguishable wouldn't that almost totally eliminate demand for the real stuff?


But to generate faithful AI CP it must mean the AI was trained on actual CP dataset. So those who trained the AI would have some explaining to do.


You don't need to train on pictures of canine golfers to make highly convincing pictures of dogs driving golf carts on Mars. https://imgur.com/a/EIWUJYp The AIs are extremely good at mixing concepts.


I don't think that's necessarily true.

An AI can generate an image of a wizard with a frog on their head and that doesn't imply that the training set had such an image


Are you sure? I’d guess that AI can extrapolate from adult porn and non-sexual depictions of children.


So the AI will generate children with adult private parts?


Pretty sure there are non-sexual images of naked children too, such as in anatomy textbooks.


Unknown. For example, I have heard most offenders abuse their relatives, and I don't expect synthetic material to have any impact in this category.

Also, the only way to find out if this has any effect at all (positive or negative) would disgust and outrage many, as that test would require having a region where it's forbidden and a control group where it's allowed and seeing which is worse.

I'm not sure how many people would try to lynch mob (let alone vote out) whoever tries to do that, but I'm sure it's enough that exact numbers don't matter.


My guess is that offenders abuse relatives because they are easier to access and manipulate, not because there is a true preference there. More a crime of opportunity than a pursued goal.


In that scenario, how tf would you know that "real stuff" was eliminated? Think, please.


Makes sense but real people have real ages. Couldnt they just say the AI is a rendition of an 18 year old with some hypothetical development deviation? You'd have to ban all ai porn because the age can't be measured as it's non-existent.


That would indeed be the probable next step for government or intergovernmental organizations. Criminalize AI porn. Then criminalize regular porn.

The government is greedy in its lust for control and order in a chaotic world. It has a tendency to overreach, then overreach again (as we see with in the overlap of privacy and counterterrorism).


Ah yes, the japanese “1000 year old dragon loli” gambit.

Which is actually a perfectly valid defense imo, as it’s horribly dumb to incriminate real people because of fictional characters. Should everyone who has a copy of IT go to jail because of child pornography? It makes no sense.


If the technology gets to that point, who needs the real thing?


People who are into it not because they like their kids young (i.e. "classic" pedos), but because they want to (feel to) have the power of causing pain. There is a real market for custom pedo videos, it's utterly insane.


> People who are into it not because they like their kids young (i.e. "classic" pedos), but because they want to (feel to) have the power of causing pain.

Stupid question but why take kids then and not adult women? Why take the risk of buying CP if you do not like the kids young?


I'm guessing the market will just serve them fake custom videos then ...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: