I once heard a devils advocate say, “if child porn can be fully AI generated and not imply more exploitation of real children, and it’s still banned then it’s about control not harm.”
If AI is trending towards being better than humans at intelligence and content generation, it's possible its CGP (Child generated P*n) would be better too. Maybe that destroys the economies of p*n generation such that like software generation, it pushes people away from the profession.
I've been thinking about this for a while. It's a really interesting question.
If we expand to include all porn, then we can predict:
- The demand for real porn will be reduced; if the LLM can produce porn tailored to the individual, then we're going to see that impact the demand for real porn.
- The disconnect between porn and real sexual activity will continue to diverge. If most people are able to conjure their perfect sexual partner and perfect fantasy situation at will, then real life is going to be a bit of a let-down. And, of course, porn sex is not very like real sex already, so presumably that is going to get further apart [0].
- Women and men will consume different porn. This already happens, with limited crossover, but if everyone gets their perfect porn, it'll be rare to find something that appeals to all sexualities. Again, the trend will be to widen the current gap.
- Opportunities for sex work will both dry up, and get more extreme. OnlyFans will probably die off. Actual live sex work will be forced to cater to people who can't get their kicks from LLM-generated perfect fantasies, so that's going to be the more extreme end of the spectrum. This may all be a good thing, depending on your attitude to sex work in the first place.
I think we end up in a situation where the default sexual experience is alone with an LLM, and actual real-life sex is both rarer and more weird.
I'll keep thinking on it. It's interesting.
[0] though there is the opportunity to make this an educational experience, of course. But I very much doubt any AI company will go down that road.
That's not the gotcha that you think it is because everyone else out there reading this realizes that these things are able to combine things together to make a previously non-existent thing. The same technology that has clothing being put onto people that never wore them is able to mash together the concept of children and naked adults. I doubt a red panda piloting a jet exists in the dataset directly, yet it is able to generate an image of one because those separate concepts exist in the training data. So it's gross and squicks me to hell to think too much about it, but no, it doesn't actually need to be fed CSAM in order to generate CSAM.
The counter-devil's advocate[0] is that consuming CSAM, whether real or not, normalizes the behavior and makes it more likely for susceptible people to actually act on those urges in real life. Kind of like how dangerous behaviors like choking seem to be induced by trends in porn.
[0] Considering how CSAM is abused to advocate against civil liberties, I'd say there are devils on both sides of this argument!
Attack away or downvote my logic.