Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is off topic but horny people are by far the most interested in conjuring up custom images. DALL-E trained on porn would be huge.


This is absolutely true. Look at text prediction models as an example (e.g. GPT-3). One of the biggest (if not the biggest) applications was story-generation tools like AI Dungeon. Guess what most people actually used AI Dungeon for? Erotica. Guess what happened when OpenAI cracked down on it? A huge portion of the userbase jumped ship and built a replacement (NovelAI) using open-source EleutherAI models that explicitly did support erotica, which ended up being even better than the original ever was. I can tell you that there is very strong interest in nsfw image generation in those communities, as well as multiple hobby projects/experiments attempting to train models on NSFW content (e.g. content-tagged boorus), or bootstrap/finetune existing models to get this sort of thing to work.


This is one of the experiments with Progressive Growing GAN (ProGAN) technology from Nvidia:

NSFW: https://medium.com/@davidmack/what-i-learned-from-building-a...


Interesting that they talk about how hard it is to make a business out of it:

>I felt at this point that I’d hit a dead end. Press and fundraising would be tough and require some extra creativity and force. I spoke to friends about hiring them, and had polarized answers. Overall, this project had become less appealing.

Seems to me there is a lot of money to be made here if it works. The interesting thing about porn compared to say TV is that people have very, very specific interests and basically just want an infinite amount of content within that interest. It's not like with TV where cooking shows become popular, and the people that used to watch dramas are now watching cooking shows.

So the ability to generate highly specific content tailored to an individual's very precise requirements seems potentially very lucrative.


>A day later I start to see detailed results from the model. Its generated images actually look like vaginas:

This doesn't say much about the amount of vaginas that this guy has seen.


This might be horrible to say, but could this be a solution to csam? From what I've seen most people who enjoy csam do genuinely feel bad for the children, but they're sick, and can't control themselves. Might they be willing to indulge in fake csam instead?


I don't think it's horrible, it just seems like a practical solution. It is such a taboo subject that nobody seems to really talk about the possibilities, but it's worth asking the question -- if someone so inclined can gratify their desires in private with fake imagery, will it prevent them from leaving their home and seeking out someone to hurt?

Or will it strengthen their need for 'the real thing' as someone else suggested in a sibling comment?

In any case, we still don't have a great answer for the legal question. Possession of realistic fake imagery is illegal, on the grounds that its very existence is a risk to children. There isn't any actual science behind that, it's just what the politicians have said to justify regulating what would otherwise be a constitutionally protected right. I imagine it will become a topic of discussion again (my quick research says the last major revision to US law in this regard was about 20 years ago).


It seems more misguided than horrible. I'm not a psychologist, but indulging in pathological behaviors would seem to strengthen them. Heroin addicts need to quit, not use methadone forever.


Is that true? Someone on HN once described it like "eventually you get tired of the addiction and want to quit" (if you live long enough, that is). No personal experience, but I have known a couple former addicts and this seems to reflect their reality.

Maybe an effective approach would be to maximize harm reduction until the addiction has run its course? That seems to be the Portugal solution, and it seems to be successful.


I don't think it's accurate to equate pedohilia with drug addiction. It's a sexual orientation, not a chemical dependence.

Do gay men eventually get tired of being homosexual and turn straight?


The truth is that addiction, like all mental illness, is complex and unique to each individual.

Diagnosing mental illness generally consists of identifying some number of symptoms out of a possible list - often something like five out of eight possible. That means two people can be diagnosed with the same thing with only two overlapping symptoms.

So basically, don't listen to the guy who starts with "I'm not a psychologist" and then decides to play armchair psychologist.


> indulging in pathological behaviors

Let us take this lesson from "pray the gay away" camps and other types of "conversion therapy": You cannot take these preferences out of people, it just does not work. At best, you can make it clear how they are is a travesty and a lot of them will hide it for the rest of their life successfully. This is not a good solution compared to what you were talking about earlier because it increases human suffering by a lot and doesn't have an advantage over a suffering-free solution.

That said, I don't think I could justify to myself to create such an AI. I've quite simply been so disillusioned by the depravity of man and especially in this case I want nothing to do with it even peripherally. Perhaps that makes me a little hypocritical. Philosophically it would still be nice to solve this issue, even if it is just so no more children need to suffer (which should always be the main goal).


> I'm not a psychologist, but indulging in pathological behaviors would seem to strengthen them. Heroin addicts need to quit, not use methadone forever.

Yes, clearly not a psychologist, nor an addiction treatment specialist. Methadone is often used indefinitely as maintenance therapy for opioid use disorder.


IIUC, fake CSAM is also illegal.


I believe that as of 2003 it has to be a realistic fake, however. Obvious cartoons are no longer illegal.

I imagine it'll get challenged again at some point on constitutional grounds. It is illegal right now on a moral basis, which is probably the weakest argument over the long term.


> Obvious cartoons are no longer illegal.

AFAIK they are still illegal in Canada.


I'm aware that Japanese lolicon in anime, manga, video games, and other contexts, is at least ... problematic ... in numerous areas. Several online sites have banned it, and on Mastodon and the Fediverse, there are often peer-level blocks against sites in which lolicon is permitted.

Then name itself is a portmanteau of Lolita complex, after the Nabokov novel.

https://en.wikipedia.org/wiki/Lolicon#Legality_and_censorshi...


Correct. Differs from, for example, rules protecting cruelty to animals. You can fake such cruelty without consequence - as is done in movies regularly.

More interesting question is this. Is it a crime if you generate CSAM just for ones own consumption?


> Is it a crime if you generate CSAM just for ones own consumption?

Yep. If it isn't obviously fake (i.e. a cartoon) the possession is illegal whether you produce it yourself or not. Though it's probably safe to say that you're unlikely to get caught if you're not sharing those images with other people.


What if it's in the "uncanny valley"?

My point is that the courts are going to have a hard time with this.


Well, the US law says "[when it] appears virtually indistinguishable [from the real deal]" (insert appropriate legal terminology on either end, but the three quoted words are the relevant bit.

I think we're in agreement that the advancement of the technology is going to make this topic come back up for legal debate. When the gulf between CGI and real photography was large, it was pretty straightforward. Not so much now.


I think the hard part is there is close to no way to know its fake. But you could also that if its so easy to fake photo realistic content, why would you ever make real content.


Depends on jurisdiction. Not a huge amount of places have outlawed fake CSAM yet, but it is growing.


Porn seems to quietly power the Internet, in so many ways. I imagine people are already getting creative with fake porn, and it's only going to intensify over time. Especially on the types of imagery that are illegal to possess.


It probably would be. Unfortunately, the DALL-E people have foreseen that use, and balked at it for some reason:

> We’ve limited the ability for DALL·E 2 to generate violent, hate, or adult images.


>for some reason

"Our investors include Microsoft, Reid Hoffman’s charitable foundation, and Khosla Ventures."


One of the primary features of DALL-E 2 is inpainting. Thus, if they allowed it, you could easily just paint out a celebrity's clothed figure and ask DALL-E to replace it with a nude.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: