Hacker News new | past | comments | ask | show | jobs | submit login

>We’ve limited the ability for DALL·E 2 to generate ... adult images.

I think that using something like this for porn could potentially offer the biggest benefit to society. So much has been said about how this industry exploits young and vulnerable models. Cheap autogenerated images (and in the future videos) would pretty much remove the demand for human models and eliminate the related suffering, no?

EDIT: typo




Depends whether you think models should be able to generate cp.

It's almost impossible to even give an affirmative answer to that question without making yourself a target. And as much as I err on the side of creator freedom, I find myself shying away from saying yes without qualifications.

And if you don't allow cp, then by definition you require some censoring. At that point it's just a matter of where you censor, not whether. OpenAI has gone as far as possible on the censorship, reducing the impact of the model to "something that can make people smile." But it's sort of hard to blame them, if they want to focus on making models rather than fighting political battles.

One could imagine a cyberpunk future where seedy AI cp images are swapped in an AR universe, generated by models ran by underground hackers that scrounge together what resources they can to power the behemoth models that they stole via hacks. Probably worth a short story at least.

You could make the argument that we have fine laws around porn right now, and that we should simply follow those. But it's not clear that AI generated imagery can be illegal at all. The question will only become more pressing with time, and society has to solve it before it can address the holistic concerns you point out.

OpenAI ain't gonna fight that fight, so it's up to EleutherAI or someone else. But whoever fights it in the affirmative will probably be vilified, so it'd require an impressive level of selflessness.


I don't think it's necessarily certain villainy for those who fight that fight as long as they are fighting it correctly.

There's a huge case to be made that flooding the darknet with AI generated CP reduces the revictimization of those in authentic CP images, and would cut down on the motivating factors to produce authentic CP (for which original production is often a requirement to join CP distribution rings).

As well, I have wondered for a long time how the development of AI generated CP could be used in treatment settings, such as (a) providing access to victimless images in exchange for registration and undergoing treatment, and (b) exploring if possible to manipulate generated images over time to gradually "age up" attraction, such as learning what characteristics are being selected for and aging the others until you end up with someone attracted to youthful faces on adult bodies or adult faces on bodies with smaller sexual characteristics, etc - ideally finding a middle ground that allows for rewiring attraction to a point they can find fulfilling partnerships with consenting adults/sex workers.

As a society we largely just sweep the existence of pedophiles under the rug, and that certainly hasn't helped protect people - nearly one in four are victims of sexual abuse before adulthood, and that tracks with my own social circle.

Maybe it's time to all grow up and recognize it as a systemic social issue for which new and novel approaches may be necessary, and AI seems like a tool with very high potential for doing just that while reducing harm on victims in broad swaths.

I'd not be that happy with an 8chan AI just spitting out CP images, but I'd be very happy with groups currently working on the issue from a treatment or victim-focus having the ability to change the script however they can with the availability of victimless CP content.


Thought-provoking post, thanks.

Especially the part about maybe generating specifically tailored material to "train" folks. Although, while obviously moral instead of immoral like "gay conversion therapy", I wonder if it would be just as ineffective.

    and would cut down on the motivating factors 
    to produce authentic CP (for which original 
    production is often a requirement to join 
    CP distribution rings).
Hmmmmm. Will machine-generated "normal" (i.e., non-CP) porn really eliminate the motivating factors to produce normal porn?

I obviously can't speak for enjoyers of CP. But when watching normal porn, I think part of the thrill for many/most people is knowing that what's happening is real.

Another potential risk is that a flood of publicly available, machine-generated CP might actually help the producers and distributors of real CP by serving as camouflage. Finding and prosecuting the people who make real CP is difficult enough already. Now, imagine if the good guys couldn't even reliably tell what was real and there were 100000x as many fake images as real ones floating around.

Yikes.


> But when watching normal porn, I think part of the thrill for many/most people is knowing that what's happening is real.

I'm wondering how true that is.

Obviously, lots of people consume hentai, and platforms like Danbooru are immensely popular.

Also, speaking personally... some of the porn that I've consumed that felt the most "real" was 3D animations where the only real humans behind them were the SFM artists (and voice actors). These artists felt free to do scenes with, like, actual cinematography, with flirting and teasing and emotions between the characters, of a kind you never see even in softcore live-action porn.

So I do wonder how much potential AI generation has for completely substituting large parts of the porn industry.


> Finding and prosecuting the people who make real CP is difficult enough already.

let's assume that AI generated CP should be illegal. Does it mean that possession of model that is able to generate such content should also be illegal? If not, then it's easy to just generate content on the fly and do not store anything illegal. But when we make model illegal, then how do you enforce that? Models are versatile enough to generate a lot of different content, how do you decide if ability to generate illegal content is just a byproduct or purpose of that model?


>> Finding and prosecuting the people who >> make real CP is difficult enough already.

> let's assume that AI generated CP should be illegal

Well that's a big assumption, lol. I definitely agree that it would be impossible to enforce, for the reasons you say.

I personally would not be in favor of such a law at all. Partially because it's unenforceable as you say, and partially on principle.

The argument against real CP is extremely clear: we deem it abominable because it harms children. That doesn't apply to computer-generated CP, or the models/tools used to produce it.


I think you might be able to argue AI generated CP could cause indirect harm by feeding those desires and making people more likely to act on them, but I agree that's a far more fragile argument.


I think there's a big range of possibilities there and they're not mutually exclusive.

There's the possibility that watching FOO directly encourages viewers to do FOO in real life. Like you said, this is the most fragile. I think clearly this is true in some cases -- most of us have seen a food commercial on TV and thought, "I could really go for that right now." I'm less convinced that it's true for something like pedophilia: the average person will be revolted by it, not encouraged, unless they already are into that kind of awful thing.

There's the possibility that watching FOO doesn't directly encourage viewers to do FOO, but serves to kind of normalize it. I think this happens a lot, but I think it takes a carefully crafted context and message.

There's the possibility that AI generated CP could actually helps children, by providing a safe outlet for pedophiles so that they wouldn't need to do heinous shit in real life. I recall reading studies that instances of (adult) rape in societies were inversely correlated with the availability of (adult) pornography, with a possible explanation being that porn provided a safe outlet for people who weren't getting the kind of sex they wanted.


Most people are not developers and most people don't provide SaaS products. They are only consumers of existing technology.

In that sense, instead of enforcing non-existance of models, the enforcement could just make ilegal to provide any service that process inputs or provide outputs that are cp-like, by, i.e. obligating people with the models to add filters on input and/or after result is generated but before it is displayed or returned from computation.


But when watching normal porn, I think part of the thrill for many/most people is knowing that what's happening is real.

Unless you understand real to just mean that actual humans were involved, describing porn as real seems to be a bit of a stretch more often than not.


I gave my audience a bit of credit there.

I am assuming that any adult reading this understands that professional porn is quite different from the sex most of us experience in our private lives in a number of major ways, both emotionally and physically.[1]

But anyway, yes. By "real" I mean "real human beings, having real sex."

----

[1] There is a lot of homemade, amateur porn on the big well-known porn sites and it seems quite popular, and much of that is closer to what typical folks do at home. But that's beside the point.


> exploring if possible to manipulate generated images over time to gradually "age up" attraction

If people already accepted that they need help, there are many good ways to treat people with unwanted sexual obsessions (trying to choose my words carefully here). I honestly don't think that it would help them to serve them more content.

However, I'd love to see some research to explore the possibility of involving machine generated content in psychological treatment. The core of your idea is IMHO brilliant.


How do you suppose your CP generator will be trained without using authentic CP images? Not only will that require revictimization but you’ll also be downloading CP to train the model.


Did they need to put possums in space suits to tell Dell-E 2 how to render them?


No because there are lots of photos of possum and space that it has seen


No, but kid genitalia pictures would need to be provided, right?


There are tons of legal medical images of that content as well. Training wouldn't require any damaging material.


There are so many excellent, thought-provoking comments in this thread, but yours caught me especially. Something that came to mind immediately upon reading the release was the potential for this technology to transform literature, adding AI generated imagery to turn any novel into a visual novel as a premium way to experience the story, something akin to composing D-Box seat response to a modern movie. I was imagining telling the cyberpunk future story you were elaborating, which is really compelling, in such a way and couldn't help but smile.


In the same theme, I liked the comments of both of you.

Another use case could be to make it easier/ automatic to create comics. You tell what the background should be, characters should be doing and the dialogues. Boom, you have a good enough comic.

-----------

Reading as a medium has not evolved with technology. Creating the imagery does happen in humans' minds. It's not surprise that some people enjoy doing that (and also enjoy watching that imagery) and others do not.

This could be a helping brain to create those imageries.

-----------

Now imagine, reading stories to your child. Actually, creating stories for your child. Where they are the characters in the stories. Having a visual element to it is definetly going to be a premium experience.


I can also imagine the magical nature of a child being able to make up a story (as children are wont to do) and having Dall-E here generating a picture book as they go.


Please write it! I'd love to read one.


I've thought for quite some time that questionable AI-generated content will lie at the heart of an forthcoming 'Infocalypse'. [0] Given the 2021 AI Dungeon fiasco over text-based AI-generated child porn, I shall posit that it's already upon us.

30 years since the original issue of encryption, it looks like cp trumps the other Horsemen of the Cyperpunk FAQ, with drug dealers and organized crime taking the back seat. It's interesting how misinformation is a recent development that they anticipate; a Google search shows that the term 'Infocalypse' was actually appropriated by discussions of deepfakes some time in mid-2020. That said, the crypto wars are here to stay—most recently with EARN IT reintroduced just two months ago.

The similar issue of 3D-printed guns has developed in parallel over the past decade as democratized manufacturing became a reality. There are even HN discussions tying all of these technologies together, by comparing attitudes towards the availability of Tor vs guns (e.g., [1]).

And there are innumerable related moral qualms to be had in the future; will the illegal drugs or weapons produced using matter replicators be AI-designed?

Overall, I think all of these issues revolve around the question of what it means to limit freedoms that we've only just invented, as technological advances enable things never before considered possible in legislation. (And as the parent comment implies, here's where the use of science fiction in considering the implications of the impossible comes in).

[0] https://en.wikipedia.org/wiki/Four_Horsemen_of_the_Infocalyp...

[1] https://news.ycombinator.com/item?id=8816013


Of course some level of censorship is needed, otherwise it can be used to produce porn involving real people without their consent (eg celebs)


we already have a lot of fabricated content like that using current photo editing technology (Photoshop) and it's not causing much legal nor moral issues


Because it’s still pretty hard to make it and it’s bad - you can easily tell it’s fake.

This makes it as easy as typing a sentence - and the quality seems fairly realistic


Considering where the progress on deepfakes is at, I'm going to have to disagree on both counts of "hard to make" and "easily tell it's fake".


Would this not necessarily require training it on a large body of real CSAM? Seems like it would be a non-starter.


Surprisingly no. It knows what a child looks like, and can infer what a naked child looks like from medical imagery.

A child with adult body parts is a whole other class of weirdness that might pop out too.

Models want to surprise us all.


Relatedly, when checking for a related comment, I wanted to see what the current state of deep fakes progress was, so I went to the usual place where the bleeding edge for such things could be found.

First video clips were with the faces of your usual celebrities, but then suddenly I got "treated" to Greta Thunberg in the situations you might expect. I cut my exploration short.

Now, Greta Thunberg is actually 19 now (how time flies !), except that deep fake was most likely trained on her media appearances, which started when she was 15 !

(I guess at least that she wasn't a child any more, which might explain why those clips had not been almost immediately flagged and removed ?)


Religious people don't only believe that porn harms the models, but also the user. I happen to agree, despite being a porn user - Porn is a form of simulated and not-real stimulation. Porn is harmful to the user the same way that any form of delusion is: It associated positive pleasure with stimulation that does not fulfil any basic or even higher-level needs, and is unsustainable. Porn is somewhere on the same scale as wireheading[1]

That doesn't mean that it's all bad, and that there's no recreational use for it. We have limits on the availability of various other artificial stimulants. We should continue to have limits on the availability of porn. Where to draw that line is a real debate.

[1] https://en.wikipedia.org/wiki/Wirehead_(science_fiction)


Iain Banks' "Surface Detail" would like to have a word with you.

This author's books are great at putting these sort of moral ideas to test in a sci-fi context. This specific tome portraits virtual wars and virtual "hells". The hope is of being more civilized than by waging real war or torturing real living entities. However some protagonists argue that virtual life is indistinguishable from real life, and so sacrificing virtual entities to save "real" ones is a fallacy.

Or some such, it's been a while.


No.

If people are exposed to stimuli, they will pursue increasingly stimulating versions of it. I.e., if they see artificial CP, they will often begin to become desensitized (habituated) and pursue real CP or even live children thereafter.

Conversely, if people are not exposed to certain stimuli, they will never be able to conceptualize them, and thus will be unable to think about them.

Obviously you cannot eliminate all CP but minimizing the overall levels of exposure / ease of access to these kinds of things is way more appropriate than maximizing it.


    If people are exposed to stimuli, they will pursue 
    increasingly stimulating versions of it.
This is not true in any kind of universal way.

If you enjoy car chases in movies, does that mean you're going to require more and more intense chase scenes, and then consume real-life crash footage, and ultimately progress to doing your own daredevil driving stunts in real life?

No, because at some point it's "enough."

Same with... literally anything we enjoy. Did you enjoy your lunch? Did you compulsively feel the need to work up to crazier and crazier lunches?

What about sex? Have you had sex? Do you feel the need to seek out crazier and crazier versions of it?


> What about sex? Have you had sex? Do you feel the need to seek out crazier and crazier versions of it?

For porn and sex it's different though. Some people are attracted to things that are deviant and taboo. That's the part they're looking for. As pornography has become more widely accepted, a market has developed for more and more extreme forms of it. This has been documented. It's not the content per-se but rather the nature of it that is found attractive. So the idea is to find a line that's reasonable so the people that feel the need to get close to that line can have that urge fulfilled without damaging society.

A market will form for more and more extreme content as soon as the line moves and what was one taboo no longer is. An Overton window of sorts for pornography.


There seems to be a small issue in GP's logical inference to me, in that he places artificial CP as proportional and wholly inferior replacement to real CP. As if, ham sandwiches and boiled sausages are _inferior_ replacements to blocks of body parts of animals on a dish.

I don't think this is the case, from anecdotal experiences; Hollywood chase scenes are much more exciting to me than real life crash footage, I've watched enough. They need cooking, and if you are cooking anyway, mixing artificial and "natural" ingredients can even be a problem than a positive.

Truth is always boring.


> If people are exposed to stimuli, they will pursue increasingly stimulating versions of it. I.e., if they see artificial CP, they will often begin to become desensitized (habituated) and pursue real CP or even live children thereafter.

I have accumulated tens of thousands of headshots in video games but have yet to ever shoot a single real person in the face. More importantly, I have never had the urge to seek out same.

I am not sure that your initial premise has any truth to it.


The point is more "can you conceive of a headshot before you've ever witnessed one?" And the assertion is, no.

I should be explicit -- I am saying the exposure which makes one seek stimulus is merely a catalyst for deeper urges, not a generator of them as such. A certain level of inhibition (e.g. sociopathy) is required but IMO so is a prior conception of the deed.

In your example, if someone is predisposed to wanting to shoot actual people in the head, exposing them to video game headshots may distract in the short term but desensitizes and entrenches the image in the long term, possibly making it easier to decide to pull the trigger later on if they are sufficiently inhibited of social concerns. This does not happen for people with high inhibitions, or at least sufficient self-control.


> The point is more "can you conceive of a headshot before you've ever witnessed one?" And the assertion is, no.

I'm not sure that's true. Our brains can imagine a lot that we've never seen, though maybe not very accurately. Inventors and developers and artists do it all the time, if we are talking about the same thing.

I'm not sure that disproves your premise. Virtual experiences may make real ones easier, but some research and details about where it works, where it doesn't, would be helpful. Many training programs use virtual experiences, such as flight simulators.


> "can you conceive of a headshot before you've ever witnessed one?"

Am totally blind, have never been able to see, can still conceive of a headshot. So, yes?


"Conceive of" is a different idea than "visualize"...


They can by definition not perceive a headshot, it is a visual thing. I'm not sure what point you're trying to make here, the difference is not germane to the conversation.


Is this just your own personal theory or opinion? Do you have some proof?

To put it as nicely as possible, this wildly contradicts reality as I have experienced it and observed others experiencing it.


I'm not sure I agree with the statement, you're putting forth a lot of assertions without the actual quantitative data to back up what you're saying, and even though you think it sounds intuitive that doesn't necessarily make it valid.

I'd actually argue the reverse, I think you see a lot more effort towards acquiring things that are illegal than you would otherwise.


It's documented well already. The Overton window for pornography has continued to move to more and more extreme forms as what was once considered unacceptable and taboo becomes socially acceptable. It's because there is a market for deviance. Some people are interested in what's taboo and off limits and so long as they are approaching or just crossing that line, they're happy. As we've moved that line these people are no longer happy with the status quo and want content that is taboo, so a new market forms around that.

Pornographers know this and talk about. Read David Foster Wallace's essay on it.


Wow I didn't even think of this, that people could use this for something so horrifying. I'm relived that the geniuses behind this seem so smart that they even thought of this too and prohibit using the AI for sexual images.

> Our content policy does not allow users to generate violent, adult, or political content, among other categories. We won’t generate images if our filters identify text prompts and image uploads that may violate our policies. We also have automated and human monitoring systems to guard against misuse.


Please tell me your being sarcastic…

This is arguably the most insipid and stupid crippling of a powerful tool for content creation I can think of. It’s worse than the adobe updates using every cpu core and locking up my machine once a week.

What counts as “political” hm? Want it to look like that Obama poster or perhaps you want a Soviet Union flag for your retro 80s punk… oops sorry “political”… let’s go to adult… hmm that’s even dumber is the model showing too much ankle? What about the obvious fact that this is just designed with a heterodoxy view of pornography and likely does nothing to stem the wildly various fetishes and other sexual proclivities that exist in the world…

It is effectively “we got squeamish and have done a bunch of stuff to stop you doing stuff that makes us squeamish, please don’t make us squeamish, we’re so worried we’re even checking for it in case you sneak something past us”…

They should comply with the law, try to prevent and also check for child porn… but otherwise just let users use the damn tool, if someone wants an Obama hope poster of a sexualised Mussolini jerking off onto a balloon animal… why the heck do they feel the need to say no to that. It’s a deeply repressive instinct that should be fought against whenever people start to “police” what is acceptable in artistic mediums.

I look forward to the reimplemented versions of this from efforts like EuletherAI and others.


Nonsense, I think the opposite is true where if you can satisfy your urges in a way that doesn’t put you in jail for a decade, most people will take that route.


I suspect that if a free version of this comes out and allows adult image generation, 90% of what it will be used for is adult stuff (see the kerfuffle with AIDungeon).

I can get why the people who worked hard on it and spent money building it don't want to be associated with porn.


> I can get why the people who worked hard on it and spent money building it don't want to be associated with porn.

Why? Is there something inherently wrong with porn? Is it not noble to supply a base human need, based on some arbitrary cultural artifact that you possess?


The problem might be that people are simply lying. Their real reasons are religious/ideological, but they cite humanitarian concerns (which their own religious stigma is partly responsible for).


> Their real reasons are religious/ideological, but they cite humanitarian concerns

Are you asserting that nobody has humanitarian concerns? If so, that's quite a statement; what basis is there? I've seen so many humanitarian acts, big and small, that I can't begin to count. I've seen them today. I hear people express humanitarian beliefs and feelings all the time. I do them and have them myself. Maybe I misunderstand.


“Strawman argument”


It'd be ironic if we ended up destroying our planet by using so much electricity to train models to generate a maximally optimal version of the type of content that you refer to similar to crypto mining.


When you combine advanced versions of this with advanced versions of GTP-3 you will not be able to tell the difference between AI and only fans.

I'm not saying that AI will pass all Turing tests. But as far as having a virtual girlfriend/prostitute.


> a virtual girlfriend/prostitute

I'm not picking on the commenter - by itself it's not a big deal - but look at the assumptions behind that comment, which I almost didn't notice on HN.


what assumptions? That there may be a market for AI girlfriends?


The assumptions that HN commenters find cis-normal females sexually appealing, and that they can't get a date.


Yeah you will. It’s not going to be very good at reproduction of the same exact thing each time. In some of the examples you see the textures changing wildly and it’s a classic problem with these models. The same input does not generate the same output, so it will be obvious that it’s generated when you can’t get the “model” to look the same between two photos in the same “photo shoot”


Or maybe we don't want to encourage that behavior more.


People take their experiences of porn into real relationships, so I do not think this removes suffering overall, no.


When you put it that way… yes since no one is hurt in the process and people with pedophilic conditions may be deterred from doing something in real life.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: