Hacker News new | past | comments | ask | show | jobs | submit login
Reddit bans 'deepfakes' AI porn communities (theverge.com)
89 points by internetxplorer on Feb 7, 2018 | hide | past | favorite | 79 comments



Fake porn of celebrities has been prevalent on the Internet for a long time. I don't have precise statistics, but I think it might predate the web. For the most part it hasn't been particularly controversial. Now that advances in technology have significantly improved the quality, everybody suddenly has a problem with it.

Let's back this up a few steps. You're an actor, you appear on television. By being filmed and accepting the pay offered to you, you're agreeing to allow these images of you be disseminated to the general public, for their enjoyment. But what if somebody finds you attractive, and looks at your picture whilst... you know. Can you sue? No, they're well within their rights to do so. What if they cut your face out and place it over a Playboy centerfold? Same deal. Several technogical innovations later, here we are. Fundamentally, nothing has changed. Fundamentally, people are still 100% within their rights to combine images legally obtained in this way. And post them online. This may not be what the Internet was created for, but this was always what it was used for.


> people are still 100% within their rights to combine images legally obtained in this way. And post them online.

Are they? If you distribute movies and media to the public, you normally are breaking laws. (Hence the infamous FBI warning on movies for a couple decades now.) Likewise, you cannot just use people's likeness in marketing and other public uses without their permission. So I'm not at all sure that just because you have an image, you have the rights to create and distribute derivative works from them.


If I film someone in public, even though I own the copyright on the film since I shot it, I still need to get them to sign a release to be able to utilize their likeness in my work. It's not a simple black and white issue and we'll need laws to catch up to address it. Claiming everything as "transformative" and thus you own the copyright as some do is silly. Otherwise, it would be legal for me to take everyone's Facebook pictures and "transform" them by putting them into the AI and sell the resulting AI porn.


> Likewise, you cannot just use people's likeness in marketing and other public uses without their permission.

Tell that to Prince and Peter Cushing. ;)


> Are they?

Yes.

> If you distribute movies and media to the public, you normally are breaking laws.

That's not what we are talking about. You can transform any public image for parody, criticism, etc. Porn is considered speech so anyone can make pornographic parodies/etc. This is especially true if fans are doing so for fun and not for profit.

Of course reddit has a right to ban it from their platform, but you as a fan can transform any public image and criticize, parody, etc it.


> Of course reddit has a right to ban it from their platform, but you as a fan can transform any public image and criticize, parody, etc it.

Maybe, maybe not. Transforming generally creates a derivative work. On its face that requires permission from the owner of the copyright of the work, but there may be exception that allows it in particular cases. In the case of parody in particular, it MAY be covered by fair use.

A lot of people on the net think that parody is automatically fair use, usually from misunderstanding the Campbell v. Acuff-Rose Music, Inc., 510 U.S. 569 (1994) case.

Briefly, in the case, the district court said that parody was fair use. The appeals court said because it was commercial parody it presumptively could not be fair use.

The Supreme Court said they were both wrong, and it might be parody and sent the case back down to the lower courts.

A lot of people just looked at as the Supreme Court reversing the appellate court's reversing of the district court's ruling that parody was fair use, and took it as therefore the Supreme Court saying parody was fair use. (I don't blame people for misunderstanding--the press is generally terrible at reporting Supreme Court decisions. They often fail to interpret Supreme Court rulings in the context of the lower court decisions that led to the case).


That's not what we are talking about. You can transform any public image for parody, criticism, etc. Porn is considered speech so anyone can make pornographic parodies/etc.

This is true. It's not censorable free speech, meaning the government can't prohibit it ahead of time. But speech can be subject to tort claims despite surviving the First Amendment. A deepfake victim can sue for damages in civil courts, and there's a 99.99999% chance they'd win massive damages every time.

This is especially true if fans are doing so for fun and not for profit

Profit motive may affect the amount of damages, but it doesn't effect whether or not the victim can sue and win in a court of law. Even in the US, you'd be paying out significant damages to someone for damage to reputation or use of likeness.


What would their suit entail that isn't covered by existing porn parodies? So long as the final composition is labeled as a fake I don't see them winning on damage to reputation. Use of likeness might go through in a few states that cover non-commercial use but in most states as long as it's fan-made they would probably be okay.


I'm sure if you would ask the people in those fakes they had a problem with it from the beginning. "It has always been a thing" isn't an argument for something.


It's an argument for not overreacting about it.


Since it was only famous people, who have presumably already tackled the downside of public life, it may have been easily dismissed by the victims.


Usually people use the slippery slope to avoid things, not encourage a descent into depravity, but to each their own.

In any case, you shouldn't pretend as if this is morally okay just because you can focus in on each individual step and find a way to justify it. There are a lot of crimes that clearly fall into this bucket: stalking, for example. Any one interaction may be innocent, but it's the sum total of actions that completes the picture of abuse.

If you endorse distributing fake naked pictures of celebrities, you are a bad person, no matter your line of reasoning.


This is an interesting final opinion. Can you explain your reasoning? I’d like to understand more about why you dislike this?

This idea of altering images has always confused me. When I was a kid, I got a scanner and a printer. The first thing I did was take a picture of a Battletech Griffin 55 ton battlemech (a giant war robot) and replaced the robot head with that of my mom, titling it “Grifmom.” I thought it was awesome and she burst into tears and beat me terribly. I never understood why. And this didn’t have any nudity or offensive material, just a normal face on a robot body.

Perhaps this is related to your aversion?


I dislike this because I have respect for human beings instead of applying moral rules like a robot.


That’s interesting. How is editing an image of someone and not even showing them disrespectful.

Do you imagine others naked? Is that disrespectful?

It’s really interesting hearing about other people’s mora systems.


[flagged]


Please stop posting unsubstantively.

https://news.ycombinator.com/newsguidelines.html


[flagged]


Arm-chair diagnosing someone you don't agree with with psychological issues isn't very helpful to a conversation, you know.


Do you honestly think that using someone else's copywritten material to transform yet another person's likeness into something that very few people would consent to (when you almost certainly didn't try to get consent) is a fair use right, legally or socially?


Well, it is definitely transformative. In fact, nothing of the original work really survives the transformation.

You can't merely copyright your likeness.

As for socially, consider this: They have used the likeness of Bruce Lee, Carrie Fischer, Prince, and Peter Cushing in commercial works, undoubtedly without prior consent. We also have "public" nude photos taken and distributed of celebrities on a regular basis (paparazzi). This appears to be quite socially acceptable, so I'll run with the idea that "socially acceptable" is pretty fluid when it comes to celebrities likenesses.


> You can't merely copyright your likeness.

Which country are we talking about? Because that is not the case in the U.S. https://en.wikipedia.org/wiki/Personality_rights#United_Stat...

> They have used the likeness of Bruce Lee, Carrie Fischer, Prince, and Peter Cushing in commercial works, undoubtedly without prior consent.

Who is "they"? Who has been using Fischer's and Cushing's likenesses without contractual agreements?

> We also have "public" nude photos taken and distributed of celebrities on a regular basis (paparazzi)

Public photography has legal protections. Doesn't mean that if you took a photo of Obama at the gym you can use the photo in an ad campaign that suggests his approval of commercial usage.


They have used the likeness of Bruce Lee, Carrie Fischer, Prince, and Peter Cushing in commercial works, undoubtedly without prior consent.

They licensed their likenesses from their estate, or used licensed footage (i.e., film clips of Bruce Lee).

We also have "public" nude photos taken and distributed of celebrities on a regular basis (paparazzi).

Photos taken in public are not subject to the same tort concerns as private photos, or faked photos purporting to be of a celebrity.


I'm pretty sure celebrities have also been able to request the removal of fake porn featuring themselves, and online communities have been able to ban the posting of it. So we're not really in any particularly new territory here.


> And post them online.

Mass redistribution of somebody image is not a "right".


Luckily in the free world we don't need "rights" in order to do things. In third world countries people are often put in prison for photoshoping their leaders. Hopefully this trend towards that kind of authoritarianism doesn't continue in the west.


Did you really just mix for profit deepfake porn with fight for political freedom?


Freedom shouldn't discriminate based on fields of endeavor.


My rights begin where someone else's rights end. Freedom is not "free to do whatever you want".


for profit? who said anything about this being for profit? I don't think it is


What do you think freedom of the press refers to? It's the printing press. Literally designed for the sole purpose of mass redistribution.


While I agree with you, it's much more invasive than a superimposed face on a still image.

A photograph of something is open to interpretation or dispute, but a video is a series of such stills, each one slightly different, each of which adds to the provenance of the whole.


Banning it doesn't make the underlying tech go away. You seem to imply a video is inherently more trustworthy, this is no longer the case, we need to acknowledge it.


What if they cut your face out and place it over a Playboy centerfold? Same deal. Several technogical innovations later, here we are. Fundamentally, nothing has changed.

You have two images, one physically overlaying the other...it's not even remotely the same thing as a single integrated image, technically or legally. Integrating the (edit) images (end edit) changes everything.

Fundamentally, people are still 100% within their rights to combine images legally obtained in this way. And post them online.

Completely. False. You may have the right to combine images for your own personal use (in the US, ignoring discussions of CP), but you absolutely do not have the right to distribute those images, and the associated tort actions both pre-date and have survived the First Amendment.


There's no way of putting this tech back in the box. And it's only going to get more powerful.

What if someone combines it with some kind of "deep ageing", to create artificial child porn? Sexual representations of children, even if completely artificial and involving no victims, are illegal in many jurisdictions, but not all.

We're only scratching the surface of what semantic editing of video is going to be capable of. It's a very big barrel of worms.


Absolutely true, and not that you’re arguing for this but:

Just because the technology is out there doesnt mean we throw our hands in the air and give up. Yes it’s there, it’s going to be used for most of the nefarious cases we can imagine, but that doesn’t mean we have to tolerate someone using our image against our will.

In fact, I would imagine that the data privacy advocates I often see on hn should see this as a logical extension of the privacy protections they want to see across the web.

No, Lyft employees should not be able to view our trip history willynilly. No, the NSA should not be able to gobble up all of our google searches for profiling us. And no, we should not have to suffer being put in porn against our will because we are a person in the public eye.

This stuff should be treated like revenge porn, IMO. Functionally it’s the same even if the technical implementation is different


i am on the fence. using my face, and making money off it without my permission seems wrong. however, what if the 'animated' porn star just happens to look like me? inevitably, a generated porn star will look like SOMEONE. for me, this is getting murky.

then there is the case that the program generates porn for me. if i privately generate porn using a famous person's image, to me - thats my right. no different than me fantasizing about them in my head.


Just to clarify the conundrum in your second paragraph: this is definitively privileging your privacy over someone else’s. What you do in private is your business, yes, but that doesn’t give you the right to be a voyeur simply because you are doing so in seclusion.

That’s why I think this is a poor argument.


voyeur: "a person who gains sexual pleasure from watching others when they are naked or engaged in sexual activity." is not illegal. its kinda the very definition of watching porn.

i wouldnt be violating their privacy any more than taping a picture to a pillow and humping it. are you claiming that is illegal? if not, then it feels like its the quality of the 'content' you are against, not the act.


Porn is consensual. Non-consensual voyeurism is usually a crime. (Generally turns on the expectation of privacy. If the watcher is observing from a non-publicly accessible location, they're probably guilty of voyeurism. Watching people go at it in a hotel room from the office across the street, generally wouldn't be a crime. Watching a non-consenual video recording generally wouldn't be a criminal act for the observer but would be a criminal act by the recorder, but this also depends on the content of the recording.)


Would you make that same argument toward someone fantasizing about someone in their mind? The two seem parallel to me; I would guess most people are either morally for or against both (even though, yes, one is way more extreme than the other), but I'm super curious about whether/why someone might think e.g. imagination is fine but private visualizations aren't (or vice versa).


Yep, I completely agree. We need to draw a line between what's real and what's not. I could easily see teenagers use images of their crushes in an "app" that makes their dreams come true. Let's not go there. I know you tech community, drooling at the $$$ in this.


>No, Lyft employees should not be able to view our trip history willynilly. No, the NSA should not be able to gobble up all of our google searches for profiling us. And no, we should not have to suffer being put in porn against our will because we are a person in the public eye.

These are centralized entities. In theory it is very easy to implement sound internal controls surrounding the access of that type of data given audit logs or whatever. The issue with deepfakes is that given a reasonable stack of images, anyone can produce their own porn. If you're a woman with many many photos on a public instagram account you can be a likely target given that anyone can go ahead and scrap your images. However, if you're one of the millions of wives that doesn't have a large presence on social media, then the deepfake can easily be tracked back to the spouse who has access to that private stack of photos.

The normal caveats apply: you need access to a gpu, probably access to Tor to download the source in case github shuts the repos, etc etc. But when there is a will there is a way.


> In fact, I would imagine that the data privacy advocates I often see on hn should see this as a logical extension of the privacy protections they want to see across the web.

The question is whether or not my likeness is my data, which I don't think has ever been settled. Anyone can take a photo of me in public and the photographer owns the copyright. Are security cameras recording me violating my privacy? On the other hand, football players have to be paid to have their likeness appear in games.

I think this might be the catalyst to resolve these issues once and for all.


Yes, the photographer owns the photo. They do not own your likeness or have the right to use it as they wish unless you signed a model contract with them.


The photographer owns the copyright in the photo, but not the right to use your likeness contained in the photo without your permission (likeness rights are also referred to as model rights).

This means that you can demand money if the photographer sells that photo to anyone, especially if the sale is for commercial use.


TMZ and other paparazzi would not exist if this were the rule. I think some states recognize a limited right to control aspects of your likeness, but not nearly as broad as this.


The tech is just a special effect, and I find it weird that so many people go straight to the most fucked up use case, or even imagine new possibilities (deep aging? Jesus) or even (elsewhere) start arguing for more laws because of it.

Meanwhile nobody fixates on all the horrible stuff that can be done with cameras or Photoshop or Poser.

You can, or will be able to soon, do previously Hollywood standard effects at home. You could put Tom Cruise in your wedding video. You could put yourself in Mission Impossible. You could have the cast of the Magnificent Seven in the movie you created with a few mates. You could do tons of imaginative, funny, creative stuff.

Celeb porn, sticking the head of a baby on the body of an adult porn star or any other sad/weird porn related use - all of which has done with images for a quarter of a century - are the least interesting things about this.

As to the reddit ban, I doubt anyone is surprised.


> What if someone combines it with some kind of "deep ageing", to create artificial child porn? Sexual representations of children, even if completely artificial and involving no victims, are illegal in many jurisdictions, but not all.

It's not even that complicated. Petite actress + childs face. I take solace in that from what I saw of the subreddit, it seems like it's quite difficult to create something decent easily.


One day I imagine we'll practically be handing out nukes to everyone.

Actually I imagine that day won't come.


3D printer. Only problem is refining the fissile material unless you have a fusion reactor laying around.


You’re seriously underestimating the precision involved in nuclear weapons. You’re not printing any of the explosive lenses, or any part of the physics package. You’re not printing neutron reflectors and absorbers either. Most of all though, assuming the ability to print some of what I’ve described emerges, it will not be in one compact, inexpensive device. Your boron composite printer won’t be your explosive lens printer, and so on.


I'm on the fence on this one. If I draw a stick figure and put your name above it, is that unethical? What if I add a speech bubble saying something risqué? What if I'm a talented artist and I draw something pretty life-like? I don't see the lines here being too clear without claiming down on all expression.


None of these examples are really anywhere close to deepfake porn videos.

- Porn is done under consent, where participants should be reasonably aware that it will be published

- Porn tarnishes reputations

If you pasted Daisy Ridley's face in a crowd in, say, China, doing every day stuff, no one would rightly care because there's no real potential unless you are doing it for some ulterior motive.


I think how convincing a piece of media may be is at the crux of this issue. Photoshopped heads of celebrities on porn actors' bodies have been around for some time now and isn't, to my knowledge, illegal. The key difference here is that people hold video to be more trustworthy. Given the advancements in CG and now more recently with deep fakes, perhaps that's what needs to change; People should stop trusting video footage.


Photoshopped heads of celebrities on porn actors' bodies have been around for some time now and isn't, to my knowledge, illegal

They have been, since inception, illegal, in the sense that they represent a tortious act. They might be illegal in the criminal sense, depending on the jurisdiction, even in the US if absent meaningful context bringing the fake under the protection of the First Amendment as a form of speech.


I agree, since the every day person's idea of what is possible is entirely cultural and that will change and advance, although it doesn't necessarily need to be illegal for it to be pushed out to the seedier parts of the internet and off of the more public platforms where influence is easier to gain.


What if I paint a specific person being physically attacked?

What if I make a fake video of a specific person being physically attacked?

He didn't consent to be attacked and it may tarnish his reputation. Is it unethical to do any of the above?


I was also, but now think that I disagree with Reddit's decision. I do not think there's a good argument for it being in the same class of phenomenon as nonconsensual porn.

Here's why:

1. r/deepfake exists, draws novelty, and has appeal in part because it's explicitly identified as fake. It's part of the name. So there's not only any claim to it being real, it's explicitly identified as fake. It's hard to argue how there's any nonconsensual anything when the parties involved in the deepfake porn all agree and understand it's fake.

2. Let's say that someone posts deepfake porn as real porn. Now this is a different issue, one closer to liable. But that's where there's some misattribution of something to the potential victim. The victimization is from the assertion that it is about the individual (as opposed to a deepfake creation, where the opposite is being asserted).

3. Let's say that, out of curiosity, you, in the privacy of your own home, on your own hardware, create deepfake porn involving your spouse (who fully consents and wants you to do so because it's arousing to them) and publicly available imagery. You do not distribute it. By the "nonconsensual porn" logic, though, you have now engaged in something akin to sexual assault, by engaging in nonconsensual photography. But this is absurd, because the public figure has suffered nothing, and nothing was obtained from them without their consent. It's your (and your spouses') creation.

4. Let's take this a step further, and say that a year from now you create software that will create a simulation of a person solely from its knowledge of what humans look like. Let's say that you obtain something that looks like a celebrity. Have you now created nonconsensual photography?

5. Let's say you find a person who is the doppleganger of a celebrity--a dead-ringer lookalike. You film them in porn. (This has been done actually.) Is that nonconsensual porn, because the celebritie's likeness is being used without their consent? The porn actors/actresses consented, though--why is deepfake any different, when there's nothing to consent? Why do you need porn actors/actresses consent to supercede the celebrities whose likeness they resemble?

The logic behind this reddit (and pornhub, etc.) decision is full of holes as far as I'm concerned, and it creates a very dangerous precedent concerning consent. It essentially gives people power over others' likenesses due to their popularity.


At point 3 you skip over the public humiliation that a published deepfake can be responsible for. By creating something and keeping it to yourself, you are basically demonstrating the behaviour Reddit wants. They want you to keep that "content" offline in your basement and not on Reddit being spread and commented on.

If that fake sex gif you made of your wife gets spread around her workplace, you better believe that would be an extremely humiliating and emotionally traumatic experience. An experience these rules will be created to prevent.


> I'm on the fence on this one.

What's there to be on the fence about? It's terrible and a infringement of free speech. But reddit has the right to ban it as a private company. It just make reddit look terrible and hypocritical.

But considering they are planning on IPOing, I guess they have to sell out.


What's scare me the most it's not the porn. But the fact that make this kind of tech it's kinda-avaliable to anyone with a GPU and a few hours of learning and training.

The only thing that make this not a treat to your regular folk it's the fact that needs a lot of images references to make the model, but imagine a politician or activist, they have a lot of images on the net; So this can take the fake news to another level. Yeah if this happens the news media and legal system, will probably not take shady videos seriously without verification (specially now, that the algorithm it's still is in the middle of the uncanny valley, so for the moment it's easy to recognize without experts need). But think about your friend, uncle or cousin that shares his "echo chamber" posts on facebook?

Or who knows, maybe I should't binge black mirror, and this only gets limited to porn and good uses. Like a new era for stunts in movies.


> if this happens the news media ... will probably not take shady videos seriously without verification

Because mainstream media is great at fact checking all the bullshit they report on? Some do, sure, but there is a lot of shit on mainstream news.


I think 'Deepfake' is a great name for an episode of Black Mirror.


So, subreddits that do "safe for work" deepfakes are still around and allowed. This tells me that the technology will just get better and better and the use for it to create NSFW deepfakes will likewise get better and better (while existing underground?) until it really is impossible even for an expert to tell that they are fakes. That is what I assume will happen anyway.


> until it really is impossible even for an expert to tell that they are fakes

I hope that happens, and as soon as possible. Otherwise, there will be an 'uncanny valley' period of sorts, where people are able to mass-produce this video, but not everyone is aware that it exists. When it really is flawless, or close enough to, societal change will begin. This will also mean that video evidence will likely no longer hold in court (which is good because it was already possible to fudge manually with videoediting tools).


Looks like voat.com/v/deepfakes has already existed for while in anticipation.


To me it looks like voat is just the ulta-toxic communities from reddit joined together. I am a fan of real free speech without banning stuff that could offend some people, yet voat does not feel like a solution for that (same goes for Gab, seems like people there are more interested to see Twitter fail than coming together as a community and create a viable environment).

That said, I think this will again drive some more people to voat. What it definitely won't do is stopping people from doing deepfakes.


I feel similarly. I want free speech, but tempered enough to not bum me out. Reddit is too constrictive, voat is almost as bad as 4chan.

I long for the olden days when bbs allowed for lots of diverse stuff without people being banned for mentioning a leaked game of throne episode, or cgi boob.

Right now, I feel similarly toward Reddit as Google- moving in an anti-user direction, but no viable alternative to quit to.


It's telling that other replies to this were a (justifiable) critique of the voat community. Another example of an effect of giant centralized communities (social networks) dominating online discussion. When one of them decides some content isn't acceptable, there are only fringe alternatives out in the Internet wilderness.


It's probably that most people hate the idea of making fake porns of people who don't want to be in porns, and if one likes to pursue that idea, the companies they can find will also harbor other crazy ideas.

These ideas basically self-select fringe groups.


I'd imagine the huge variety of nude/sex mods in almost every video game ever made, or the idea of rule34 in general, seems to indicate it is anything but fringe.


If only there was a site that wasn't as censorious and inconsistent in banning as Reddit and didn't have such a horrible community of users like voat.


Voat could create subs to give away free money, candy, and blowjobs and they’d still have more tumbleweeds than active users.


These bannings and the process making the news will definitely have the Streisand effect. The community might move on from Reddit but so many people have been made aware of the possibilities that it's impossible to keep a lid on it.


So glad that they banned this crap. This is actually a fun project to get started with deep learning as you paste Nicholas Cage in every movie you can imagine. Unfortunately it's been dominated by all this porn creepiness. Hopefully the discussion sorrounding this will get friendlier from now on.


The decision that makes voat.co become successful?


Ban all you want, it’s not illegal, and silly to suggest as much.

Here’s a picture I drew of your mom. Look at those stink lines.

Arrest me.


It can actually be considered as a form of harassement or defamation. While it might be a gray area in the US, it is likely not going to be the same in Europe.


[flagged]


Do you actually think a slave-driving genocidal America invented freedom?


It seems to have gone unnoticed, but since a week or so Github requires sign-in for the deepfakes repos (https://github.com/deepfakes). Note that these are public repositories.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: