Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> make some teacher’s life hell by faking them having sex with a student;

The harm comes from falsely accusing someone of rape, not the deepfake itself. This can be done with or without a deepfake.

As a counterpoint: if a group of boys find their teacher is really hot and they use Stable Diffusing inpaint her clothes to create a fictional naked photo, where's the harm? They could also just use their imaginations to fantasize about what she'd look naked. I don't really see that much different - they're essentially delegating the imagination to the computer.

The typical response is that if the teacher discovers these photos, that discovery could be disturbing for her. I do agree. But at the same time, if these boys walk up to her and tell her "I just jacked off to fantasies of you last night", that's also probably disturbing too. Like the first scenario, the deepfake is tangential to the actual harm: the harm stems from telling another person that you've fantasized about them sexually - not the exact nature of whether that fantasy was produced by a human brain or a silicon one.



Women get fired from jobs for naked photos of them leaking - photos obtained through illicit means, photos taken consensually or even selfies not shared with anyone in particular.

As long as it looks convincing, when deepfakes start circulating like this, the object of the deepfake will face consequences far in excess of what you are suggesting. Hell, some parts of the country even obvious deepfakes may get a woman in trouble. Any woman, not just a teacher.


Again, the harm comes from dishonesty not the deepfake itself. Calling someone's employer and falsely accusing them of stealing from their past companies or abusing their co-workers achieves the same purpose. If a company fires an employee because of fictional content, the blame lies with the company.


> If a company fires an employee because of fictional content, the blame lies with the company.

You explain to me how to fix that. There's a whole culture around treating women (and their bodies) as a corruptive moral threat, which has proven completely intractable. God forbid a real photo leak somehow - the woman still gets fired. That shouldn't happen, but our culture seems to insist on it.

> Again, the harm comes from dishonesty not the deepfake itself.

This is a weird argument. "The deepfake is a thing; it has no agency and thus is not capable of any harm of its own accord" seems to be akin to the argument of "Guns don't kill people; people kill people". It similarly misses the trees for the forest.


> The harm comes from falsely accusing someone of rape, not the deepfake itself. This can be done with or without a deepfake.

The person I was responding to said they were not “a concern at all”, but clearly that can’t be true if they make harassment and false accusations easier and more damaging. Similarly, we shouldn’t dismiss the impact of making those attacks stronger - it’s like saying that giving a kid a gun is no big deal because they used to have a slingshot.

You missed the reason why I mentioned that scenario. It’s not just that the targeted teacher gets harassed — and, let’s be honest, that’d be a lot of aggression just below the threshold of serious disciplinary measures – but that something like that would if found trigger a mandatory law enforcement response. School employees tend to be mandatory reporters so they don’t have the option of saying “that’s a fake, he’s probably lying”.

The real harm is that unless it’s a very obvious fake, the victim is going to have to go through a police investigation into their personal life. Want to bet that’d especially be used to target people who have reason not to trust the police because they’re gay, brown, not an evangelical Christian, etc.?


I'm not an evangelical Christian, and I trust the police. I think you have a very warped understanding of how the police would respond to such a video. Their first thing would be to interview the student in the video. And the student would confirm that it's a fake. Unless the student was deliberately trying to frame the teacher. Which, again, could be accomplished by lying without a video.


I respectfully suggest that the world has fewer perfectly spherical cows than your dismissal depends on.


Your concern stems from spherical cows: you're treating law enforcement as automatons that can't think. In your scenario, the police will either interview the student depicted in the video who will confirm that it's fake. Or they'll fail to identify a victim at all, and have no basis on which to proceed.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: