Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's ban photoshop all over again.

It's just the new reality. Porn is not even the worst thing. If anything it can make real photos/videos harder to get viral.

But augmentation will be (if not already is) used to make very tiny adjustments, like you make one candidate/celebrity face slightly more symmetrical and the other one less, change their gait just a little bit. These changes are non-perceptible, are not interesting enough to make a scandal, and they can have huge impact on how people perceive other people. I mean, I am not suggesting that it isn't a puppet show without those, but the game is evolving.



> ban photoshop all over again. It's just the new reality.

We didn't solve teens doctoring inappropriate images of their classmates by banning Photoshop or accepting it as the new normal. We solved it by passing laws which criminalise the behaviour and enforcing them to deter the behaviour.

Technology influences society. But plenty of social mores are almost universally respected despite being difficult to enforce. Possible and difficult to enforce doesn't force permissibility.


I was a teen post photoshop and it was just... never an issue. It never occured to me. No one ever mentioned it to me as a possibility. If someone did it would have struck me as weird and as way too much work. Photography class in school (in which we learned to use photoshop) didn't mention any acceptable use guidelines or anything. As far as I know no one in my school ever did anything distasteful with photoshop.

I don't accept the idea that "passing laws and enforcing them to deter behavior" was the cause of the lack of issues, because to be a deterrent us teens would have had to been aware of the laws.


> was a teen post photoshop and it was just... never an issue. It never occured to me

Same. And same.

Drawing inappropriate pictures of your classmates was creepy before. Making Photoshop or AI porn remains so now. Most people won't do it. But there is active social reinforcement that keeps it from becoming "the new reality."

> don't accept the idea that "passing laws and enforcing them to deter behavior" was the cause of the lack of issue

Fair enough.

The jargon was cyberbullying, and there was absolutely legislative activity around horrific examples. But the principal mechanism of action probably wasn't a direct fear of getting caught and punished, but the prompted discussion reinforcing and extending the norm.


Did we criminalize doctoring a photo and make it a crime?

I thought we have general laws around sexual harassment and revenge porn which criminalize the point of harm.

I bring this up because most of the enforcement difficulty seems to stem from a desire to to control behavior which would otherwise be undetectable.

I believe we already have means of enforcement for students sharing pornography of their classmates.


Harassment is a lot narrower than people tend to think. It restricts people from makings communications that the subject does not want to be a recipient of. It does not restrict people communicating between each other about the subject. Abby following Jane around calling her fat is harassment if it is severe and pervasive. Abby and Betty texting each other, and talking to each other about how Jane is fat is not.

At least as far as criminal harassment goes. Schools can adopt more restrictive policies but the consequences would only be suspension, expulsion, etc. not criminal charges.


> Abby and Betty texting each other, and talking to each other about how Jane is fat is not

What if they do so publicly?


The takeway is that harassment laws restrict what people are saying to you, not what they're saying about you among each other.

If the police actually tried to prosecute them for harassment, the question the courts would be asking is, "could Jane reasonably avoid having to hear it?" Harassment is about being able to not be the recipient of communications you don't want to receive. Calling Jane fat on message boards, twitter, in a bar, etc. would all be fair game.

There's more restrictions if Jane is a captive audience. The courts have ruled that certain settings where people can't easily just leave - namely work and school - have greater restrictions. Again, the question the court would be asking is "can Jane reasonably avoid this speech" and it's a much bigger barrier to leave one's workplace than walk out of the bar.


Laws can help, but I'm most concerned with states leveraging this to influence foreign campaigns.

There will be a time when the Trump "grab them by the you know what" style scandal will just be met with sceptism.

Where do you go when even video evidence can't be seen as the truth?


The devices need to cryptographically sign the original video with keys stored in some kind of Secure Enclave

If the video is edited or comes from a source that doesn’t have trusted keys then it’s no better than some random tweet


This is just a certificate of authenticity that one either trusts or not, based on whatever one knows about secure enclaves, cryptography, or the device/person issuing the key/certificate.

I think the reality is that photographs and video are now like text, you trust the source or not.

I can write about Cleopatra's ability to snowboard and play Xbox, or I can make a photo with the caption "Cleopatra doing a 1080", and now I can make a video of it (probably).

I can also give you a certificate/key/blockchain whatever to prove it's "valid".


And then someone takes the camera and points it at a screen showing an image of their choice. The analog hole cuts both ways.


Photoshop was never built as an easy tool so powerful that even grandma can produce photo-realistic fake images in seconds. The new wave of AI powered editing tools are clearly on a new level.

This is like bringing your howitzer to a gun debate because they are kinda similar.


Generative AI is much easier than photoshop the same way that photoshop was much easier than trying to alter photos any other way. In both cases it is a completely new level.


> new wave of AI powered editing tools are clearly on a new level

For the time being, the easiest-to-use tools are centrally run. That makes monitoring and enforcement easier.


A centrally-managed app will always be easier to use than a local self-service tool. But those local apps will keep trying to converge towards the ease of the hosted apps. Over time I expect the difference will become negligible (i.e. a 1-click install and it actually works)


Photoshopping something convincing has a much bigger bar to entry than (as far as I understand it) generation of deepfakes now. Lots of time/practice invested even for one convincing image.

From what I can see the issue here is that this problem with AI can potentially become much much more widespread because the amount of work required is pretty much none and gets lower as models get better. It sounds like someone using deepfakes to ie extort people can just cast a much wider net by just scraping/dumping masses of images into a generator to create a multitude of convincing pictures of many different people instead of having to focus their time and effort on picking a single target and creating fakes manually.


Probably half the shorts on Facebook are pretending not to be AI generated now. And a similar number of the photos in your feed advertising groups.


The above sounds like the opinion of someone who doesn’t have a daughter.


Even trying to be helpful here you're still just centering men's feelings about "their" women.


Do you seriously think most women are on board with this?

Many do not know it’s happening. Those that do and aren’t chronically online find it horrifying and even traumatizing.


No believe me I know, I have heard women talking about this coming for years. It was already a very common harassment technique before AI made it free, easy, and accessible.

You can tell how few women are involved at the key stages of building new tech because of how rarely obvious harassment vectors are accounted for during development.

The problem I have is with trying to get men to care about it because it affects their daughter or whatever.


>You can tell how few women are involved at the key stages of building new tech because of how rarely obvious harassment vectors are accounted for during development

Interesting point to think about- it certainly would have helped. But I also genuinely don't see how someone developing this kind of technology, who has had even a brief glimpse of online culture any point in the last 20 years, didn't immediately see what it would be used for. It almost feels like "progress" with malicious intent. Or an extreme case of apathy towards societal consequences.

>The problem I have is with trying to get men to care about it because it affects their daughter or whatever.

Lot of people are too self centered to care about any issue unless it directly impacts them. If it gets people thinking about how it affects their loved ones I don't think that's necessarily a bad thing.


> If it gets people thinking about how it affects their loved ones I don't think that's necessarily a bad thing.

It's just shitty both to men and women. Men don't care about an issue unless it affects their daughters? They don't have other women they care about? Mothers, friends, mentors? Or for that matter sons who are also vulnerable to abuse and manipulation? The men I know are better than this.

And it positions women as being just a motivation for the actions of men, whose right relationship is to be protected rather than being actors in their own right. If you find this tactic sound then feel free to use it but I think there are better ways.


I get where you are coming from. The original poster could have phrased it better.

But it's an issue that has both then and now predominantly affected mostly women. Sadly I reckon many men will not care about it, or even be aware of the issue, until it impacts a woman they know.


[flagged]


I do have a TBI but I don't see what that has to do with anything.


A. Idea, not person.

B. To restate, heaven forbid a man take action based on what people he loves would want. All motives must be purely self-satisfying, or purely driven by the affected.


I think most women aren't on board with people drawing erotic pictures of them with pen and paper either. I think most women aren't on board with their peers fantasizing about them - even if it's completely limited to their imagination.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: