Hacker News new | past | comments | ask | show | jobs | submit login

It's sad that people seem to be consistently incapable of understanding this nuance; same with the "but humans also learn by studying existing work!" argument that always come up when discussing the ethics of training and datasets.



The difference of nuance vs Photoshop is completely unclear.

Back in 2005 or so, kuro5hin (a now gone discussion site) closed signups because somebody photoshopped the founder's wife's head onto some porn. That was 18 years ago.

True, doing it with Photoshop took a bit of skill, but it is a skill a lot of people have, for whom it would be doable in minutes. For a newbie, figuring out Stable Diffusion is probably more work than figuring out how to do it in Photoshop.

And IMO the training argument is long term a pointless waste of time.


> The difference of nuance vs Photoshop is completely unclear.

It's perfectly clear if you realize similar is not same.

> Back in 2005 or so, kuro5hin (a now gone discussion site) closed signups because somebody photoshopped the founder's wife's head onto some porn. That was 18 years ago.

So what? Everyone here understands that's possible. If you think that example somehow addresses the concern, you missed the point.

> True, doing it with Photoshop took a bit of skill, but it is a skill a lot of people have, for whom it would be doable in minutes.

Also, IIRC, that particular Photoshop of Rusty's wife was terrible, as in obvious.

The skill "a lot of people have" is to make bad photoshops. Generative AI has the ability to near-effortlessly make high-resolution ones that most people could confuse for a real photo.

> For a newbie, figuring out Stable Diffusion is probably more work than figuring out how to do it in Photoshop.

Again, what quality can a newbie achieve with photoshop after a couple days effort? And how long will Stable Diffusion be hard to setup? You do realize someone's going to come up with an easy-to-run "revengeporn.exe" sooner rather than later?


> Again, what quality can a newbie achieve with photoshop after a couple days effort?

The answer is a better quality than if they are trying to figure out stable diffusion.


> The answer is a better quality than if they are trying to figure out stable diffusion.

Prove it. Give some newbie Photoshop and a week of time, and show me how well they can Photoshop the face of a particular person (say Tom Vilsack, Secretary of Agriculture) onto some porn.


I've read many of your replies, and while I /think/ I understand your point, I am not sure what you're proposing.

Is this a reasonable summary?

> AI = easy, so it should be regulated. It has passed the "threshold of simplicity" (and realism) where new legislation should be enacted.

> Photoshop = harder, so it should not be legislated.

If so, what happens when Photoshop releases a "copy/paste a face" feature (a desired general photo editing capability) that uses GenAI to merge background, skin tone, lighting, etc.? That could easily be a beginner-level feature (ctrl-c/ctrl-v with auto-segmentation) and be used to create porn.

Are you proposing that the feature be regulated because it's become too easy? That artificial barriers of difficulty be implemented?

Again, what are you proposing be the outcome?


> Prove it.

Literally all some kid has to do is use the Photoshop magic wand feature to copy someone's face and post it on another image.

High schoolers don't care about the difference. They'll harass the target just as much if those kinds of pictures show up around school.

And that would only take a couple minutes to produce.

High schoolers don't need a 3000$ gaming computer, and the skills of figuring out how a GitHub repo and install process works to harass their fellow students.

You are entirely confused about what the problem is here. Slight differences in technology are not the cause of the problem of sexual harassment.


> Literally all some kid has to do is use the Photoshop magic wand feature to copy someone's face and post it on another image.

Do it, show me the results.

> High schoolers don't care about the difference. They'll harass the target just as much if those kinds of pictures show up around school.

You're moving the goalposts. I don't care if highschoolers will run with obvious, low quality crap. In fact, throughout this whole conversation, I didn't have highschoolers harassing each other on my mind at all. I was thinking about the more general problem, which includes things like harassing exes and potential employers coming across pictures during a job search.


> I don't care if highschoolers will run with obvious, low quality crap.

> which includes things like harassing exes and potential employers

You are completely missing the point. Also, your repeated requests that I engage in sexual harassment is weird.

The damage of sexually harassing messages being sent to your friends and families can be just as damaging regardless of whatever small quality issues that you think exist.

You are confused about what the issue is. People are still significantly harmed via this harassment, even if there are quality issues. The "accuracy" isn't the determining factor here. Instead, it is the sexually harassment messages and images being spammed to people surrounding the victim.


> It's sad that people seem to be consistently incapable of understanding this nuance...

It's a problem software engineers (or some superset that contains them) seem particularly prone to.


Something something binary thinking


It's not a matter of understanding; people just disagree.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: