Hacker News new | past | comments | ask | show | jobs | submit login

> the watermark signal is detected by inverting the diffusion process

Which, by definition, requires you to know in advance the exact process and parameters that were used? That seems untenable.




That seems to be the point. Sure, anyone can inject their own noise into the initial image, but the best place to be for that would be the people hosting the model. A host could watermark their service and be able to identify images that they'd produced down the line in a way that the end user can't remove and can't perceive.


"I need to protect the copyright of my AI generated image" is some big lol thought process. If that's all this is then it's not nearly as useful as claimed.


Yeah, it seems like exactly the wrong thing to be watermarking. I'd much prefer to be able to verify that something was human-generated, but of course that ship has sailed.


It's to trace down abuse.

People using products for fraud, defamation, illegal categories of porn, etc.


I would be very surprised if a company running a generative AI app wanted to be able to prove someone made illegal porn with it. I'd have thought they'd want people to believe that wasn't possible on their platform.


In a world where other startups can attribute outputs to particular tools, it'd be better to be able to show the database record, payment details, and IP address of those creating bad content.

You can internally investigate and see what users are doing to escape the guardrails. There's probably lots of legal but definitely juvenile and borderline offensive content that could also be studied.

There are even non-abuse analytical reasons for wanting this. If you sample the social media deluge for your watermarks, you could see how far your tool spreads and in which user clusters.


Couldn’t you just hash your image and share your proven ownership?


Didn't say anything about copyright, in the US at least that's already been ruled on saying that the raw output of one of these models is not copyrightable. Being able to trace something back to a service can be useful anyway. Imagine stuffing metadata in there, such as the prompt or the identity of the user who generated it. It'd be a powerful tool to combat everything from generated CSAM to political disinformation. If nothing else, a nice ghost story to tell the AI kiddies around the campfire.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: