I saw a discussion a few weeks back (not here) where someone was arguing that SD-created images should be legal, as no children would be harmed in their creation, and that it might prevent children from being harmed if permitted.
The strongest counter-argument used was that the existence of such safe images would give cover to those who continue to abuse children to make non-fake images.
Things kind of went to shit when I pointed out that you could include an "audit trail" in the exif data for the images, including seed numbers and other parameters and even the description of the model and training data itself, so that it would be provable that the image was fake. That software could even be written that would automatically test each image, so that those investigating could see immediately that they were provably fake.
I further pointed out that, from a purely legal basis, society could choose to permit only fake images with this intact audit trail, and that the penalties for losing or missing the audit trail could be identical to those for possessing non-fake images.
Unless there is some additional bizarre psychology going on, SD might have the potential to destroy demand for non-fake images, and protect children from harm. There is some evidence that the widespread availability of non-CSAM pornography has led to a reduction in the occurrence of rape since the 1970s.
Society might soon be in a position where it has to decide whether it is more important to protect children or to punish something it finds very icky, when just a few years ago these two goals overlapped nearly perfectly.
> I saw a discussion a few weeks back (not here) where someone was arguing that SD-created images should be legal, as no children would be harmed in their creation, and that it might prevent children from being harmed if permitted.
It's a bit similar to the synthetic Rhino horn strategy intended to curb Rhino poaching[0]. Why risk going to prison or getting shot by a ranger for a 30$ horn? Similarly, why risk prison (and hurt children) to produce or consume CSAM when there is a legal alternative that doesn't harm anyone?
In my view, this approach holds significant merits. But unfortunately, I doubt many politicians would be willing to champion it. They would likely fear having their motives questioned or being unjustly labeled as "pro-pedophile".
The strongest counter-argument used was that the existence of such safe images would give cover to those who continue to abuse children to make non-fake images.
Things kind of went to shit when I pointed out that you could include an "audit trail" in the exif data for the images, including seed numbers and other parameters and even the description of the model and training data itself, so that it would be provable that the image was fake. That software could even be written that would automatically test each image, so that those investigating could see immediately that they were provably fake.
I further pointed out that, from a purely legal basis, society could choose to permit only fake images with this intact audit trail, and that the penalties for losing or missing the audit trail could be identical to those for possessing non-fake images.
Unless there is some additional bizarre psychology going on, SD might have the potential to destroy demand for non-fake images, and protect children from harm. There is some evidence that the widespread availability of non-CSAM pornography has led to a reduction in the occurrence of rape since the 1970s.
Society might soon be in a position where it has to decide whether it is more important to protect children or to punish something it finds very icky, when just a few years ago these two goals overlapped nearly perfectly.