It's pretty funny that most people that hate AI art and want it banned because of copyright violations, don't understand that the only difference that will make is that instead of having open models, you will only have closed models owned by Adobe who will license all the imagery from Getty, Disney, etc.
AI art isn't going away, and it better be open instead of in full control of just 2 or 3 companies.
Baffling to me they don't see this, attacking stable diffusion will only harm open and free AI tooling, megacorps will always just "unethically" train.
Also find it laughable that a lot of the most vocal artists against it make their money drawing characters they don't own anyway.
Approaching this as a designer so obviously these tools will impact me but as long as I can run them on my own hardware without paying tax to Adobe/OpenAI I can at least benefit.
The AI process isn't going away, but suits like these will preserve the opportunity for artists and coders alike to get paid for their work or choose the terms under which it is shared.
Without enforcing copyright rules on AI, it may mean the end of copyright entirely.
I share a lot of images under CC-BY-SA-NC and share code under the GPL. If someone can train a model off my work and disregard the terms under which I have shared it, copyright is my only enforcement mechanism.
What you mean by "enforcing copyright rules on AI"? Regardless of the tool used if the code used by someone heavily copies (against your license) on your code then it's copyright infringement, the fact that the tool is an AI doesn't really change this fact. But I guess you want to expand the copyright law to consider more than just the final product (code in this case), as it would be original in the vast majority of cases.
The cat is already out of the bag on that one. You can download several open source text to image AIs and run them on your own computer. You can train new styles and concepts on your own computer in about 20 minutes. You can throw up roadblocks for the big players, but it’s far too spread out to ever be contained again.
Ditto for LLMs. Obviously GPT-4 is the best game in town, but I have several LLMs and the code required to use them installed on several machines. Training new data for them can close the gap quite well at the expense of becoming less general.
We can’t regulate our way out of this one. That window closed a while ago.
If I rob a bank then you can tell. If I make an AI model on my own computer and train it, how would you ever know? It won’t have a “signature” that can be detected by any kind of AI detection system because I’m the only person who knows it exists.
You can make it illegal if you like, but you’ll never be able to enforce it. And other countries are not obligated to follow suit, and will have significant competitive advantages as a result.
There are plenty of things that you could do 'on the sly' that are illegal. That's doesn't mean they're not illegal, and that doesn't mean that if you are found out (which is hard but not always impossible) that you will be punished. Given that there are plenty of legal ways of making money, why would you pick an illegal one just because you (think you) can?
The boundaries of the law are rarely set at what you think you can get away with, but are usually set on some other principle and in between that line and the point where you exist there is plenty of room for prosecution. The jails contain lots of people who thought they 'could get away with it'. Don't be one of those.
There's plenty of public domain material out there to train on. The problem is just that having to filter out copyrighted material from the training dataset would be prohibitively expensive.
These AIs do not need to learn from your work in order to function, they just need to learn from some work. All that extending copyright in this manner would do is create a bunch of unnecessary busywork and slow the progress of technological development. You're not going to get paid $5 for letting a proprietary AI train on your drawing, blog post, or GitHub repo when they could get the same benefit from paying someone else $0.05 to find a public domain image, blog post, or repo elsewhere.
Basically, the amount of value your individual work contributes to generative AI is minuscule, but the amount of effort the entire industry would need to expend in order to compensate you for that value is massive. Unless stifling progress is your explicit goal, it's not worth it.
The difference is that the open systems will be for personal use and the closed models by companies using it for profit. And the reason Adobe and others would be used in for profit settings is because they pay the artists. Which is what's intended by the lawsuits?
I suppose you could argue the beneficiaries are the shareholders of Disney, etc rather than its artists. At the same time, maybe this ends with open source systems that artists can host themselves.
Wouldn’t that then make it exactly what detractors want, either a situation where content creators can be compensated for providing input to the models and/or a situation where models aren’t feasible economically at all?
> but having your IP ripped out of your own blog feels like being raped, by open-source software or not
That's most likely an irrational feeling. "Adobe AI" will most likely work just as well without including your IP in their dataset, and be able to output something in your style with the right prompt (and other knobs we'll see in the future) without ever having seen your IP.
Potentially you could even run some kind of CLIP on your image and then feed that into "Adobe AI". And I really see no reason to think something like that would ever be considered copyright infringement.
People have the idea that their IP is very meaningfully influencing these AIs, while in reality their artstyles aren't that unique.
first of all, it is pretty prepotent to deny someone else feelings straight out of the box. i am an artist and if someone would came with the same art-style i do on Blender with an industrial scale speed; with some text input that took 15 seconds to type and MY data is on it, it would feel like a rape.
and i do not think "i am too insignificant to make an impactful change in society" ... if everyone think(ed) like the quote, life probably would be worse?
Do you also hate Human Artists who trained by viewing copies of your artwork and incorporating elements of your style into theirs? Do you sue all Human Artists that create art vaguely similar to yours?
So if a Human Artist did this on an "industrial scale" you would sue them for having similar a style, how do you think that lawsuit would turn out for you? You are actually mad at the moral implications of Capitalism and not the legality of AI art?
> So if a Human Artist did this on an "industrial scale" you would sue them for having similar a style, how do you think that lawsuit would turn out for you?
when it becomes a realistic possibility that a single human can ingest every work ever created, and then output a derived work in a second to thousands of people at a time: I'll let you know
So, reading your comment, I can infer that the main problem is not the fact that the AI was trained on copyrighted data, but the fact that its performance outperforms that of a human artist.
But the question was not so stupid as it underlines the actual problem of AI, the scale on which it operates, not the fact that a machine could learn similarly to a human. I found the question pretty clever.
I'm trying to point out the flaw in your argument not the real possibility of a human doing this. You have unrealistic expectations of AI art and such a lawsuit against a human would obviously not go in your favor. 5 seconds of google will tell you that Style is not copyright protected so your entire premise is flawed.
There's a parallel in policing: driving around and looking up plate numbers is perfectly fine in the opinion of some, but automating the same task by fixing cameras to police cars and using computer vision to do the task makes it qualitatively different. Scale can have an impact on the way tasks are viewed.
Computers aren’t humans. Software isn’t a person. If you’re going to argue as if computers are now people and deserve having laws apply to them, think about the implications of you owning a human.
> Can they even license those works to train a model?
Adobe is already creating such AI with these kinds of partners, so yes
> And if they can, then everyone can. I can buy the rights to a single photograph of Darth Vader and create an entire series TV about his life using AI.
No because a single image of Darth Vader alone won't work. You need a pretrained base model, trained with millions of images. Only then you can feed such base model a single image and get something decent.
> Adobe is already creating such AI with these kinds of partners, so yes
So, nobody has contested them yet. I smell a class-action lawsuit for them from the original makers of their clipart library.
The base model can be trained on open source images. Or people may use adobe's software for that. From just a few images of Darth you can make basically anything, so yes it's quite doable already.
It seems to me there is no way to put this cat back in the bag ever again.
In the case of Darth Vader, there is probably a trademark involved as well, but the vast majority of photos , celebs etc don't have that.
> It seems to me there is no way to put this cat back in the bag ever again
Patents. Copyrights. Franchising?
Artists are now experiencing the joy of getting beaten in the market by someone else selling the product they created. It was not allowed to stand in other industries.
Copyright is a necessary evil. Disney may well be the savior of artists in the age of AI. There is no incentive to produce anything original if someone else can just gank it and drown you in volume.
AI art isn't going away, and it better be open instead of in full control of just 2 or 3 companies.