While I partially understand (but not support) the hate against AI due to possible plagiarism and "low effort generation" of works, think about the whole process: If model providers will be liable for generating output, that resembles lyrics or very short texts that fall under copyright laws, they will just change their business model.
E.g. why offering lame chat agents as a service, when you can keep the value generation in-house. E.g. have a strategy board that identifies possible use cases for your model, then spin off a company that just does agentic coding, music generation. Just cut off the end users/public form the model access, and flood the market with AI generated apps/content/works yourself (or with selected partners). Then have a lawyer checking right before publishing.
So this court decision may turn everything worse? I don't know.
The fact they don't already do that, sounds to me like the things produced by AI are not worth the investment. Especially since the output is not copyrightable, right?
If there was a lot of gold to find they wouldn't sell the shovels.
There is a lot of value in specialization. It allows capitalism to do its magic to elevate the best uses of your technology without yourself taking on any of the risk. Trying to inhouse everything often smothers innovation and leads to bad resource allocation. It can be done, but in fields with a lot of ongoing innovation it's extremely hard to get right
There is a reason that Cisco doesn't offer websites, and you are probably actively ignoring whatever websites your ISP has. ASML isn't making chips, and TSMC isn't making chip designs
But think of the Apple approach. And while all cloud providers started with mainstream hardware, they evolved to proprietary systems. The current AI phase may just be the „good old days“ with access just limited by financial power paving to be cut down once the dust settles and some model vendors lose.
If there is such an immense value in spinning off and selling models separately you can bet that will happen - without court saying so. At the end running these models is a costly job and you'd want to squeeze out every value.
A media generation company that is forced to publish uncopyrightable works, because it cannot make the usage to these media generators public, since that would violate copyright - that does sound like a big win for everyone but that company.
Because that's the only business model that the management of these model provider companies suspect to have a chance of generating income, at the current state.
> While I partially understand (but not support) the hate against AI due to possible plagiarism
There's no *possible* plagiarism, every AI slop IS result of plagiarism.
> E.g. have a strategy board that identifies possible use cases for your model, then spin off a company that just does agentic coding, music generation.
Having lame chat agents as a service does not preclude them from doing this. The fact that they are only selling the shovels should be somewhat insightful.
E.g. why offering lame chat agents as a service, when you can keep the value generation in-house. E.g. have a strategy board that identifies possible use cases for your model, then spin off a company that just does agentic coding, music generation. Just cut off the end users/public form the model access, and flood the market with AI generated apps/content/works yourself (or with selected partners). Then have a lawyer checking right before publishing.
So this court decision may turn everything worse? I don't know.