> Because some people just don't care where their code ends up.
Yes, take me for example.
> Many people release code to the "public domain" (or under very liberal licenses).
In my case, the MIT license, because I saw it was popular, and I was afraid that in some places, "public domain" might cause unexpected legal issues to whoever wants to "play by the book" and use my code.
> if LLM chews on it and regurgitates it out
As work coming from a machine does not have copyright protection, whoever gets a LLM to spit out my code back can then claim it as their own, under whatever term they like.
If this person wants to contribute to a free software project and release the code under the GPL v2 or v3, good: it may help create a new feature that users will enjoy!
If this person wants to contribute to their company private software that's only available on a subscription basis (and let's say the subscription is sold at an eye-watering price), good: it means whoever pay this subscription will get more from their money, and whoever use the software may get a new feature they will enjoy!
Software has nearly 0 marginal costs. LLM is the closest thing to a Star-Trek level "replicator", getting everyone everything they want.
On which moral grounds would you object to a Star-Trek level replicator for physical good? (please make them good, as offering any food anyone may want would fix world hunger once and for all)
Then why object to that for virtual goods?
Maybe I'm reading too much into your reply, but I don't see it as trolling or negative faith.
I see variants of it in many places, and they all look to me very close to luddism: rejecting a new technology, because you fear for your own work, while ignoring what this technology will enable in the greater picture: for the orignal case of luddism, reducing the price of clothing for everyone by increasing production and decreasing labor, therefore allowing workers to get in other fields where they may try to satisfy other human wants - some that would be inconcievable to the original luddites like videogames
We should feel graceful we get more technology, as it removes constraints and make more people happy.
I don’t think fearing one’s job is necessarily a bad reason because as much as I love the idea of a Star Trek utopia, real and present people have real responsibilities like children which are cared for from money generated by their careers.
This is particularly relevant in societies which take a dim view of their social responsibilities (I’m looking at you America) which means there’s less of a safety net should that career disappear.
We are already seeing more developers than job vacancies is the tech market, so this isn’t a theoretical concern either.
That all said, I don’t think hiding our valuable code for fear of LLMs is the right solution either. If your code is really that good then you’ll be more likely to secure your career by sharing your code because it builds a visible reputation that extends further than any verbiage on a CV might.
So while I don’t agree with the LLM excuse I can still completely understand why someone might cite it as a reason not to open their source.
Another valid reason is that some people have been completely burnt out deadline with entitled complaints from users. Thankfully I’ve had a mostly positive experience personally but I’ve read that others haven’t been so fortunate.
> We are already seeing more developers than job vacancies is the tech market, so this isn’t a theoretical concern either.
Agriculture also employs far fewer people than a few hundred years ago, yet we have more food in quantity and diversity, so I see that as a good thing.
I suppose we just have very different beliefs and values.
Thanks for your answer, as it helped me understand your perspective.
I think you’ve misread my comment. I’m neither the GP nor against LLMs. I’m just offering a counterpoint that a fear for one’s job isn’t an unreasonable perspective.
> On which moral grounds would you object to a Star-Trek level replicator for physical good? Then why object to that for virtual goods?
This just made me realize a distressing thing - if we ever built a replicator, a lot of people might then want to destroy it. For the same reason I believe they object to LLMs - greed and entitlement. Because they don't get to benefit personally, they don't get first right to refuse, the instinct is to deny the value to others. The Dog in the Manger.
I use LLMs and consider them quite useful, but I think that characterization of detractors is very disingenuous. People don't object to LLMs out of greed and entitlement. People object to LLMs because the copyright and IP systems in most of the modern world have equated copying with theft for so long, complete with the threat of legal action and even prison sentences. This system was said to be there to keep people fed and employed. Suddenly, when giant companies have billions of dollars to gain by ignoring copyright, they are allowed to. We've lived in a couple generations where giant companies have been able to completely own and control our culture, which should belong to the people.
People object to modern AI because it's another glaring sign that capital doesn't care about human life, and the people who own the capital largely don't either. They will use that as a PR angle until it's not useful anymore, and then proudly say the opposite when it suits them. It's flagrant hypocrisy.
I believe that if we had sane copyright terms and limits so we were actually entitled to use and share our own culture and media as we see fit, and better social safety nets so people whose jobs become outmoded didn't have to worry about losing their homes and having their families go hungry, very few people would be actually against LLMs.
as a detractor, yes, partially (LLMs are also overblown marketing hype BS which is one of the other many reasons).
> I believe that if we had sane copyright terms and limits so we were actually entitled to use and share our own culture and media as we see fit,
i agreed with everything except this.
to me this feels like you’re saying “if only we were allowed to murder people then we’d have less crime” (not exactly what you are saying and a bit hyperbolic, but hopefully it helps highlight my perspective on what you said?).
existing copyright laws are the copyright laws. we all have to follow them or face penalties/consequences. just like the laws on murder/homicide etc.
it’s the fact these companies are being allowed to flaunt the law with zero repercussions i have a problem with (specifically on this one of the problems).
if they’d licensed all the content and it was opt-out — i wouldn’t give a shit about this part.
having worked in copyright, i feel very strongly about it. as do most people who have their works protected by it. its very easy to argue against copyright protections when your livelihood does not depend on it (note: i’m not arguing against you here, you mentioned social safety nets etc which is one direction to go i suppose, i’m just venting somewhat at the oft repeated opinion here on HN that copyright is evil and should be completely abolished… good luck listening to any decent music in ten years time if that happens!!).
edit — i know there’s nothing i can do about this. which also contributes to the misanthropic attitudes towards magic LLMs.
FWIW, my perspective and descriptions of the detractors is aimed primarily at the detractors I see - which are notably not the artists or other people whose livelihood depends on copyright protection. Instead, the loudest voices are of the bystanders who imagined possible windfall if only OpenAI et al. had to pay them for using their Reddit comments and a few blog articles written half a decade ago. These are the "I won't write comments on public forums anymore, nor will I blog, because LLMs" voices.
I fundamentally believe that people are not entitled to 100% of the value created by their labor - in fact, society can only function if there's surplus that others can build upon, and when companies try to capture 100% of their output, we call this out as extreme greed and a symptom of "late stage capitalism".
I do agree that people who are directly affected by LLMs basically replacing their job have a valid argument, though I don't think the core of it relates to copyright anyway.
As for the laws:
> it’s the fact these companies are being allowed to flaunt the law with zero repercussions i have a problem with (specifically on this one of the problems).
It's not been determined they actually broke the law on this. AFAIK it's still an open question, pending (in the US) court verdicts or (elsewhere in the world) updates to IP regulation. Morally, I personally think use of data for training AI is not violating copyright entirely, and is closer to a person consuming content and learning from it - but more importantly, I think preventing this use to be denying humanity great value for no good reason. Almost all content in the training set, on the margin, is both contributing an infinitesimal bit to the resulting model, and at the same time, it's providing much more value this way than it ever had before.
> Maybe I'm reading too much into your reply, but I don't see it as trolling or negative faith.
Maybe you are. All my repos are either MIT (where I'm a little proud, and would appreciate the acknowledgement - though realistically, I'd never sue anyone over it) or MIT-0.
So yeah, if it ends up in a LLM, and people copy it, great. Less "please give me free support" requests coming my end.
> On which moral grounds would you object to a Star-Trek level replicator for physical good? (please make them good, as offering any food anyone may want would fix world hunger once and for all)
Unfortunately this is one topic in which my philosophy qualification comes in handy — "moral grounds" are so varied by people, that it's almost useless as an argument.
Consider the following list of examples, I expect most people in the world will object to at least one of these arguments, but which one(s) they object to will vary wildly:
1. Kantian Ethics: Replicators risk devaluing human labor by reducing work to a mere means, thereby undermining the inherent dignity derived from effort.
2. Aristotelian Virtue Ethics: By eliminating the need for craftsmanship and effort, replicators could impair the cultivation of virtues essential for personal and communal flourishing.
3. Marxist Ethics: The obsolescence of traditional labor due to replicators may intensify alienation and disrupt social structures central to class solidarity.
4. Existentialism: Removing material struggle through replicators might strip life of the challenges necessary for authentic self-creation and personal meaning.
5. Confucian Ethics: Such technology could erode the social harmony built on mutual effort and well-defined communal roles, destabilizing moral and familial bonds.
6. Environmental Ethics: Unlimited production enabled by replicators may encourage overconsumption and waste, endangering ecological balance and sustainable resource management.
7. Amish Ethics: Replicators could undermine the values of simplicity, humility, and communal labor by promoting dependence on technology instead of human effort and cooperation.
8. Consequentialism: While replicators, as defined by your question, can solve world hunger, they're also demonstrably able to build weapons (as can current 3d printers), and can function as Von Neumann self-replicating machines. A literal minefield made of these appeared in DS9, and giving such tech to the world almost unavoidably means also giving such tech to every psychopath. Also grey-goo/paperclip scenarios become plausible.
> Then why object to that for virtual goods?
You can't eat virtual cheese, and unlike The Matrix if you die in a video game you don't die in real life, so the arguments for/against AI don't even need to be the same as those for/against Trek-replicators.
Yes, take me for example.
> Many people release code to the "public domain" (or under very liberal licenses).
In my case, the MIT license, because I saw it was popular, and I was afraid that in some places, "public domain" might cause unexpected legal issues to whoever wants to "play by the book" and use my code.
> if LLM chews on it and regurgitates it out
As work coming from a machine does not have copyright protection, whoever gets a LLM to spit out my code back can then claim it as their own, under whatever term they like.
If this person wants to contribute to a free software project and release the code under the GPL v2 or v3, good: it may help create a new feature that users will enjoy!
If this person wants to contribute to their company private software that's only available on a subscription basis (and let's say the subscription is sold at an eye-watering price), good: it means whoever pay this subscription will get more from their money, and whoever use the software may get a new feature they will enjoy!
Software has nearly 0 marginal costs. LLM is the closest thing to a Star-Trek level "replicator", getting everyone everything they want.
On which moral grounds would you object to a Star-Trek level replicator for physical good? (please make them good, as offering any food anyone may want would fix world hunger once and for all)
Then why object to that for virtual goods?
Maybe I'm reading too much into your reply, but I don't see it as trolling or negative faith.
I see variants of it in many places, and they all look to me very close to luddism: rejecting a new technology, because you fear for your own work, while ignoring what this technology will enable in the greater picture: for the orignal case of luddism, reducing the price of clothing for everyone by increasing production and decreasing labor, therefore allowing workers to get in other fields where they may try to satisfy other human wants - some that would be inconcievable to the original luddites like videogames
We should feel graceful we get more technology, as it removes constraints and make more people happy.