LLMs don't have rights. LLMs are tools, and the state can regulate tools. Humans acting on behalf of these companies can still, if they felt the bizarre desire to, publish assembly instructions for bioweapons on the company blog.
You're confused about whose rights are at stake. It's you, not the LLM, that is being restricted. Your argument is like saying, "Books don't have rights, so the state can censor books."
> if they felt the bizarre desire to, publish assembly instructions for bioweapons on the company blog.
Can they publish them by intentionally putting them into the latent space of an LLM?
What if they make an LLM that can only produce that text? What if they continue training so it contains a second text they intended to publish? And continue to add more? Does the fact that there's a collection change things?
These are genuine questions, and I have no clue what the answers are. It seems strange to treat a implementation of text storage so differently that you lose all rights to that text.
I have rights. I want to use whatever tool or source I want - LLMs, news, Wikipedia, search engines. There’s no acceptable excuse for censorship of any of these, as it violates my rights as an individual.
More and more people get information from LLMs. You should be horrified at the idea of giving the state control over what information people can access through them, because going by historical precedent there's 100% chance that the state would use that censorship power against the interests of its citizens.
Are you also horrified how many people get their facts from Wikipedia, given its systematic biases? All tools have their strengths and weaknesses. But letting politicians decide which information is rightthink seems scary.