That's already the case minus the last ~70 years or so. The overwhelming majority of our knowledge is in the public domain, in particular cultural artifacts.
It's a nice sentiment but like, people can already go to gutenberg.org and download pretty much most important works of literature in existence and most books have like 5k downloads so there's that.
I assume they are referring to the fact that books older than about 70 years are in the public domain. The rest are protected by copyright and not free.
Project Gutenberg is missing a lot of content that is public domain, and the oldest entries are often pretty poor in quality (and some of the oldest entries aren't lucky enough to get redone like Carroll's work has been).
Which just goes to prove parent's point. As much as it might seem otherwise, IPR restrictions are not the main bottleneck to widespread availability of content (at least book-like content); actually making the content available is far more important!
(Also, keep in mind that content is now entering the public domain every year, and projects like PG are nowhere close to keeping up with that flow of newly-unrestricted stuff. So this dynamic is becoming more extreme over time, not less.)
I really don't think it proves any point aside from Project Gutenberg being a bad example, as it doesn't really contain the sum of human knowledge in the sense that a pirate library does.
In a lot of countries it's life plus 70 years. If it was only 70 years, we'd already have everything from 1951 and earlier. However, we have to wait for the authors to die before we even start counting, and good luck if it's somebody obscure and you can't find their date of death.
It's actually fairly unlikely that the bulk of published knowledge is in the public domain.
By copyright expiry, public domain in the US begins in 1927. Later works may be in the public domain, but all works published prior to 1927 are in the public domain in the United States.
(This may not be the case in other countries.)
There were not many published books prior to the invention of the printing press, and many of those didn't survive. The total number of books (not individual titles but actualy bound volumes entirely) in Western Europe as of 1400 may have been as few as 50,000.
By 1800, about 1 million titles had been printed.
Over the course of the 18th century, presses became vastly faster, as they evolved from hand-operated wooden screw-press to iron-frames to steam and electric-powered rotary and ultimately web presses. Paper became much cheaper (and less durable --- a factor commented on at length in the Librarian of Congress's annual reports to Congress in the late 19th century). Literacy exploded from ~25% to 95%+ over the 19th century (and probably accounted for numerous revolutions and political upheavals).
Through much of the 20th century, certainly by 1950, US publishers were issuing about 300,000 new titles per year, a rate which state remarkably constant through the early 21st century. By the aughts, "nontraditional" self-publishing (a/k/a "vanity press") was nearing or exceeding 1 million titles per year more than had been published through all time to 1800.
Reports that all recorded data was doubling every few years date to at least the 1960s. That would mean that in any two year period ... half of all recorded information was less than two years old.
The catch is that not all recorded data is published. So I'm not sure what the time-distribution of all publishing looks like. But I'm pretty confident it's skewed far more recently than 1927. And would thus tend to be copyrighted rather than uncopyrighted.
If you want to measure works by significance, you might make a different argument --- there are many great works of literature, philosophy, history, and religion which were first published before 1927. But ranking and tabulating these is more challenging than a simple enumeration.
The lawsuit has been settled. Gutenberg has been unblocked in Germany since late October. The site was completely blocked for three years, from 2018 on. Now, a total of 19 German language books are blocked, per the "terms of the settlement are confidential." Everything else works.
I have information that the abysmal filth is still in place: "Sito sotto sequestro".
I wonder how your provider overrides it. Is it too much of a wet dream to think that some operators (behind the DNS) responded to the requests of the vacuously appointed low scum* with disdainful neglect? In terms of "If you really had any credential for existence but some forms of physical expression, we would say that you must be joking - before putting the proof of your vilty in the trash archival it deserves".
*May I remind that they appended the apex of Civilization, Project Gutenberg, to a list of a dozen pirate sites in their "operation". No, really, I cannot find words to label those lowest abysses.
Just tried it at home and loads fine from broadband too. That "Sito sotto sequestro" literally means "site has been seized", which of course isn't the case here; that page might indicate that the carrier has been forced to resolve that domain elsewhere, therefore all it needs is using an external name server to get around the block.
I'm currently using one of Cloudflare DNS addresses and it loads just fine.
I don't use the (ex) national provider (Telecom Italia / Tim) however. Crappy service aside, they're well known for jumping every time the government tells them to, and happily filter a lot of stuff.
It's a nice sentiment but like, people can already go to gutenberg.org and download pretty much most important works of literature in existence and most books have like 5k downloads so there's that.