Your post was generated using GPT-3 and 175 billion parameters of pre-existing human writing, contextualized, distilled, and cross-referenced with terminology we've agreed on for centuries. It's a parrot, and I remain unimpressed.
Take the learned knowledge of GPT-3 (because it must be so smart right?) and have it actually do something. Buy stocks, make chemical formulas, build planes. If you are not broke or dead by the end of that exercise, I'll be impressed and believe GPT-3 knows things.
What's unimpressive about a stunningly believable parrot? I think, at the very least, that GPT-3 is knowledgeable enough to answer any trivia you throw at it, and creative enough to write original poetry that a college student could have plausibly written.
Not everything worth doing is as high-stakes as buying stocks, making chemical formulas, or building planes.
Sigh. When DNA becomes human, it doesn't have a-priori access to all the world's knowledge and yet it still develops intelligence without it. And that little DNA machine learns and grows over time.
When thousands of scientists and billions of human artifacts and 1000X more compute are put into the philosophical successor of GPT-3, it won't be as impressive as what happens when a 2 year old becomes a 3 year old. (It will probably make GPT-4 even less impressive than GPT-3, because the inputs vis-a-vis outputs will be even that much more removed from what humans already do.)
> That post was generated using GPT-3 and 175 billion parameters of pre-existing human writing, contextualized, distilled, and cross-referenced with terminology we've agreed on for centuries.
DNA is nothing like the training of GPT. DNA does not encord a massive amount of statistics of words and language and how concepts, words, etc, relate to one another.
All DNA does it encode for how to grow, build, and mantain a human body. That human body has the potential to learn a language and communicate, but if you put a baby human inside an empty room and drop in food, it will never learn language and never communicate. DNA isn't magic and comparing "millions of years of evolution" of DNA is nothing like the Petabytes of data that GPT-3 needs to operate.
Again DNA has no knowledge embedded in it, it has no words or data embedded. Data in the sense that we imagine Wikipedia stored in JSON files on a hard disk. DNA stores an algorithm for growth of a human, that's it.
The GPT-3 model is probably > 700GB in size. That is, for GPT to be able to generate text it needs an absolutely massive "memory" of existing text which it can recite verbatim. In contrast, young human children can generate more novel insights with many orders of magnitude less data in "memory" and less training time.
"Knows things" is kind of vague. I'm pretty sure GPT-3 would obliterate all traditional knowledge bases we have. Even bert could achieve state of the art results when the questions are phrased as the cloze task.
If you mean that anything except full general intelligence is unimpressive than that seems like a fairly high standard.
Your post was generated using GPT-3 and 175 billion parameters of pre-existing human writing, contextualized, distilled, and cross-referenced with terminology we've agreed on for centuries. It's a parrot, and I remain unimpressed.
Take the learned knowledge of GPT-3 (because it must be so smart right?) and have it actually do something. Buy stocks, make chemical formulas, build planes. If you are not broke or dead by the end of that exercise, I'll be impressed and believe GPT-3 knows things.