While I agree the general thrust of what you're saying, I do think there is something subtly different about the AI hype cycle. It's so accessible. The core concepts are the fundamental, inextricable frame of daily experience. The metaphors we use to talk about it borrow so heavily from lay terminology that it's so easy, so automatic, to believe we understand the state of modern AI because we understand the common definitions of the words that tend to show up around the matter. Do some semantic arithmetic and sum up a few words like 'training' and 'learning' and 'intelligence' and all of the sudden, we're all walking around with what we think are proximate models of what everyone else is talking about to, or y'know close enough to make a consulting business out of.
The real kicker though, is that at it's heart we're all pretty convinced that we understand what it's like to be intelligent. I mean how could we not be? Nevermind the agonies of ten thousand years of philosophers and clergy. And so it must be pretty straightforward, with a little bit of introspection, a crash course in statistics, and maybe a TED talk or two, to map that to the artificial side too, right?
The real kicker though, is that at it's heart we're all pretty convinced that we understand what it's like to be intelligent. I mean how could we not be? Nevermind the agonies of ten thousand years of philosophers and clergy. And so it must be pretty straightforward, with a little bit of introspection, a crash course in statistics, and maybe a TED talk or two, to map that to the artificial side too, right?