I got a chuckle out of Apple M1 chip touting having shared video memory as a big step forward. (Which it is, but is still amusing to me how it might have sounded like a groundbreaking innovation to a layperson.)
In the PC industry, that's called UMA and has been around for a few decades, synonymous with ultra-low-cost (and performance) integrated graphics. To hype it so much as a good thing, Apple marketing is really genius.
Yes. I have thought about this a lot. There are cycles...
Like thin client (VT100), to thick (client/server desktop app), to thin (browser), etc.
Similarly, console apps (respond to a single request in a loop), to event-driven GUI apps, to HTTP apps that just respond to a simple request, back to event-driven JS apps.
It depends on how you define the boundaries, but history rhymes.
Is it really a pendulum, or is it more that this was always an idea with merit that's now finally seeing wider adoption because it's become more widely available? (In part, I understand, due to some IBM patents that expire 10 or so years ago)
The greatest trick the devil ever pulled is convincing people that shared hosting is preferable to dedicated, and then charging them way more money for it.
We started out on mainframes.
Then things moved to the desktop, with some centralized functionality on servers (shared drives, batch jobs).
The processing moved to centralized web servers via the web, SAAS, and the cloud.
Then more moved into the client through React & similar.
And now things are moving back to the server.
Tick. Tock.
These changes are not just arbitrary whims of fashion, though. They're driven by generational improvements in technology and tooling.