Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'd take it back even further:

We started out on mainframes.

Then things moved to the desktop, with some centralized functionality on servers (shared drives, batch jobs).

The processing moved to centralized web servers via the web, SAAS, and the cloud.

Then more moved into the client through React & similar.

And now things are moving back to the server.

Tick. Tock.

These changes are not just arbitrary whims of fashion, though. They're driven by generational improvements in technology and tooling.



I got a chuckle out of Apple M1 chip touting having shared video memory as a big step forward. (Which it is, but is still amusing to me how it might have sounded like a groundbreaking innovation to a layperson.)


In the PC industry, that's called UMA and has been around for a few decades, synonymous with ultra-low-cost (and performance) integrated graphics. To hype it so much as a good thing, Apple marketing is really genius.


Apple takes Cue From Original Xbox with Latest Chipset.


Or; Apple takes cue from own Macintosh IIsi from three decades ago?


Yes. I have thought about this a lot. There are cycles...

Like thin client (VT100), to thick (client/server desktop app), to thin (browser), etc.

Similarly, console apps (respond to a single request in a loop), to event-driven GUI apps, to HTTP apps that just respond to a simple request, back to event-driven JS apps.

It depends on how you define the boundaries, but history rhymes.


Virtual machines, containers, very similar to partitions and spaces on mainframes as well.


Virtual machines are not a recent invention. They were already being used on IBM mainframes starting in the early 1970s:

https://en.wikipedia.org/wiki/VM_(operating_system)

Notably, the VM operating system could run an instance of itself in one of its own virtual machines.


Is it really a pendulum, or is it more that this was always an idea with merit that's now finally seeing wider adoption because it's become more widely available? (In part, I understand, due to some IBM patents that expire 10 or so years ago)


And serverless is an anemic CICS executing non-transactions.


Serverless is kind of like Apache running PHP scripts in virtual hosts.


The greatest trick the devil ever pulled is convincing people that shared hosting is preferable to dedicated, and then charging them way more money for it.


The cloud is just someone else's computer in a data center in new Jersey about to be hurt by a hurricane.


They're driven by generational improvements in technology and tooling.

I'd say they're driven by corporate greed. Cloud computing is basically renting time, and so the more you use them, the more $$$ they make.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: