Hacker News new | past | comments | ask | show | jobs | submit login

He's not criticizing cloud computing, he's saying "cloud computing" is an idiotic idiom. And it is. "Cloud computing" is just distributed computing. It has been around for decades. (See: http://en.wikipedia.org/wiki/Distributed_computing)

But, it's the latest tech word-soup. Like in 2005 when everything Javascript became "Ajax", everything distributed becomes "Cloud".




As far as idioms go, it's as good as any. "Cloud computing" != "disributed computing" -- it's on-demand virtual hardware and storage. That's how most users of cloud think of it, in my experience.

I was hugely skeptical of the cloud concept -- sneered at it, even -- until i started doing the cost/benefit analysis.


On-demand virtual hardware and storage is Infrastructure as a Service. I'd be willing to bet that most developers think of that when they think of the cloud (since, as you point out, they're actual users of the cloud).

But then some marketer somewhere (I'm guessing Salesforce, but that's my own bias :) decided that any web application could call itself "Cloud" (since "Software as a Service" was no longer cool. Again, I'm blaming Salesforce for flogging that horse to death...). This way, even normal people can use the cloud/buy cloud services, and it's just as easy as using the Web since it's really just the Web! (Way to go, smart and savvy Internet^H^H^H^H^H^H^HCloud user—you're so much smarter than everyone else and can expect to get promoted over/laid more than your coworkers still using that ancient Web 2.0!)

The Wikipedia article on Cloud Computing is a good illustration on how vague the term has gotten, especially how ill-defined all of the sub-classes are: Platform as a Service, Application as a Service, Service as a Service. Pretty much anything on the Web can be made into a definition of "cloud" somehow.

I'd really like to see Cloud Computing only applied to things involving immense elastically scalable computing and storage that can be provisioned and taken offline immediately, but I don't think that's going to happen. The meaningfulness of the term has been destroyed by the rampant "me-too"-ing of the industry.


Everything in Javascript didn't become Ajax.

The field developed and it became useful to distinguish between different uses of Javascript, so new names were invented to do that.

Javascript that changes the page and calculates on user input is still 'Javascript'. Javascript that sends requests to a server and updates the page based on the result became 'ajax', Javascript that does processing in a small-tool-page-independant fashion became 'bookmarklets', Javascript that does page processing in an after-market-modification fashion became 'userscripts'.

Likewise distributed computing - that just refers to using multiple computers. They could be yours, rented, local, remote, clustured, desktops, servers, using any kind of framework. Cloud computing refers generally to renting remote servers from a third party who provide a particular software stack and interface and a restricted set of services. Yes the edges blur, but having 5 linodes to store data on is not really cloud computing, it's just renting servers, but having an S3 account is.


I think that was his point. Even though JavaScript had been used in web apps for years, as soon as "AJAX" caught on people (who didn't know better) started referring to anything that used JavaScript as "AJAX". I still get people asking me if such and such website is written in "AJAX".


Cloud Computing is about how you manage your infrastructure, not how you use it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: