Hacker News new | past | comments | ask | show | jobs | submit login

"Consider the shift to cloud computing. From 2000 to 2005, electricity use by data centers in the U.S. increased 90 percent. From 2005 to 2010, the gain was 24 percent. As of 2014, data centers accounted for 1.8 percent of U.S. electricity use, according to a 2016 Lawrence Berkeley study, but their electricity demand growth had slowed to a crawl (4 percent from 2010 to 2014). What happened? The nation outsourced its computing needs to cloud providers, for whom cutting the massive electricity costs of their data centers became a competitive imperative. So they innovated, with more-efficient cooling systems and new ways of scaling back electricity use when servers are less busy."

I am torn because centralization leads to more R&D and cost savings and better security. I just wish the innovations would propagate to everyone. Decentralization is much better for everything except those things that come with economies of scale.




The R&D being done here doesn't only apply if you run extremely large data centers. Facebook, Google are contributing back to the Open Compute project [1] , and anyone could use this knowledge if they want to build a small data center with things like the Open Rack[2].

[1] http://www.opencompute.org/ [2] https://code.facebook.com/posts/1687861518126048/facebook-to...


But it seems to me that when it comes to security, efficiency etc. only large centers can get it right, because they have more at stake. The majority of small providers will mess up. Like compare AWS uptime and security to a regular host.

And sadly this explains the rise of centralized everything including gmail and facebook and iOS app store and - for thousands of years - centralized cities and states and federations.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: