Hacker News new | past | comments | ask | show | jobs | submit login
Google flushes heat from data center with toilet water (arstechnica.com)
125 points by shawndumas on March 16, 2012 | hide | past | favorite | 39 comments



Wow, what is up with data center news? I mean seriously.

I joke with folks who think running a datacenter is 'exciting' its like watching grass grow, except its dark and their is no grass. Maybe I should blog about this stuff, but at social events it seems impossible to maintain a conversation about apex cooling vs planar cooling vs directed airflow or conduction, or 208 three phase vs 120 single phase vs DC. Perhaps I go to the wrong types of parties :-)


In secondary school, I had a chance to visit a supercomputer center with a Cray Y-MP (like this one: http://planetromero.com/2010/01/gametales-cray-ymp). Turns out it's just a big cabinet. Hooked up to other big cabinets. Oh, and when you sat down at a login terminal, it was just a Unix box. Coolest thing about it was the attached bench (Cool! I sat on a supercomputer! Hmm. Not very comfy.)

sigh

From the outside, the datacenter is an inaccessible realm of high technology, carefully engineered to maximize various important features. From the inside, it's cabinets and wiring and plumbing.


Except the Barcelona Supercomputer Centre, in a church...

http://degiorgi.math.hr/~vsego/phun/beautiful_supercomputer/


That shit cray!


On the surface a data centre is, as you say, dull. They're just big noisy sheds full of racks and wires. Some things that make them interesting:

1) Some people don't know just how noisy a data centre can be.

2) Some people just like seeing a rack with a caption saying "48 TB across X drives in Raid Y"; or lots of cat 5 / 5e / 6.

3) Innovative solutions for heat, aural noise, electrical noise, power, security, bandwidth, etc.

Most people on HN aren't going to be in group 2. But there's so many smart people on HN, and I'm surprised that this stuff feels so (and I know this is the wrong word) "backwards". (Maybe "stagnant" is better?)

Energy has cost, and reducing that cost is a space for making money.


I worked at a well known government agency for a year, and spent some time in one of the data centers. It was poorly designed so the main air conditioning vent was directly above... the cubicles and desks. Instead of the servers.

So it would be blasting full cold all day and all night, and we were miserably cold. Nothing like wearing three jackets to work in the middle of August in Maryland.

So one day, we brought in a few refrigerator boxes, attached them to the ceiling to make a make-shift vent, and then we shunted all the cool air to the "server" side of the room.

We were much happier after that.


You should hear the stuff soundsystem heads talk about. To be honest it's very similar to the things you just listed, power distro and heat dissipation, plus lots of stuff about managing phase, gain structure, the relevance of SPL metering, scoops versus horns, price/performance ratios for different types of plywood, and so on.

To be fair, I expect it helps that after all that stuff is dealt with, the equipment turns all those watts and volts into waves of pounding bass and hordes of dancing people, as opposed to just pushing bits around.

BTW for anyone who does want to debate pro audio, head over to http://forum.speakerplans.com/


Oooh, wye or delta? Tell us more!

I look forward to the fun stories we'll hear when data centers save a power conversion step by boosting line voltage to around 300VDC (necessary for PFC), and then put say 25 servers in series (floating from ground).


That would probably make a funny Tumblr ... "Things people say about data centers" or maybe "Data Center Ryan Gosling"


Here's a youtube video about the system: http://www.youtube.com/watch?v=lJnlgM1yEU0

Basically they're taking pre-treated water from the city sewage plant and running it through their own mini-treatment plant to a level that allows them to use it in their cooling system. They do mention filtration & sterilization from chlorine, so I am wondering really how much different the cooling water is compared to potable drinking water.

The water they're getting from the sewage treatment plant is definitely not raw toilet water. I imagine Google is getting a big discount by treating their own water.


Sewage waste tends to have high amounts of pharmaceuticals excreted in urine, which are hard to filter but biodegrade. I wouldn't drink the Google water it, but it probably has no immediate toxicity.

Google must treat the water with chlorine, because of risks like Legionnaires' disease.


Wouldn't the dirty water clog the pipe? Via mineral deposits etc?


In the video, they say that the water is filtered and chlorinated before going into the cooling system.


I am wondering if this water circulates within their datacenter going through some cooling system or they have constant stream connected to community water pipe? Can someone explain? e.g. how big might be daily consumption of this facility? is there serious "green" difference of its mostly PR trick?


I'm not sure exactly how much water these data-centers use, but it's huge enough that Microsoft built a water treatment plant next to its DC in Quincy, WA, which it now leases back to the city for $10/year in exchange for reduced water rates. In that scenario the water is reused by treating at each stage between the fruit processors, the data center and the city.

http://blogs.technet.com/b/msdatacenters/archive/2011/10/13/...


It occurs to me that one way to save power would be to specifically design your servers to operate at significantly higher temperature, that way cooling is much cheaper and less power intensive.


Very true, search for "server inlet temperature" if you want to learn more.


This is not as green as it sounds.

This area of the united states suffers from huge water shortages, and the river they are using is a major source of water for Atlanta (I'm not sure if they are upsteam or down from Atlanta).

As a result by the time the river flows out of Georgia into Alabama and Florida it is very diminished. This makes those states pretty upset since they also want to use the water.

So what does Google do? They evaporate the water! What a waste.

Water cooling is not really a very clean thing in this part of the country.

I would like to see some numbers: How much energy does it cost to desalinate the amount of water they are wasting? If it's more than the energy they are saving then they do not have a net win.


"So what does Google do? They evaporate the water! What a waste."

Quite literally a drop in the bucket. The back of my envelope has too much written on it at the moment, but if you look at the total water flow in that watershed and multiply it by the heat of vapoization of water you get a staggeringly large energy flow. There's no way; absolutely no way at all a few megawatts of server heating would be measurable at all. None.


I happened to have a free envelope:

Google's total worldwide power usage is about 220 megawatts, calculate the amount of water starting at about 20 deg. C average needed to evaporate to carry away all that energy and that works out to about 700 million gallons of water needed per year. Which is about one or two days of water usage for Atlanta. Keep in mind that this is for all of google's data centers worldwide. For only the servers that are actually in Georgia the amount of water carried away is incredibly small compared to total usage.

Also I didn't account for radiative cooling which is probably fairly significant but likely not more than, say, 50% of the necessary cooling load.


1 megawatt * 1 hour = 3,600,000,000 joules Water takes 2,257,000 joules / kg to evaporate Which works out to ~1,600 kg or 422 gallons of water per hour per MW.

However, I don't think they actually want to evaporate all that much water which leaves a lot of residue issues and instead focus on heating a much larger volume of water.

PS: The article is also wrong in that they don't directly cool hot air from their servers with this water. They are cooling air heated by there cooling system and depending on how hot the air from there servers are they may simply vent that if it's warmer than the outside air.


Heh, the gauntlet has been thrown, so I need to show numbers too.

I see that Atlanta rainfall is about 48" (122cm) a year.

422 gallons * 24 * 365 (* 3.785 litres/gallon) * 100 cc/L == 1.4e9 cc/year.

So converting to rainfall area: 1.4e9cc/122cm = 1148 m^2

Basically a 100' square catch basin on the roof of the building would provide all the water they need. Noise.


> Basically a 100' square catch basin on the roof of the building would provide all the water they need. Noise.

Which raises the question: Why don't they just do that? I'd imagine it would be less effort to process, and even though you do need to have a buffer against low rainfall periods, water is one of the easiest things to store.


O, I agree it's Noise. However, the reason they don't use a catch basin / just pay for it is Atlanta's high average humidity. There are going to be days where little water is going to evaporate in those cooling towers and most of their energy is going to simply raise the temperature of a large volume of water. So, having access to say 100 times the flow they need is vary helpful.


This is an absurd argument. Evaporative cooling is green and efficient, but problematic during droughts. Your argument is like saying wind turbines are pointless because sometimes the wind doesn't blow.

If Google cools with fossil fuels during only droughts, that is still much better than using fossil fuels all the time.


>(I'm not sure if they are upsteam or down from Atlanta).

The two reservoirs which provide water for Atlanta are located far north of Douglas County (I'm a resident).

I do find it odd that they are using evaporation cooling, since humidity is so high here.


I don't really know anything about this, but how does water cooling consume water? Is it not just for heat transfer?


Not evaporative cooling.


I would like to know why Google had to open a datacenter in that specific region. It gets incredibly hot and humid.


Presumably to reduce latency for nearby users. People in warm climates deserve fast Google results too! :)


The amortized CAPEX (cost of building the data center) is still greater than the OPEX (cost of keeping it running). If Google can find cheap land and labor, heat might be irrelevant.


Google Please... Facebook has been pumping loads of shit through it's data centers for years!


High five to google for a green solution!


not sarcasm folks--unless people dont like green solutions???


THAT explains a lot!!!


2 google datacenter articles in as many days, both including quotes from the chief of DC ops? why the big PR push all of a sudden from what's normally a quiet division?

i'm guessing google has a big datacenter PR disaster that they think is going to leak soon, or else Mr Kava is looking for a raise.


I think it's more that it's technically kind of interesting, and keeping them in the press (which is cheap) lets them recruit more easily (across all infrastructure, not just datacenter ops), and more importantly, retain staff.

"Green Google" also is a halo for the company overall, and might be heading off negative press about how Google alone uses more electrical power than all the households in (several lightly populated states, chosen to mislead).


But what PR disaster would be negated by this type of data center news?


It took a lot of convincing to get them to switch from using Brawndo: The Thirst Mutilator.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: