So ... any actual measurements of how much energy was actually saved? Neat hacking for sure, but it seems somewhere along the line thinking clearly about the goal of "saving energy" got lost in the details.
Although I'm not exactly experienced in the field, I'm going to guess it's quite unlikely the water heater's energy consumption was reduced by anything close to 2/3, or even by a significant amount.
Most obviously, the author completely overlooked that there are two types of energy use by a tank-style water heater:
1) Heating incoming water to the desired temperature.
2) Holding the tank at the desired temperature.
This setup does nothing to address the first energy need, which is the larger of the two.
And, unless the timing is spot-on, it will heat a tank which is not immediately used -- which then cools off, and finally needs reheating later.
If you make the (incorrect) simplifying assumptions that (a) a tank loses energy at the same rate regardless of its current temperature, and (b) the burner is instant-on/off to full efficiency, then my pre-coffee mind arrives at the conclusion that reheating a stale tank to a target temperature would use exactly the same energy input as holding it at the target temperature the whole time anyway.
Without those simplifications, it still seems likely that the 'stale tank re-heat' will use a decently large fraction of (2)'s energy.
tldr; To take large chunks out of your consumption of heating energy, insulate your home better instead. Bonus: It saves you cooling energy during the summer, too!
reheating a stale tank to a target temperature would use exactly the same energy input as holding it at the target temperature the whole time anyway
Essentially this same line of reasoning was used during the 1970's energy crisis to argue that night setback of a thermostat wouldn't save energy. "You spend as much energy to reheat the house as you save by letting it cool down."
Unfortunately, that argument is incorrect.
Think about the overnight cycle, during which no water flows in or out of the tank, and the water ends up at the same temperature as it started. The total energy you have to put into the tank equals the heat lost through its insulation. That heat loss is driven by the temperature difference between the hot water and the surrounding air. Lowering the water temperature means the tank loses less energy over that time. Therefore you have to put less energy in.
In other words, you were right to identify your assumption (a) as incorrect. The tank's heat loss does, in fact, depend in a very direct way on the water temperature.
Assumption (b) is probably incorrect too, you know. ;)
That section could probably have been omitted entirely, since my point lay more in the following sentence:
> it still seems likely that the 'stale tank re-heat' will use a decently large fraction of (2)'s energy.
In other words, this project tackles the minority energy user (only 25% of all heating), then addresses the minority energy consumer within that (far more energy is used heating fresh cold water), and does so only partially (vs, say, switching to a fully on-demand heater).
I looked into a project like this a while back and my research came to the same conclusion: holding a modern insulated tank at temperature takes a minimal amount of energy compared to heating the water. Recovery time of hot water heaters isn't the same as something like air conditioning a house where it's useful to run your house at a higher temperature when the home isn't occupied.
Unless you radically change your consumption of hot water, you're gonna need the same amount of energy to heat it no matter what temperature control method you use.
Something that is possibly way more efficient is waste heat recovery from a shower/tub drain. There are devices that do it, but they aren't cheap (price of copper is the killer)
Low-flow showerheads could have a decent impact as well. So would other ways of reducing hot-water use (turn the shower off when soaping & shampooing, etc), though I doubt they'd see widespread adoption.
Quick answer: Because in a water heater (2) is very small compared to the energy cost of (1). In a home, which is infinitely less insulated than a good tank, maintaining temperature is a lot more expensive.
Edit to add: I'm not claiming zero energy savings -- I'm just saying that, compared to the other opportunities for savings available, this project's energy savings are likely to be tiny.
It's a big risk because most people that get Legionnaires get Pneumonia. A few years ago my neighbor got Pnemonia and almost died. He was 41, fit, and otherwise healthy. It's nasty business, and you do not want to risk it just to save a few bucks a year.
If you want to do your part, go out and buy a new efficient tank, or even a tankless heater. Both of them will save you a bundle over that older model.
Yes this is true.. Another thing that people don't usually consider is that any hot water system actually contains a gradient of temperatures, all the way from whatever your water heater is set at, down to almost room temperature where the faucet is. Somewhere along that gradient is the temperature that Legionellae will thrive, so it's a good idea to let the hot water run until it comes up to temperature before using that water.
This is really interesting. I'm assuming this fine gentleman (or lady folk) reads this thread. Dude, you should totally post back in a month or two and report your actual energy savings.
I'm wondering if it's actually inefficient to turn the setting lower and actually heat the water back up later. Does anyone know?
Disclaimer: This comment, and my other one, are the first time I've actually thought about this.
The amount of tank heat loss would depend on the temperature difference between the water & the environment; a tank that's kept hot will lose more heat than one that's warm.
So I think it's a marginal gain to let it cool, but only marginal. And: the better-insulated your tank & plumbing are, the less you gain.
Although I'm not exactly experienced in the field, I'm going to guess it's quite unlikely the water heater's energy consumption was reduced by anything close to 2/3, or even by a significant amount.
Most obviously, the author completely overlooked that there are two types of energy use by a tank-style water heater:
1) Heating incoming water to the desired temperature.
2) Holding the tank at the desired temperature.
This setup does nothing to address the first energy need, which is the larger of the two.
And, unless the timing is spot-on, it will heat a tank which is not immediately used -- which then cools off, and finally needs reheating later.
If you make the (incorrect) simplifying assumptions that (a) a tank loses energy at the same rate regardless of its current temperature, and (b) the burner is instant-on/off to full efficiency, then my pre-coffee mind arrives at the conclusion that reheating a stale tank to a target temperature would use exactly the same energy input as holding it at the target temperature the whole time anyway.
Without those simplifications, it still seems likely that the 'stale tank re-heat' will use a decently large fraction of (2)'s energy.
tldr; To take large chunks out of your consumption of heating energy, insulate your home better instead. Bonus: It saves you cooling energy during the summer, too!