Hacker News new | past | comments | ask | show | jobs | submit login
No, you can’t save £30 per year by switching off your “standby” devices (shkspr.mobi)
263 points by tosh on April 27, 2022 | hide | past | favorite | 411 comments



Strange, I've had a very different experience to the author of this, I've actually got a similar spreadsheet I may clean-up and post to show another view point, however in summary.

A couple of months ago I went on a little mission around the house testing the power consumption of devices on standby, as I noticed through my energy monitor that idle power consumption was about 0.4-0.5kW while everyone was sleeping at night, which seemed high to my research of national averages.

To my surprise a number of fairly modern devices I found to have utterly terrible standby power consumption, the new buttonless electric hob in my kitchen, the two monitors for my PC on standby, the TV & Blu-ray player and an Aircon unit just to name a few.

Since then I used a few Sonoff relays around the house to add these to Alexa, the results were actually quite amazing really, about £10 per week of electricity saved, so £30-£40 per month...

So in short, if your similar to me you can actually save £300 a year, I think the key take away here is clamp/monitor your own devices and take a judgement from there, no one house/setup is the same.

---

Edit, here's a very quick screenshot of the spreadsheet and data I've got to hand at the moment, in time I'll get something better cleaned up and together.

https://imgur.com/a/l4xMwbo


I have some questions about your figures:

- You've included the fridge, that's not really "standby" power in the sense OP is meaning. That's active power consumption. Fridges don't have a standby mode. - As above you've included air conditioning. That looks like active power with a pump running, not standby power. I cannot envisage any way you could get that consumption in standby. I think you've made a mistake with that. - Why did you measure the Alexa with a toaster and a clock connected to it? The Alexa is 2W max. Is this toaster some kind of smart device? A normal toaster is either on or using zero power. Do clocks have a standby mode? I don't understand that. - What's the iMac power consumption mean? It uses 6.6W shutdown or in hibernate? That seems unlikely. - Are you sure you've switched these speakers into standby? That looks like they're switched on and active. - What are the lighting figures? Regular lights are zero switched off. Are these remote controlled or wi-fi lights maybe? - Your door bell is seriously using 10 watts? Are you sure? Do you mean is this actually a camera or something? - You've got eight devices involved with delivering your wi-fi and network (four access points, 2 switches, a router and a controller) . That's probably more than average and it's arguable whether that's really standby power as such in the sense OP means. - It was measured with some form of clamp meter. Have you checked that with a multimeter to see if that's really accurate? A consumer one may not be that good.


I would interpret these devices as being in a low-power usage mode where we wouldn't expect high power draws, rather than literally a 'standby mode'. A clock certainly falls in this category. A fridge if it hasn't been opened in hours and has good insulation could even fall here as well overnight, etc. Sure there are power spikes but the important thing is the amortized consumption.


But you cant turn off your fridge at the wall so what's the point in measuring it?


Measuring your fridge is important as it could point to a bad compressor or bad seals.

Your fridge compressor shouldn't be running 24/7, and it should mostly be on standby until temperature rises.


The point is measuring the consumption of a device while you do not need it and it is not performing any meaningful work.

Your fridge is never in that state. Even when the compressor is not working, the fridge is still functioning and performing work. It is sensing temperature and pressure and time and then making decisions based on those inputs.

Your laptop is not performing meaningful work when it is updating while you are away. It can do that in the background while you look at cat pictures.


> the fridge is still functioning and performing work. It is sensing temperature and pressure and time

I believe most fridges use bimetallic thermostats to control a relay to power the pump. And they don't sense pressure or keep track of time.


Have you ever tried to start a super old fridge or AC that you unplugged without waiting the full 3 minutes before plugging back in? It sounds like a diesel truck shutting off in your kitchen.

Modern fridges absolutely keep track of pressure and time.


No, I admit I've never powercycled a fridge nor ever seen a reason to.

Can you explain the mechanism by which a fridge unplugged then replugged a minute later sounds like a diesel truck? The compressor pumps aren't continuous duty, they spend most of the time powered off anyway.

edit: I did some web searching and the only mention of wait times before plugging in a fridge I can find is waiting a few hours before plugging in a fridge if it was stored on its side. Unless you're also flipping your fridge over when you power cycle it, I don't understand why there should be any problems.


It's not specific to fridges, you'll find more results looking for AC/compressors in general: https://skeptics.stackexchange.com/questions/4695/do-air-con...


Depends on whether or not you have a fancy 'smart' fridge, in which case it could very easily be in a state where it is consuming power without performing any meaningful work. A lot of these IoT devices, while low power vs a traditional PC, are not efficient.


Measuring standby power matters where the alternate option would be to unplug the device or use an extension cord with a power switch instead of relying on an active but low-power state.


Sure, for things that you have the option to do that with and you don't mind the hassle. In cases where that isn't an option (i.e. in the case of a smart fridge) or you're not going to do that, it's still useful to know how much idle/sleep power a device consumes as you might want to either modify the device, replace or get rid of it.


Yeah. The author is way off here.

Another way you can quickly find big offenders is by using an infrared camera. I found various things in my house that are putting off a ton of heat when doing nothing (and using electricity to do it). My 10-year-old speakers with a built-in amp, for example, absolutely glow on infrared even when sitting there inactive 98% of the time. Now they are on a switch.


While an IR camera is a good tool (and terribly fun to use!), not everyone has one. Almost everyone has another tool that can be nearly as efficient though: a hand.

It's quite easy to use too: if something is warm when you touch it, it's burning through watts. If you expected it to be sleeping, it's either not sleeping or it's very bad at it. Don't forget the AC adapters, they can be worse than the devices they supply.


My local library has them available to check out for free! The payback period for buying a Flir camera to do your own energy audit is likely long, but checking one out is pretty reasonable.


In a net heating environment, this is less of an issue than otherwise immediately obvious. A small idle electronic device making a consistent amount of heat just helps heat the house. If electricity is very expensive it would make sense to try and track down the offenders however.


Hopefully you have another heating system that's more efficient than direct electricity, though!


How can any electric heating system be more "efficient" than an inefficient appliance? Inefficiency in an electrical appliance is nothing more than a measure of how good that appliance is at turning electricity into heat.


I had this same thought at one point and eventually learned of heat pumps. Heat pumps move heat from one spot to another, ie from outside your house to inside your house, or the other way. The point is, they move more energy than they consume. So that 1kwh of energy turned into heat by an inefficient electrical appliance could have instead been used to move 5kwh worth of heat into your home by a heat pump.

Heat pumps are more efficient than they used to be (at some point the outside air is too cold for them to work efficiently), so the cold climate where I'm from is just now starting to see them included in new homes.


Yes, I've considered getting a heat pump; for one reason and another, I can't.

It doesn't feel right to talk about the "efficiency" of a heat pump system; talk of "1W in gets you 3W out" isn't right, because you didn't magic those 3W out of thin air. It's more like "4W in gets you 4W out". It seems wrong to say that a heat pump system is "400% efficient".

I thought efficiency was a measure of how much of the work[0] going into a machine is transformed into the intended output work. I suppose you could say that this heat delivery machine is inefficient if a lot of the input work is expended on producing torque, or raising heavy stuff. But even that eventually turns into heat; just in the wrong place.

[0] I hope I'm using "work" and "machine" in the technical sense, so that "work" means displacement against a force, and a "machine" is any device for converting work from one form to another. Effiency is well-defined for that kind of machine: it's just the ratio of work in to work out, the output being what you want; so "efficiency" intrinsically depends on what your intentions are for your machine. .


> It doesn't feel right to talk about the "efficiency" of a heat pump system; talk of "1W in gets you 3W out" isn't right, because you didn't magic those 3W out of thin air. It's more like "4W in gets you 4W out". It seems wrong to say that a heat pump system is "400% efficient".

Those 3W didn't come out of thin air, it came from outside air.

1W of energy going into the heat pump causes 4W of heat energy to get moved from outside your house to the inside of your house.

That's the key thing to take note of. Traditional resistive heating works by turning electrical energy into heat energy directly. Heat pumps move heat energy the same way an A/C unit does, it just moves it the other way.

Essentially, a heat pump is just an A/C unit with the evaporator and the condenser flipped.


All heaters are 100% efficient in a closed system. However, your house is not a closed system. It’s a spatial region of an open system. In this context it makes sense to talk about efficiency as being lower or greater than 100%.


I wish this were made evident to the hoteliers with bar fridges kept in sealed cupboards...


> you didn't magic those 3W out of thin air.

No magic, but you are pulling those 3W out of the outside air.

You can use the industry term 'Co-efficient of performance' to avoid any possible ambiguity, but it's just the same thing with a different name.


The efficiency is higher than input because the electricity is only used to transfer heat from one spot to another. This is when the gas gets compressed, to get to the condensing stage at the high pressure. (+after burn for optimal efficiency)


Another one of my "came across this in YouTube" things (started with the design of a hurricane lantern) Why Heat Pumps are Immensely Important Right Now - https://youtu.be/MFEHFsO-XSI

The key is the Coefficient of performance https://en.wikipedia.org/wiki/Coefficient_of_performance


If your electricity comes from natural gas, it would be more efficient to burn the gas on-site for heat than to burn it to generate electricity, distribute the electricity, and then turn it back to heat.


Most efficient to burn the gas for electricity and use that to power a heat pump.


It'd be far simpler and more efficient to use a reciprocating engine to drive a heatpump via a rotating shaft. As a bonus, you could capture the exhaust from the engine with a heat exchanger and use that to heat the house as well.


Yes...ish.

However, consider the greater performance scales with generators combined with the amount of useful heat that one gets out of a heat pump can be better than converting the natural gas to heat locally - even with losses on the grid.

The key is that a heat pump doesn't convert electricity to heat (as a resistive heater) but rather moves heat from one spot (outside) to inside with great efficiency.


Yes, but the comment I'm replying to is about resistive heating from inefficient appliances.


A heat pump. Instead of using power to heat a room, you're using a (relatively) small amount of power to move heat from outside into the room.


The others mentioned heat pumps but I'll mention that the person you replied to never said "electric" and is likely referring to gas heating.


A heat pump?


In addition to what other people are saying, there's also district heating: https://en.wikipedia.org/wiki/District_heating Heat can be generated efficiently in some centralized location (from geothermal heat, power plant waste heat, solar, or whatever) and then distributed to nearby homes.

There's lots of ways to heat homes in ways which use way less than 1 joule of electricity for 1 joule of heat.


If you had a heat pump, it would be more effective to run that for sure.


How can heating be inefficient?


Besides the matter of heat pumps which others have covered, there is also the matter of how good your heating system is at getting heat to where you actually need it.

A PC under your desk will keep your toes warm, and maybe with warm toes you don't really mind the rest of the house being at 10 C / 50 F. With such space heater, you can turn down the thermostat for the entire house and only heat the parts you care about most. Furthermore there are radiative space heaters which direct infrared energy to a specific part of the room using a reflective dish, instead of heating the entire room using convection. In large rooms, radiative space heater can keep you feeling warm while using much less energy than a heatpump would require to heat the whole room.


By generating only 1 joule of heat per 1 joule of electricity used. There are lots of much more efficient ways to heat homes.


In most of the world electrical heat is about 3x as expensive as the next cheapest heating (gas, coal, oil, heat pump or wood)


In the summer though heating the house is a bad thing. Either you are making your house hotter or you're making the air conditioner work harder to remove it.


Even better, just go to a pawn shop or ebay and get a used current clamp meter for cheap. Then pull the cover off of your circuit breaker box in your house and check the current draw on each and every wire.


> Almost everyone has another tool that can be nearly as efficient though: a hand.

I've found the IR camera works much better than my hand for things like switching power supplies. They'll be glowing in IR while being cool to the touch even when no device is plugged into them. While each might not be a ton of power I tended to just leave them plugged in all the time.


Speakers are notoriously a specific case where power actually does get used in standby, so it doesn't really contradict the original article. For instance, an old BBC News article (not the one published today that probably triggered this submission) listed "TV on standby" as using lots of power but was actually including various peripherals including powered on external speakers. I would expect turning those off and leaving the TV on would use almost no power. If, as parent comment says, even the TV/monitor is using significant power, then that's a lot more interesting.


It may be a specific case, but then the article title should be "No, you can’t save £30 per year, unless you happen to have speakers like most people..."

But there are other things around as well. My old laserjet printer was another offender.


> I would expect turning those off and leaving the TV on would use almost no power

That is going to vary a lot depending on the TV. I suspect a lot of modern "smart" TVs consume a measurable amount when idle.


Idle or in standby? EU regulations state the maximum standby power is 0.5w.


You do, however, typically need to set “low power standby” or whatever in the options somewhere - it’s usually not enabled by default - the default is usually to turn the screen off but leave everything else running. Mine drew about 5W in “normal” standby, and is immeasurably low (from our household inverters - I live off grid) in low power standby. It takes about 10 seconds to boot - no big deal.


I believe the low power mode is required (by UK law, at least) to be enabled by default. It's pointless otherwise as almost nobody will go and enable that mode.


> You do, however, typically need to set “low power standby” or whatever in the options somewhere - it’s usually not enabled by default

i had the opposite, i had to disable the low power standby (which was on by default) so i could wake it over the ethernet or via HDMI-CEC


I've thought about buying an infrared camera, but I've found them to be terribly expensive. Not in scientific terms as they are precision instruments, but in 'I have some money to spend on amazon but this thing has high three figures so I'll buy something else'. Specifically talking about FLIR and the direct competitor I forgot the name.

Are there any affordable cameras that are not completely trash? Low resolution is ok.


Mine is an HTI-19 320 x 240 resolution one that was $233 on Amazon. It's not cheap, but I've certainly saved more than that in electricity from problems it identified in the 2.5 years since I bought it. It works just fine.

An infrared camera is great for tracking down and fixing insulation problems in your house, if you own a home. And fixing insulation issues really brings down your heating/cooling bills.

If you don't own a home the value proposition is a lot worse, but $200 or so still isn't bad for something that amounts to a fun toy.

EDIT: Sorry, it was $380 before Amazon points. I should have clicked into the invoice.


The FLIR One or Pro are in the $220-300 range if I remember correctly. They attach to your iphone and use that as the screen an maybe for some processing. In any case it works pretty well for most personal applications you might run across.


Hypothetically speaking you could buy one, find the hot spots, send it back to Amazon.

Morally it’s not exactly solid but at least you’re not standing on the backs of the poor to be a billionaire and all that


Really depends where in the world you are. In the UK / EU there are rather strict rules on how much power devices can use in standby.


I have electrical heating, so until the summer there's no such thing as wasted electricity.

Many people struggle to understand this concept. In some climates and energy grids, switching from things like incandescent to LED causes a greater environmental impact. It may shift (and may have already) as LED production gets more efficient.


You should read up on heat pumps, they are several times more efficient than electrical resistance heaters.


26% of heating in Canada is electric baseboard, much of it in buildings that can't or won't allow heat pumps. https://www150.statcan.gc.ca/t1/tbl1/en/tv.action?pid=381002...

Even for those who can, it would take several years for the investment in a heat pump plus the waste of the old heating system to become carbon neutral, let alone cashflow positive.

Don't feel bad. This entire thread is filled with people who think they know better. Must be why UX is in such high demand these days.

I get the enthusiasm though, head pumps are really neat.


Depends on what other value you get from your “heater”. My PC, running at full whack, draws about 700W, which all gets radiated as waste heat. It’s enough to maintain the temperature in my (well-insulated) living room when it’s 20C colder outside.


Still inefficient, but I guess that's power you were going to draw anyway.

You sure it's drawing 700W? Are you running multiple GPUs or what? Is that a server? Have you measured?


That's kinda the point though, that heat is doing work anyway, and they may not have the ability to use/install a heat pump. Of course it would be more efficient, but for example, my condo has three separate electric wall heaters. Instead of using them, we just rely on our computers to heat our space enough. Works until it is about 4-5C outside.

Someday if I ever get to build a custom house, I plan to install a full 42U rack that is tied into the houses central air to optionally heat the house with the waste heat.


Also, if your rig goes over, say, 400 watts of power it is definitely worth it to get an 80+ efficiency rated power supply. There are quite a few tiers, bronze, silver, gold, platinum, titanium, etc., but that's the equivalent of extra 9's uptime where just the 80+ rating is the first 99%


I doubt they use a non-80+ power supply with their modern high end CPU and GPU seeing as the standard was introduced in 2004, with Energy Star certification requiring it in 2006.

But not only is the efficiency rating important but the specific model’s efficiency curves at cold loads and hot loads, 20%, 50%, and 100% loads. Paying attention to your components and what level of power usage they will be at for most of their usage will dictate what the most efficient specific model of power supply (at a reasonable price point for you). This of course makes reviewing and purchasing power supplies quite complicated.


My personal experience has been that people skimp on the power supply, it's the one area where direct performance isn't affected so the features that the power supply provides are not directly correlated to the overall performance of your machine.

Many people, especially in mid-tier or lower builds where every dollar for price vs performance has been calculated to the decimal point will opt for the cheapest power supply that meets the wattage and connector requirements their particular build requires, efficiency and longevity be damned.

My opinion is that getting a guaranteed 80% power conversion efficiency rating (as compared to, say, a 75% conversion efficiency rating) on a 500 watt PC will save you 25 watts of power over the years that you use it, plus a manufacturer that actually goes through the process of getting the approval will probably have also put a little extra care into their manufacturing process and so probably has a better reliability rating than one that did not.

Unless you are scraping to get any PC together it may well be worth an extra $10 in price variance, all other things being equal, for that piece of mind and minor electricity savings to boot.


My point was, in the past few years I have been buying computers, I have never seen a non-80+ psu. Not that they do not exist, certainly there are unlabeled white boxes for dimes and nickels on the dollar compared to a quality one.

I went out of my way to research my current power supply to make sure that given my usage and components, it would operate in peak efficiency (I believe mine is at nearly 95% efficiency at average load, would need to measure it more thoroughly, it has been a few years).

I feel a power supply is like tires. They are the only thing that actually touches the road(electricity). They are worth paying for something halfway decent, not bottom of the barrel.

I do agree, but I would be shocked to see a non-80+ psu at a reputable store like Micro Center.


True. It's been about 5 years since I built a new computer and so my information apparently is a little but outdated, but a quick search through newegg's cheapest power supplies shows things like this:

https://www.newegg.com/p/1HU-0027-00010?Item=9SIAXE5EGG6813&...

650 watt, no-name non-80+ power supply for $40. That's the sort of thing I'm talking about. There are 80+ rated power supplies in the same price range or 80+ from reputable companies for $10 more but it's still a thing worth looking for when you're a newish builder or if you have any concern for the overall efficiency of your computer.

After all, base 80+ rating still means that for every watt your PC actually gets to use you're converting .2 watts directly into heat. That's not a big deal for a workstation that may only draw 100 watts under load but if you have a beefy PC with 3090s and an overclocked 12 series i9 processor with a lot of fans, you could be burning 100 watts or more just to power the thing, so it would definitely be worth it to go for gold or platinum 80+ rating just to save the money in waste heat and in cooling that waste heat after the fact.


RTX3080 and a 12 core i9-12. Yeah, I’m obsessive about power consumption as we’re off-grid - it’s more like 650W at peak, when running a modern game at 4K on maximums. Doing nothing much it draws about 50W.


Got it. Only reason I asked is that I've seen crazy figures thrown around (specially when PSU sizing) that never matched reality - specially when you look at the claimed cooling solutions. Dissipating 700W is no small feat.


Particularly as I have it on a mini-itx, in a cupboard - the radiator lives outside the case, outside the cupboard. It’s actually proven dead handy, as the weather gets too warm for running the log stove, but too cold at night to not have any heating - so using the waste heat from entertainment to keep the place cosy seems like an efficient approach.


> In some climates and energy grids, switching from things like incandescent to LED causes a greater environmental impact.

But for a huge number of people (probably the majority) who live in more temperate climates (most of Europe and the US), they are a net win both in their energy bills and environmentally marginal extremely cold climates. That's the people whom LEDs target, not the people in marginal extreme cold climates running high power-draw servers in their homes.


> In some climates and energy grids, switching from things like incandescent to LED causes a greater environmental impact

How would that be the case?


If you live in a climate where it's too cold for heat pumps to function, and rely on electrical (resistive) heating instead, the "waste" heat from the incandescent bulbs offsets the power needed for heating almost 1:1. Replacing those with LED bulbs wouldn't reduce your power requirements at all (you'd just run the heaters more) and the LEB bulbs require more energy and exotic materials to manufacture.


Is there such a thing as too cold for heat pumps though? They drop in efficiency, sure, but still beat resistive heating. Modern ones are still going to be 300% efficient down to -20C or so. With a resistive pre heater I don't see how they'll ever not be more efficient than resistive heating.


My heat pump is a Trane heat pump from 2018 and it goes into emergency heat mode when the temps are below 36, which is off and on for most of the winter. I wonder if I should have it serviced to make sure it's working correctly.


It's partially a function of your heating demands. My sister's emergency heat mode would run all the time in her town house. But she installed a giant insulating curtain over one of her leaky sliders and now it almost never needs to run.


That seems more like she improved her insulation, decreasing her heating needs rather than addressing the temperature cutoff for a heat pump.

I still like the heat pump because in the summer my power bill for cooling a 2600 sq ft home is about $50/month, and in the winter I do have a wood burning stove that I can run to heat the house when the temps are below 36. That and recirc mode on the fan unit can keep my power bills to a minimum.


A heat pump with a sufficiently powerful resistive preheater will be at least as energy-efficient in operation as the resistive heater alone, but it's also much more complex, with a correspondingly higher up-front cost and ongoing maintenance requirements. A resistive heater can consist of little more than a solid-state heating element, a blower fan, and a mechanical thermostat; radiative heaters can even dispense with the fan, which is the least reliable part of the system. A heat pump, by contrast, additionally involves pumps, heat exchangers, and volatile fluids under pressure, generally under the management of some moderately complex logic.


Naturally a heat pump is more complex and thus more maintenance intensive, but it's worth the investment/cost in energy savings in any calculation I've seen (they do tend to require more maintenance).


Cost is one factor, but if you live in, say, rural North Dakota, you may not be able to afford the risk that your main source of heat breaks down in the middle of a blizzard and you can't get anyone in to service it for several days… or weeks. Don't discount the value of predictable reliability.


> If you live in a climate where it's too cold for heat pumps to function

Year round? So you mean the polar regions? Because they can work even at -10F. Efficiency is reduced, but they can work. Crappy ones will cutoff earlier though. If you are not in one of the planet's poles, we have seasons so it's better to use LEDs for at least 3/4 of the time (maybe more as even northern latitudes only have a few really severe winter days).

Modern LED lamps are tiny and use very little material. Incandescent bulbs are no longer considered 'exotic' because of how old the tech is, but tungsten was a very exotic material at some point.


If you live in a climate where it's too cold for heat pumps to function, and rely on electrical (resistive) heating instead

That's the point where I'd strongly consider cryptocurrency mining.


Reminds me of the story of the LED traffic lights that freeze in winter, thus needing an additional heating solution (making them more expensive / error-prone and even less energy-efficient during cold weather): https://www.popularmechanics.com/technology/infrastructure/a...


That was once true but it's no longer a real issue: https://youtu.be/GiYO1TObNz8


5x 100W light bulbs is 0.5kW of heat output which your heating system doesn't need to supply.

Switch to LED and your heating system will need to supply that extra 0.5kW.

In a climate where you are almost always running heating, this makes the benefit of switching to LED much smaller, and considering that they are much less environmentally friendly to produce, it may even tip the scale the other way.


Yes, but. There are few places that need heating 100% of the year and fewer people living there. During the non-heating hours/days, that's just waste heat.

Depending on where the bulbs are located, they may be emitting into a ceiling cavity that exits the building's envelope more quickly than the location of the heating coil. Some bulbs are actually outside the building.

And of course, the heating system can supply 0.5kW for less than 0.5kW of power with heat pumps or (net including generation and transmission) natural gas.


This really only applies where the heat is frequently so low that heat pumps cannot function. Antarctica, or maybe Siberia or northern Alaska.

Beyond that, heat pumps are still much more efficient than direct electric heating.

Also, if you have air conditioning at any point in the year, those incandescent lights are just going to make your air conditioner work that much harder.


Resistive electrical heating is typically the most expensive heating option available, at least here in Canada. A heat pump uses much less power to provide the same amount of heat, and fossil fuel heating (gas in the west, oil in the east) is often much less expensive than electric.


It used to be the case. The energy cost of production was massively higher in their early days and if the waste heat was used, there wasn't a net energy savings. Not so much anymore, especially when increased lifespan is taken into account (it's at least at parity at this point).


Your amp isn’t “doing nothing” it’s just… on. Just because you’re not feeding it a signal doesn’t mean it’s not amplifying.


500W of "idle" power consumption is waaaay too high, in my experience. That's my average rate when I have everything on.

Granted, I live in a place where AC is not needed and use laptops instead of PCs to work on. ~$40/month is my total electriciy bill (I'm on the lowest end, I know).


I agree, 500W is when my 2 TVs, 2 desktop PCs, 2 32 inch monitors, NAS, router, 3 wifi APs and 3/10 lights are on and my fridge, freezer and other miscellaneous non-smart devices are plugged in.

Idle power with NAS, router and wifi APs are on, and TVs, PCs, Monitors are in standby is about 150W. It can be lower, but it is convenient and with $0.10/kWh it is just $10 a month.


We are a family of 5 in Germany, we have a 70€/month electricity bill. 2500kWh per year with one person (me) in home office. 3000kWh with two persons and the kids being more at home.


My house uses about 1500-1800kWh per month. We just bought it though and are working to get it efficient.


How is that possible? Electrical heating / AC?


Unclear. We just got solar panels and are auditing all the energy users now. There were some resistive heaters running we didn’t know about that pushed an extra 500kWh before. I think we got it down to about 1600 kWh. I think with all the he mods we are doing we can get it down below 1000 kWh but it will take time and more efficiency improving.


I would suggest to get a smart electricity meter so you can see "live" power usage. I'm not familiar with resistive heaters, but it appears they are purely electrical, which means they are only 100% efficient. Heat pumps (you can use AC for heating) are >300% efficient.

For reference: my 2 person household uses less than 3000 kWh / year (gas heating and electrical cooking), so even 1000 kWh / month would be a lot imo.


The solar system provides a whole house real time meter so we should be able to track down usage easier. We’ll see. We have two teenagers and a larger house and workshop, so, quite a few variables to figure out. We are converting the whole house to heat pumps. These units are state of the art and hit 370% efficiency.


I’m in the same boat. Electric baseboard heaters are what’re killing us.

To the 500W figure mentioned above, I plugged our little space heater into a kill-a-watt meter and it measured it at 500W/day.

Resistive electric heating is brutally expensive. I’m looking into heat pumps before next winter.


We have a boiler and some resistive heating. We are going with whole house (6500 sqft total) heat pump system. Ends up being several large commercial Fujitsu units that are efficient down to -20F (Colorado weather). We expect to dramatically cut our energy consumption with this move and be totally off natural gas. Combined with the solar and battery system we are getting we should be completely energy independent and net producing electricity which is cool :)


500W/day is a nonsensical unit; I assume you either mean 500W or 500Wh/day (different by a factor of 24)


Definitely. Electricity at the current UK price cap is pretty much bang on £2.50 per watt-year, so there are some eye watering annual standby costs in that spreadsheet.


>I noticed through my energy monitor that idle power consumption was about 0.4-0.5kW while everyone was sleeping at night, which seemed high to my research of national averages.

>Edit, here's a very quick screenshot of the spreadsheet and data I've got to hand at the moment, in time I'll get something better cleaned up and together.

Many of the items on the list stretch the definition of "idle". For instance, network equipment. Sure, I guess if you're not using the internet at a given time it's "idle" and therefore can be turned off, but your use of it is so random that you effectively can't turn it off. Same goes for "clock and alexa". The networking equipment category alone makes up 78W, or 20% of the total. Also, listing fridge as "idle"? You effectively need it on 24/7.


Sure, I guess my definition of idle is everyone in the house asleep not using any appliances and having them all in the generally accepted idle state.

Network Equipment I actually swapped a decent amount of it - The testing showed the Netgear managed switches were using around 1W each, opposed to 20w for the DLink switch, and another 20w for a Unifi Switch.

The Hob/Microwave I manually just power off now as I don't need them on all day long, and devices which I put Alexa controller relays on were the aircon units, my PC & Monitors and the TV & Surround (other than Sky Box)


Definitions are being stretched all around for this.

Vampire/phantom power was a specific problem where devices that were turned off by the user would still consume 5-10w. The only way to remove that power use was to completely cut the electricity to the device. From testing I did a while ago, devices made after around 2010 don't have this problem.

Deciding not to turn off a device can certainly be a problem. But it's a relatively new one.


> Also, listing fridge as "idle"?

The refrigerator isn’t constantly cooling. You could consider the time when it is cycled off to be idle. Of course it is constantly monitoring for when it needs to start cooling again. If it’s well insulated and your home is a reasonable temperature you can probably turn it off at night when you know you won’t be opening it. I don’t have any expertise. I’m just speculating.


> you can probably turn it off at night when you know you won’t be opening it

The compressor is going to run in the morning though to make up the lost cold unless you want your ice to melt. Seems the marginal benefit would be tiny. It probably messes up the cool/defrost cycle that modern fridges go through too.


I'm pretty sure refrigerators have electric heaters in the door gaskets as well.


how did you get 500w of idle use. that does not make sense. could you please post your table. i have my pc all the time on and some other things and not even close to that.


I'm wondering too. I actually tested almost all devices at home and the idle power consumption actually quite different from my previous perception. The following are a few I tested: 1. Microwave Oven. ~3W. 2. DECT phone base. ~2W. Extension unit. ~1W. 3. IP phone. ~1W. 4. Cable Modem. ~10W. 5. Routers. ~3W to 20W. 6. Switches. ~1W to 30W. 7. NAS(4Bay). ~40W. 8. Raspberry Pi 3B. ~3W. 9. Chargers/Power adapters. ~0W to3W. 10. WIFI Smart Plugs. ~1W when switch off. ~1.5W when turned out, both no load at all. 11. Set top box. ~15W, on or off. 12. PCs/Laptops. ~3W off, ~5W to 15W on.

To me the DECT phone, Set top box, cable modem consumes much more than I thought. Routers/Switches are more or less align with what I thought. What surprised me was that chargers/power adapters from decent brands were doing much better than I expected. Most chargers/power adapters I have consumes ~0.1W when idle, some event shown as 0w. Except one dodgy one came with a cheap toy PC behaves pretty weirdly: peaks at around 10w when just plugged in then gradually reduced to ~1.5w. Might be because it is a traditional power supplier with a transformer in it.

So the standby power consumption of my household should be just below 100W and I don't see how I can get it close to 500W.


Agree, 500w of nothing is INSANE in 2022 are you sure. Please list brands.


I don't think it's any one thing that's 500W, it's an accumulation of many devices all doing 20-100W. Lots of modern PCs are 30W+ even when off— like not even on sleep, but on off/standby.

So these numbers don't surprise me in the slightest.


Which modern PC draws 30W when off? When mine is off, the electricity calculator states 0W (the draw is less than it can register).

Even for sleep mode that would be huge. 30W is about the normal modern PC idle (on, but not doing anything) consumption, not any sleep / standby / off state consumption.


My ASRock B450 Pro4 draws about 3W from the wall if WOL is enabled, which in ASRock's firmware seems to keep all PCI devices (and attached USB devices) powered. 30W seems impossible though, anything not specifically designed to be fanless will almost certainly need to turn on the fan at that power usage (and even fanless might have difficulty dissipating a constant 30W).


Monitors can dissipate a huge amount of power without a fan. The OP blames mostly them, and their stand-by power consumption usually isn't even available to a buyer.


Surely the display panel itself is switched off when on standby, though? Is this a case of not including a separate standby power supply circuit, so that it's the "main" one inside there running at a fraction of its capacity, where it is least efficient?


The specific case I'm thinking of was an Advantech AIMB-274 with a Mini-Box PSU (I don't remember which one). I was evaluating this setup for inclusion in a battery powered mobile robot; in particular we needed to decide whether the computer could be hooked up to the battery full time or if there needed to be a relay in between them in order to preserve the battery when the robot was "off". The answer turned out to be that we very much needed to disconnect the computer.

Admittedly this is not a consumer PC or a "modern" gold/bronze-designated power supply, but I still thought it was pretty significant.


Mine is <100W now. Including a big monitor.


30W with high end CPU and graphic card in desktop and tons of plugged pheripals and big screen? They have good power management these days but not that good.

Plus many people use electronics build 5-10 years ago which are generally bigger offenders.


It doesn't matter if it is a computer with multiple GPUs for cryptomining. If it's 'off', PCI should be off, etc.


i worked for a electricity meter maker and measured consumption for things in my house and have now meters on most sockets and so on. 500w idle is huge.


How much is the draw of all those ammeters? i.e. the "burden". I'm having trouble finding what that would be for a typical household plug ammeter without avoiding all the "power meters are great" spam sites. I assume it is very small.


I'm using Shelly EM at the entry (for the entire phase), relays with PM at some sockets with interesting devices and shelly plugs at some sockets that need to be easily serviceable (think behind kitchen cabinets, for the built-ins).

The consumption of each of them is below 1W.


Nobody is missing 50w any more because that goes on their BOM as a bigger power brick. I can see this stuff where someone missed an incandescent or hasn't replaced a device in 10 years but really stuff has moved on and cost optimisation is relentless.


i had access to industrial power meters designed to 120A 400V so that is what i used, but the typical household here would not go over 4-7 kw and you can buy cheap digital meters from AliExpress with display and so on they can measure up to 4kw i believe.


Yes it's huge, but not out of the question for some people who go all in on some things in their house.

For example, in my place right now:

  - Unifi rack (some switches, cameras, other PoE stuff) - 335W
  - Servers (home automation, NAS, etc) - 426W
That's all the time, Unifi actually goes up at night when the IR LEDs come on in the cameras.


That's still insane. I have a Hue bridge and server at home that together draw less than 50W. Network equipment is also in the range of max 10W per device.


That's not "idle" use... somebody wasted good money on those labour-saving devices. And they'll write a blog post about it too once they get around to fixing them!

(I write this lovingly yet seriously as absolutely a victim of this exact same dynamic, sitting next to a pile of expensive fans I'm going to use to reduce the noise from my rack... any day now...).


Sure, it'll be a while before I had something cleaned up nicely, however I've added some screenshots to my original post.


Just having air conditioning in the UK clearly puts you in a position with electricity use outside of the normal.


Sure - Completely agree with having the air conditioning or heat pump running, there would be a higher power consumption compared to the average UK property, however this is about idle consumption.

When these devices are in "sleep" the unit is doing nothing more than looking for an IR signal from the remote controller that it's been turned on, just like a TV does etc, so it's odd to pull such a large draw in this sleep state.


Why doesn't every home have a monitor right off the main panel recording current total power draw in a format that is easy to read and interpret? Seems like such an obvious way to control usage. Running around with a power consumpution monitor for every wall socket is all well and good, but might miss a lot of silent leaks.


Careful what you wish for ...

In Germany, this thinking has let to one of two things: mandatory smart meters, which may spy on you, and require an Internet connection. Beefing up the cost by a factor of, idk, a lot.

My local provider (luckily) has opted to install "semi smart" meters right before the smart meter became mandatory. Those do not need Internet connectivity (and thus, are only somewhat more expensive than the old ones). The "semi smart" part is, they allow renters to read some power consumption figures. How, you ask? do you press a button? Ask again. The meter has an encapsulated (notoriously shitty) photo diode. You need to flash a code with a flashlight to it! Only then will it display the power figures for the last week on the LCD display. Madness.

In the same vein, landlords will soon be required to provide a monthly overview of heating costs, e.g. by email. Guess who pays dearly for this service with no choice? Renters.

Bottom line: Those of us who are already on top of costs get expensive, mandatory and ultimately useless services. Those who didn't care about running the heater while the window is open probably won't care still.


> In Germany, this thinking has let to one of two things: mandatory smart meters, which may spy on you, and require an Internet connection. Beefing up the cost by a factor of, idk, a lot.

My house (in CA - but not PG&E) has a 'smart meter'. Which means that they don't have to send people every month on every single residence to collect usage data for billing anymore. Cost has decreased for them. They do require internet connectivity, but it's the power company's responsibility (they have spread wifi access points across the city for this purpose, that also doubles as free wifi).

What 'spying' are you worried about with your meter? What can it report, other than your power usage, that the power company would know anyway? Presumably it even is located outdoors?

> In the same vein, landlords will soon be required to provide a monthly overview of heating costs, e.g. by email.

Well, too bad they have decided on a crappy implementation.


> What 'spying' are you worried about with your meter? What can it report, other than your power usage, that the power company would know anyway?

Traditional electro-mechanical meters keep a running sum of power used, but can't tell the power company exactly how much power you were using at every minute of the day. Such a log, although still crude, gives the power company much more insight into your daily routine than they ever had before.


> Traditional electro-mechanical meters keep a running sum of power used, but can't tell the power company exactly how much power you were using at every minute of the day. Such a log, although still crude, gives the power company much more insight into your daily routine than they ever had before.

Using traditional summing electro-mechanical meters assumes that electricity is priced uniformly at all times of day, and encourages over-consumption during times when spot market electricity is most expensive, the cost of which ends up being shifted to those who consume energy at lower spot market price times.

Smart meters allow for time-of-use rates, which are a more economically (and in CO2 emissions) efficient way of pricing power.

They do expose more information about the times of day that individual consumers use electricity to electric utilities, though. A consumer who is concerned about and also is proponent of market based pricing mechanisms could install battery storage to mask their usage profile, and even use it to do energy arbitrage.


That is convoluted. Then again, it's Germany. Here in NL most places have a meter that beams usage to your utility, which send me handy montly reports. By default it's only allowed to transmit twice per month, but you can allow it to be done daily or near real time (some energy co's require it, they'll charge you different hourly rates). The privacy around these numbers are guarded: they may only be used to calculate your bill.


> The meter has an encapsulated (notoriously shitty) photo diode. You need to flash a code with a flashlight to it! Only then will it display the power figures for the last week on the LCD display. Madness.

That's simultaneously terrible and quite neat.


well its bullshit, there are some shitty ones but also some really good ones.

besides a smart meter is only required for certain people. only a digital meter will be required for most people. a digital meter will be smart if it gets a communication unit, however most digital meters lack these. (you need the com unit if you have more than 6kW/year, produce more than 7kW to the grid or if you have a wallbox/heat pump/etc)

however some power providers will force a smart grid (digital meter + com unit) on you and you can't deny it (you can't deny digital meters aswell) however they are not as smart as they could be and most of the time you need to send your total kW amount to the provider anyway...

(btw. your power provider and the grid provider are not always the same so it might be that you send the amount to the grid provider or you have a smart meter and the grid provider forwards it to the power provider and the grid is basically a municipal one so it might happen that each county/city might have a different one (my parents live 30km afar and have the same power provider, but we have a differet grid provider)) (smart meter usage or not is bound by the grid provider so you can't just switch it, since you would need to relocate...)


They do. All new smart meters come with an IHD (In Home Display). It shows real-time use in kWh or £.


> All new smart meters come with an IHD (In Home Display).

Not necessarily. My local utility (Xcel Energy in Colorado, US) is in the process of moving to new smart meters, and as far as I'm aware, there is no IHD. My apartment complex was converted a couple months ago; I can get hourly usage via the utility website on about a 3-4 hour delay, but no realtime usage.


I'm assuming, if you're in the US, your IHD wouldn't be displaying in £ either :-)

Let me be clearer. Every new install of a smart meter in the UK is supposed to come with an IHD.


Mine didn't, the meter was installed as part of the house build last year.


That would work for the total consumption, but if you notice that that is too high, you will still have to run around with a power consumption monitor to identify the individual culprits. Of course, a display saying "you are currently using 1300 Watts" would be easier than having to estimate it from how fast the wheel on the meter is turning, but replacing every meter everywhere is not the most environmentally friendly thing to do either...


> if you notice that that is too high, you will still have to run around with a power consumption monitor to identify the individual culprits

Or just the good old method of turning things off individually, one at a time.


I’d think panel support would logically have a probe per circuit, so you could get the complete breakdown if and when necessary.

Though obviously it’s still only on per-circuit basis, and if an idiot wired your house that may be less useful than the platonic ideal of house wiring.



There's an energy monitor with 16 CT clamps included, you put one on each of the circuits coming from the fuse board. 16 is more than enough for most UK houses, which would at least let you know which circuits are consuming the power.


Yeah I'm confused by this article. I've got a Virgin Media TV box which used 25w while on standby. Maybe I'm doing the math wrong but:

25w * 24 * 365 = 219000w or 219kw per year.

At my current electricity rate that's £39.42 per year at a very minimum just for the TV box.


The math is right but the units are messy. Watts are already energy/time, which is correct for the 25 W figure, but the 219 should be kWh/year.

25 W * 24 h/day * 365 days/year = 219000 Wh/year = 219 kWh/year

Which makes me notice that 1 kWh / year = 1000 Wh / year = 1000 Wh / (356 * 24h) = 0.117 W. So you can quickly estimate that a device with x Watts of constant consumption will have you paying for roughly 10 x kWh in electricity a year. With electricity costs on the order of 20 cents/kWh, that means a rule of thumb is "double the wattage, that's how many $ it'll cost you to have it running all year".


Your math is right, if you're not recording at night then you can switch it off at night (we switch one of ours off at night on a timer)


A good approximation is that 1W of power for a year costs you one Dollar (or Euro) per year.


Those cable TV boxes that they give out are terrible outdated cheap pieces of shite and have notoriously bad power consumption.

Considering it is impossible to know what kind of appliances folks have in their homes the article is odd.


Please do post your full experience, I'd be interested. I saw an article this morning and it spurred me to buy some smart plugs to cut power overnight to standby devices (https://www.telegraph.co.uk/bills-and-utilities/gas-electric...)


Come to think of it, the smart plugs probably draw similar power to the appliances on standby..??


Most are similar to well-behaved appliances - so low it's difficult to measure. Even more so if they are not wifi.

I have one in my desk that's then plugged to a power strip so it can cut power to multiple devices at once. It can even measure power consumption – sitting at around 80W as I type this, with a laptop, 34" monitor, Echo, wireless charger, USB switcher, USB hub, a desktop computer currently 'off', standing desk, laptop dock and the list goes on.


It's OK, there are smart plugs you can use, to cut the power to them.


Smart plugs all the way down.


Sonos devices are also quite power hungry in standby, in the 4-6W range (230v)

https://support.sonos.com/s/article/256?language=en_US


I've had pretty much the same experience as you. Some devices had excellent standy power use, others were really bad (the speakers on my sound system being the worst offenders and using exactly as much power when on standby than when operating and more than my fridge…). The key takeaway here imho is: n=1, monitor your own installation and devices and don't expect there to be either an untapped pool of savings or a well optimised system based on blog posts by other people.


The Keurig is the worst offender... Enormous energy is expended to keep your water hot 24/7.


Very true - To be honest at the point I knew the house consumption and hadn't yet clamped individual devices I had expected to find a single culprit like that, or even a faulty device with earth leakage that was affectively dumping the current for no real purpose.

I should have also pointed out, I really do wish more devices would have more physical power switches where it makes sense - As an example the washing machine has a touch sensitive main power button, yet our tumble dryer (same manufacture & design) has a more physical on/off switch - The former uses a decent amount of standby power, the second uses none, the only real purpose I see is a slightly more "premium" feel to the washing machine.


Mains power switches are a wear item and can withstand like 10-100x less on/off cycles than a simple button that controls some 3.3V gpio on a microcontroller.


A cheap mains power switch from 30-40 years ago had no problem being cycled once or twice a day for 30-40 years without ever failing. This is not a binding constraint.


Either there's not alot of current going across that switch, or it's just a little lucky - just because that one is going strong doesn't mean over half of the same switch hasn't failed in the same application.

Or, if it's really 40 years old, "cheap" could be "really fat and expensive" today.

If you pick up an old "broken" power (audio) amp, the most likely reason for it to be broken is pots - switch - caps - FETs in that order.


Do you have stats on this? I've developed coffee makers and other appliances. Yes, there's often a mode where the boiler keeps warm for some amount of time after the first cup is made. The first cup can take minutes to heat up, and the second is often much shorter. Then there's a certain amount of time where the preheat is left on, and eventually it's turned off if the coffee maker isn't used again. Different models/makes have different algorithms and modes, but this is a common pattern. There are energy conservation standards which also come in to play, so unless we're taking about a commercial machine, I don't think anything current will just keep the preheat on forever.


I wouldn't have believed it either, and yet:

"Keurig brewers – unless turned off – continuously heat the water inside the heating tank even when the brewer is sitting idle. Some Keurigs such as K-Cafe and K-Elite feature an auto-off function that switches off the brewer two hours after its last brewing to save on energy."

I guess if your machine is trashing the planet with plastic waste anyway, why not also trash it on the energy consumption side? What a disaster.

ref. https://kahawaplanet.com/does-keurig-boil-water-how-to-get-h...


I keep an electric kettle and Melitta pour-over on my desk at work. It is on when I turn it on, then it is off. The Keurig has always seemed like kind of a bad deal.


I guess you could make a case for this kind of thing in a breakroom or something where it's going to be in use all day, but then again, that's also the scenario where it would make the most sense to just be continuously brewing whole pots of coffee in a conventional drip machine.

Either way, a Keurig in a home, only brewing once or twice a day? The worst.


My keurig would keep it hot for weeks. How much energy that actually wasted, I'm not really sure.


I just got one but the Zojirushi hot water dispenser can keep 4 liters of water hot for 31 - 48 watts or so. What does the Keurig use?

https://www.amazon.com/gp/product/B00R4HKIV8/ref=ppx_yo_dt_b...


Zojs are well insulated hot water dispensers, a keurig is a pod-type coffee machine, so I assume it’s just an un-insulated plastic reservoir (though hopefully it doesn’t keep the entire reservoir hot, only the water for the “next dose” for people who can’t cope with waiting 30 seconds for the machine to heat up)


I really wish keurig would just have a big capacitor inside to insta-heat the water beyond what’s possible from mains in real-time. But I guess there’s too much risk of shocky shock.


> I really wish keurig would just have a big capacitor inside to insta-heat the water beyond what’s possible from mains in real-time. But I guess there’s too much risk of shocky shock.

I'm not sure that's an option unless you have a capacitor larger than the device. Heating water takes a lot of energy: let's say we're talking 200mL water (0.2kg), ambient temperature is 20C, water has a specific heat of 4.2kg/kg.C, so that's 67200J.

A Maxwell K2 3000f 3.0v would store a hair under 13500J, which isn't even remotely close, and these things are huge (https://youtu.be/y8tQesYvCig?t=10), heavy (>500g), and each would be a significant fraction of the keurig's existing price.

If you need to strap 5 capacitors to the keurig, danger aside, you've added 2.5kg to the thing, you've strapped a pack twice the size of the brewer, and you've at least doubled and probably close to tripled its price.


My Quooker does too, keep a reservoir permenantly at 110C. It only uses 10W thanks to its insulation though. Surprising Keurig didn't think of that. Then again, they've mostly seemed to have thought about taking your money.


You have measured the apparent power (volt-amperes), not the active power (watts) that you pay for. There can be quite a big difference, especially at standby.


I had a printer that was just spooling up every minute. Was doing some stuff and looked at my power consumption and noticed it.

Super weird, seems like a job didn't finish or something and it was trying to finish printing and just couldn't


I use air purifiers in home and I am trying to do the same. Issue is a lot of the electronics like these honeywell purifiers require user input when turned on or off. Do you know of any tricks around this?


No good solution I've found other than being careful in product purchases.

I recently got some Blueair air purifiers and was careful to check the manual for power off functionality first. ("The autostart feature automatically restarts the air purifier at the last set speed if it has been unplugged, used with a power switch timer, or if a power failure occurs.")


Other than modifying the device to provide the missing input? No.

You could also use some ridiculous solution like the switchbot.


Those are some atrocious computer numbers, though. I have a desktop PC with 5.1 speakers and 2 Monitors. Standby is less than 10W (10W being the lowest my device can recognize). 36W? What?


what clamp meter are you using?


I’m using an Owl Intuition, however please don’t take that as a recommendation. I got it cheap on eBay with the network bridge included hence why, but the website UI is truly terrible.

I’m currently building one myself using the common SCT-013 sensor, and an ESP32 (arduino) which I plan to then store the data in either a little react app, or perhaps Grafana.


> idle power consumption was about 0.4-0.5kW while everyone was sleeping at night

Thats quite high, I'll be checking our inverter now as its an air source heater and air con unit rolled into one.

Our biggest energy savers have been one of these (https://www.screwfix.com/p/lap-digital-immersion-timer/1804r...) at shoulder height in an easy to access location, on for 1.5hrs during the last part of Economy 7 night time tariff and then switched on when ever hot water is needed on a demand driven basis, switched off when its finished with although it cuts out after 1 or 2hrs anyway if we forget.

Thing is most appliances have a timer and heat water when it needs it, so they go on during Economy 7 anyway.

Talking to a washing machine/tumble dryer engineer the other day, the condensing tumble dryers seem to be energy saving because they dont heat the clothes up so a saving is made by not shrinking the clothes exposed to too much heat and they run at ambient air temps extracting moisture so the dry clothes much like hanging them out on the line.

The other major energy saver which means we dont have the air con on during the summer months in the UK (desert climate location in the UK - yes one exists believe it or not) is these black out blinds. https://www.blocblinds.co.uk/window_blinds/blocout_info?sour...

They sit flush with the internal wall so you can still have stuff on your window ledge, but by being flush with the internal wall means its more like triple glazing. They keep the warmth in during the winter because its very cold over night in the countryside and they keep the heat out during the summer so much so that some days when I would have put the air con on, I dont need to, or I only put the air con on for the afternoon, its temperature controlled anyway so it cuts in when it wants. And when the air con is on, the whole house is at 18 degrees C according to the downstairs central heating thermostat, its at the top of the stairs keeping bedrooms cold and the cold sinks down the stairs which keeps the downstairs cold, thats works brilliantly especially during heatwaves!

You also get fantastic sleep during the summer months as well! Cant stress that enough! We all have individual bedrooms due to different work schedules and I love my sleep, but you cant see anything in the room at midday in the middle of summer, you cant even see your hand in front of your face, they block out that much light. Its bliss!

Those have been our two biggest energy savers, and we have solar panels, a loft full of insulation, insulated cavity walls, fancy gas in the double glazing etc etc. Little short of dumping some of the time saving gadgets, I dont think we could save any more energy.


Damn; and I felt gaudy with my phone, personal desktop, single smart speaker and laptop through work being my only gadgets at home.

How much money would be saved if we didn’t fetishize owning a bunch of stuff that idly costs more money? How much electricity, since that matters in real terms to me more than your finances.


I heard this £30 saving stuff on the radio this morning, and it made me shake my head a little. I remember when I got a socket adapter that measures power consumption, and I went around checking all my appliances, eager to find out which are drawing lots of power, excited at the money savings. Started with my old, huge, plasma TV, that is the best part of 15 years old - surely chugging loads of power in standby? Nope, next to nothing. Monitors, same. Network switches, same. Essentially I didn't find anything in my property that was doing anything unexpected. Having hot water (which I get via electricity as my flat doesn't have gas) for a single day will use up more energy than I'd save unplugging all my devices. Similar for heating (again, electricity based) on a colder day in winter would cost more than all the standby devices.

I remember a long time ago (25 years?) someone 'invented' a device where you plug your TV into it, and it has an IR receiver to go with it, you program it to respond to your TVs power button (on the remote control), and then when you go to put your TV into standby, the plug then turns off too, and when you switch it on it does the opposite. At the time it was being touted as the money saver, and I got the impression (although never actually tested!) TVs back then did draw a lot in standby, so this 'invention' was a legitimately good one. But it didn't seem to really take off, I am assume because standby was rapidly implemented in a way where there is such minimal electrical draw that it would be pointless.


A nice side effect of people doing this is that, depending on where you live, you might be able to obtain these older, less power efficient devices for relatively cheap on the used market. Of course if you can barely afford electricity buying them is probably not a bright idea, but for someone with e.g. their own PV system it can be a great way to save some money while also reducing the amount of e-waste going to landfills.

Even better, this is not limited to large power hungry appliances. You can build a surprisingly decent gaming handheld out of an old flagship phone with the help of rooting and some hardware mods [1], basically taking advantage of the low resale value of Android devices.

[1] https://youtube.com/watch?v=px1A6XptqhQ


And if you’re in an area that gets cold, waste heat isn’t so bad - in the winter.


True, it's not a full loss. A heat pump can easily be 400% efficient, though, so I'd rather lean on that except in extreme cold.

Still, not a bad idea to run Folding@Home all winter.


IIUC you need a higher temperature source from which to pump heat, so in the winter, where do you get that temperature differential?


Thankfully, you are wrong about that! Just like you don't need a cooler source _from which to source cold_ in the summer with AC. Technology Connections has a fun (well, to me) video on how heat pumps work and are great. Literally just AC units running in reverse, thanks to having a valve.

https://www.youtube.com/watch?v=7J52mDjZzto


Yes, Technology Connections is a great channel and added a couple of new videos about heat pumps in the past month.

https://www.youtube.com/c/TechnologyConnections/videos


At a certain temperature differential you need to source "heat" from elsewhere, which is why up north you either need a geothermal heat pump (underground piping) or you supplement with gas or electric heat.

Some very efficient pumps can work down to 0 degrees, but most start losing efficiency at 25 to 40 (Fahrenheit).

For some parts of the world, a heat pump is perfect - pump in heat in the winter and heat out in the summer, but when the outside temperature can hit -40º Fahrenheit (which is -40º Celsius), you're gonna need something else.


For the overwhelming majority of the world they are perfect. Even in high latitudes, they might only reach those extremes for a few days in a year, that's when you would require a backup.

Losing efficiency is fine since they are so ridiculously efficient to begin with.


This is not true, any more than your fridge needing a source of cold from which to cool itself down.


The temperature differential is between the outside temperature and -273°C. There's still heat energy to be captured even on a cold day.


100% efficiency is rather low for heating. Heat pumps easily give you 400%. Efficiency in this case calculated as: generated energy (in heat) / electrical energy. Efficiency goes beyond 100% because heat is extracted from air or water.


A good rule of thumb, and I hope someone more knowledgeable can correct me if I'm wrong, is that stuff that generate more heat will consume more power. So hot shower, iron, curling iron, hairdryer, rice cooker, this kind of stuff. Electronics that generate minimal heat will consume minimal power in general.


In general, all the electricity consumed by an electrical device gets converted into heat. If your GPU uses 250W of power, it generates 250W of heat. If your whole computer draws 1kW from the wall, it generates 1kW of heat. If your little raspberry pi and its power supply together draws 15W from the wall, it generates 5W of heat. A 1500W electrical space heater converts 1500W of electricity into 1500W of heat.

So yeah, you're not wrong.


Only if it's not doing useful work.


It's just basic energy physics. Making something hot or cold is real work. Same with moving something or producing light. But light is extremely "cheap" compared to the rest. Computing data or rendering a video game isn't real work and would consume no power if it wasn't for the inefficiency of current flowing through the chip making it hot.


If I remember my long ago physics classes correctly, irreversible computation inherently costs energy: you can't get its energy "consumption" down to zero because of entropy, no matter how efficient the technology. Our computer architectures are far from reversible in a physical sense. If they were, you could run programs in reverse. That said, the minimum amount of energy required is an absolutely tiny rounding error compared to the waste heat of today's technology.


And if the heat goes into water, it needs even more power because water is notoriously hard to heat up.

At one point I owned many appliances that leaked heat, and I think I learned to estimate how much power they drew simply by putting my hand on them and feeling how hot they were. I'm not sure I have that superpower anymore. (Obviously it was never that exact, because it depends on many other things like volume, material, isolation, etc. But you can get fairly close for common household things – between a finger and a few heads in size, surrounded by air, plastic case.)


Or to put it another way -- if hot water goes down the drain (shower, clothes or dish washing, etc.), then that's another place the waste energy is going.


Better down the drain then to radiate it back into your air. Then you gotta spend more energy pumping that heat outside (if you live in a hotter climate like I do).


Alternatively, use it to warm up the cold water coming into the shower (I'd guess the largest usage of hot water in a home). Warmer "cold" water mixing with the hot water means less hot water used out of the hot water heater. My understanding, placing this only on the cold supply to the shower is it'll only impact your showers. https://ecodrain.com/en/


Correct, and the difference is not even close - anything to do with temperature manipulation consumes orders of magnitude more. Literally, not figuratively

The laptop I'm writing this on consumes around 10W. Kettle 2100W

Making one coffee in the morning consumes enough electricity to power the laptop for the whole workday

The idle power consumption of my home is dominated by the fridge and freezer (around 150W combined). Idle mode of any other devices is a rounding error


My power company sent me a power strip that has a somewhat similar feature. It constantly powers the TV, but the rest of the strip is turned off when the TV is off. It's so you won't waste power having Roku, etc., always running.


> I remember a long time ago (25 years?) someone 'invented' a device where you plug your TV into it, and it has an IR receiver to go with it, you program it to respond to your TVs power button (on the remote control), and then when you go to put your TV into standby, the plug then turns off too, and when you switch it on it does the opposite. At the time it was being touted as the money saver, and I got the impression (although never actually tested!) TVs back then did draw a lot in standby, so this 'invention' was a legitimately good one. But it didn't seem to really take off, I am assume because standby was rapidly implemented in a way where there is such minimal electrical draw that it would be pointless.

I had a couple of these (in fact I probably still have them), from the 2000s - an Intelliplug and Intellipanel - the later being an 8-way plug. I used it for a TV, 5.1 amp/receiver, blu-ray player, Xbox360, Sky+, Wii, and Squeezebox Radio. The Sky+ box was in the "always on" socket so it could record stuff at all times, while the TV was in "if this comes on, turn on everything". I convinced myself I was doing a good thing for the planet and my finances, but I never measured it. Blindly believing it was far preferable to spending any effort on proving I'd been a fool, especially since I needed the sockets anyway - I don't recall the price being that much different to any other 8-way I looked at.


>if this comes on, turn on everything

Eco-friendly or not, I'm impressed; detecting power draw and using that as a signal is actually a legitimately clever solution to the problem of wanting to conveniently switch multiple things at once. The only caveat I can think of is that some devices probably don't like being switched at the wall.


I've found these devices to be finicky. You have to set the threshold of current flow that matches your TV, and sometimes switching from one input device to another will spike power consumption low enough to turn things off in a way you don't want.

But it is a very cool idea.


I thought the same, and my Smart Sony TV claims very low standby consumption, but when it is on the Wifi and sat there ready to Airplay too it consumes something close to 27W all night long. I setup a smart plug to turn it off at night on a schedule.

Also seen that my Hifi Amp draws a good few watts on standby.

We have recurring argument in my house where my partner tells me I should have turned the LED lights off to save electricity and I tell them to close the curtains to keep the heat in. In truth I have no idea which wastes more energy/money


They said £150 on Radio 4 this morning!



There are rules in the EU that allow newer devices only 0.5W or 1W standby power, so basically we have to thank the EU that standby devices don't kill us by thousand cuts. I recently bought a device to measure my power consumption and most devices (e.g. monitor, TV, receiver, coffee machine) only use 0 to 0.7W when in standby, so definitely negligible. However, my dvd player uses 3W in standby because it's from the 2000s (before this EU law). btw. next to my washing machine and dishwasher the device that uses the most power over time is my 42" plasma tv (from ~2008) with 150-200W. My 32" monitor only uses 22W. My PC uses 40-60W when browsing or watching movies. 200W when playing Elden Ring.

I think it's worthwhile to check your power consumption. If anyone still has not changed their lightbulps to LEDs, please do it yesterday (a change from 40W to 10W is HUGE).


And despite these rules my Sony TV (KD-65XG9505) bought in 2019 uses 20 to 30 watts in standby. Only disabling the internal Wi-Fi and restarting it gets this down to about 1 watt. This seems to be a very common problem with Sony TVs and I have no idea how they don’t get into any kind of trouble with this.


I have a Sony to, standby is around 1W unless I enable some features like Airplay, then it jumps to 20W. Enabling the feature will show a warning in the UI that "this might increase power consumption". I think this is how they get away with this. They show this warning with a lot of settings. Also a lot of TV's have an energy saver picture profile which is horrible to look at for movies, but it's included to pass the rating I guess.


Tried a lot of settings, but with enabled network I never got the standby power consumption down. It should also be expected that the TV uses the advertised amount of energy in standby mode with the default settings, which was really not the case for me. And don’t get me started on this „might increase power consumption“ message. This might actually not be Sony‘s fault, but the EU‘s, but it really gets annoying when it pops up all the time.


For me (KD-55X9000H) the default settings give low power consumption. Enabling WiFi does not make a difference. Enabling HomeKit Wakeup (not Airplay, as I mentioned before) gives a huge power usage. Disabling it again doesn't bring it down. Only after a factory reset I got it down again. Took me a while to figure this out. Also, it might depend on your firmware version.


> it's included to pass the rating I guess.

I see the automobile industry taught well.


Indeed, they even have their own dieselgate scandal where tv's recognise reviewers test patterns and adjust settings accordingly: https://www.youtube.com/watch?v=rhto9MmiExE


In the US it was in 2001: https://georgewbush-whitehouse.archives.gov/news/releases/20...

Don't know if that was before or after the EU.


The author acts as if it's all bullshit, but actual measurements in my house disagree. I have a new (2021) 4k "sky box" from my ISP, a relatively new Android TV (2020), a PS4, a Logitech Harmony hub and a lan switch plugged into one socket. The measurement device shows a total of 35W continuous power usage when they're all in standby.

Programmed a rule in my home automation to turn off that whole socket if it's after 11pm and the lights are off, then back on in the morning. It's around 100 kwh per year, which is more than 30 euros at current energy prices over here.


It's a good thing if you already have home automation set up. But installing home automation probably costs more than the 30 euros/year you're saving on electricity.


Absolutely, and my home automation setup in total also uses more power than what this trick is saving. But it's there anyway because I want an automated home, and then this is a simple rule to save some otherwise wasted energy that takes only 5 minutes to set up.


I use a ZWave light switch in our bedroom that can trigger "scene" commands on a double/triple tap. A double tap at night fires a scene that tells Home Assistant to shut all the things off (and make sure the doors are locked, garage door is closed, HVAC is set correctly, etc). Double tap up in the morning turns stuff back on.


Have you broken them out individually?

I'd imagine your router is probably the primary contributor to that 35W.


I've done a bit of unscientific plugging and unplugging ;-) The TV box is using 15ish watts so that's a bit part.

My router + fiber termination point are using 18W continuous, but that's not included in this 35, they're somewhere else in the house. The box here is only a TV decoder.


Does standby for the sky box, the lan switch and the hub just mean normal operation?


Yes, with the TV thing set to "off" which really isn't that off.


Hmm. I think I need to do some measurement on my TV setup.


Definitely annoying when "official" looking stats are not just a bit out of date but sometimes are completely wrong. Then a tonne of the uneducated take these things as gospel and either tout them to their friends or perhaps start breaking switches by turning everything off all the time.

How much to replace a broken socket? Probably about £100 by an Electrician!

The guy from the energy savings trust also mentions "Even the standby light on your TV takes up energy"! An LED? Seriously? So the crazy advice is switch the TV off at the wall instead. What happens then? Every time you switch it on you have to wait for longer for the PC inside it to boot properly and connect to the network, using much more energy than a tiny LED!


That LED by itself obviously doesn't use a lot of power, but it does imply more usage by other components.

Last year I got a thermal camera to help with some insulation in my old house, but it's great for literally seeing this standby usage. The TV, when off, is around 2 degrees above ambient.

The author of the article laughs at the mention of the VCR, but doesn't acknowledge that we might be wasting more energy with newer gadgets.

Every "IoT" device that runs of mains power seems to have very inefficient power supplies, because they are regularly warm to the touch even when doing nothing: Smart bulbs (by far the worst), Alexa, Fire/Google TV sticks, etc.


To my mind, advising people to "turn them completely off" is only part of the story. The other part is questioning why they are that wasteful in the first place.

One explanation might be that it's a side effect of designing them to make their development and production cost effective.

In that regard, I wonder to what extent consumer protection regulations would be helpful e.g. clamping down on the production of consumer electronics that consume certain wattage per hour without any proper justification.


> Fire/Google TV sticks

Are you saying the device itself is warm or the power supply? If it's the device it can't be caused by an inefficient power supply because the supply is external.


I mention power supplies in both the colloquial sense, but also the power supply circuitry - I know things like the ESP chips themselves don't use much power.

As for my Google TV: the PSU is integrated, so I can't tell.


A device that's designed to handle 200W power peak might have a power supply which is 95% efficient, but it might be that you're paying that 5% all the time.

(If, for instance, there is a big-ass transformer with a lot of iron in it you are making magnetic domains flip back and forth whenever the device is plugged in.)

The standby mode might use 0.2W but the device burns 10W in the power supply.

Old power supplies used ideas from analog electronics: if the voltage regulator was reducing the voltage from 10V to 9V then you are directing that extra volt (10% of the energy) into a resistor. More modern power supplies use digital ideas such that transistors are fully on or fully off. Because they waste less energy they have an easier time getting rid of heat, so they can be dramatically lighter and smaller... but more expensive.

Other answers would be to add a second smaller power supply for the standby mode or run the standby mode off a battery or super-capacitor that charges intermittently.

The cost of those improvements would definitely be noticed by the manufacturer and the consumer at the time of purchase. The consumer might not notice a small increase in the electric bill, even if it adds up to a lot over time, because it is against a high and noisy background.


You have it right.

Many cheap always-connected smart devices use Low-dropout regulator (LDO). What does wiki say about these?

> The advantages of an LDO regulator over other DC-to-DC voltage regulators include the absence of switching noise (as no switching takes place), smaller device size (as neither large inductors nor transformers are needed), and greater design simplicity (usually consists of a reference, an amplifier, and a pass element). The disadvantage is that, unlike switching regulators, linear DC regulators must dissipate power, and thus heat, across the regulation device in order to regulate the output voltage.

See for example: https://templates.blakadder.com/athom_PG01EU16A.html

You can clearly see an AMS1117 (a super popular LDO), right next to the ESP board on lower right.

Most LDOs have a chunk of metal tinned to the board so that the copper layer can help with heat dissipation. You can quickly identify them by this design.

Here is another one on a dev board: https://hallroad.org/images/watermarked/thumbnails/465/465/d...

More on the topic: https://www.digikey.com/en/articles/use-adjustable-low-leaka...


> Smart bulbs (by far the worst)

Maybe some. On the other hand, the Hue sitting here is cold to the touch when not emitting light and is drawing about 0.4W right now--I checked.


0.4W is a lot.

WiFi-connected ESP32 draws about 140 mA. esp8266 will be similar. They are both 3.3V devices.

3.3 * 0.140 = 0.46

Your light bulb's esp8266 (or a similar chip, zigbee is 2.4 GHz as well) is fully powered.

With a switching power supply and proper light sleep you should see about 1/10 of that.

API reference: https://docs.espressif.com/projects/esp8266-rtos-sdk/en/late...

Sleep impact: https://github.com/chaeplin/esp8266_and_arduino/tree/master/...


Huh--I was certain it could be better, though I didn't realize it was that high! On the other hand, my electricity is hydro and it costs like $0.009 per kilowatt-hour.


It's easy enough to plug a power strip into wall socket and simply turn off the power strip when not in use. I'm not suggesting that this means the passive power draw of these devices is enough to warrant the effort. However, it's also effortless just to turn off the power strip when not watching TV. We only watch TV once or twice a week, and have a pretty unusual setup:

- large CRT

- VCR

- DVD player

Given the ancient appliances, and how infrequently we watch TV, it does actually pay off to disconnect them via a surge protector.


There is something to be said for disconnecting or hibernating devices, either because they do consume a fair bit, or because lots of device consume a lot more than you’d expect on standby.

The proper thing to do is to check them with a kill-a-watt or similar, and make the decision on a case-by-case basis.

Plus it’s fun checking your various devices.

E.g. a desktop PC might clock in at 100W when unused depending on the components, so if it’s not necessary hibernating it at night is a significant gain (if it has a working hibernate mode) for very little inconvenience.

There are also devices which you only use once a month but sleep in at a few watts, completely unnecessarily.


> E.g. a desktop PC might clock in at 100W when unused depending on the components

100W when unused! What on earth causes that? My laptop draws 100W under full load.


My 9700K desktop draws 45-50W (excluding monitor) when idle. Main usage is:

- GPU at about 11-15W

- Fans on motherboard and case 7-10W

- Disks (1 HDD, 2xSSD): 5-10W

- Peripherals (keyboard, mouse, headset, ...) 2-5W

- CPU 5-15W

I can imagine a scenario with additional peripherals, beefier GPU, more PCI cards, more disks could absolutely reach upwards to 100W.


What power level is the system at? That sounds reasonable for a system that's in its active mode (ACPI S0) but not under load, but would be extremely high for any other sleep mode (S1-S5).


I have never been able to consistently get Windows 10 to enter deep sleep for more than about 30 minutes, there is always something waking it up. So I have given up and try to remember to shut the computer off when not using it.


That sounds like you've got some scheduled task firing too frequently. These commands may help diagnose the system wakes:

- powercfg/lastwake (report what event/device last woke the system) - powercfg/waketimers (tasks set to wake the system)

If those don't report the culprit, it could also be a hardware device:

- powercfg -devicequery wake_armed (enumerate devices able to wake the system)


Gaming desktops are extremely inefficient. My new AMD GPU claims to be using 30w while at 0% utilization. You also have all the fans which don't turn of when they really could half the time. As well as often multiple hard drives plugged in.

It's just that no one really cares about power usage on a desktop because its plugged in and doesn't cost all that much if you turn it off while not using it. I think modern windows may even try to automatically suspend the system to actually turn the gpu and fans off.


Utilization is never 0% though. There's a lot of housekeeping activities that prevent components from being powered down.

They do care about desktops, but not as much as laptops obviously. It's all about compromises.

Fans don't draw very much at low speeds, but you can turn most off (except the CPU fans unless you have a large heatsink).


Ok, so we're talking more like on-but-idle rather in sleep mode? Still seems like a lot, but more reasonable.


Hard drives and monitors, mostly.


Also RAM. 32 GiB of (non-LP) DDR4 will consume around 10 watts constantly, regardless of load, and DDR3 was twice that.


Ever since‘s Apple standardize having SSD‘s in their computers, I’ve been trying to be better about shutting off my computer at the end of the day. It boots up so quickly compared to HDD’s it’s hard for me to justify not doing it. I don’t think my laptop or mini take more than 15 seconds to be up and running.


If you don’t have a Kill-a-Watt device, you would at least feel 100W waste heat.

Does your PC get that hot?


You might not. Given enough fans on the case they basically always feel cold. Mine isn't particularly crazy but at 100% utilization it would be using about 400w and it feels cold on the fan exhausts. You'll notice the room heat up though.


> If you don’t have a Kill-a-Watt device, you would at least feel 100W waste heat.

You would not, 100W not that much heat unless the room is very hot or it’s blown straight into your face (but even middling desktops tend to have very large fans, and will generally send the heat away from you, either towards a wall or to the ceiling directly).

100W is about what a human produces at rest.


> 100W is about what a human produces at rest.

Which is why humans are quite "warm to the touch", so yes, you'd definitely feel that.


> Which is why humans are quite "warm to the touch", so yes, you'd definitely feel that.

The original comment didn’t specify “to the touch”. Furthermore a (again) well-ventilated case will quickly expel heat, the case of my desktop PC is colder than ambient temp (that it’s aluminium and steel doesn’t help with heat retention either). I might feel some heat if I put my hands in the exit airflow, but even that is not a given: at a mere 100W the windchill of the back fan is likely sufficient to counteract the actual heat.


If you don’t have a Kill-a-Watt device, you would at least feel 100W waste heat.


When your PC doesn't need to boot when not disconnecting it from the grid, it means that there is more than a LED being powered


This. Some things have a kind of "hidden cost" that aren't considered when doing those savings maths.

The cost of turning up some "microCPU" vs. the cost of the led + sleeping sycles. The pollution cost of slowing down, stoping, driving again vs. stoping on bumpers. The costs of keeping the same old refrigerator vs. energy savings during 5 years.


If you think that a PC maintains a network connection only by keeping an LED lit then either your analytic abilities aren't up to this discussion and/or you're being purposefully disingenuous.

>How much to replace a broken socket? //

So, waste electricity because the 5¢ socket switch might break?

What's wrong with turning things off when you're not using them.


If you live in the Netherlands (or Belgium/Luxemburg/Sweden) and have a smart meter it is fitted with a P1 port according to the DSMR (Dutch Smart Meter Requirements). If you connect it to a RS232 port (and pull some pin high depending on the version) it will dump a ASCII "telegram" every 1 or 10 seconds (depending on version) with all meter readings, current usage/delivery, gas meter readings and some other stats like frequency and phase voltage. It's very useful in determining your current power usage and "debugging" vampire drain. A good starting point is the HA DSMR component page[0].

I personally log the metrics to Prometheus so I can graph them over time with Grafana. For groups/devices that don't have a dedicated power meter attached I sometimes do a run where I switch off all the breakers and then turn them on one by one. The live power usage output is really useful to quickly determine how much easy group is using.

[0] https://www.home-assistant.io/integrations/dsmr/


I'm doing something similar although I don't think smart meters in the UK come with a debug port sadly - so I'm scraping a third party API instead.

Wrote a small script to regularly ingest the data into SQLite, then use Grafana to plot the results :)


Check your meter to see if it has a blinking LED. Most meters have a pulse LED which blinks X times per kWh consumes. Before smart meters where a thing I used to have a light sensitive transistor taped to my meter and just count the pulses. Even older models might have a digit replaced with a mirror on the last indicator dial. Which can also easily be read. Another one is magnetic pulse output, which can be read with a HAL sensor, but those are mostly found on water meters. Don't be satisfied with aggregated low resolution scraped metrics is what i'm saying. Per second real-time power measuring is the thing you want :).


Hahah that's amazing.

My meter is one of the blinking ones yeah.

I'm not sure if I have the patience to add sensors though.

One interesting aspect of the meters is I think they talk over zigbee protocol, but I'm not sure how to tap into that. I think the in-home display uses it though to display "real time usage".

The API I use gives 30m resolution which seems ok enough.


Which provider / API? I'm on EDF w/ a smart meter but haven't seen anything that looks coding accessible


I signed up to this app https://play.google.com/store/apps/details?id=uk.co.hildebra... and they have an API https://docs.glowmarkt.com/GlowmarktApiDataRetrievalDocument...

Other people have suggested https://data.n3rgy.com/consumer/home as an alternative as well

As long as your smart meter is DCC connected they should work


Every article that states you can save a small amount of money by doing something is tacitly saying it's your own fault if you're poor. You ate too many avocadoes, you didn't make your own coffee, you didn't turn your TV off at the wall. It's corporate PR to shift responsibility for the cost of living from billion dollar company profits to individuals who are already struggling.

Journalists who write those articles should be banned from being published forever.


>Every article that states you can save a small amount of money by doing something is tacitly saying it's your own fault if you're poor. [...] It's corporate PR to shift responsibility for the cost of living from billion dollar company profits

That's reading way too much into it, don't you think?

>You ate too many avocadoes, you didn't make your own coffee, you didn't turn your TV off at the wall

The "article" literally says the savings is £30 a year. Nobody thinks that's the reason they're poor.

>Journalists who write those articles should be banned from being published forever.

In other words, money saving advice should be banned because they might be spewing "corporate PR to shift responsibility"?


That's reading way too much into it, don't you think?

I don't think that, no. Or I wouldn't have posted what I posted. Corporations spend an immense amount of money on PR, and a lot of it goes on "perception shifting". As an example, in the early 2000s Shell spent $100m inventing the idea of a "personal carbon footprint" so that people would focus on the environmental impact of you using 20 plastic straws a year instead of Shell burning 400m barrels of oil. It worked brilliantly.

The "article" literally says the savings is £30 a year. Nobody thinks that's the reason they're poor.

£30 a year on this. £50 a year on something else. £20 a week on a phone. £5 a week on coffee. And suddenly the papers are full of "Poor people can't afford a house because they waste their money on things!" People do believe it.

In other words, money saving advice should be banned because they might be spewing "corporate PR to shift responsibility"?

Yes. Money saving advice when it's a paltry amount that won't cause a shift in your circumstances is essentially useless, especially if the root cause is some corporation trying to get out of being responsible for their actions. In this case, electricity companies are making record profits and increasing prices far in excess of inflation. The media should focus on that, not whether or not someone is using 20kWh on making LEDs stay on.


>I don't think that, no. Or I wouldn't have posted what I posted. Corporations spend an immense amount of money on PR, and a lot of it goes on "perception shifting". As an example, in the early 2000s Shell spent $100m inventing the idea of a "personal carbon footprint" so that people would focus on the environmental impact of you using 20 plastic straws a year instead of Shell burning 400m barrels of oil. It worked brilliantly.

I'm not denying "perception shifting" doesn't exist, just that claiming that "Every article that states you can save a small amount of money by doing something" is "corporate PR" is a little too broad. This article has problems for other reasons, but claiming that an energy conservation NGO is acting in bad faith by telling people to conserve energy is a stretch.

>£30 a year on this. £50 a year on something else. £20 a week on a phone. £5 a week on coffee. And suddenly the papers are full of "Poor people can't afford a house because they waste their money on things!" People do believe it.

So you're against money/resource saving techniques being promulgated because it's ammo for the "millennials are poor because they're eating avocado toast" crowd? That seems like an overly "we must win at all costs, anyone that helps the enemy is an enemy" way of looking at things.

>Yes. Money saving advice when it's a paltry amount that won't cause a shift in your circumstances is essentially useless, especially if the root cause is some corporation trying to get out of being responsible for their actions.

I agree this article is dumb because the savings is only £30, but jumping to the conclusion that it must be part of some PR campaign seems premature.

> In this case, electricity companies are making record profits and increasing prices far in excess of inflation.

source?

> The media should focus on that, not whether or not someone is using 20kWh on making LEDs stay on.

In this case "the media" isn't even involved at all. The OP is a blog post from a private citizen, and the "article" it's referencing is from an energy conversation NGO. While I think it might be worth pointing out energy companies profits, that's not really part of the NGO's mission.


So much this. Right behind you there.


Some advice but not exactly ideal for people on low incomes is replacing old appliances.

I have an old TV that consumed ~290 watts. Similar TVs today consume around 70 watts, say 25% for simplicity.

Variable rate electricity tariffs are ~30p/kWh at the moment.

At ~4 hours of TV a day (I guess more common for older households)

290W * 4 * 365 = 423.4kWh = £127 @ 30p/kWh

vs

70W * 4 * 365 = 102.2kWh = £30 @ 30p/kWh

So a £400-500 TV pays for itself in 5 years.


My 70" OLED draws 130 watts and is probably on 10 hours a day. Thanks streaming!


Yes, you can save more than 30 pound per year by switching of standby devices. I just calculated it for my TV: https://www.wolframalpha.com/input?i=0.01+kwh+*+24+*365*++0.... which draws 10 W in standby (which I measured). It results in 25€ per year (I pay ca. 30 cent per kwh). I have lots of other standby devices.


That's a crazy amount of standby power consumption! That's more than my laptop under light usage and way more than a mobile phone even under heavy usage. I really hope this isn't common.

As an aside, WA is awesome at units. You can simplify to: 10W years * 0.3 €/kwh.


> That's more than my laptop under light usage and way more than a mobile phone even under heavy usage.

Battery-powered devices (which include both laptops and mobile phones) are notoriously frugal with their power usage, since they have a severely limited power budget (even the biggest laptop batteries are limited to 100Wh because of the FAA, and mobile phone batteries are also limited by their physical size), and they are also compared on that power usage (you see on review sites things like "with the screen at half brightness and playing a nice movie, the battery lasted N hours..."). Wall-powered devices, on the other hand, are much less limited; the biggest restriction would be in places with 10A wall plugs at 110V (that's the case where I live, though "110V" is what the public calls it but it's actually 127V), and that's already over 1kW with no time limit. So wall-powered devices tend to not optimize their power usage (standby or otherwise) as much as battery-powered devices.


So I have the same (9W). It is crazy, especially since the usage when the TV is on is only 35W.


Many TV's have a small ARM computer running constantly in the background, regardless if the screen is on or not.


Google can do it, too; try it with your example query. I find that more convenient because I can just type it into the address bar and it loads lightning fast, whereas WA is kinda slow. For simple math and unit conversions like this, WA's full power isn't necessary.


I measured almost all appliances in my house. And my TV uses around 9W in standby, so consistent with your finding. We don’t have a switch yet and I’m too lazy to (un)plug every time. So currently that is all wasted energy.

Mobile phone chargers really don’t use much energy at all. So these can be left plugged in.


> And my TV uses around 9W in standby

How old is your TV?

EU regulations state that devices > 2013 must must not consume more than 0.5 Watts in standby or in off mode

9W is an egregiously high amount, maybe there's some setting or something that's keeping your device awake to be sapping 9W constantly?


We bought it last year. I think it is a model from 2019 or so. Perhaps something is keeping it awake indeed (it is connected to WiFi, and that is also how we use it: it is not connected by any cable). But there is not much to set up, unfortunately.


Said this in another comment from a another commenter saying their TV sapped 9W inb standy: how old is your TV?

EU regulations state that devices > 2013 must must not consume more than 0.5 Watts in standby or in off mode


Also, even if your devices on standby draw the equivalent of £30 of energy per year, and even if you turned them off completely, you would still not save £30 a year if you live in a country where you need heating in the winter.

Because apart from light escaping by your windows, almost all that energy is turned into heat, and if it wasn't there, you'd have to heat up your home more.


You are right, but your heating system is probably cheaper (unless you have simple electric radiators).

If you have a heat pump, it will generate about 3W of heat from 1W of electricity, so it's 3 times more expensive to heat with standby devices that with your heat pump.

If you heat with gas, it's much cheaper per kWh than electricity, but I'm not sure what the ratio is.


Exactly! Some people look at me with disgust for running an inefficient old AMD PC as my home file/print/backup server, but it's made the coldest room in my house very comfortable during the winter months. There are no central heating vents in this room so I would be using electric heat regardless, it just so happens that my heater is also filling an important technical role as well.


IIRC, a kWh of electricity is about 3x more expensive than a gas kWh. You’re correct if your heating is electric (resistive) though.


And if you use air-conditioning for cooling?


Yep, Your device heat adds to the load on your AC. So if you going to bother with unplugging devices, it is more beneficial on those hot summer days.


Then I guess you don't care about neither the environment nor money, so it's kind of a moot point.

(I am not talking about using it in reverse / heat-pump ONLY)


Perhaps related or not, I was wondering about the energy loss of AC/DC converters that are often used to power the device. This clearly happens when the device is switched on - the converter is a bit warm - but maybe it also happens when it is off ?


The BBC ran an interesting piece of advice this morning on saving money from "vampire devices":

"There are smart plugs you can buy which will let you check everything is turned off from your phone".

How much power do the smart plugs use to listen on wifi?


I have a Smart Plug flashed with Tasmota firmware, it uses 0.3W continuously (£0.73/year) and 0.7W with the relay activated. I'm sure there's smart plug's which use more efficient power supplies and/or MCU. The esp8266 isn't exactly known for being power efficient.

I suspect that it would be more cost effective to worry about the energy consumption of Windows update than to worry about a smart plug.


Good ones should manage below 0.1W. But I don't thing many are that efficient. So <1W.


It's interesting - someone looked at their smart plug and it was pulling 1.2W, the cost of which comes to more than some of the devices they might be attached to - assumes saving electricity is your only goal - in that case not necessarily the best bet.


Out of interest, I just checked my Philips Hue smart switch (Zigbee) using my measuring plug. It reported 0.2W.


My takeaway from this is that while you may not be able to save £x specific amount, you can make not completely insignificant savings in energy usage. While the claims aren't completely accurate it is probably a good thing for us to be more mindful of our usage.


These are completely insignificant savings. The source aligns with a piece published this morning on the BBC which is very obviously a Govt encouraged piece attempting to push the critical failings from them (the Govt) on protecting the most vulnerable families from price gouging and speculation in the energy markets.

It's just a replay of the carbon footprint playbook.


This was my first thought when I saw that too. It's a hamfisted attempt to turn a systemic problem into a "a problem of personal responsibility".

Same way that companies that produce a lot of waste are weirdly keen on "everybody doing their bit" with recycling: https://www.treehugger.com/shell-oil-preaches-personal-respo...

Or the way that jaywalking was turned into a crime by the motor car companies: https://www.grunge.com/721704/the-truth-about-how-jaywalking...

Or the way poverty/homelessness is framed: https://www.marketplace.org/2012/10/05/personal-responsibili...


I do think there is something to be said for creating a culture of sustainability. The values then become reflected in a society’s laws. If we as a people have even just a baseline consideration of our waste and energy consumption, then it stands to reason we’ll apply pressure to the truly responsible parties over time.

It also helps remove personal politics from the equation if everyone already sort of agrees we should be mindful as a standard sentiment. The fact that we have folks whose identity is tied to pollution, to the point where we had people at rallies going “drill baby drill!” barely more than a decade ago, says a lot. Remember: the GOP created the EPA. They used to be more openly supportive of conservation.

The truly responsible parties love the fact that millions of Americans are going to bat for their “right” to trash and pollute our country.


I agree. In fact I'm insulted at the suggestion of £30 savings in YEAR, and I even live in a country where the purchasing power of £30 is much higher than in the UK.


On a societal level, once heating, industry, public services and transportation are accounted for, we consume around 500-1000W per person. Saving a few Watts per family is a rounding error. We're talking about the power output of a solar panel the size of an A4 paper

This is a failure of state's energy policy, trying to pass the blame on regular citizens


Especially true since costs such as carbon emissions aren't internalized yet into the electricity cost.


One of my old coworkers actually went around with a plug-in power meter and measured what all of his appliances were using.

Ended up knocking something like $800/year off of his energy bill.

That's not to say that certain modern devices aren't highly efficient in standby, but to call this a fraud just doesn't seem right at face value.


It highly depends on what devices you have.

With recent Danish prices fluctuating between €0.5 and €1 per kWh, i have been going through my devices, and there were a few of them that surprised me.

I have a couple of Sonos speakers, and apparently every Sonos speaker draws 3-6W while idle/standby (https://support.sonos.com/s/article/256?language=en_US). Having 4 speakers means about 20W idle power consumption, which amounts to 175 kWh/year (14.5 kWh/month), and even at €0.5/kWh, that's €7 per month just keeping speakers on standby.

Another "hidden source" of power draw is a PlayStation. While it's fairly cheap keeping it on "standby" (6-8W), streaming media from it takes about 90W. For comparison an Apple TV uses between 3 and 7W (https://www.apple.com/environment/pdf/products/appletv/Apple...) for the same.


Hue lightbulbs use about 0.5W in their powered off state. So 10 of those is another 87 kWh per year in idle consumption alone.


€1 per kWh? Is that a typo?


Sadly no.

Normal prices are around €0.35/kWh, but since the Danish electricity grid is based on renewables, with backup power being generated mainly by natural gas, that means when natural gas prices go up, so does electricity prices.

Combined with freak occurences of low rainfall in Sweden and Norway during 2021, meaning less hydroelectric power, as well as an unusual period of very little wind along the Danish west coast in august/september, meant that most of the winter has been using a lot more "backup power" than usual.

Solar is of course an option, but considering that the southern parts of Scandinavia is about as high up north as Canada, the sun usually doesn't shine when we need it (the sun rises around 9:00am and sets around 4pm during winter, with an impressive 19 hours of sunshine during all of December)

As i'm typing this, 57% of our power is being generated by renewables, and the current price is €0.46/kWh. The highest price in April has been €0.63/kWh, but on March 23rd it was €0,71/kWh, on March 14th €0.79/kWh, on March 9th it was €1.00/kwH, and on March 8th it was €1.12/kWh. It was even higher in february.

All of the above prices are "hourly spot prices" from 18:00 - 19:00, so the average 24 hour price will be lower, usually around the €0.5/kWh mark, though in february that was closer to €0.7/kWh.


Yeah, this particular myth is common in many EU countries as well. Apart from the minimal amount reportedly saved (30 GBP per annum in real currency is 37 EUR/USD per year, less than 10 cents a day), it does not seem to be particularly true.

I measured this (at the wall socket) over the years, and my findings are below.

For battery-powered devices, disconnecting the charger once the battery is full does nothing, other than to cause the battery to be discharged more rapidly than it would be otherwise. For battery health, it's best (if possible) to set a 'start recharging' threshold at 90% or so, but that's mostly a device-lifetime issue, not a power consumption issue.

For 'mains-powered' devices with a 'soft power-off', like many modern coffee machines, microwaves, etc. etc., the power draw in 'idle' mode is truly insignificant. You may have an atypical (broken?) device, but other than avoiding some transistor whine, you truly don't gain anything by powering these off. For devices like printers, monitors and TVs, I've never seen any 'idle' power consumption that was even noticeable.

Some 'always-on' devices do have significant power requirements. Like: your set-top box (since it needs to records the programs you scheduled), your modem and/or media converter, and your fridge. With these: it's always measure, inquire and replace as needed/possible.

For me, my fridge is as efficient as it gets, the fiber-to-Ethernet box from my ISP draws minimal power anyway and my Mikrotik router and APs are pretty power-efficient as well (like, 4 hours runtime on a tiny UPS). The rest goes mostly to my heat pump (which also powers my boiler), and in the summer months, this always offset by my solar panels, unless cooling requirements get way out of hand.


Kinda depends on the device. My desktop is at 5W for sleep state which equals around 10 quid per year if my napkin math isn't failing me (5 * 730 * 12 *0.23 /1000). So 30 for a household doesn't seem impossible

Then again that is acceptable to me. Recently realised that leaving it on overnight at 150W idle isn't the best of moves though so now using sleep. Still not quite clear to my why it is idling that high for relatively modern gear.


My 'base' load in the house is about 100W.. Containing 3 servers (one 10W-idle Skylake i5 NUC, the other 5W-idle Atom-based, one Raspberry Pi 4), one router, one fibre modem, one 4G backup modem, security system, 5 network switches around the house, 2 Access Points.. Oh and one light (NUC) desktop I run 24/7 but in low-power at night (with most processes suspended). It accumulates to about 100W. All in all that adds up to a lot more than 30 bucks/year, it's about 1 euro per day at current prices. But I need all this stuff.

Everything that drains considerable power is already on a switchbox. It's just the way it is, a smart house saves a lot of power but it costs some too.

Still, we've come a long way. 100W used to be one lightbulb. Now I can run my whole house on that.


seems underestimated, did you measure it? no fridge? no tv?


I didn't count the fridge, no. I took the base load from the times the fridge compressor is off.

The TV in standby is included though. And yeah it's measured. I have a Shelly EM2 connected to home assistant.


Ok, asking because I'm getting around 150W measured base load with way less computing devices but two fridges. I need to start monitoring single appliances to see who's guilty.


Yeah if you have two it's a lot harder to see. I don't have sensors on a lot of individual devices, definitely not the fridge.

But on my house power graph it's really clear when the fridge is on because I see a huge spike when the compressor starts up and then about 50W constant load while it cools. Also, I have temperature sensors in both the fridge and freezer compartments so I can see exactly when it's cooling (which coincides with that power draw).

But almost all my computing devices are low-power ones and that really helps a lot. Some are in the living room so noise was also a consideration. The only big exception is my game PC - I always shut that down fully and only really use it for gaming because even idle it will take 200W or so. I have an RTX3080Ti in it so it's not power-efficient :)


Just yesterday I noticed that a device installed by the cable TV company in my house (a WISI VX 2030 amplifier, according to the sticker) was warm to the touch. Measuring its power consumption, it uses a constant 15 watt, so about 130kwh/year.

I don't use cable.


Cable Modems are notorious energy hogs. I'm pretty sure some of the ones I've seen run their CPU at 100% all of the time in busywait loops. They are always warm to the touch.


Did you turn it off?


Yes, I unplugged it.


It seems like it's part of the cable plant?

https://www.wisial.com/product/vx-2030-065/

"The VX 2030 is a location feeding in-house amplifier"

Did your service say the same quality after you unplugged it? Did that of other people in your building?

It seems weird that they would "install" this in your residence.


The actual sad thing here is that incomes have stagnated so much in the UK and the cost of living has skyrocketed so quickly, that people are desperate to save even £30/year

That's approximately equal to 3hrs worth of minimum wage work (£9.50/hr), and people are that desperate? But it makes sense when you realise that the median wage in the UK is ~£24k/yr...


> incomes have stagnated so much

I don't think it's about income change so much as what income bracket you fall into to begin with. An upper-middle-class person with a stagnant income won't worry about £30/year. But people living in relative poverty (a huge segment of the population) will regardless of how the economy is going.


Median incomes really are stagnating. See e.g. https://blogs.lse.ac.uk/politicsandpolicy/real-wages-and-liv...

That's until 2015 and doesn't even take the recent energy hikes into account.


Upper-middle class incomes are not stagnating. That leaves 90% of people with stagnating incomes.


So your position is that a majority of people in the UK are so desperate for an extra £2.5 a month that they're willing to follow asinine advice like this?

People living under bridges would scoff at that much less the population of an affluent, technologically and image obsessed culture like the UK.


If you make say 10 relatively simple adjustments to your life like this then it ( at the risk of being boringly obvious :) ) would be 300 a year which becomes much more significant.

So learning about them is important or you can't make any adjustments.


If you made 100 such adjustments it would be 3000!

And so on... but really: how many such adjustments can you make? And how much inconvenience will that cause?


If you have a low income, then potentially much less inconvenience than not being able to afford things that you need.


Well if you want to be silly about it, sure. (I even put in a friendly note to hopefully avoid such comments). But if we remain sensible then there is quite likely room for a dozen or so such adjustments in many lives where the inconvenience is negligible in proportion to the potential gains.


> But if we remain sensible then there is quite likely room for a dozen or so such adjustments in most lives where the inconvenience is negligible.

I doubt that. Which is my point. You can try to forestall 'such comments' but then you probably should show where you believe 10 (or even more) such adjustments with negligible inconvenience can be made in the life of the people for who this sort of advice would matter. Because I don't believe that to be the case.


Could you name, say, two more? Reducing your use of heating or AC is the only other similar adjustment I can think of. I don't think 10 such adjustments exist.


I’ll name the ones I do myself

- Reduce heating as you suggested (we have a crisis going on in Europe and I reduced it even more recently)

- Drive a little slower

- Make sure I check for discounts at the supermarket and plan accordingly

- Try to fill the dishwasher optimally

- Cook more vegetarian meals

- Check used goods online markets before buying new

And that’s just a quick list


Those go far, far beyond the negligible impact of turning off your devices when you're not using them. Nobody is claiming that it's impossible to save money. We're saying there are a limited number of options with negligible-to-no impact to you. Most of the things you mentioned are absolutely not worth it to me, I'd rather not save the money.


It not about you It’s about people for whom 300 a year is not negligible For those people the things I listed are worth doing and negligible effort compared to the reward

I don’t understand why this is so hard to get


Again, nobody is saying it's impossible to save money. That would be ridiculous. We're saying that there are few low-hanging fruits that have negligible or no impact on your life. The word "negligible" here is applying to the impact on your life by doing the cost-saving thing, not on the dollar amount savings that you get by doing it. Turning off devices when you're not using them is one; you're literally not using it, it doesn't impact you by being off instead of on standby. It's like free money. Switching to a more vegetarian diet is, and I hope this is obvious, much more impactful and more of a life change. Obviously you can save money with life changes.


I'm not saying you are saying it's impossibe to save money! Please, no straw men.

I started this dicussion by making a specific but limited claim that learning about £30/year money saving tips that are easy to adopt is worth it for some people when they learn several and add them up. You are trying somehow to claim that if the tips are not identical in character to turning off devices that my claim is invalid. This makes no sense.

Regarding the more vegetarian meals point; No it's not obvious to me that it is life changing. Having more vegetarian meals that it saves you £30/year is about as effortless as it gets. It's far far less effort than turning devices on and off almost every single day compared to maybe 10 times in an entire year to substitute meat with something plant based in a meal. Nothing life changing there. Just prepare a handful of meals little differently.


Because most people who need to make such adjustments have already made them.


I'm not sure what your point is. Yes, people are doing the thing I explained from the very start of this conversation. To recap, OP was curious why people would do it for a such a small amount of money and I pointed out, what I incorrectly assumed was non-controversial, that they don't just do one thing they do several and it adds up.


If you cut out the take-away coffee and avocado on toast and you will have enough to buy many houses.

Apparently going without 24,499 times would mean you have enough to put down a deposit for the average first time buyer in London.


I remember my father powering off TV. It was this old SONY TV with an actual power off switch at the front. The switch that does click/clack and breaks the power.

Him saying that he will save some money. Let it be those 30GBP a year. I asked how much will it cost to change the switch when it actually breaks from this constant clicking.

He stopped turning off the TV.


So you used FUD to convince him to waste electricity.

The MTBF of the switch was probably 30k cycles.


I can't remember the exact model of course but it must have been some 2000s SONY WEGA Trinitron. Quick search says it uses 0.3W in standby.

30k cycles / 365 days / ~4 cycles a day (morning news, evening) is ~ 20 years of clicking.

Let's say TV is in standby 20h per day. So ~ 146 000h * 0.3W will be ~ 44 000Wh or 44kWh.

44kWh * £0.16 or so equals to ~£10.


Turning off modem/router for the night saves a bit of energy though (about enough energy to boil the water for my morning tea), with the added benefit of preventing me from compulsive internet browsing on my tablet at night.

This obviously won't save the planet or my wallet, but helps me with being mindful at least.


Based on your described power usage, I assume your modem/router is actually a regular x86 PC


15W * 8h = 120Wh. For that, you can run a 2000W kettle for 216 seconds, or 3.6 minutes.


OK, I stand corrected. 0.12kWh is a somewhat significant amount, enough to boil a cup of tea.


Bought in the windows 95 era...


I don’t know if it was the same in the UK, but growing up in the 70s and 80s here in the US it always felt like the future was going to be plentiful - think “The Jetsons”. I always figured that electricity would be cheap and ubiquitous. Weird that didn’t turn out.


In that time the world population has increased from 3 billion to 8 billion. Things like energy, food and construction are our most basic needs. In the past, production of these were limited by available labor. Today that's not the case. So the same resources (eg. productive land and mining resources) have to be divided between more people.


This is what everyone consistently misses. Many people seem to approach shortage and price increases as if we're suddenly running out of something. What no one really addresses is population has more than doubled in the same time period and that consumption of goods and resources has increased per person as well.


I'll ignore cheap as its subjective but where do you live that electricity isn't ubiquitous? are you atill waiting for that new fangled indoor plumbing yoo?


I have completely no authority in this, but my guess it's some sort of path-dependence. I think the problem is that we modelled our economy to rely on fossil fuels instead of trying to diversify way earlier. A finite resource under increased demand (the total population increased and the living standard increased rapidly in the last decades) quickly turns from abundant to contested.


It is cheap and ubiquitous, and getting cheaper.


> and getting cheaper.

Ahh, my electricity bill in Germany must be lying then. Good to know!


Same here in Canada. There may be some reduction in rate here and there, but the overall trend is always up.


To be fair we could generate a lot more power a lot cheaper with circa 1962 (year The Jetsons began) environmental legislation.

I mean you would have to compromise on the environment a bit but it's by no means impossible to generate a ton of cheap electricity. It's not some lost art like Greek fire or Damascus steel.


That's the mindset that let people in the US to buy giant trucks and SUVs that do 20mpg on a good day and now you see them at the gas pump being mad gas is at $6 because it wrecks their budget. Pretty sad that people don't plan for prices to get higher.



>Weird that didn’t turn out. Decades of a green energy policy being developed by what are, apparently, human-shaped ostriches have led us to being partially dependent on an autocratic regime. Other things such as pipeline (US) and fracking (UK) projects being shutdown by a bizzaro left-green lobby mean a real difficulty in substituting the energy input that our entire economy runs on.

Bizzaro because the left desperately wants to redistribute the very thing it's attempting to shut down, as they probably haven't considered that energy underpins all the wealth we have. And the greenies - who are nearly universally left wing also - probably will shout loudly when people are shivering in their homes and not getting enough to eat. And never blame themselves.

But the real blame for this lies at the feet of politicans, blue and red, who have allowed these loud eejits to dictate energy policy: shutting down nuclear plants, subsidising intermittent energy without (the necessary and) even more significant investments in energy storage. Using green credentailism as a vote winner and nothing more.


Uh, this seems incoherent. You're calling the group that's calling for avoiding the worst consequences of climate change "ostriches". Aren't the ostriches the people who advocate keeping current emissions trajectories while ignoring the consequences that'll have on the lives of billions of people?


The _politicians_ are the ostriches. The group calling for "avoiding the worst consequences of climate change" are, through ignorance, calling for war and mass starvation.


The politicians who want to avoid the worst consequence of climate change are ostriches, and the politicians who advocate in favor of worsening climate change are the non-ostriches?

We know that climate change is threatening to make massive areas of land uninhabitable, change weather patterns in ways which will massively increase water scarcity in large parts of the world, cause droughts which will be devastating for food production, put coastal regions under water, etc. don't you think that these effects would cause war and mass starvation?


I have a Shelly Plug S connected to my PC Speakers, which connects to my InfluxDB 2.0 via MQTT. The speakers use 11 Watt on average, even if turned off, which amounts to 96.10 kWh per year. In Germany, that would be 37.48 € (39 cent/kWh), or 31.51£ - only one device turned off.

Now, I still leave it on, because turning a power adapter on and off 365 times a year may also render it useless after a while. I was actually thining about this recently, and whether I should use the Shelly to turn it off automatically, but did not reach a conclusion so far.


People like Amory Lovins have long identified cases where it seems that you could spend $A today and then save $B a month later and after n months $nB >> $A. There are a long list of these where it seems there is a business case to invest $A in energy conservation yet people don't act.

One trouble is that energy consumption is noisy. I remember replacing all the incandescent bulbs in my house (circa 2002) with compact fluorescent bulbs, expecting a few dollars a month savings, but given the fluctuations of the bill those savings were invisible.

A few years later I was involved a project that involved evaluating the effectiveness of improvements in building energy efficiency at a large university and that turned out to be pretty difficult. We tried modelling the energy consumption of buildings based on the weather and found there was still a large residual of what seemed like random variation that typically dwarfed the effects of energy improvement upgrades. It could be that our model wasn't a good model, but it turns out to be very hard to prove that the energy and money savings advocated by the likes of Amory Lovins are real, so that's one reason why people don't invest in them.


> No, you can’t save £30 per year by switching off your “standby” devices

I suspect this is probably true of modern devices with modern power supplies.

No doubt if you've got a house full of 10+ year old gadgets then they possibly have more inefficient power supplies.

Its likely worth considering modernising old kit because they'll be power-hungry during active use too, which will likely cost you a lot more than the £30 a year you'll save by turning them off overnight !


I will happily pay my local equivalent of £30 each year to not have to wait for every device I use to start from a cold boot every time I want to use it. My TV takes ~2 minutes to be usable after booting, I use that every single day more or less - if I wait for a cold boot each time that's HOURS of time lost every year.

Not to mention things like games consoles which update themselves and their installed software while in standby mode - that's more hours of time saved. All for the low low price of £30pa.

Realistically I could buy smart plugs or ones with a timer, but that's more cost (probably more than £30), more complexity, and they'll probably cause at least one ungraceful shutdown which I'd rather not have to deal with. And the last thing I want is more discount IoT garbage using my internet to participate in a DDoS because the now bankrupt manufacturer left telnet on and forgot to change the password from "admin/admin".


I think it's hilarious that the cable box (or satellite box) is worse than the others, in fact most of the others put together.

The cable box uses more power when idle than a smartphone uses peak, and that doesn't include the display!

As important as they frame the DVR I don't think I've known anybody who has cable who really uses the DVR.


Long lived hifi equipment can be a huge offender, plenty of stuff still around from the decades between broad introduction of remote controls (before that, they were usually hard off) and broad introduction of switching PSU which aren't quite as bad as transformers (but early and/or cheap can certainly be bad as well)

But I think "everyone" already knows that standby devices are a convenience that one should strive to avoid, the current fight is between people who insist on unplugging the USB charger vs those who refuse to do so. On this battleground, I think I see a strong case of perfect as enemy of the good: "refusers" are likely to react to pressure by ignoring the issue completely and might easily miss out on meaningful low hanging fruit.


My one-occupant apartment is currently averaging about 0.15 kWh for the hours when I'm asleep. With my current electricity price, 0.1043 €/kWh, I could perhaps save 11.30 € in a month by completely eliminating that 0.15 kWh draw. Obviously, that's not realistic since I want my fridge to stay cool, I want my Wi-Fi beaming across the apartment, I want my NAS running services and I want my air purifier circulating air, even if I'm concious or even in the apartment.

I think I'm currently on pace to save about 2 kWh/day with my new computer though, so that's something at least.


Standby power use varies widely by device.

Of 100 devices around your house, there might be only two with unreasonable standby power. But those two will be costing you significant amounts of money.

For example, I found my oven clock was using 15 watts. The oven is 20 years old, so over those 20 years, the clock (which I never use anyway) has cost £762 (at today's electricity prices). For that I could have bought a new oven!

Other surprising high usage devices:

* Audio Amp (on, but silent - 45 watts).

* A power strip - 10 watts.

Nearly all other appliances were < 1 watt on standby/idle, as is required of many new devices by European law.


What the hell is a power strip doing that needs 10w!?!? Did they find some outrageously inefficient bulb to put behind the switch?


Power strips often have 'power surge protection' built in... That usually consists of a metal oxide varistor. When they have taken a few lightning strikes at the other end of the street, they start leaking power and getting warm. A few more strikes and it'll be drawing 100 watts and catching fire...


As a rule of thumb one Watt standby equals 9KWh per year. If you have expensive electricity like we do in Germany with 35 Cent per kWh, this is roughly 3 Euro per year.

10 Watt standby does not sound unplausible I found that my subwoofer of a very well known brand consumes about 6 Watts in standby, and our old Roomba is very hungry when on standby, too.

Get a device to measure power consumtion for a 15 Euros/Dollars, find the energy hogs, and give the device to friends and relatives. This will not save the world, but you WILL find some pontential for saving energy.


Be very careful about some of these articles. The general ideas are basically correct, but exact numbers are subject to huge error bars. Your typical clam-type meter will easily be off by 5-10%, possibly much more at very low amperages in a household environment. So articles written by well-meaning tech writers measuring their computer with a 50$ clamp meter should be taken with a grain of salt. They are really not designed to properly measure anything less than a few amps.


I did this in two different houses and it does depend on if you have bad culprits. By far the worst for me was a Zanussi washer/dryer with an on/off that went "clunk/click" and popped in/out - but would still use 10W (about £10/year back then). I just ignored that switch and used the wall socket switch instead.

A few timers on other devices to turn off bluetooth speakers overnight. Over the ten years I lived there it was a decent saving.


Even if you could, would it be worth the aggravation of the on/off wait to do this every time you finished with these devices to earn an extra £2.5 a month?


I had a single AV receiver that was using over 100w in its "off" state, one device alone easily blowing past that £30 figure.


Ha; I’ve always found these nickel and dime savings efforts, as if the waste creation at scale is offset by shaving a few dollars, laughable

Nah; while the species as a whole is fine to grind forward without a care, I’m all in full speed ahead.

Leaving my lights on in empty rooms, generating all the one time use waste; front row seats to the apocalypse sounds metal af.


When I was staying at my parents over the winter holidays, after they'd gone to be at 9pm, my wifi went down and I had to troubleshoot what on earth was up with my laptop. Turns out my dad turns off the router every night to save energy, it was pretty low on my list of things that might be wrong with the wifi.


Let's not focus solely on standby devices. How much energy could be saved by simply disabling Windows update or discontinuing use of other CPU inefficient software? I suspect that across the entire population, it would add up to be a non-significant number.


And what is the cost of unplugging devices after every use?

Not just replacing sockets or cables, but is not true that (electronic) devices have a longer live if they are always plugged in?

Are electronic components happy now with the constant power on/off cycles?


> And what is the cost of unplugging devices after every use?

You really think your time is worth so much that taking 10 seconds extra for plugging/unplugging a device is worth even considering?

> Not just replacing sockets or cables

Never happened to me once in over 3 decades. Both sockets and cables usually have a longer lifespan than any of your appliances.

> Are electronic components happy now with the constant power on/off cycles?

Depends entirely on the design of the device and the components used.

The notion stems from the olden days of analogue devices, which indeed often didn't like being power cycled. Turning appliances on and off twice a day, though, doesn't affect the lifespan in the slightest.


Unless you're using something with tubes in it, there is basically no penalty to unplugging. I use a power strip with a switch so I have the option to turn off everything when I want to.


If you use 5 devices a day and need 30 seconds for plugging and unplugging, this will accumulate to 75 hours per year. If you make 15€ per hour, the cost of opportunity would be 1140€. Ouch!


The solution is to lower the cost of electricity. Why is nobody talking about this?!? I jest...

16 pence / kwh is pricy I think. Thats around $0.2/kwh whereas the US average is around $0.13... so around 50% more expensive in UK vs US.


Wait, so your solution to wasting resources is to make resources cheaper so that more can be wasted? Got it...


As someone living in an electrically heated and well insulated house, all that vampire power is heat+convenience. My heat pump is surely 4x more efficient, but that 50W of standby power is not really wasted.


Why would microwave oven clocks, tumble-dryers, dishwashers use a considerable amount of power when not in use? Surely there is a LED light but that should be tiny, and keeping the time should be even tinier.


There are considerable losses on the power delivery side.

The appliances don't have very efficient power supplies that convert the input voltage (e.g. 230V or 120V) down to the single digit voltages used by the clocks and LEDs. It's simply not a concern for devices that usually consume power in the kW range when operating.

The simplest way to save power is to literally just pull the plug after use. It's not even much of an inconvenience either if the plug is reasonably close to the appliance.


A lot of household appliances like this are hardwired at the breaker panel. I'm having a hard time believing that running to the panel before and after each use of a dishwasher, oven, washer, dryer, etc. is something anyone would actually do.


Most ovens are hardwired, true. Washers, dryers, and dishwashers, however usually aren't hardwired (at least in countries with 240V AC) - might depend on the country, though.

Personally, I've yet to come across a washing machine, dish washer, or dryer that's hardwired. Typical appliances in my region look like this [1], [2], [3], [4]. Note the cords and plugs.

[1] dryer: https://elektricks.com/Stromanschluss_022.jpg

[2] washing machine: https://image.architonic.com/img_pro2-4/117/4868/waschgeraet...

[3] dishwasher back: https://www.kuechenjournal.com/wp-content/uploads/2018/07/Ku...

dishwasher front: https://img.moebelplus.de/xlarge/gaggenau_df251160_szene.jpg

[4] dryer: https://image.coolblue.de/840x473/content/5a1961ba7ab2e4818d...


My oven and microwaves will refuse to function unless you set the clock. It would be an inconvenience in my case.


Sounds like bad design to me. But yeah, if that's the case I agree that's indeed a major inconvenience.


Unfortunate shortened url in that tweet: "Find out more here: http://ensvgtr.uk/pS1ut"


OLED TVs runs pixel refresh compensation cycles whilst in standby. These cycles are essential for preventing burn in.

So don't turn off your OLED when you find it using some power in standby.


I literally read an article on this earlier claiming I can save £147 https://www.telegraph.co.uk/bills-and-utilities/gas-electric... I am now regretting my decision to buy some smart plugs so that I can cut power to appliances overnight.


i lived with my mother in law for a short time when i was between apartments. she would occasionally go around the house and unplug all of my devices in the early mornings before I got up. this was originally just the xbox connected to the living room tv, but if I had my laptop plugged in (work or personal) it would get unplugged too. i don't know if it was a petty "power move" (pardon the pun) or if she genuinely thought the standby mode appliances were bleeding her dry, but she's a senior electrical engineer so she should really know better.

i was considering confronting her and putting in a kill-a-watt and paying her directly the couple bucks a month my standby devices used in power, but i figured that wasn't worth it.


So I got into this a bit over the past year, and put a SmartThings socket behind just about everything in the house. Not so much for the cost saving (which will take forever at £29 for each of the SmartThings sockets), but more to understand where the power is being used so that we can run for longer on solar + batteries.

There's a range of devices that consume practically nothing when they are in standby mode, these include:

  - Phone chargers (0 - 1W)
  - Projectors (0 - 1W)
  - TVs (0 - 3W for the ones we have)
  - Laptops (3 - 4W for the ones we have)
  - AV amps e.g. Denon (2W)
  - Coffee Machine e.g. Jura (4W)
  - Printers (0 - 1W)
There's a bunch of low powered devices that stay on all the time

  - Amazon Echo Show (4W)
  - Presumably the Echos without a screen but I haven't tested them
If I turned all of these two groups off in the house when we are not using them then we might save close to the £30/year, but we have a lot of them and it's not worth the effort.

There's some devices that you have to keep running, but will cycle, these are typically fridges and freezers.

  - Pretty much all around 1W sleep
  - 2 year old full height fridge - peaks at 49W for 7 minutes ever 30 mins or so
  - 2 year old full height freezer - peaks at 75W for 10 minutes ever 45 mins or so
  - 1 year old chest freezer - peaks at 76W for 12 minutes every 30 mins or so
  - 10 year old half height fridge - peaks at 66W for 30 minutes ever 120 mins or so
  - 2 year old half height wine fridge - load isn't smooth, spikes and goes back down, average of about 100W for 10 minutes every 60 mins or so
  - 10 year old combined full height fridge and freezer - 110W for 17 mins every 50 mins or so
There's not much you can do about these (well I could remove some) but they need to stay on.

Then there's some things that can be reduced:

  - I usually walk away from my laptop and leave it running. The external monitor goes to sleep, but the laptop itself continues to use 33W when left like this. If I put it to sleep instead it would go close to 0. Assuming that I leave it like this for 12 hours each evening, that's 0.033kW \* 0.22p/kWh \* 365days \* 12hours = £31.80 - with this change alone I can hit the headline figures
  - We have a Virgin Media Box and an XBox One X, I haven't separated their power usage, they both draw in their sleep/standby mode, but together they are using 38W when in standby. Using the formula above, switching them off for 12 hours would save £36.61
  - Lights, yes, they are only 0.5W each for the Philips Hue, but we've 32 of them in the kitchen alone. That's 16W in standby, for say an average of 18 hours a day over the year when they aren't being used. £20.56 for the kitchen lights to be "off".
Problem with these articles are that some older folk in older houses go round switching off the telly rather than using standby on the remote. That's fine, you might save £1 or £2 over the course of the year, but the house is old, the sockets are close to the floor, the risk of you tripping/falling is much higher. It's never worth it to save £1.

Where it does make sense is people who have a few consoles in the house, leaving them in standby mode (particularly the one where it can download updates in the background), or just walking away from their computer and not putting it to sleep properly. Switch it off instead and you'll save the headline figures easily.

(edited for formatting)


One note about the consoles.

The Xbox Series X has two modes

- Instant On (uses about 10-11W in standby) - Energy Saver (uses about 1-2W in standby)

I think it defaults to "instant on" which is ludicrous. Energy saver mode is better, and still supports downloading updates if required.


I am using a monitor light bar (Xiaomi) instead of a ceiling light when working at night. This gives a very nice atmosphere and saves a few Watts, but it will take some years until the energy savings compensate the price for the light bar.


I switch them off to prevent large corporations from spying on me.


Imagine how much global power would be saved if NOBODY ran standby devices overnight. Not practical, but an interesting thought.


Just today the bbc said you can say 100 pound plus. Sigh. Any executive summary of which is fake.


How come tweets like these aren't tagged as misinformation


Because telling people to turn off their devices when not in use is not harmful.

I mean, I can tell you I drank orange juice this morning when in fact I didn't. So it's misinformation. But it's not harming you to believe me.


Because they aren't considered political. Platforms are mostly only looking for political deviations, something that a politician or lobbying group could complain about.

If you tweeted that turning off your Apple devices would save 30£ a year, for example, and other people started picking up on it, Apple would complain and it would become political. Twitter exclusively hunts political deviation; they're not even nipple-hunters like the other platforms.

edit: even its search for threats/abuse seems to be modeled on the "Zero Tolerance" policies of suburban schools in the '90s-'00s. Posting finger-guns might get you deleted like it would have gotten a second-grader suspended.


Probably because it's contradicting asinine advice based on outdated data and flawed assumptions.

A better version might be tonhave peoole look at what they have in their home thats using electricity and why.


> tweets like these

A quick way to #saveenergy at home is to turn off tablets, laptops and consoles as soon as you stop using them, and ideally unplug them.

This is a big potential #energy saver and could save you up to £30 a year.

> aren't tagged as misinformation

The only misinformation I see here is up to £30 a year as households would save more than £30 a year by doing this.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: