Hacker News new | past | comments | ask | show | jobs | submit login

Yeah I'm confused by this article. I've got a Virgin Media TV box which used 25w while on standby. Maybe I'm doing the math wrong but:

25w * 24 * 365 = 219000w or 219kw per year.

At my current electricity rate that's £39.42 per year at a very minimum just for the TV box.




The math is right but the units are messy. Watts are already energy/time, which is correct for the 25 W figure, but the 219 should be kWh/year.

25 W * 24 h/day * 365 days/year = 219000 Wh/year = 219 kWh/year

Which makes me notice that 1 kWh / year = 1000 Wh / year = 1000 Wh / (356 * 24h) = 0.117 W. So you can quickly estimate that a device with x Watts of constant consumption will have you paying for roughly 10 x kWh in electricity a year. With electricity costs on the order of 20 cents/kWh, that means a rule of thumb is "double the wattage, that's how many $ it'll cost you to have it running all year".


Your math is right, if you're not recording at night then you can switch it off at night (we switch one of ours off at night on a timer)


A good approximation is that 1W of power for a year costs you one Dollar (or Euro) per year.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: