Hacker News new | past | comments | ask | show | jobs | submit login

You COMPLETELY missed the elephant in the room : 8K TVs have really, really massive CPUs that waste a TON of power (150-200w for the CPU, 300-400w for the TV, often!) Think 8 cores of the fastest arm 64-bit processors available plus extra hardware accelerators! They need this extra processing power to handle the 8K television load, such as upscaling and color transforms - which never happen when you are using them as a monitor!

So, 8K TVs are a big energy-suck! There's a reason why European regulations banned 100% of 8K TVs until the manufacturers undoubtedly paid for a loophole, and now 8K TVs in Europe are shipped in a super-power-saver mode where they consume just barely below the maximum standard amount of power (90w) ... but nobody leaves them in this mode because they look horrible and dim!

If everybody were to upgrade to an 8K TV tomorrow, then I think it would throw away all the progress we've made on Global Warming for the past 20 years ...




Anecdotally my house draws 0.4 kW when idle and 0.6-0.7 kW when both my 8K screen and my computer are on. Since my computer draws 0.1-0.2 kW, I surmise that the QN800A doesn't draw 300-400 W total --- maybe 100-200 W.

I run my screen on a brightness setting of 21 (out of 50) which is still quite legible during the day next to a window.

Also, I have solar panels for my house (which is why I'm able to see the total power usage of my house).


RTINGS reviewed the Samsung QN800A as consuming 139W typical, with a 429W maximum: https://www.rtings.com/tv/reviews/samsung/qn800a-8k-qled#tes...

The parent comment is completely wrong on nearly every point it makes. I don't know why it's so upvoted right now.

It doesn't even pass the common sense test. Does anyone really think TVs have 200W CPUs inside just to move pixels around? That's into the territory of a high-end GPU or server CPU. You don't need that much power to move some pixels to a display.


Surely they'd be using image processing ASICs instead of CPUs anyway, hence why they don't draw that kind of power!


I didn't smell anything. A 200W PSU isn't terribly expensive and being cheaper than more efficient processors seems reasonable. I also only run a single 4k monitor so haven't thought about driving 4x the pixels recently.


> I didn't smell anything. A 200W PSU isn't terribly expensive and being cheaper than more efficient processors seems reasonable

200W is the realm of powerful GPUs and enthusiast/server CPUs.

Common sense would rule out an 8K TV requiring as much power as an entire gaming GPU just to move pixels from the HDMI port to the panel.


And certainly it would require active cooling. TV with audible fans could be a hard sell!


Maybe these TVs are using salvaged Pentium4 CPUs...


That's a facially absurd statement. Just on the numbers:

The US consumes 500 gigawatts on average, or 5000 watts per household.

So if every household bought an 8K TV, turned it on literally 100% of the time, and didn't reduce their use of their old TV, it would represent a 10% increase in power consumption.

The carbon emissions from residential power generation have approximately halved in the past 20 years. So even with the wildest assumptions, it doesn't "throw away all the progress we've made on Global Warming for the past 20 years ...".


How does it compare to working from home as opposed to driving to the office?

E.g. let's say I drive 10 miles a day to get to the office vs use an 8k TV at home?

If I go out of my way to work from home, would I be ethically ok to use 8k monitor?

Back of the napkin it seems like 8k monitor would be 10x better than driving to the office?


The parent comment is wrong. The Samsun QN800A uses 139W typical, with a peak around 400W according to reviews https://www.rtings.com/tv/reviews/samsung/qn800a-8k-qled#tes...

To put it in perspective, an electric car might need 350 Watt-hours per mile. A 10-mile drive would use 3.5 kWh. That's equivalent to about 24 hours of using that monitor at normal settings, or about 8 hours at maximum brightness.

The comparison doesn't make sense, though, because if you drove to the office you'd still be using a monitor somewhere. A single 4K monitor might take around 30-40W. Using four of them to equal this 8K display would come in right around the 139W typical power consumption of the 8K 65" monitor.


I don't think this is an honest question.

There's no "fixed budget" of energy that is ethically ok to use. The parents point was that these devices are woefully inefficent no matter which way you look at them.

The "best" thing to do would be neither, and is usually to just use the device you have - particularly for low power electronics as the impact of buying a new one is more than the impact of actually running the thing unless you run it 24/7/365


> There's no "fixed budget" of energy that is ethically ok to use.

Not even 0.00001 W? How is it ethical to live in the first place in such case?

> The parents point was that these devices are woefully inefficent no matter which way you look at them.

It's always a trade off, of productivity, enjoyment vs energy efficiency, isn't it? If I find a setup that allows me to be more productive and enjoy my work more, certainly I would need to balance it with how much potential waste there is in terms of efficiency.

> The "best" thing to do would be neither, and is usually to just use the device you have

That's quite a generic statement. If my device is a budget android phone, do you expect me to keep coding on it, not buying better tools?


FWIW (just to clarify on this one area if I may)

>> There's no "fixed budget" of energy that is ethically ok to use.

> Not even 0.00001 W? How is it ethical to live in the first place in such case?

The idea of no fixed budget is there is no binary threshold. Where above is bad and below is okay. Just a spectrum.


> the impact of buying a new one is more than the impact of actually running the thing unless you run it 24/7/365

This is wrong for basically everything we ever use - from a house, to an electric car. And especially for small items like electronics


I’d be interested in hearing you reconcile your statement by making an ethical case for the energy use of video games and the hardware that runs them.

Is there a level of energy that is ethically okay to use for video games?


No more or less than any voluntary activity. What about baking for pleasure?

My point isn't that all energy use is bad, it's that there's no line that says "use to hear as you wish but everything after this is wrong"


Read up on the 2000 watts society


> You COMPLETELY missed the elephant in the room : 8K TVs have really, really massive CPUs that waste a TON of power (150-200w for the CPU, 300-400w for the TV, often!)

RTINGS measured the Samsung QN800A as consuming 139W typical, with a peak of 429W.

Your numbers aren't even close to accurate. 8K TVs do not have 200W CPUs inside. The entire Samsung QN800A uses less power during normal operation than you're claiming the CPU does. You do not need as much power as a mid-range GPU to move pixels from HDMI to a display.

> There's a reason why European regulations banned 100% of 8K TVs

This is also incorrect. European regulations required the default settings, out of the box, to hit a certain energy target.

So large TVs in Europe (8K or otherwise) need to come with their brightness turned down by default. You open the box, set it up, and then turn the brightness to the setting you want.

> until the manufacturers undoubtedly paid for a loophole

This is unfounded conspiracy theory that is also incorrect. Nobody paid for a loophole. The original law was written for out-of-the-box settings. Manufacturers complied with the law. No bribes or conspiracies.

> If everybody were to upgrade to an 8K TV tomorrow, then I think it would throw away all the progress we've made on Global Warming for the past 20 years ...

The Samsung QN800A 8K TV the author uses, even on high settings, uses incrementally more power than other big screen TVs. The difference is about equal to an old incandescent lightbulb or two. Even if everyone on Earth swapped their TV for a 65" 8K TV tomorrow (lol) it would not set back 20 years of global warming.

This comment is so full of incorrect information and exaggerations that I can't believe it's one of the more upvoted comments here.


> RTINGS measured the Samsung QN800A as consuming 139W typical, with a peak of 429W.

Can you explain why does a TV's power fluctuate so much? What does peak load look like for a TV? Does watching NFL draw more power than playing Factorio?


Most likely brightness. Turn the brightness to the maximum value and power will go up a lot.


Power consumption varies significantly based on what's being displayed, on top of brightness settings.

I have a 42" 4k LG OLED. With a pure black background and just a taskbar visible (5% of screen), the TV draws ~40W because OLED pixels use no power when displaying black.

Opening Chrome to Google's homepage in light mode pulls ~150W since each pixel's RGB components need power to produce white across most of the screen.

Video content causes continuous power fluctuation as each frame is rendered. Dark frames use less power (more pixels off/dim), bright frames use more (more pixels on/bright).

Modern OLEDs use Pulse Width Modulation (PWM) for brightness control - pixels switch rapidly between fully on and off states. Lower brightness means pixels spend more time in their off state during each cycle.

The QN800A's local dimming helps reduce power in dark scenes by dimming zones of the LED backlight array, though power consumption still varies significantly with content brightness. It's similar to OLED but the backlight zones are not specific to each pixel.

Dark mode UIs and lower brightness settings will reduce power draw on both QLED and OLED displays.

Traditional LCDs without local dimming work quite differently - their constant backlight means only brightness settings affect power, not the content being displayed.

This explains those power fluctuations in the QN800A measurements. Peak power (429W) likely occurs during bright, high-contrast scenes - think NFL games during a sunny day game, or HDR content with bright highlights. For gaming, power draw is largely influenced by the content being displayed - so a game like Factorio, with its darker UI and industrial scenes, would typically draw less power than games with bright, sunny environments.


Thanks for taking the time to write this.

I was under the incorrect impression the power consumption would related to the rendering of the image (ala CPU/GPU work). Having it related to brightness makes much more sense.


Great aspect to consider, thanks for raising it.


To be fair it's not the energy that you're concerned with; it's the source of that energy.

Private jets can't run off nuclear power grids. Also the real problem-child of emissions is not America. China has a billion more people, what are their TVs like?


Good points. I would go further and say it is the integral of emissions over time that we would be most concerned with. From that perspective, over the last 200 years - it is problem children, and rising problem childs.


This is a bit much.

The average American household uses about 29 kilowatts of power per day (29,000 megawatts).

The difference between using a 4K screen and 8k screen for 8 hours a day is about 100 megawatts difference.

I wouldn't get over excited about something increasing overall energy usage ~2.5%.


> The average American household uses about 29 kilowatts of power per day (29,000 megawatts).

Ignoring the megawatts error that the sibling pointed out, it's 29 kilowatt hours per day. Watts are a unit of power consumption -- joules (energy) per second.

One kilowatt hour is the energy used by running something at 1,000 Watts for one hour.


I think you mean 29,000 W (watts), not megawatts (MW), which are 1000 kilowatts (kW).




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: