Hacker News new | past | comments | ask | show | jobs | submit login

The LG G3 has a 1440x2560 QHD screen. It takes more power to dive more pixels.



So Android pushed 77% more pixels using only 50% more power? Sounds like Android is more efficient!

Obviously power is more complex than just that, but seriously Android's reputation for being inefficient is really not justified due to things like this, where resolution differences are completely ignored.


A minor correction: The iPhone 6+ renders at 1242 x 2208 but scales it down to 1920 x 1080. So 30% more pixels at 50% more power?


It's the physical pixels that suck most of the power, so "77% more pixels" is more accurate.

Certainly the GPU on the iPhone 6+ is going to take more power because it's rendering at 1242x2208 rather than 1920x1080, but that's a minor effect compared to the physical screen power draw.


That presupposes that lighting up and drawing pixels is most if the overall power budget. Moreover, the biggest component of screen power usage is the backlight, which doesn't scale linearly with the number of pixels.


> That presupposes that lighting up and drawing pixels is most if the overall power budget.

That very much IS most of the overall power budget. That was very not a guess on my part.


See: http://www.displaymate.com/Smartphone_ShootOut_3.htm. At the same brightness level (Anandtech uses 150 or 200 nits I think), the FHD displays use only 15% more power than the iPhone display despite almost 3x the pixels. And total display power is probably about half the power budget.


Read how that test is run. It's showing a static image. There's nothing changing, there is no rendering in that test at all. So of course resolution didn't have an impact, the test is built to test purely the display's power draw and does excactly that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: