Hacker News new | past | comments | ask | show | jobs | submit login
Apple MacBook Pro lasts 135 minutes longer on battery in full screen mode (notebookcheck.net)
182 points by g0xA52A2A on Aug 7, 2017 | hide | past | favorite | 137 comments



If you're a gamer who plays on sub-optimal hardware, chances are you already know this. In general, full screen mode tends to perform better than windowed mode. Windowed mode means that the OS has to render and handle logic for all the window-y stuff, i.e. shadows, borders, close buttons, menus. In Windows 7 I recall a perf win was to turn off the Aero translucent window decorations, and an even bigger perf win was to simply go full screen with your game.


Not only that but you're guaranteeing yourself one or two extra frames of input lag due to the double/triple buffering.


Double buffering is always enabled; there is no such thing as “single buffering” in modern computer graphics.

What you are talking about is enforced v-sync to prevent tearing on the normal desktop environment.


> Double buffering is always enabled; there is no such thing as “single buffering” in modern computer graphics.

Technically there is, it's called front-buffer rendering and there's specs for it (look up EGL_SINGLE_BUFFER or EGL_KHR_mutable_render_buffer). The only practical usage of it, and the only reason the spec exists at all, is for time-warping in VR where the warp is done race-the-beam style so to speak on a per eye basis (as in, while the left eye is being scanned out do the time warp for the right eye)

Granted this is unrelated to OS desktop composition, but hey, the more you know :)

In terms of OS desktop composition windowed vs. fullscreen this is actually something mobile solved, or at least Android did I don't know if iOS is the same or not. By shifting final composition to the hardware compositor and using extensions such as EGL_EXT_swap_buffers_with_damage only the area of the screen that was damaged gets updated & re-composited. Desktop has been pretty stubborn and slow to adopt similar systems, Intel only recently added a HWC fast path at all and only for video playback.


Cool, thanks!

As far as I know, windows on iOS are similar to fullscreen windows on Mac, as in each app has its own surface where it renders, and SpringBoard has a window which serves as a host for BackBoard to draw.


And triple buffering never means more latency than double buffering with vsync. Unless you're talking to Microsoft, who didn't have real triple buffering for a very long time so they took to using the term to refer to a FIFO queue of rendered frames.


Samsung Galaxy phones are single buffered when in VR mode. They 'race' the beam by rendering to the left eye buffer while the right eye is being scanned out. then switch.

Just a neat fact.


"They 'race' the beam by rendering to the left eye buffer while the right eye is being scanned out."

Doesn't that mean they have no framebuffer ?

My understanding of "racing the beam" is that it is only required when you have no framebuffer at all - you are literally cramming the bits, in real time, onto the display circuitry.

If you have even a single framebuffer, you write to that and it handles painting the actual pixels to the display circuitry.


race the beam can have multiple usages. Its usage makes sense here IMO.

for the gearvr, the scan out circuitry operates independently and reads out the bits from a buffer in memeory. There is still a buffer.


So if rendering takes too long, you see a partially rendered frame?


in theory, yes. in practice they detect that and quickly show the previous frame reprojected.


Some things that are specific to laptops are tested and well covered in this youtube video by LinusTechTips https://www.youtube.com/watch?v=1z01hT2yy0g


I wouldn't be surprised if there's a "fast path" for this in the hardware - "decode H264 video to the entire screen" seems like a logical target for optimization.


Apple specifically documents this for iOS.

https://developer.apple.com/library/content/documentation/Pe...


There used to be a function where you paint an area with a color key - bright magenta or nasty cyan - and the graphics card would directly render decoded mpeg2 into that region. If you moved your media player fast enough, you'd see magenta seams, as the rendering would lag one frame. At least with some cards or drivers, the area was not clipped properly, and you would see artifacts if the mask color appeared in other windows (or a window using the color was on top of the video).

I think this fell out of use when people moved to the "compose everything" rendering scheme.


Yup, I remember on Win98 when I was playing video I could open Paint's color picker, target it at the Windows Media Player window, move Paint over the top of WMP, do a flood-fill with my new color, and voila, Paint was playing my movie :D


Hardware accelerated decoding is enabled even in windowed players. On Macs, especially when using QuickTime, the only difference between fullscreen and “windowed” windows is the size of the surface and the hidden Dock and menu windows.


Full screen video could skip most of the other composition that needs to happen, though. So while the video decoding itself may not be any different, it certainly won't have to go through drawing everything else that would otherwise be on the screen.

Edit: as the sibling noted too, they could also de-prioritize other apps since presumably you're not really using them.


Yes, that is what I said in this thread in the first place; in fullscreen, the window manager needs only draw the player window.


I wouldn't be surprised if the trick is the resolution it is decoding to. Could test this by picking windows that are a related size to full screen.

Could also have something that just gives the movie larger slices of computer time and clock down, if in full screen.


That makes little sense. When decoding to fullscreen canvas, on a retina display, the 1080p video needs to be rescaled from 1920x1080 to the retina resolution, whereas in windowed, it is 1:1 pixel mapped (in the case of QuickTime X).


I was expecting they optimised there. I was also just throwing up ideas. :)

I assumed convenient sized windows for the viewer, not native res ones.


Actually the size does not really matter, since the scaling will be done in hardware on the GPU if possible, and scaling in hardware (or shaders) is not a very intense task.

For video playback you have a first step on the GPU, which is decoding the video (e.g. from h.264 into uncompressed video). You can from there go several ways, e.g. read the video back into main memory, do post-processing on the CPU and somewhere blit it back on a surface. This is not really what one wants for good performance. Instead of that all necessary post-processing (including scaling and other things like color-space conversion) can also be directly done on the GPU after decompression. Again, after that step the output can be read back or it can be directly forwarded towards the video output. For the best performance you go all-the-way on the GPU. This works in fullscreen mode as well as in windowed mode (you can instruct the GPU to use only a part of the screen for the video).

I had some task in the last year where I had to use gstreamer for video rendering, and using various configurations I could see their impact quite good. E.g. CPU-based postprocessing (through the videoconvert pipeline plugin) was very slow compared to the GPU hardware-based one (vaapi) on the Intel Atom box.


Again, I was mainly throwing ideas out. But you are also conflating "does not" with "should not." Clearly something is different. What?


There are many possible reasons for the the lower power consumption. E.g. the OS could schedule processes other than the foreground window less often in fullscreen mode. Or it even detects that the user watches video and puts the CPU in a very low power (low frequency) mode. Or it's the overhead of the desktop environment and compositor. I have no idea.

I justed wanted to give an explanation why I don't think that the video scaling makes a huge difference.


And that makes perfect sense. These are all hypotheses to be ruled out, at this point though. Indeed, your "schedule processes" one is essentially the other idea I threw at the beginning. My hunch is more likely that is the deal. I'd have to eliminate both through testing, not reasoning, though. Reason can not and will not ever eliminate why something behaves against expectations.


Desktop composition.


I remember being amazed the first time you could have a Quicktime player window playing live video and also being slow genie-effected to the dock if you held down the shift key as you minimised the window.


Yes!


"OS X v10.6 and later automatically optimize the performance of screen-sized windows, allowing your application to take complete advantage of the window server environment on OS X. For example, critical operating system dialogs may be displayed over your content when necessary."

https://developer.apple.com/library/content/documentation/Gr...


My guess would be that macOS shuts down all the additional composition features when full screen is activated so no need for live previews (or even rendering at all) of other windows, all those pretty bokeh effects, etc. Plus it probably means they can switch to a more efficient hardware decoding configuration as they can dump directly to the screen rather than into an application window?

I am just guessing though. Interesting that the difference is over 2 hours though, I wouldn't have guessed it would be that much.


Two best explanations:

1. Screen brightness can be dimmed in full screen mode without a perceivable difference.

2. App nap. It's a macOS feature that dials down the power consumption of apps when their windows aren't visible.


App nap is amazing.

Mac OS X reached an astounding level of technical goodness in Snow Leopard. Downhill since then. When my Macbook died I bought a refurbished, shitty Chromebook. Living off Azure Notebooks for any kind of computing...


What? App nap was added many releases after Snow Leopard.

Edit: 3 releases actually. (Mavericks)


Sorry, I meant memory compression.

Memory compression is just awesome.


Memory compression was also added in Mavericks.


Then... what was it that dramatically increased my battery life from Leopard to Snow Leopard?

(I might be misremembering everything.)


OSX is still pretty great. After Snow Leopard, for a while it looked like OSX is gonna turn into "iOS Desktop Edition", starting with Lion. But that never really happened, Sierra is a very nice OS, and High Sierra is basically the same thing with the OS using Metal.

(Now if only Apple would sell a Macbook or iMac with 512 SSD & 16 GB RAM for a reasonable price)


App nap is really great, I agree. But you know you can still run Snow Leopard, right? You could get an older pre-retina MBP for like $400 to run it at about the same speed as a Chromebook will run ChromeOS.


Much less secure though.


I'm wondering whether this live blur effect might have played a big part here. Since its full screen, macOS doesnt need to render all these fancy effects. I'd like to see similar tests with live blur turned off.


Of course. If you disable it in settings, you should see a noticeable boost in batter life (~1–1.5 hours in my testing).


I wonder if turning off the blur/transparency effects in iOS results in any substantial battery difference...


It does, yes, but less substantial than on Mac, due to tighter vertical integration between software and hardware.


Can you be more specific? What exactly do I need to disable to see a battery life boost?



You need to enable the linked options which in turn disable some visual effects.


forgive my ignorance here (new to Mac), but which setting is this specifically?


In System Preferences, under Accessibility -> Display, there are two settings of interest: “Reduce motion” and “Reduce transparency”. The latter is the more important.


try in settings->Accessibility->Reduce transparency. this will also make the OS more responsive (minimizing, switching windows....)


Are these effects active also in "steady state use"?


Normally yes -- a bit similar to e.g. when a game is displaying a seemingly 'static' image (when you don't move your character), then the GPU is still rendering at X frame/second.


What?! That sounds like something I would write as an amateur. I thought that modern graphics systems would be smart enough to only render when the screen has actually changed. I.e. a tick still comes at 1/60 times a second, and it checks if a region is dirty, but the whole copying of textures and re-compositing (which practically doesn't take time - or rather fixed time - and runs on the GPU, but costs power) is only invoked when something is dirty.

Actually, I remember reading something that Windows doesn't do any blitting / buffer swapping if you just leave it idle and no updates happen on screen. However, there are many ways to throw this off and force drawing every frame if you are not careful.


Yikes, that would indeed be the first thing to fix on a battery-powered platform then.


That's pretty neat.

The thing that stands out to me like a sore thumb though is the 'load' numbers. You only get 1 hour of battery life.

I know modern chips can be a hog but that terrible, especially compared to the 11 hours of light surfing on wifi.

This is the problem with optimizing power sipping all over the place. As soon as you push things none of that matters and your battery time PLUMMETS.

I don't know what they could do though outside of doubling the battery capacity.


Stop optimizing for pretty and add more battery. My laptop is a tool, not a piece of art.


How many users want their laptop to be less portable? Every quarter pound does make a difference in real world convenience.

You're already giving up performance if you choose a laptop over a desktop computer as your "tool", and probably using some form of cluster if you're doing any heavy computation.

Sometimes reading pros talking about how they need to do pro computing on their pro computer, you wonder whether they would get even more satisfaction from a bag of external batteries and lots of serious-looking accessories.

If they're running complex models somewhere off the grid, a large solar array and some car batteries would look even cooler!


Aside from the tone, I think the thought that the 'pro' series should follow the older body type and optimize for battery and performance. That's not the craziest thing. If you want a console to hit your cluster or work machines, get the not pro line.

There are worlds where people need self-contained mobile compute.


How portable is your laptop if you can't use it for more than 30 minutes without an electrical socket? A quarter pound is an insignificant amount of weight. Most people would probably quickly get used to having to carry around 5 extra pounds of weight in their daily bag/backpack/whatever. Also, there is a massive range of computing power between "I can barely use gmail because my browser is laggy" and needing a compute cluster; and a massive difference between being able to perform your daily computing tasks smoothly vs. with noticeable lag that adds friction to everything you try to do. Your dichotomies are false and hyperbolic.

I think a significant proportion of people who are even moderate computer users would prefer increased battery life and snappy performance at the expense of a slightly bulkier laptop.


There are FAA limits on how big a battery can be if you want to take it on a plane. Your best bet is probably an external battery.


The FAA limits allow up to 160 Wh within a device.

A Thinkpad T430 with the 9-cell with a Slice battery pushes that to 188 Wh, and no one seems to question it. You can also bring spare batteries (not the same as an external battery) to plug into it.

The MacBook Pro is not pushing these limits. They have a glued-in, non-replaceable 49.2 Wh battery.

Your best bet at this point may be an external, but users would be better off if manufacturers offered machines with large, replaceable batteries.


Apple used to stick 99 Wh batteries in the 15" Pros, but with the last 2 generations they downgraded that to 76Wh batteries (presumably since their battery life targets are based on Safari browsing and QuickTime playback - or at least that's all they advertise)


Yes, I'd be curious to know why they don't push the limit anymore.

But in any case, we're talking a maximum of 30% improved battery life that you can get by going bigger. Better efficiency when consuming power is likely to make a larger difference.


Yeah, but not everyone flies, not everyone flies into/from the US, the current MacBook is with 76 Wh (IIRC) below the rumored 99 Wh and the actual 160 Wh limit.

And for the love of god I don't understand why a pro notebook does not have swappable batteries [1]. It would add a millimeter and a few dozen grams, but allow you to choose between slim and fat batteries, and replace them when they lose capacity. You can get rid of ports and tell people it is for their best, and they will buy it. Surely you can make it a bit thicker and use the same explanation.

And here is an idea for a business. You know how companies like Westfalia and Karman take VW buses and modify them? Take a new MacBook, saw open the case at the bottom, and add an aftermarket battery, and if you're at it more RAM. Impossible for an individual, but maybe not impossible if you have a factory with all the necessary equipment. It's going to be expensive, but just sell those modded MacBooks to people who buy business class tickets and who have enough money.

[1] OK, I know why - so people have to buy a new one when the batteries go bad.


The limit on li-ion 100W.h standard and 2x160W.h in carry-on with airline approval.

The 2017 15" MBP has a 76W.h battery.


It seems like the issue with manufacturers doing this is that optimizing for battery has diminishing returns (going from 11 hours from light load to heavy load) and most importantly people buy pretty.


You'll have to buy something other than Apple if you want that.


I just buy only older refurb Apple gear.


At some point, the hardware is no longer supported by the OS though.


My 2010 15 inch MBP is still getting OS updates. Installed 8GB of memory and an SSD a few years back. Other than the fact that 1.5 hours is about all I can get from the battery, I haven't been compelled to upgrade.


Sure, a 2 year old macbook isn't that old. I think they support them for 4 or 5 years so you're about half your life time there. Good for now but not really sustainable long term.


Keep this battery trickle-charging in your car?

$699 412Wh / 116,000mAh / 11 pounds

https://amzn.com/dp/B071FG4G13


This. Four years +/- ago the Macbook Pro was my go to laptop for serious work.

Note the "Pro" in the title (and, yes, I bang on about this a lot when provoked).

Apple, please: I work on batteries a lot. Start putting in BIGGER batteries, along with better connectivity (yes, I would like standard USB 2 and 3 connectivity without requiring dongles for, like, most things I want to plug in; yes, I would like an HDMI port or two built in so I don't need to carry dongles; yes, I would like a wired ethernet port because it's still a lot better than WiFi and, again, doesn't need yet another dongle; yes, I do require line in audio built in to the damn box, not just line out, and don't be forcing me to use a dongle to plug in a pair of headphones or I will become militantly angry).

Also, whilst I'm at it, more cores, please: 8, minimum. More memory as well. 32GB minimum, ideally 64GB. And 2 - 4TB storage.

I know, it'll weigh more: I don't care, I'm not puny. Even still, you used to be able to build decently powerful machines with good battery life without building the kind of ridiculous lumps that Dell, et al., shovel out the door as "desktop replacements". Please, remember how to do this, and stop building toy computers for undernourished hipsters with chronic muscle atrophy.


The thing is - they could do all this and ... about 250 people would buy it. Maybe 1000. To break even at that level of production, they'd probably have to charge 5 times what people would want to pay. Meanwhile they have tens of millions buying their "toy computers". Where would you focus your efforts?


> about 250 people would buy it. Maybe 1000

Lolwut, companies would INSTANTLY go back to the older MBPs if they could. Do you know how much replacing lost/misplaced/stolen/broken adapters cost?!

The only thing where companies can save with the new USB-C MBP is on power bricks because once the cable inevitably fails you only have to get a new cable instead of a whole new brick.


Indeed. I'm not sure why my comment is getting so heavily downvoted to be honest.

The fact is that a new hire has specifically asked me to buy him an older Macbook Pro in preference to the new touch-bar variant primarily because of the lack of decent ports on the new model and the hassle this causes with dongles (especially when they go missing, which they invariably do).


If that's the case, then who are all these people buying the new MBP's in record numbers?


Fake users!


Hipsters buying MBPs because they're cool. Or those who had to upgrade from their beloved 2012+ machines due to HW failure.


Really? You can't really believe that that's true...


It's cool, they don't have to: they can do whatever they want obviously, but I'd argue that whilst abandoning the professional market will undoubtedly bolster profit now, in the long term... well, those chickens are going to come home to roost.


Seriously? Again with the downvoting? Can you at least add a comment as to why, please?


It might be because, historically, claiming that Apple are doomed in the future because they're not servicing $SOMESMALLGROUP has been a losing strategy.


And that's a point of view with some merit. However, Apple's strategy at the moment (and it's obviously been going this way for a long time) appears to be somewhat predicated on being a fashion brand. And that's OK, but fashions change. They've been doing a pretty good job so far, but let's say they become unfashionable, whilst already having alienated a market sector that values a laptop as a tool first and an accessory second.

Owning Apple has become a social signal. The problem is these things lose their power when a large number of people are sending the same signal.

Don't get me wrong: I'm not above buying expensive fashion items, but it doesn't change the fact that to me a laptop is primarily a tool. The reason I switched to Apple 6 years ago is because it was a better tool for what I wanted to work with (audio), but I got suckered in by the power, stability and battery life, and found I could run Windows as much as I needed it using Parallels... so most things have moved to OSX.

The thing is, 6 years is a long time and that situation is changing. Apple are no longer releasing laptops that have scaled in capability to meet the demands of my workload because of their increasing focus on, well, being a fashion item I suppose.

So, anyway, in the next year or two I'll probably be looking for a high-specced Linux laptop to replace my Macbook Pro, which doesn't sound like a heck of a lot of fun to me.


> However, Apple's strategy at the moment [...] appears to be somewhat predicated on being a fashion brand.

See, I think if that's where you're starting from, you've already lost the point. Their strategy appears to me to be "stop making gadgets for technoweenies and make them for the other hundreds of millions of people". If 99% of your potential customer base are never going to plug a HDMI cable into a machine, why bother with the machining and cost of having a HDMI port?

> Apple are no longer releasing laptops that have scaled in capability to meet the demands of my workload

Because there's orders of magnitude more money in servicing every one else's needs, unfortunately (or fortunately, I suppose, if you have shares in Apple.)


It is not really surprising. All the window composition with frame-buffer effects (such as multiple layers of blur) takes a toll.

Enabling “Reduce transparency” and “Reduce motion” in accessibility preferences also causes a considerable battery gain on Macs.


I think the snark is unnecessary.

This is surprising because

1) The time - it's a full 2 hours and change - is not nothing.

2) It's not immediately obvious to many active computer users.

I know it's cool to pretend like nothing is surprising these days, but this is something that's legitimately surprising to many people.


As they note at the end, this only applies to OS X with QuickTime, but not to Windows 10 on the same hardware.

So while it may be an obvious optimization to some people it doesn't seem to be implemented everywhere it could be. Or the DWM is more efficient than OS X's window composition stuff and there's no difference between windowed and fullscreen modes. Either option probably will annoy some fanboys one way or the other.

One thing that they may have done is to simply change the refresh rate of the display to match the played video's (akin to GSync/Freesync) in fullscreen mode. You can't do so in windowed mode since it will affect other applications, but when a fullscreen video is shown you can safely do so and it may save a bit of power.


> As they note at the end, this only applies to OS X with QuickTime, but not to Windows 10 on the same hardware.

> > Using the Windows Media Player in Bootcamp, power consumption was at a much higher 28.2 W, and we found absolutely no difference between regular and full screen playback.

Windows was less efficient overall, just with no difference in full-screen.


They should have tested with the 'movies and tv' app that is now the default video player in windows 10. From benchmarks I recall seeing, it's by far the most battery efficient video player on the windows platform.

edit: http://www.pcworld.com/article/3023430/hardware/tested-vlc-v...


Windows media Player is actually deprecated on Windows 10 (2015) and you actually have to make special efforts to use it.

Some info on UWP power usage: https://www.reddit.com/r/Windows10/comments/4iol5z/do_uwp_ap...


Interesting, and I'm curious why its like that. The only thing being tested is 4k video though. More tests are needed beyond 4k.

Case in point: I don't own a mobile Windows device so it doesn't matter to me. On my MBP (2880 x 1800) 4k wouldn't apply either. I'd run either XviD or DVD.


DWM is gimped compared to Windows Vista/7 for battery life reasons. That’s why blur (Aero theme) went away in Windows 8.

Apple started implementing variable refresh in iPads, but not on their Macs as far as I know.


Well, yes, it is really is surprising that you can go from 8.5 hours of battery life to almost 11.5 hours just by going from a windowed video to full screen.

Apparently only you and Apple know the secret to this because similar battery improvements weren't seen when trying this trick in Windows on the same machine. Microsoft should hire you.


Microsoft crippled their DWM and cut out (as in, remove the code) all the Aero effects (blur). Compare how Windows 7 looks and how Windows 8 and 10 look, and you will see. It was done, partly, for batter life reasons.

So… it seems, in addition to Apple and me, Microsoft has also figured out the secret to having a battery life–conscious window manager.


"Crippled" is a weird word to choose here. They made a design decision to prioritize battery life over fancy graphics. I don't particularly care about the fancy graphics, so that's a clear win for me. Is there something more I'm missing to justify the claim of "crippled" software?


Giving a choice is always best. A lot of people wanted this effect, including enterprising developers who ported portions of Win7’s DWM.EXE & DLLs to make it work.


Not only that, but one that can handle HiDPI scaling in a way that doesn't make you want to rip your hair out.


Actually, on a technical level, on anything but the native scale (in display settings), Apple’s approach is more computationally heavy, as they render a larger texture and then rescale it to the native pixel size. On Windows, in a perfect world, a non–multiple scale should render still at native resolution.


It's worth the trade off.


Ancient Unix Secret[1].

Maybe it is related to "unredirect fullscreen" optimizations many Linux desktops employ. The people saying they don't get similar improvements in Windows by going fullscreen on the same hardware should run the test again with all the Windows 10 graphical improvements disabled.

[1]http://losca.blogspot.com/2012/11/game-performance-improveme...


It takes a toll, but not a 2 hour difference kind of toll. The OS has to be equipped to take advantage of the state for it to be that effective.

Does the same test run on W10 or any linux distro do the same? I highly doubt it.


On Linux, if you use a desktop environment that has many GPU-heavy effects, you will see a worse drop off.

On Windows 10, it doesn’t compare, as the most heavy effect DWM has to deal with is alpha blending (status bar transparency). Microsoft dropped all of Aero to save battery life. We’ll see how it fares once the new design elements (with blurred backgrounds) start appearing in future betas.


So is there a guide to battery-optimal MacOS settings? That would be useful.



> Use Invert Colours to save battery life. It takes less battery to display black pixels than white ones.

There's a lot of junk and mostly obvious stuff in the second article.


The tips above (reduce motion and transparency) also work great on iOS devices. Disabling AppNap is also important on battery. On iOS, the system does a lot when entering battery saving mode.


Exactly. I would be disappointed if this wasn't the result.


>Is it really that surprising?

Yes.

>All the window composition with frame-buffer effects takes a toll.

Which would be relevant if they were moving windows around, instead of just watching a movie in full screen mode vs typical window mode. So not much (if any) composition and no frame-buffer effects in place. At worst, the tiny rect with the clock on the menubar was changing.


What I would've love to see is if this feature is unique to the QuickTime player, and if it is, can it be ported to something like VLC.


Would this not be down to the GPU, if it's told to use full screen without windowing, then it'll be more efficient?


Yes, this isn't the only bloated window manager that gets more efficient when it only has to draw one app without a window.


Do you see the same or similar results browsing in windowed vs fullscreen modes?

(I rarely watch video on battery in a window, but I might start browsing fullscreen on battery if there's a benefit)


This is a feature that has been called out by Apple in the past with regard to battery life. Even iPhones get better battery life when just doing video as opposed to other things. In part, it's because of specialized hardware for decoding. Even the headline here is misleading. It's not full screen mode. It's explicitly using features Apple has made available for some time now.

That this is surprising is really interesting. I honestly thought this was a known thing.


The 2+ hour difference is comparing playing video windowed and fullscreen, not video to other tasks.


Yes, I know (and it's not just video fullscreen, it's video in QuickTime). And as I said, I thought this was known. Apple has called this out. I mean, did people really think that merely playing a video extended battery life? Or did they just forget about those features that Apple talked about.


I am absolutely not surprised that almost all Apple users have failed to notice the massive CPU usage of the window manager.


This might sound like a plug, but I found it to be very useful/related: when searching for a utility to control App Nap (suggested here) and actually have more control over that, I found the App Tamer utility. This looks like I can now throttle the CPU usage of background tasks like Chrome/Slack, etc.

I do a ton of road warrior stuff on my old MacBook, and I've been struggling with a battery that craps out in under 2 hours due to all the background junk in the Mac OS. This looks to be super useful to me, hopefully others will find the same.

https://www.stclairsoft.com/AppTamer/


this isn't news, app nap existed in macOS when it was still called OS X


I read a fair number of the comments. Most of the guesses/explanations on why there's a difference make sense. But over 2 hours? That's a significant dip in power consumption that doesn't seem to be explained by most of the comments. Of course, I could be wrong. I'm not an internals junkie. (Sorry?)


Should be noted that this testing appears to have been done playing video in full screen mode. macOS has a full screen mode that any app can take advantage of, it would be good if someone could test that as well.


Very click bait title. Sound like gain in any use case, where in reality it's just video playback :((


That strongly suggests that Apple's window manager is suboptimal. Is it not?


ok so they squeezed some more battery life for watching jackass the movie.

How about they figure out the combination that leads to consistent kernel panic when plugging in/out external monitors.


> Exciting: the MacBook Pro's video playback runs 135 minutes longer in full screen mode - the length of an entire feature film.

A feature film is anything longer than 40 minutes. This idea that movies need to be 90 or 120+ minutes long is absurd.


Dude, don’t be that guy. You know what they meant and everyone understood it.


I just made a film, it's a really common misunderstanding that films need to be 90+ minutes long to be a feature film.


Ok, right, I know. I’ve made films myself - shown them at festivals around the world inc Cannes - and I’m aware. It’s true; you’re correct on the definition.

The reason people are reacting negatively is not because of the truth value of your point: it’s because your point was pedantic and irrelevant.

Is that time the length of a feature film? Yes. Many of them. Does that mean all feature films have to be that long? No. Are most feature films the general public encounters ~90-120 mins? Yes.

So you’re not adding to the discussion. You’re being that “well, actually...” guy.

I’m not trying to rag on you to be mean, I’m trying to help you understand why it came across poorly.


I thought it was an interesting and relevant comment. I guess people disagree. However, all the people replying telling me such just makes this place feel hostile.

> I’m not trying to rag on you to be mean

Then downvote and move on, don't reply "Dude" :)

I'd be curious to see your work if you have an IMDb link though. If you're going to reply with anything other than a link to your work, please don't.


I prefer trying to be constructive to downvotes, YMMV.

The only thing that made it into IMDb was the animated short that got into Cannes. I’m well out of that area now.

It’s called “Ignorance is Bliss”, if you’re interested.


I apologize for accusing you of downvotes. I'll check it out.


The reason why you are being downvoted is for focusing on the one thing that nobody could give a crap about in the article. We are talking about battery life, not the definition of what a "feature film" is or isn't.


I'm aware, but it's not something nobody could give a crap about.


You cut off the end of parent's sentence. It is "something nobody could give a crap about in the article."


It was something in the article I could give a crap about, and apparently enough for everyone else to give me crap about, so...


I just made a 40 minute film of my dog, is it a feature film?


Yep. For a series of shorts (less than 40 mins) starring just dogs look at The Dogville Comedies -- https://en.wikipedia.org/wiki/Dogville_Comedies




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: