If you're a gamer who plays on sub-optimal hardware, chances are you already know this. In general, full screen mode tends to perform better than windowed mode. Windowed mode means that the OS has to render and handle logic for all the window-y stuff, i.e. shadows, borders, close buttons, menus. In Windows 7 I recall a perf win was to turn off the Aero translucent window decorations, and an even bigger perf win was to simply go full screen with your game.
> Double buffering is always enabled; there is no such thing as “single buffering” in modern computer graphics.
Technically there is, it's called front-buffer rendering and there's specs for it (look up EGL_SINGLE_BUFFER or EGL_KHR_mutable_render_buffer). The only practical usage of it, and the only reason the spec exists at all, is for time-warping in VR where the warp is done race-the-beam style so to speak on a per eye basis (as in, while the left eye is being scanned out do the time warp for the right eye)
Granted this is unrelated to OS desktop composition, but hey, the more you know :)
In terms of OS desktop composition windowed vs. fullscreen this is actually something mobile solved, or at least Android did I don't know if iOS is the same or not. By shifting final composition to the hardware compositor and using extensions such as EGL_EXT_swap_buffers_with_damage only the area of the screen that was damaged gets updated & re-composited. Desktop has been pretty stubborn and slow to adopt similar systems, Intel only recently added a HWC fast path at all and only for video playback.
As far as I know, windows on iOS are similar to fullscreen windows on Mac, as in each app has its own surface where it renders, and SpringBoard has a window which serves as a host for BackBoard to draw.
And triple buffering never means more latency than double buffering with vsync. Unless you're talking to Microsoft, who didn't have real triple buffering for a very long time so they took to using the term to refer to a FIFO queue of rendered frames.
Samsung Galaxy phones are single buffered when in VR mode. They 'race' the beam by rendering to the left eye buffer while the right eye is being scanned out. then switch.
"They 'race' the beam by rendering to the left eye buffer while the right eye is being scanned out."
Doesn't that mean they have no framebuffer ?
My understanding of "racing the beam" is that it is only required when you have no framebuffer at all - you are literally cramming the bits, in real time, onto the display circuitry.
If you have even a single framebuffer, you write to that and it handles painting the actual pixels to the display circuitry.
I wouldn't be surprised if there's a "fast path" for this in the hardware - "decode H264 video to the entire screen" seems like a logical target for optimization.
There used to be a function where you paint an area with a color key - bright magenta or nasty cyan - and the graphics card would directly render decoded mpeg2 into that region. If you moved your media player fast enough, you'd see magenta seams, as the rendering would lag one frame. At least with some cards or drivers, the area was not clipped properly, and you would see artifacts if the mask color appeared in other windows (or a window using the color was on top of the video).
I think this fell out of use when people moved to the "compose everything" rendering scheme.
Yup, I remember on Win98 when I was playing video I could open Paint's color picker, target it at the Windows Media Player window, move Paint over the top of WMP, do a flood-fill with my new color, and voila, Paint was playing my movie :D
Hardware accelerated decoding is enabled even in windowed players. On Macs, especially when using QuickTime, the only difference between fullscreen and “windowed” windows is the size of the surface and the hidden Dock and menu windows.
Full screen video could skip most of the other composition that needs to happen, though. So while the video decoding itself may not be any different, it certainly won't have to go through drawing everything else that would otherwise be on the screen.
Edit: as the sibling noted too, they could also de-prioritize other apps since presumably you're not really using them.
That makes little sense. When decoding to fullscreen canvas, on a retina display, the 1080p video needs to be rescaled from 1920x1080 to the retina resolution, whereas in windowed, it is 1:1 pixel mapped (in the case of QuickTime X).
Actually the size does not really matter, since the scaling will be done in hardware on the GPU if possible, and scaling in hardware (or shaders) is not a very intense task.
For video playback you have a first step on the GPU, which is decoding the video (e.g. from h.264 into uncompressed video). You can from there go several ways, e.g. read the video back into main memory, do post-processing on the CPU and somewhere blit it back on a surface. This is not really what one wants for good performance. Instead of that all necessary post-processing (including scaling and other things like color-space conversion) can also be directly done on the GPU after decompression. Again, after that step the output can be read back or it can be directly forwarded towards the video output. For the best performance you go all-the-way on the GPU. This works in fullscreen mode as well as in windowed mode (you can instruct the GPU to use only a part of the screen for the video).
I had some task in the last year where I had to use gstreamer for video rendering, and using various configurations I could see their impact quite good. E.g. CPU-based postprocessing (through the videoconvert pipeline plugin) was very slow compared to the GPU hardware-based one (vaapi) on the Intel Atom box.
There are many possible reasons for the the lower power consumption. E.g. the OS could schedule processes other than the foreground window less often in fullscreen mode. Or it even detects that the user watches video and puts the CPU in a very low power (low frequency) mode. Or it's the overhead of the desktop environment and compositor. I have no idea.
I justed wanted to give an explanation why I don't think that the video scaling makes a huge difference.
And that makes perfect sense. These are all hypotheses to be ruled out, at this point though. Indeed, your "schedule processes" one is essentially the other idea I threw at the beginning. My hunch is more likely that is the deal. I'd have to eliminate both through testing, not reasoning, though. Reason can not and will not ever eliminate why something behaves against expectations.
I remember being amazed the first time you could have a Quicktime player window playing live video and also being slow genie-effected to the dock if you held down the shift key as you minimised the window.
"OS X v10.6 and later automatically optimize the performance of screen-sized windows, allowing your application to take complete advantage of the window server environment on OS X. For example, critical operating system dialogs may be displayed over your content when necessary."
My guess would be that macOS shuts down all the additional composition features when full screen is activated so no need for live previews (or even rendering at all) of other windows, all those pretty bokeh effects, etc. Plus it probably means they can switch to a more efficient hardware decoding configuration as they can dump directly to the screen rather than into an application window?
I am just guessing though. Interesting that the difference is over 2 hours though, I wouldn't have guessed it would be that much.
Mac OS X reached an astounding level of technical goodness in Snow Leopard. Downhill since then. When my Macbook died I bought a refurbished, shitty Chromebook. Living off Azure Notebooks for any kind of computing...
OSX is still pretty great. After Snow Leopard, for a while it looked like OSX is gonna turn into "iOS Desktop Edition", starting with Lion. But that never really happened, Sierra is a very nice OS, and High Sierra is basically the same thing with the OS using Metal.
(Now if only Apple would sell a Macbook or iMac with 512 SSD & 16 GB RAM for a reasonable price)
App nap is really great, I agree. But you know you can still run Snow Leopard, right? You could get an older pre-retina MBP for like $400 to run it at about the same speed as a Chromebook will run ChromeOS.
I'm wondering whether this live blur effect might have played a big part here. Since its full screen, macOS doesnt need to render all these fancy effects. I'd like to see similar tests with live blur turned off.
In System Preferences, under Accessibility -> Display, there are two settings of interest: “Reduce motion” and “Reduce transparency”. The latter is the more important.
Normally yes -- a bit similar to e.g. when a game is displaying a seemingly 'static' image (when you don't move your character), then the GPU is still rendering at X frame/second.
What?! That sounds like something I would write as an amateur. I thought that modern graphics systems would be smart enough to only render when the screen has actually changed. I.e. a tick still comes at 1/60 times a second, and it checks if a region is dirty, but the whole copying of textures and re-compositing (which practically doesn't take time - or rather fixed time - and runs on the GPU, but costs power) is only invoked when something is dirty.
Actually, I remember reading something that Windows doesn't do any blitting / buffer swapping if you just leave it idle and no updates happen on screen. However, there are many ways to throw this off and force drawing every frame if you are not careful.
How many users want their laptop to be less portable? Every quarter pound does make a difference in real world convenience.
You're already giving up performance if you choose a laptop over a desktop computer as your "tool", and probably using some form of cluster if you're doing any heavy computation.
Sometimes reading pros talking about how they need to do pro computing on their pro computer, you wonder whether they would get even more satisfaction from a bag of external batteries and lots of serious-looking accessories.
If they're running complex models somewhere off the grid, a large solar array and some car batteries would look even cooler!
Aside from the tone, I think the thought that the 'pro' series should follow the older body type and optimize for battery and performance. That's not the craziest thing. If you want a console to hit your cluster or work machines, get the not pro line.
There are worlds where people need self-contained mobile compute.
How portable is your laptop if you can't use it for more than 30 minutes without an electrical socket? A quarter pound is an insignificant amount of weight. Most people would probably quickly get used to having to carry around 5 extra pounds of weight in their daily bag/backpack/whatever. Also, there is a massive range of computing power between "I can barely use gmail because my browser is laggy" and needing a compute cluster; and a massive difference between being able to perform your daily computing tasks smoothly vs. with noticeable lag that adds friction to everything you try to do. Your dichotomies are false and hyperbolic.
I think a significant proportion of people who are even moderate computer users would prefer increased battery life and snappy performance at the expense of a slightly bulkier laptop.
The FAA limits allow up to 160 Wh within a device.
A Thinkpad T430 with the 9-cell with a Slice battery pushes that to 188 Wh, and no one seems to question it. You can also bring spare batteries (not the same as an external battery) to plug into it.
The MacBook Pro is not pushing these limits. They have a glued-in, non-replaceable 49.2 Wh battery.
Your best bet at this point may be an external, but users would be better off if manufacturers offered machines with large, replaceable batteries.
Apple used to stick 99 Wh batteries in the 15" Pros, but with the last 2 generations they downgraded that to 76Wh batteries (presumably since their battery life targets are based on Safari browsing and QuickTime playback - or at least that's all they advertise)
Yes, I'd be curious to know why they don't push the limit anymore.
But in any case, we're talking a maximum of 30% improved battery life that you can get by going bigger. Better efficiency when consuming power is likely to make a larger difference.
Yeah, but not everyone flies, not everyone flies into/from the US, the current MacBook is with 76 Wh (IIRC) below the rumored 99 Wh and the actual 160 Wh limit.
And for the love of god I don't understand why a pro notebook does not have swappable batteries [1]. It would add a millimeter and a few dozen grams, but allow you to choose between slim and fat batteries, and replace them when they lose capacity. You can get rid of ports and tell people it is for their best, and they will buy it. Surely you can make it a bit thicker and use the same explanation.
And here is an idea for a business. You know how companies like Westfalia and Karman take VW buses and modify them? Take a new MacBook, saw open the case at the bottom, and add an aftermarket battery, and if you're at it more RAM. Impossible for an individual, but maybe not impossible if you have a factory with all the necessary equipment. It's going to be expensive, but just sell those modded MacBooks to people who buy business class tickets and who have enough money.
[1] OK, I know why - so people have to buy a new one when the batteries go bad.
It seems like the issue with manufacturers doing this is that optimizing for battery has diminishing returns (going from 11 hours from light load to heavy load) and most importantly people buy pretty.
My 2010 15 inch MBP is still getting OS updates. Installed 8GB of memory and an SSD a few years back. Other than the fact that 1.5 hours is about all I can get from the battery, I haven't been compelled to upgrade.
Sure, a 2 year old macbook isn't that old. I think they support them for 4 or 5 years so you're about half your life time there. Good for now but not really sustainable long term.
This. Four years +/- ago the Macbook Pro was my go to laptop for serious work.
Note the "Pro" in the title (and, yes, I bang on about this a lot when provoked).
Apple, please: I work on batteries a lot. Start putting in BIGGER batteries, along with better connectivity (yes, I would like standard USB 2 and 3 connectivity without requiring dongles for, like, most things I want to plug in; yes, I would like an HDMI port or two built in so I don't need to carry dongles; yes, I would like a wired ethernet port because it's still a lot better than WiFi and, again, doesn't need yet another dongle; yes, I do require line in audio built in to the damn box, not just line out, and don't be forcing me to use a dongle to plug in a pair of headphones or I will become militantly angry).
Also, whilst I'm at it, more cores, please: 8, minimum. More memory as well. 32GB minimum, ideally 64GB. And 2 - 4TB storage.
I know, it'll weigh more: I don't care, I'm not puny. Even still, you used to be able to build decently powerful machines with good battery life without building the kind of ridiculous lumps that Dell, et al., shovel out the door as "desktop replacements". Please, remember how to do this, and stop building toy computers for undernourished hipsters with chronic muscle atrophy.
The thing is - they could do all this and ... about 250 people would buy it. Maybe 1000. To break even at that level of production, they'd probably have to charge 5 times what people would want to pay. Meanwhile they have tens of millions buying their "toy computers". Where would you focus your efforts?
Lolwut, companies would INSTANTLY go back to the older MBPs if they could. Do you know how much replacing lost/misplaced/stolen/broken adapters cost?!
The only thing where companies can save with the new USB-C MBP is on power bricks because once the cable inevitably fails you only have to get a new cable instead of a whole new brick.
Indeed. I'm not sure why my comment is getting so heavily downvoted to be honest.
The fact is that a new hire has specifically asked me to buy him an older Macbook Pro in preference to the new touch-bar variant primarily because of the lack of decent ports on the new model and the hassle this causes with dongles (especially when they go missing, which they invariably do).
It's cool, they don't have to: they can do whatever they want obviously, but I'd argue that whilst abandoning the professional market will undoubtedly bolster profit now, in the long term... well, those chickens are going to come home to roost.
It might be because, historically, claiming that Apple are doomed in the future because they're not servicing $SOMESMALLGROUP has been a losing strategy.
And that's a point of view with some merit. However, Apple's strategy at the moment (and it's obviously been going this way for a long time) appears to be somewhat predicated on being a fashion brand. And that's OK, but fashions change. They've been doing a pretty good job so far, but let's say they become unfashionable, whilst already having alienated a market sector that values a laptop as a tool first and an accessory second.
Owning Apple has become a social signal. The problem is these things lose their power when a large number of people are sending the same signal.
Don't get me wrong: I'm not above buying expensive fashion items, but it doesn't change the fact that to me a laptop is primarily a tool. The reason I switched to Apple 6 years ago is because it was a better tool for what I wanted to work with (audio), but I got suckered in by the power, stability and battery life, and found I could run Windows as much as I needed it using Parallels... so most things have moved to OSX.
The thing is, 6 years is a long time and that situation is changing. Apple are no longer releasing laptops that have scaled in capability to meet the demands of my workload because of their increasing focus on, well, being a fashion item I suppose.
So, anyway, in the next year or two I'll probably be looking for a high-specced Linux laptop to replace my Macbook Pro, which doesn't sound like a heck of a lot of fun to me.
> However, Apple's strategy at the moment [...] appears to be somewhat predicated on being a fashion brand.
See, I think if that's where you're starting from, you've already lost the point. Their strategy appears to me to be "stop making gadgets for technoweenies and make them for the other hundreds of millions of people". If 99% of your potential customer base are never going to plug a HDMI cable into a machine, why bother with the machining and cost of having a HDMI port?
> Apple are no longer releasing laptops that have scaled in capability to meet the demands of my workload
Because there's orders of magnitude more money in servicing every one else's needs, unfortunately (or fortunately, I suppose, if you have shares in Apple.)
As they note at the end, this only applies to OS X with QuickTime, but not to Windows 10 on the same hardware.
So while it may be an obvious optimization to some people it doesn't seem to be implemented everywhere it could be. Or the DWM is more efficient than OS X's window composition stuff and there's no difference between windowed and fullscreen modes. Either option probably will annoy some fanboys one way or the other.
One thing that they may have done is to simply change the refresh rate of the display to match the played video's (akin to GSync/Freesync) in fullscreen mode. You can't do so in windowed mode since it will affect other applications, but when a fullscreen video is shown you can safely do so and it may save a bit of power.
> As they note at the end, this only applies to OS X with QuickTime, but not to Windows 10 on the same hardware.
> > Using the Windows Media Player in Bootcamp, power consumption was at a much higher 28.2 W, and we found absolutely no difference between regular and full screen playback.
Windows was less efficient overall, just with no difference in full-screen.
They should have tested with the 'movies and tv' app that is now the default video player in windows 10. From benchmarks I recall seeing, it's by far the most battery efficient video player on the windows platform.
Interesting, and I'm curious why its like that. The only thing being tested is 4k video though. More tests are needed beyond 4k.
Case in point: I don't own a mobile Windows device so it doesn't matter to me. On my MBP (2880 x 1800) 4k wouldn't apply either. I'd run either XviD or DVD.
Well, yes, it is really is surprising that you can go from 8.5 hours of battery life to almost 11.5 hours just by going from a windowed video to full screen.
Apparently only you and Apple know the secret to this because similar battery improvements weren't seen when trying this trick in Windows on the same machine. Microsoft should hire you.
Microsoft crippled their DWM and cut out (as in, remove the code) all the Aero effects (blur). Compare how Windows 7 looks and how Windows 8 and 10 look, and you will see. It was done, partly, for batter life reasons.
So… it seems, in addition to Apple and me, Microsoft has also figured out the secret to having a battery life–conscious window manager.
"Crippled" is a weird word to choose here. They made a design decision to prioritize battery life over fancy graphics. I don't particularly care about the fancy graphics, so that's a clear win for me. Is there something more I'm missing to justify the claim of "crippled" software?
Giving a choice is always best. A lot of people wanted this effect, including enterprising developers who ported portions of Win7’s DWM.EXE & DLLs to make it work.
Actually, on a technical level, on anything but the native scale (in display settings), Apple’s approach is more computationally heavy, as they render a larger texture and then rescale it to the native pixel size. On Windows, in a perfect world, a non–multiple scale should render still at native resolution.
Maybe it is related to "unredirect fullscreen" optimizations many Linux desktops employ. The people saying they don't get similar improvements in Windows by going fullscreen on the same hardware should run the test again with all the Windows 10 graphical improvements disabled.
On Linux, if you use a desktop environment that has many GPU-heavy effects, you will see a worse drop off.
On Windows 10, it doesn’t compare, as the most heavy effect DWM has to deal with is alpha blending (status bar transparency). Microsoft dropped all of Aero to save battery life. We’ll see how it fares once the new design elements (with blurred backgrounds) start appearing in future betas.
The tips above (reduce motion and transparency) also work great on iOS devices. Disabling AppNap is also important on battery.
On iOS, the system does a lot when entering battery saving mode.
>All the window composition with frame-buffer effects takes a toll.
Which would be relevant if they were moving windows around, instead of just watching a movie in full screen mode vs typical window mode. So not much (if any) composition and no frame-buffer effects in place. At worst, the tiny rect with the clock on the menubar was changing.
This is a feature that has been called out by Apple in the past with regard to battery life. Even iPhones get better battery life when just doing video as opposed to other things. In part, it's because of specialized hardware for decoding. Even the headline here is misleading. It's not full screen mode. It's explicitly using features Apple has made available for some time now.
That this is surprising is really interesting. I honestly thought this was a known thing.
Yes, I know (and it's not just video fullscreen, it's video in QuickTime). And as I said, I thought this was known. Apple has called this out. I mean, did people really think that merely playing a video extended battery life? Or did they just forget about those features that Apple talked about.
This might sound like a plug, but I found it to be very useful/related: when searching for a utility to control App Nap (suggested here) and actually have more control over that, I found the App Tamer utility. This looks like I can now throttle the CPU usage of background tasks like Chrome/Slack, etc.
I do a ton of road warrior stuff on my old MacBook, and I've been struggling with a battery that craps out in under 2 hours due to all the background junk in the Mac OS. This looks to be super useful to me, hopefully others will find the same.
I read a fair number of the comments. Most of the guesses/explanations on why there's a difference make sense. But over 2 hours? That's a significant dip in power consumption that doesn't seem to be explained by most of the comments. Of course, I could be wrong. I'm not an internals junkie. (Sorry?)
Should be noted that this testing appears to have been done playing video in full screen mode. macOS has a full screen mode that any app can take advantage of, it would be good if someone could test that as well.
Ok, right, I know. I’ve made films myself - shown them at festivals around the world inc Cannes - and I’m aware. It’s true; you’re correct on the definition.
The reason people are reacting negatively is not because of the truth value of your point: it’s because your point was pedantic and irrelevant.
Is that time the length of a feature film? Yes. Many of them. Does that mean all feature films have to be that long? No. Are most feature films the general public encounters ~90-120 mins? Yes.
So you’re not adding to the discussion. You’re being that “well, actually...” guy.
I’m not trying to rag on you to be mean, I’m trying to help you understand why it came across poorly.
I thought it was an interesting and relevant comment. I guess people disagree. However, all the people replying telling me such just makes this place feel hostile.
> I’m not trying to rag on you to be mean
Then downvote and move on, don't reply "Dude" :)
I'd be curious to see your work if you have an IMDb link though. If you're going to reply with anything other than a link to your work, please don't.
The reason why you are being downvoted is for focusing on the one thing that nobody could give a crap about in the article. We are talking about battery life, not the definition of what a "feature film" is or isn't.