Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I’ve heard that macOS fractional scaling seriously is rendering at one size, and then upscaling or downscaling it to the target size. I’m not certain this is true because I haven’t confirmed it myself and it seems such an obviously stupid idea (and though it’s certainly easier, no one else does it that way because it’s such a terrible idea), but I’ve heard people saying this at least three times (twice on the internet, once in real life), about text not being crisp at fractional scaling. I dunno.


It is always rendering at integer scale and always downscaling to target size (upscaling would result in blurriness; upscaled apps are only those that support only @1X scale).

Just take a screenshot of your desktop and check it's resolution, then compare to physical display resolution. Apple uses output scaler of the final, composited image.

It's not stupid; it is a solution that you can implement without support at application side and is relatively simple. Going Android way means, that all apps have to support random scales, which means they have to ship with assets for that.


Both upscaling and downscaling result in blurriness, though upscaling will generally yield slightly worse results. But downscaling is still going to yield a result drastically inferior to rendering at the correct size. It totally butchers pixel-perfect lines, for example. It’s the sort of hack that would be awful on low-resolution displays, and only becomes even vaguely tolerable on high-resolution displays because it’s still somewhat better than a low-resolution display for a lot of what people are using their computers for, even if for others it renders it legitimately unusable.

If this really is true, I remain utterly baffled, and I maintain my position that it is an obviously stupid idea. Doing it that way just makes no sense to me. The visual result is way worse, it’s more resource-demanding and thus slows things down a little, and it doesn’t really simplify anything for app developers anyway—the only difference is that you have an integer scaling factor rather than a float scaling factor; but all code is still having to perform scaling mappings, and using floats would change roughly nothing (though the changes required in your APIs may need to propagate through a few levels, and GUI libraries will have to decide how to handle subpixel alignment). Windows and Android have both done it properly, so that supporting fractional scaling is no burden whatsoever for developers. You talk of having to ship assets for arbitrary scales, but that’s not a reasonable argument: GUI libraries should always choose the most suitable version of a resource, and scale it to the desired size themselves if it doesn’t match exactly.

The result of taking the proper approach is that users of fractional scaling may get icons being rendered poorly, but images, text, vector graphics, &c. will be rendered precisely and crisply. Meanwhile, this other behaviour people are saying Apple is doing is just guaranteeing that everything is rendered poorly. Surely they’re not actually doing this? Is it perhaps a case of them having erred in making Retina support integral scaling only, but they’ve since made a better version that supports fractional scaling that each app can opt into, but they just haven’t insisted on everyone fixing their stuff yet? (And remember, Apple’s in an excellent position to do such insisting—they do it regularly.) —But as you say, screenshots are scaled at the next integer, which would suggest that yeah, they’re actually doing this mangling system-wide, and there’s no per-app recourse. Thanks for that explanation.

I just find it hard to believe that Apple would truly butcher things this badly. Even if they’ve been known to do weird things a bit like this before, like killing off their text subpixel rendering with no stated justification, to the clear detriment of low-DPI users (and it may still be worthwhile even for high-DPI users).

I can’t check any of this because I don’t use a Mac. There may even not be a macOS device within a kilometre or two of me.


I understand your position; however, the practice has shown, that this approach is good enough quality-wise. Most users didn't notice. In some aspects it is better than the approach you suggest would be, because it takes into account entire framebuffer at once. It will have less problems with pixel perfect lines, subpixel mouse cursor, etc, than the purely software solution, which will struggle with these more.

Also, it is not more resource demanding; the only cost is the bigger framebuffer. The scaling itself is free: it is done by the output encoder (that means output hardware that does the encoding for eDP/DP/HDMI; it doesn't use GPU at all for that[1]). Apple has one more trick: it doesn't offer the user zoom percentages as Windows or Linux Gnome do. You cannot do 125% on Apple hardware (that's bad corner case; you need to display 8 framebuffer pixels using 5 physical pixels AND you pay the price for @2X framebuffer). The default mentioned above (1440x900@2X) means displaying 9 pixels from framebuffer using 8 physical pixels. It doesn't compound the error at all.

We may discuss whether Windows did it correctly and Mac not, but the fact on the ground is, that all Mac apps run correctly on fractionally scaled displays and Windows apps are mixed bag. Even those apps that do support HiDPI on Windows have weird bugs (I'm not going to name & shame).

By which we are getting to another one of your points: that apps can scale their assets. Sure, they can. And as we can see, every app will do it incorrectly in it's own unique way. So when every app does that, why wouldn't system library do it for them? Any bugs you will fix in one place, and you might find a way to hardware accelerate it in a way, that the respective apps couldn't do (see above about the scaler on the output encoder).

As I wrote, this solution is good enough quality wise, is simple to implement and brings results quickly. Actually, it so so good enough, that Apple is using it in iPhones too (on some models, the physical and logical resolution do not match. The logical resolution is an integer multiple one).

[1] For Intel hardware, you can find more info in the Programmer's Reference Manual, Volume 12: Display Engine.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: