The problem with all these font rendering discussions is that we are missing some ultra-high quality (offline) reference rendering system to compare against. Not only does that make these discussions pretty unproductive (just subjective back and forth), but also practically that drives font designers to build fonts that look good on their preferred platform(s), rather than something that is built to look good on spec. This drives then feedback loop where other platforms then need to start emulating the major popular platforms with their flaws instead of aiming for the highest quality; for example this stem-darkening is almost certainly just inspired by macos but doesn't really have justification outside that.
On a related note, I thought that by 2024 we would have displays with 250+ dpi resolution, but to my disappointment I'm still waiting for this to be a reality on anything but small laptop screens. A lot of the rendering tricks that corrupt the appearance of fonts have to do with how few pixels are available for rasterisation. We should have been getting print-quality text by now.
Right, I have seen the Dell screen before. But it is currently an extremely niche product which means it is wildly outside my price range. And the VG3281 is still not available in my country even though it was released in 2023.
Click the picture to get a full resolution version. On my phone it's quite obvious. But to my eyes it just looks like someone made it bolder, not less blurry.
Personally, I never had issues with fonts on any OS that much except when I connected my M1 MacBook to a 1080p monitor, then it felt like the fonts had no anti-aliasing at all.
That’s because MacOS got rid of subpixel antialiasing sometime after launching Retina screens, which makes non-hiDPI screens have quite awful font rendering.
I sometimes switch to a bitmap font like Fixedsys Excelsior or GNU Unifont when using MacOS with a low-resolution monitor to compensate (with antialiasing off so the bitmap font looks crisp).
Also, JetBrains Mono somehow looks good on lowres screens even though it’s not a bitmap font, it seems to not blur as much as other fonts when it gets antialiased.
Subpixel antialiasing is going to be problematic anyway on newer displays that don't necessarily feature a plain RGBRGB (or similar) subpixel arrangement. For example, many OLED screens use RGBG/BGRG or even more complex "PenTile" subpixels.
> Subpixel antialiasing is going to be problematic anyway on newer displays that don't necessarily feature a plain RGBRGB (or similar) subpixel arrangement.
This will then mean
making the subpixel anti-aliasing algorithm aware of different subpixel layouts. And this ought to be done anyway, because most anti-aliasing is usually at least somewhat hardware-aware. In my opinion, regardless of how subpixels are laid out, more resolution is always better.
But this is an issue that applies to VA panels as well (cheaper than IPS, worse viewing angles, but better contrast ratio), and I have a 27" 4k VA screen that works just fine with it turned on in Linux — text is so much clearer with it on than off, and attaching a MacBook to a 4k screen at 27" or 32" IPS makes me hate MacOS for killing subpixel rendering off.
As for "retina" resolutions, I've tried 24" at 4K as soon as it came out (with that Dell monitor that required 2 DP 1.1 connections for 60Hz IIRC), and turning subpixel rendering off made text and lines jagged — that was ~190 ppi at a normal viewing distance with vision corrected to better than 20/20 (which is what I usually have — can't really work without glasses anyway, and worse correction leaves me with headaches). For the record, 5k at 27" and 6k at 32" is roughly ~216 ppi, so not much better than ~190 ppi: subpixel rendering probably achieves 2x the increase in text clarity for those not sensitive to colour fringing (I am not).
So, subpixel rendering is really not an issue on any displays, but Apple will happily tell you what's the limit of your vision and upsell you on their monitors.
Fontconfig on Linux has an option to set the subpixel layout, though currently only rgb, but, vrgb and vbgr are supported. Maybe this could be extended for OLED monitors
The “sometime” happened in macOS Big Sur. Prior to that in Mojave and Catalina, you could enable it back by twiddling a hidden preference with the word “legacy” in it. It somehow was worse than what you got in High Sierra and prior anyway.
Subpixel antialiasing is kind of overrated anyway. On screens that are low enough DPI to benefit from it, it can cause color fringing (especially for people with astigmatism) that is worse than the blur from grayscale antialiasing.
I disagree: I am not susceptible to colour fringing, and I can tell if subpixel rendering is on or off on 24" 4k screens (~190 ppi) at a regular or even further viewing distance (~70cm/27") — I specifically got that display hoping for subpixel rendering to be turned off.
Haven't tried Apple's big "retina" screens but considering they are ~215 ppi, pretty confident 10% increase in PPI wouldn't make a difference that subpixel rendering does. Laptop screens have higher resolution, but haven't really paid attention to whether M1 Air 13" or 4K 14" X1 Carbon work for me without subpixel rendering (I prefer to be docked).
Before anyone jumps on "you've got incredible vision": I wear either glasses or contacts, and with that my vision corrects to better than 20/20 — slightly lower correction induces headaches for me. Without glasses, I'd probably be happy with 640x480 on 32" so they are kind of a must. :)
On medium-DPI screens, I find that subpixel antialiasing make fonts significantly less blurry than grayscale antialiasing without causing obvious color fringing. On actual low-DPI screens, bitmap fonts are IMO the only really usable option. (YMMV, but I have mild astigmatism and use glasses.)
They don't sell anything without hiDPI for quite some time now (a decade?). Making their software look good on obsolete screens is understandably not a priority for them. And if you are happy to plug in something that old, you are kind of signaling that you don't really care about what things look like anyway. So, why bother to make that look good?
> They don't sell anything without hiDPI for quite some time now (a decade?). Making their software look good on obsolete screens is understandably not a priority for them. And if you are happy to plug in something that old, you are kind of signaling that you don't really care about what things look like anyway.
My apologies for buying 1080p monitors that had no issues with neither my Linux, nor my Windows computers, I guess. I can understand that they might not care about what I care about (supporting the hardware that I have, rather than going out of my way to buy a new monitor just because of a new computer deciding not to work with it well), I'd argue that maybe that's even fine because it's their device and ecosystem, but jeez, that tone is super uncalled for.
As an aside, I use the M1 MacBook at a scaled resolution of 1440x900 because anything finer is hard for me to see. That's a visible PPI of around ~130 because of the 13.3 inch screen. A 1080p monitor of 21.5 inch diagonal size would have a physical PPI of around ~100, so that's around 80% of the pixel density. That's not to say that the panel on the MacBook is not much nicer, but rather that with software anti-aliasing it could definitely be okay. Somehow I don't want to buy a new monitor just for the weekends when I visit the countryside.
I have a perfectly good normie dpi 25x16 display which is extra crisp on windows. On macOS I had to install betterdisplay just to make it not miserably bad; it’s just plain bad now. As far as I can tell Apple removed the feature because of greed and laziness.
There are plenty of non-hiDPI screens from other vendors on the market, especially “large” screens that are “medium” in price. In an office you’re not always free to order a screen from any vendor you want (due to their framework agreements), unless of course you’re paying for that hardware privately.
I care about how things look, and have spent more time than I want to admit configuring MacOS apps to look good on the screens available to me. I just don’t care enough to buy an expensive office screen with my own cash if my employer can’t provide one.
Apple specifically wants that you cannot use non-apple displays by artificially worsening the experience for the user while strengthening the illusion that Apple's hardware looks better - even though the only reason it does is because Apple themselves made sure to make other displays look unnecessarily bad.
It's hilarious there are people that actually think this is totally okay and not just plain anti-competitive with just enough plausible deniability to get away with it
In a few more words: not at all, not even slightly.
To explain briefly:
> Apple specifically wants that you cannot use non-apple displays
No. Apple does not make or sell or offer non-HD displays and has not done for over a decade. Apple mainly sells phones and laptops with built-in hiDPI screens. Desktop computers that use external screens are a small part of its range, and it sells its own very high-quality screens for those.
Because font antialiasing is pointless on a hiDPI screen, and it only offers hiDPI screens, it removed antialiasing from its OSes.
However, the kit does still support old screens and you are free to use them. The antialiasing feature is gone, but to my (not very strong) eyesight it doesn't matter and stuff looks fine.
> artificially worsening the experience for the user
No. This is paranoia.
> It's hilarious there are people that actually think this is totally okay
People think it's okay because your interpretation is paranoid.
> not just plain anti-competitive
How is REMOVING features anti-competitive? In what universe does taking something out of your products hurt your competition? That is absurd.
> How is REMOVING features anti-competitive? In what universe does taking something out of your products hurt your competition? That is absurd.
You're unironically arguing that EEE isn't anti competitive?
The whole strategy is about removing support/features at the right time when users cannot realistically leave, putting the nail in the competitors coffin.
Simply put:
1. initial product supports both equally
2. People start using your product
3. Competitors product work less well
4. People will use the better working product. Despite the fact that the downgrade in quality is artificial.
Or is it only anti-competitive if Microsoft does it, Apple being the last bastion of healthy competition on the market, with groundbreaking examples like the AppStore and the green/blue bubbles in their chat app?
MacOS have always had the best font rendering on HiDPI screens and also the worst on Low-DPI.
Windows is historically very good on Low-DPI but they also managed to be great on HiDPI.
Linux, well, it depends on so much things … you can achieve good on both but you’d better be ok with integer scaling and not have multiple displays with different DPI.
It's funny to read that thread (from 6 years ago, wow time flies) and see complaints that a lot of people have low-dpi external displays. But I think some of the rebuttal comments in that article rang true even then, and certainly do now: if you are spending money on an Apple laptop or workstation, Apple is going to expect you to spend money on hi-dpi external monitors.
I'll reiterate a comment I made elsewhere in this thread. With my vision corrected, 4k at 24" (~190ppi) needs subpixel rendering for crisp fonts. I would expect 5k at 27" or 6k at 32" (both around ~215ppi) would be the same, so my only option for comfortable external use with a Mac is really 8k at 32". I know that I am an outlier as I get my vision corrected with glasses/contacts to better than 20/20 (outside that 5%-95% group I guess), but I was perfectly well served by subpixel rendering (and continue to be).
Luckily, Mac is only for work, and it is passable with 32" at 4k, but I can use Linux everywhere else for much nicer fonts. Unluckily, work is 8+h :)
Yeah, the issue is that 6 years ago your only option for a highdpi monitor with the correct scale (e.g. for 27'' it needs to be 5k, not 4k) would be the iMac or the XDR display that costs over $5k...
Now that Apple sells their own (very decent) monitor at somewhat more affordable price it makes sense to use it as an external display, I agree.
> MacOS have always had the best font rendering on HiDPI screens and also the worst on Low-DPI.
No it hasn't.
Maybe to you "always" means "since 2014" but if so that means you are very young and you should not generalise from that.
I've been using Macs since 1988 and Mac OS X since 2001 and it used to be great on SD screens. I used to use Safari on Windows XP because its font rendering was so much better than the built-in Windows Truetype renderer.
You can have displays with different DPI on Linux and achieve good font rendering. But it requires the latest versions of your favourite DE (like GNOME 45+ and KDE 6.x) and you'd need to give up X11 (which does not support mixed DPI very well).
I'm literally sitting at a station with 3 monitors 2 high DPI and 1 low DPI wherein UI elements of a window moved from one monitor to the other are identically sized and this was the case when all 3 monitors were different sizes and DPI as well.
In this case the UI is scaled up by an integer factor on all screens so as to look nice on the highest DPI screen. It is scaled down from the higher resolution by a decimal factor that could be but needn't be an integer. If the factor is proportional to the difference in DPI the result is UI elements being sized precisely the same size across different size and dpi monitors.
All monitors share a singular scaling factor and DPI. Apps thus need to support high DPI but needn't do anything smart to support scaling because it happens outside of the apps remit.
This can be achieved again with xrandr --scale OR in the nvidia-settings GUI by setting viewport in to a higher resolution than viewport out. No the result isn't blurry.
First thing first, the X11 protocol is not aware of DPI. Period. And that has implications. You can apply all sorts of hackish solutions on top of it but (1) the solutions will most likely be out-of-band (not pure X11) and involve DBus/envvars/Xft.dpi, and (2) per-output.
Out-of-band solutions are effectively bandaids and they unnecessarily increase development difficulty of GUI programs on Linux, since developers now have to be aware of the various side channels (fragmentation!) that communicate the DPI scale.
This is why SDL2 for the longest period did not support HiDPI on X11, but does on Wayland, MacOS, and Windows. A COSMIC dev just recently made a complaint about XSettings too! [0] You can't just ignore those problems, "Linux is hard to develop for, blah blah fragmentation blah blah" I am sure you have heard of.
Another thing. Per-output HiDPI is fine when all your programs support high DPI, but it's unworkable if you want to mix LoDPI and HiDPI applications in a single screen, i.e. if an application has better user experience if it is upscaled and blurry (!), you are SOL unless you want to apply scaling to your entire desktop.
You also lose the opportunity to implement some neat features like temporarily scaling up a window if it is being magnified or screenshot. (The idea's been floating about in the KDE community)
Finally, we can argue for days, but the HiDPI page on Arch wiki already says a lot when a good 90% of the article is about acheiving good DPI scaling on X11 [1]. Even the Wayland sections have an XWayland subsection in them...
Scaling is a built in feature of X for over 20 years intended for this use case no bandaids of any sort or awareness by the applications of differing DPI.
From the perspective of apps there is only one DPI with all scaling down handled by X at a layer below the apps level. There really aren't any lodpi apps on a modern X system only Wayland has issues scaling X apps correctly. Gtk Apps not updated this decade can handle scaling up by integer factors and because X scales the lower dpi monitor down to the appropriate size the app doesn't need to handle anything else.
Its very odd for folks to argue that my mixed DPI system using only basic boring old xorg.conf somehow doesn't exist. I mean would you like to come to my apartment and open a few different apps?
Its in a way ironic that between pure X, pure Wayland, and Wayland + xwayland only the last can't handle mixed DPI.
> There really aren't any lodpi apps on a modern X system only Wayland has issues scaling X apps correctly.
The problem starts at XWayland, not Wayland. And XWayland is just your typical X server with all its X intricacies.
Only if we can somehow communicate our intended scale to the X clients through XWayland... oh, we can't!
> Its very odd for folks to argue that my mixed DPI system using only basic boring old xorg.conf somehow doesn't exist.
You keep on confusing per-window and per-output DPI. I can only assume you are feigning ignorance at this point. Good luck with attitude.
I still have a couple GTK2 apps installed that doesn't support HiDPI, those are best used upscaled but... they can't. Those apps are unable to communicate to the server they are rendered at 1x (rather, the server doesn't have the notion of per-window scales at all).
The desirable quantity is the result not the mechanics specifically that one can simply open windows on any display and move them between monitors without worrying about scaling. Ideally an element on one ought to be the same number of mm should you pull out a ruler.
This is easily achievable on X11 and its easily done with Wayland.
You mentioned not being able to do scaling per window. Well only with xwayland do you ever need to do so. You need it to be smarter than it is and so scaling on xwayland sucks in a way that isn't a factor in X or Wayland.
Since Wayland is dependant on xwayland for unsupported apps and mixed DPI + xwayland sucks effectively only Wayland sucks at mixed DPI.
> The desirable quantity is the result not the mechanics specifically that one can simply open windows on any display and move them between monitors without worrying about scaling.
Then next time don't bring up compatibility when you talk about the benefits of staying on X11. :) You are breaking the legacy programs anyway by forcing LoDPI programs to remain un-upscaled and therefore unusable on HiDPI displays.
> Well only with xwayland do you ever need to do so.
I am sorry, but the problem I mentioned exists as well on a pure X11 setup.
And the "solution" for this problem, on this setup, is to scale up the entire desktop (blurriness everywhere) which is decidedly worse than XWayland where only a single window is affected, assuming your other HiDPI-aware programs run natively on Wayland (which, from my experience, is pretty much true -- after all, HiDPI programs most likely are still actively developed).
The only other viable choice on a pure-X11 setup is a non-solution: just live with the tiny UI whenever you are using HiDPI-unaware programs.
Either you bring up documentation showing people that X11 has something akin to Wayland's buffer_scale event, or consider this discussion finished. It is tiring to talk to a brick wall.
Please try to read this honestly I promise I'm not screwing with you or misunderstanding.
Outside of this conversation mixed DPI for about 20 years has meant having more than one output of substantially differing DPI most commonly 4K + 1080p.
X11
Under X11 the idea of low DPI applications doesn't exist because broadly speaking every toolkit under the sun supporting integer scaling to deal with this common case about 15 years ago. Individual apps further cannot not support viewport scaling either because it happens at a layer out of scope of the application.
To test I did a search for gtk2 and loaded up fsv a gtk2 port of file system visualizer copyright 1999. It and the QT6 apps released last month I have all appear correctly scaled across high and low and high dpi monitors. The bugaboo of "low dpi apps" just flat out doesn't exist. Read that again if it doesn't make sense.
Wayland
Now leaving the Kindom of X the specter of low DPI apps appears. Suddenly instead of a unified identically scaled virtual space we have little isolated X kingdoms sprinkled inside of wayland space like pimples on a dogs ass. Because we are no longer playing at scaling in them down outside the apps purview the app must now explicitly support scaling and behold the only low DPI apps on the Linux ecosystem hath appeared!
ONLY WAYLAND HAS LOW DPI APPS
Worse its users mostly gnome users smugly assume that X users suffer from problems only they have but worse.
Remember GTK2 app from 1999 on 4K monitor, QT6 app from 2024 on 1080p monitor no manual adjustment perfect scaling. Put them side by side on the 4K perfect scaling. No farking about with anything ever.
Does it have anything to do with DPI? I thought basically the state is Windows rules, MacOS and Linux suck on non-integer scaling ratios. For integer nobody has problems AFAIK.
Am I the last person on the Earth to turn off font smoothing completely and use fonts perfectly hinted to the pixel grid? Can't get any sharper than that, and really helps with eye strain.
No, at least that's what I'm going for as well, at least for monospace fonts in editors and such. I can't stand the slightly blurry look of antialiased fonts and absolutely prefer the crisp edges of traditional bitmap fonts.
Apple have always preferred preserving letter forms over hinting, so Macs are therefore also "blurry" on lower DPI displays. The reason they aren't these days is because the DPI is higher.
Usually when people complain about this, the comparison is to Windows, which prefers strong hinting, i.e. snapping the font shapes to within (sub)pixel boundaries.
There also used to be patent encumbrance issues around subpixel rendering, making Linux font rendering on TFTs overall worse by default, but as of 2019 those have expired. Some distributions had already enabled those features anyway.
With any reasonable resolution screen (1080p or more), you can have both good preservation of letter forms and a non-blurry image, simply by applying a sharpening filter after antialiasing. Some image downsampling methods, such as Lanczos filtering, effectively do this for you. The tradeoff is one can detect some very fine 'ringing' artifacts around the sharpened transitions, but these don't impact readability in any way.
Mac font rendering on non retina displays is pretty awful. Mac 'cheats' somewhat with hDPI screens + aggressive super sampling as MacOS doesn't do fractional scaling and instead rounds up to the nearest integer multiple. At my current display scaling settings my MBP is using a 5120x2880 buffer for my 4k (3840x2160) as it's set to scale to a logical size of 2560x1440.
Under a fair comparison on a 1080p display with no scaling even Windows demolishes MacOS these days. Apple dropped support for subpixel AA years ago which really hurts on standard DPI rendering.
This idea of scaling graphics up and down instead of rendering at the display native resolution with size measurements in dpi-independent units was so bad. I understood it to be a neat trick, when the first retina displays were relased. But everything afterwards is just awfull.
With fractional scaling you can get objects misaligned otherwise, and I personally found it very annoying e.g. in VSCode with 1.25x zoom, where the objects would move slightly when you interact with them, due to imperfect and inconsistent size calculations.
IMO the way Apple does this is quite brute force and unconventional, but at least the picture doesn't drift with different scale.
When i look at the first picture the Mac is blurrier but at same time the Windows doesnt keep the balance of the font. Look at all the horizontal vs verzical strokes in ZLF they are for sure drawn to be same optically thickness but on windows they are very wrong.
I dont think windows is such clear winner here. Seems like different philosophies.
Forget the font rendering, it looks like a poorly tuned scaling filter with excessive ringing is being applied in the macOS case. That halo around the letters is the result of a sharpening or bicubic+ resampling filter that's turned up too high.
Both use subpixel rendering is that it is specific to the display where it is shown. You need the same subpixel ordering and you can't really scale it. Your preference may be correct on your display, while mine is correct here: I strongly prefer the second image.
The first image has less hinting, causing the top of the x-height to be rendered at a half pixel. This makes it seem less intense that the rest of the letters. The second image aligns way better and have a more consistent intensity and sharp edges, but gives an overall slightly more bold appearance, and also compromises a tiny bit on the actual shape of letters.
HiDPI displays combined with subpixel renderings for the total win. Unfortunately, that's dead for MacOS today, and for a time, unachievable with Wayland on Linux.
I have both side by side and i think this is more of a myth. They simply render the same on hidipi except very small sizes where the engines differ.
On Linux font selection is terrible, people have problems with Wayland/x11 rendering and other settings (often opinionated defaults from distros).
But when you are lucky :)) you can get pretty much same hidpi font rendering.
Freetype at least has knobs that are both configurable and understandable. On Windows, you only have Cleartype Tuner, and it can be hard to get what you want.
I used to prefer font rendering on Gnome to that of MacOS and Windows, but as of Gnome 46, they changed it so that if you set a fractional scale factor, everything gets blurry (a different kind of blurry than the OP means) just like it does on MacOS, making Windows font rendering the desktop font rendering I like the best as long as I stick to relatively modern apps like browsers, VSCode and the newer apps that comes with Windows like Settings. (Legacy apps on Windows are super super blurry at fractional scale factors.)
I use non-HiDPI 1920-pixel-wide displays at scale factors between 1.25 and 2.0. (Yes, I like big type and big UI elements: my eyesight is bad.)
No, Linux (Gnome/Wayland) uses a method similar to macOS where everything is rendered at a higher resolution and then scaled down. If you have a very high DPI display it looks nice, but if the DPI is not that high to begin with you will notice that the result is a bit blurry. Only Windows has truly good scaling.
I agree. The blurriness at fractional scaling factors on Mac and in very recent versions of Gnome is obvious on a non-HiDPI display (at least a 24-inch one like the ones I use).
Until a year ago, Gnome/Wayland did it the way Windows does it! I.e. the method the OS used to make the text and the other UI elements the size the user specified refrained from applying any blurriness-causing resolution-scaling algorithm as long as you avoided the XWayland compatibility layer (and the compatibility layer that lets modern Windows support legacy apps makes the legacy app blurry, too).
Chrome's "zoom" feature (activated by the keyboard shortcuts Ctrl+plus and Ctrl+minus for example) does fractional scaling (i.e., allows the user to adjust the size of the text and the other UI elements) without introducing any blurriness. Installing Chrome and experimenting with that feature is probably the easiest way for the average reader to see what we are talking about.
One parenthetical detail is that (unlike on Mac or Windows) in order to get fractional scaling at all on Gnome/Wayland you have had to use a command line to add a particular key to a list named 'experimental-features', but that will change next month when Gnome version 47 will be released, at which time the user will be able to change the scaling factor in the Settings app just like one can on Mac or Windows without first having to configure anything or opt in to anything.
I would love to know the reasoning behind the Gnome team's decision here because just because although you or I might not be able to notice the blurriness on a HiDPI display or to say with confidence that the blurriness is there doesn't mean that the image is as sharp as it could be: the scaling algorithm used on Mac and on Gnome versions 46 and 47 is clearly throwing some "visual information" away regardless of the resolution of the display.
Isn't "what MacOS does" just shipping extremely high DPI screens on everything + using regular anti-aliasing on fonts with no hinting or subpixel rendering?
Probably because MacOS doesn't do fractional scaling. If you're not using a scaling mode that's an integer multiple of your monitor resolution MacOS oversamples the whole display. Kinonite is KDE iirc which can do fractional scaling (quite well too) on KDE 6. So if you're comparing 1.5x scaling on KDE compared to MacOS you're actually comparing a 3840x2160 render to 5120x2880.
MacOS always renders at an integer multiple of their internal "logical" resolution, and then just shrinks the framebuffer down at the end to whatever the native resolution of the target monitor is.
Fractional scaling is rendering directly to a framebuffer at the target monitors native resolution, and scaling your UI fractionally to look right on the target screen size.
Apples approach is more resource intensive, but probably makes UI layout implementation a lot simpler and more consistent + potentially looks better than rendering at "real" native res for a lot of content.
I think you're right, I might've been confusing it with proper support for mixed DPI desktops (i.e. one display at 1.5, and another at 1x). I think Plasma 5 could do it but I had a lot of problems until Plasma 6.
Given a 4K monitor you have 3 options:
- No scaling, produces tiny text and UI elements
- 2x scaling, use a logically 1080p display leading to very large text and UI elements
- 1.5x scaling, logically a 1440p display
4k at 1x produces UI that's too small, 2x scaling is too large for a 27 inch monitor. 1.5x sizes everything like a 1440p display but you still get the higher resolution rendering at 4k.
Fractional scaling _is_ rendering at the 'proper' resolution of the display. It can be challenging to workaround some issues like how to deal with window sizes that scale to a fractional native buffer (i.e. a 501 logical pixel wide image becomes 751.5 physical pixels?). Apple decides 'no' to native fractional scaling, so does GNOME unless that's changed recently.
Which imo was a bad choice. It works pretty well on MacOS because Apple only ships machines with a fairly constrained, known set of resolution/physical dimension combos. And they sell their own monitors for stuff like the mac mini where that isn't the case. This means they can design all their UI elements to be the "right" size without needing fractional scaling for a very large percentage of their users.
Gnome is running on all sorts of screens, with all sorts of resolution/dimension combos and on a lot of them you have to choose between "way too big" and "way too small".
I believe GNOME just hides the option for fractional values in their native settings menu, but it can be easily accessed through something like GNOME tweaks and similar.
What I generally do on my Gnome systems running 4k screens is to use 125% or 150% scaling, and then set larger default fonts for the system. Fractional scaling helps keep old apps legible even if imperfect, but updated apps which respect system font size render great (Gtk+ widgets auto-scale to fit the text).
Unfortunately, this approach ain't gonna work with Wayland as well.
Gnome has another set of hidden settings to set default system font sizes: for UI elements and "document" font. Some non-GNOME apps used to respect that too (like Firefox and LibreOffice), but others were not-scaled.
Still, if you mostly relied on Gtk+ apps, where Gtk+ widgets scale to fit the text they are showing, this produced pretty good results even if some spacing was a bit inconsistent (i.e. spacing was too small compared to widget and text sizes).
Unfortunately, this approach seems to work less and less well as Gnome devs are following Apple's lead.
It’s probably what macOS does but they manage to do it cleanly whatever the display scaling you choose, including arbitrary values of scaling that are unsupported by the OS (but that you can force with some software).
On Linux, good luck using anything else than integer scaling. And it’s a shame because with a 4K screen and fractional scaling, you can get both more definition AND more real estate.
In the replies people are talking about current issues regarding scaling and high dpi displays.
That has little to do with it. Apple has had vastly superior font rendering since the day OSX launched, and have been in first place ever since. There's no point in my memory of the past 20+ years this was not the case regardless of display technology.
Even though other systems implement a lot of the same techniques of sup-pixel rendering, fractional scaling, etc. the results speak for themselves. With the same monitors, the same display cables, the same fonts, and the same applications, text rendered by Apple operating systems is more crisp, clean, readable, and pleasing to the eye. Configuring settings on Linux improves the situation, but no configuration gets close to the quality Apple delivers by default. Apple is also superior to Windows in this regard, but the margin is much smaller.
This is coming from a person who maintains a long list of MacOS gripes because of all the things they do wrong. Font rendering is one of the few things they have consistently done better than everyone else.
I might be blind but I'm using 2x4K 32" displays on macOS and the fonts still look like paper to me. It was funny because every Apple commentator I had listened to made it seem like my retinas would melt out of sheer disgust if I didn't use a 5K display.
It looks night and day better than my 2x4K 27" I use with my Windows 11 work laptop, even with ClearType (That was a let down. I replaced 2x1080p 21" displays and expected that to fix the woeful fonts on Windows).
Font rendering is absolute garbage on macOS since they removed subpixel-antialiasing. Everything is a blurry mess unless you have a high-dpi display (and even then it's pointlessly worse).
Mac looks shit. Always has done. Turns out you can get used to anything. My standard is paper. The only way to get close to that is high DPI screens. Guess what Macs come with now? My Linux PC also has a high DPI display. Only difference is you have the choice to spend less and still get decent looking fonts if you don't use Mac.
Yeah, no. Mac font rendering is blurry, Windows font rendering is pixelated or okay with the right ClearType settings, FreeType with slight hinting and subpixel rendering (lcdfilter) is between these two and really fine IMO.
macOS doesn't do anything with fonts. They removed all aliasing features. The Apple solution is to use retina displays for everything. That works great with Linux too.
macOS absolutely does antialiasing, it's just grayscale antialiasing instead of sub-pixel antialiasing. This is obvious if you mess with CSS "font-smoothing: none"
I am not sure if you are talking about fonts included with the OS or about rendering.
But on high DPI screen all OS render fonts well (and pretty much the same). Its how the OS renders the small type on low DPI screens where lots of guessing from rendering engine has to happen.
About the font choices for sure Mac has better selection than open source typefaces… but you know you can take your San Francisco and install it on Linux if you want.