This guy is complaining about fringing...on 9- and 10-pixel high fonts. That works out to 1.6mm or 1.8mm high characters on a 140 dpi screen, or about 1/16 of an inch.
He's also got Cleartype on and set to RGB stripe even though the OLED is not RGB stripe (though to be fair, Windows doesn't really make it clear what each page of the ClearType tuner does).
But yeah, if you use a _tiny_ font and sit _really_ close to the screen, you see fringing. In practice for me, it's been unnoticeable.
My Friend on a 30Hz Phone Screen because of battery saving mode said the difference between 30Hz and 60Hz is so minor, that in "In practice, it's been unnoticeable."
At one point in time 95%+ of HN comments were cheering on about Atom the text editor and later VSCode as being fast enough or unnoticeable. When Sublime user are baffled as to why. And Sublime isn't the fastest text editor either before Zed came out.
Yes, 10 times out of 10 I could tell an OLED font rendering to LCD. I wish I couldn't. Some people call it taste, some call it absurd requirement.
I could go on and on. The point is most people aren't very picky and picky is a definition defined by average. But there are those of us who have, let say very high standards that cares about PPI, Refresh Rate, Colour accuracy etc. Keyboard Key's typing distance, trackpad responsiveness, all the tiny details that I wish I could unseen and un-feel.
As the article state, RGB OLED Tandem is coming out, and I cant wait to see it in person. I have been pro LCD on Laptop for so long that when I learned Apple will soon ditch LCD for OLED I was worried. Hopefully the new sub pixel layout will fix it.
Yes this. I don't use screens below 200dpi anymore (right now I have 4K on 24", at 200% scaling) and many people say that it's a waste because it's way too high. But I can still see pixels that are off. I just love sharp text. Which everyone is used to on their phones, I don't understand why people don't want the same on their computer which they probably use a lot more hours per day (I sure do)
I'm sure I would notice and be annoyed by this fringing too unless the pixels were so small I really couldn't see them. Probably needs to be slightly higher than 200 then. But I haven't seen oled monitors with such high DPIs. The highest I've seen is 4K on 27" which wouldn't even do for me on LCD.
I have Dell P2415Q, from 2015. There are, like, 4 other (legacy) models of 24" 4K out there, and that's it. I've no idea why they don't manufacture them.
I also have one bought used. It’s the only way to have 4k and 200% scaling on Linux without everything being too big or too small. Size and ppi are perfect but sadly other aspects are becoming really dated (bad colors and contrast, high latency, low refresh rate etc).
I also have one, and it's holding up pretty well. A month or so ago I broke out my colorimeter and it had almost 100% sRGB at around 120 cd/m2. I don't recall the delta E, but it was very low.
While I didn't measure the backlight, it does seem to not go as bright as before, judging by the levels I set in the OSD. I never went above 70% or so when the sun was shining in the room (not directly on the screen, though), so it didn't have any effect on me.
I understand there are two version, I have the second one. But I don't think there's a difference in the panel itself, I think the change was related to HDMI support.
I can't comment on the latency, the only games I played on it were Civilization and Anno Something. Never had a problem for this.
I have been using P2415Qs for over 10 years by now. Replaced some, bought second hand, had to ship to Dell at one point because of the wake issues (pointless: they never really fixed them), so I know the drill. There are actually 4 versions.
The last new one I bought in 2018 I actually paid the same price for it than when I bought my first one in 2015, so it is one of those few computer accessories that significantly increased in price over its course rather than decrease.
If you cannot see the P2415Q degrading and/or being generally crap in any metric (EXCEPT DPI) when compared to even the non-IPS black Dell monitors from this decade, you are simply blind. They are early HiDPI-revival-era panels, and it shows.
Some of the newer IPS black panels are so good that it is tempting to just take the DPI hit and go 27''... albeit with care as it seems Dell has decided this last year to put some filter that further increases blurriness.
I don't pretend that I have the best vision out there, though I don't think I'm completely blind since I don't run into things. But I actually measured this display, and it's within specs. So maybe both I and my colorimeter are blind? Sure, it's not absolutely impossible, but how likely is it?
I actually have a newer "ips black" dell, an ultrasharp 3223qe and yes, it's much better.
But what I'm saying is that the old one is still good. However, I never pretended it was as good as current models. That's moving goalposts. The initial comment was about the display degrading, so comparing it to itself when new (not even other similar models from that era!). Mine only seems to have become somewhat dimmer, but not enough to matter in my day-to-day use since it's still brighter than I need.
I have the same Dell (since 2016) and love it. But eventually I transitioned last year to a 27" 4K monitor. Still almost as sharp (KDE at 175% works fine for me).
yeah, I've tried their 24" 4k monitor, was okay, but not great, so returned.
24" is the max size I can tolerate with short-sightedness, but avoid using glasses for the monitor.
Just a nitpick… it's ppi (pixels per inch). dpi is unit used in printing.
I have the same monitor and i believe over 200ppi is pointless for desk monitor unless you are very close to it. It makes sense for laptops which you have much closer but i think most people have desk monitors way way further from the eyes.
I have a dual 4k/1080p(480hz) oled monitor at home I mostly run at 1080p and 4k lcd monitors at work. I bounce between both and really don't notice much difference. I need the text zoomed on 4k anyway, so it is effectively 1080p screen area, but sharper. Growing up in Atari days I don't mind pixels and actually like them. Latency and the 480hz is more important to me than 4k pixels.
I grew up in the Atari days too but I really need the HiDPI. I run at 200% scaling too, so effectively also 1080p. But I really love not or almost not being able to see pixels.
When people ask what computer they should buy I always tell them to get any old office computer from ebay and use the rest of the money to buy a really nice monitor, and a really nice keyboard and mouse, as these are the bits you use! For most tasks that are undertaken on a computer any processor from the last 10 years coupled with 16GB of ram is more than sufficient.
If you buy a really nice monitor, which for me starts at 6K 32” - the eBay computer will no longer drive it.
What I find insane is how long companies issued 19” 1080 screens to their employees. I don’t think that was a well calculated choice given that a couple of hundred more over 5 years would have surely improved productivity by a little bit for their 50k year employees. It felt almost done out of spite to keep people in their place
I wouldn't call a 6k monitor "nice", that's way above that. I would love on of those, since I can't stand blurry text, but even for me that's way too expensive to justify. So as the sibling says, if people are looking at old used pcs on ebay, they're unlikely to drop more than a grand on the screen.
A 32" 4k screen is nice enough and a reasonable one [0] can be had for a third of that. My I don't-know-how-old desktop I saved from the bin at work sporting an i5-6500 could drive that with no issues.
---
[0] Around 2020 I bought an LG something-or-other for 350 Euros for work, 32", 4k, some form of VA panel. It had pretty good colors and better contrast than the IPS monitor I use as a Sunday-photographer.
that's a weird start. for me the start is 4k with proper blacks & proper color calibration
> given that a couple of hundred more over 5 years would have surely improved productivity by a little bit
no company wants the bulk of it's people to improve their productivity by even a little bit. you should be productive enough, that's it.
> It felt almost done out of spite to keep people in their place
otherwise amazon and the likes would have competitors in every country. but I don't think it's out of spite.
it's the 'established' interpersonal culture between employers and employees, like in packs without natural alphas: if one beta-beta steals the show of the beta-alpha a few times too many, he's a goner. in packs with alphas the performer gets commended and a chance to compete for the top because you want your team to be lead by the currently best. hasn't been the case in our species for a long while now.
companies don't treat their employees bad out of spite, it's so they can stick to low, moderate(d) standards and cultures, ... and have an easy work life
Many employees are doing text work and, until recently, operating systems and apps did a really bad job of working with Hi DPI displays. Your best bet was to target around 115 DPI on a monitor for decent text rendering without having to deal with font scaling. 19" 1080p is perfect for that. You just gave them multiple monitors if you wanted more real estate.
I have a ThinkPad T420 that is sufficient for most tasks that don't involve HEVC acceleration. It's got a mobile Sandy Bridge i7, booting off of a SATA SSD.
The only thing that really needed an upgrade was the display. I ditched the crappy 1366x768 TN for a 1440p IPS and an LVDS-eDP conversion board. Looks fantastic. Runs great.
I can see the difference between 60Hz and 120Hz on phone screens, and I think it's worth the impact on battery life.
I can see the difference in speed between VSCode and editors like Sublime or Zed, however in this case I prefer the additional features at the cost of speed/smoothness.
At some point there were LCD monitors that had very noticeable to me chequerboard pattern - like with analogue TV, only half of the screen got lit up/refreshed, but with an alternating pattern rather than scan lines.
After asking the owner of said screen how he could stand that... "stand what?"
It’s not necessarily picky - it’s sometimes about physically different perception.
When DLP projectors first came out, I couldn’t watch them. I would see colors breaking in fast motion scenes and whenever I would move my head even slightly (and … we all move our head slightly often when watching a movie).
When I told other people, some of them nodded in understanding, but the vast majority thought I was making things up - for them, it was a rock solid picture.
One of my friends replied: “I can see about 300hz. Not all the time - only when I have secadic movements; but that means many fluorescents, DLPs and other light sources drive me crazy. I guess you’re also a member of crazy club”
Some people can hear 26khz. Some people can see DLPs. Some people can see the alternating pattern….
I had a (regrettable) 19" 4x3 ViewSonic display that made that problem obvious -- way back in 2008 or so.
My homework at that time revealed a couple of things:
1. Liquid crystals are individually driven by AC waveforms, not DC as one might assume. This is the nature of the beast. The frequency at which the signal alternates is not necessarily very high. Thus, sometimes, this alternating nature is visible.
2. Some displays use dithering. A given display might support just -- say -- 6 bits per subpixel. To get the full 8 or 10 or whatever number of bits that are expected as a final output, the in-between steps are approximated by switching between two values -- sometimes (again) at a fairly low frequency that is visible.
...
But anyway, that ViewSonic monitor: Most people thought it looked fine, but it drove me nuts.
There was a point in time ( may be 15+ years ago ) I said MacBook have the best laptop speakers and that was one of the reason I brought them. They are not perfect, as any people who cares about audio will tell you but considering the size of laptop they were the best I could get and decent enough.
Most of my friends, and nearly everyone on the internet was like, who buys Laptops because of speakers? They all sounded the same. Get a Dell. I think it was on either Anandtech or Tomshardware. It was certainly before Reddit Era.
Somewhere along the line, may be 2015 to 2020 Youtube reviewers have been bashing about Dell or other laptops for their crappy cost saving speakers. ( Thank You Dave2D ) And, manage to actually show it in the video how awful they were. All of a sudden "consumer" took notice and have since demanded better speakers. Laptop Speakers in the past 5 years have improved tremendously. As it turns out, people need to learn how to compare. And once they do, they cant unseen it.
But in all honesty in the past few years I really really wished I dont have the ability to tell the difference. To not have the mentality how something could be "better". To stop thinking how everything, from Food, Furniture, Tech Stack, UI, Buildings anything could be better.
Some say it is a gift, I think it is more of a curse. And it is a struggle and tiring. I then discovered my retreat for peace was to go out to nature and enjoy the creation of god.
The curse of knowledge really. Or perhaps more accurately “awareness”.
You know about things that are to others unknown unknowns. Since ignorance is bliss, it definitely feels like a curse to you, and since what one doesn’t know can hurt them, others would see as a blessing.
I've learned to come to terms with this mentality, mainly due to time constraints. Before, I would always do an insane amount of research and benchmarking to find the absolute best in its category, even for mundane things like a coffee grinder. I would aggregate thousands of reviews and turn them into sentiment analysis, cross-referencing reviews, and so on.
Now, I take a more 80/20 approach: I clearly define my needs and shut down any thoughts about features and capabilities that I don't need right now. Frankly, after years of thinking that I might use a feature later, I realise that I never do and never recover my investment in these kinds of gadgets.
Finding a trustworthy review source is key — by trustworthy, I mean mostly in line with your own standards. However, if you can try it yourself, that's always better.
For sound on small devices with clear voice and a good dynamic range, Samsung is quite good with its high-end Galaxy Tab line.
I think it's still worth going down the rabbit hole but only when 1) You know you're gonna use the thing a lot and for a long time and 2) You have some real, objective ways (i.e. beyond the very influenceable MOS) of measuring a real, humanly perceivable difference.
This is how I became quite learned in sound reproduction (incl. acoustics and psychoacoustics) then bought Genelec loudspeakers, for example. But I don't care about finding Samsung B-dies (I think?) for my RAM.
Genelecs are a no brainer if you have the money (the small ones are not too expensive but if you want power, wallet will get hurt a lot).
It's crazy how good those speakers sound for how small they are. And extremely well built of course, the aluminium casting was a very good decision, albeit an expensive one of course.
For nearfield, I think so; the 8341A is basically unbeatable there. But it still suffers from three problems (even more in midfield): the ridiculous price of their subwoofers, the way too primitive handling of multi-sub by GLM and the eye-watering price of the W371A compared to other fully cardioid solutions (D&D 8C, AsciLab BX8C); I'm also skeptical of the (unpublished) directivity matching of the combination.
Still, Genelec's legendary reliability and GLM are worth it.
Yeah I think the 40s series is the sweet spot for nearfield, definitely.
I am not a subwoofer guy, in fact I am pretty much against subs. In my experience, the result never really match the theory. Bass frequency are not directional but if the speaker producing them isn't setup physically at the same location as the other speakers you get phase coherence and power mismatch problems.
On top of that most peoples do not have access to speakers that can produce such low frequencies, so it is not really worthwhile to use them to produce. You end up with results that don't sound that well in most systems because they lack a part of the work.
Sub frequencies are also felt physically and quite tiring for long term listening. It always feels nice at first, because it's suprising and powefull but really it is not where the focus in music production should be.
I have had access to some recording studios (even built a small one with friends in association) and my experience is they don't really use subs. If they need more power they just get bigger speakers that will reproduce the relevant frequencies coherently.
There is basically no instrument that goes below 20Hz, even a pipe organ which is a massive object basically requiring a church, does not get there. The energy requirement to produce sub 20Hz sound just doens't make sense and it doesn't add much to the music, you just feel a rumble, most people cannot even hear that low.
In fact even between 20 and 30Hz is not that important. It still requires a lot of power and either you get something that is dominated by the rest of you put the power and it covers everything else.
I played french horn in various harmonies and the tubas were right behind me. It's one of the few instrument that can technically hit sub 30Hz while still being of manageable size. Whenever they had a low part, their sound was much more quiet, the wind volume requirement to sustain those notes is tremendous and I can tell you there was not many compositions that would go that low, because composers know that.
Even if you want to focus solely on modern music, sub notes are just a gimmick, because like I said, they become tiring and overpowering really fast. I had a friend who was listening to dub constantly and over time it was boring, monotonous and tiring because of the omnipresent bass part. This genre is linked to the creation of massive sound system that tries to produce bass as low a possible for the physical factor but past the wow factor, it's not really pleasant and very likely to damage your ears if they play them over the 90db level.
So yeah Genelec sub are overpriced but this is not really a problem confined to Genelec. Every sub is overpriced for the benefits they have. It is mostly about dick measuring contest or being able to say we have the best and can produce frequencies most people don't care about.
Unless you trully have an end game monitor what you really want is just a better monitor that can produce the really usefull frequency range with more power without distorting while keeping everything in phase.
The biggest main monitor from Genelec (1236A), technically goes as low as 17,5Hz and this is what you really want, but in practice it's no even really needed.
There is a recording studio in my town, that got built by some friends and they produce music for Netflix. It's not a small studio size wize, the main room is big enough to handle a mid-sized orchestra (it was a disaffected cinema, they built around this to be able to work with projected movies), it's 1200m2 main room with 50m2 control room.
But they don't have subs. They have multiple mains, some are Neuman, some are older Genelec models and some are Adams for 5.1 handling (I think they technically have Atmos capacity since they have overhead speakers but they don't use it, it was too much trouble for little commercial benefits).
You can check it out there: https://www.instagram.com/studiospalace/
If you think you need a sub but don't have the biggest speakers your room can handle, you are misallocating money. They are just the cherry on top when everything else is near perfect. Like the cherry, they can add a subtle touch but not really necessary for good music production or even good listening. The quantity of music that will make use of them meaningfully is so vanishingly small it's not very relevant. It's like looking at cars and choosing solely on the top end speed, you are unlikely to need it and even if you want to use the capacity, the risks are not worth the reward.
This is why GLM doesn't handle multi-sub well, it's just not a feature that would get used a lot by their high-end customers, they can already pay for the high end monitors. And generally GLM isn't really the best for room correction but it is just a convenient solution that will work well with no hassle if you only buy hardware from Genelec.
I see Genelec as the Apple of monitors. If you have the money, it is hard to find a setup that will get results as good without spending a lot of time building a custom solution that can use speakers from mismatched brands. Just like you can buy hardware cheaper than Apple but it will require a lot of work to end up with a solution that is transparently integrated as well.
What is legendary about Genelec is not just their reliability (most monitors are perfectly reliable if you treat them well, unless you go for bottom of the barel made in China) but the quality of their frequency reproduction (especially if you consider size for the nearfields).
There are a few competitors that make comparable speakers (like Neumann, ATC or JBL pro line) but they are not necesseraly cheaper and don't really have a integrated solution like GLM.
In the nearfield range, Neumann does quite well but they are not as good as coax Genelecs, but they are cheaper.
I don't want to brag too much, but when I was in music theory while young, I always got very close to perfect score in music dictation. I have listenned to quite a lot of monitors from the most reputable brands (Genelec, Neumann, Dynaudio, Focal, JBL, Adams, KRK) or even less reputable brands (Presonus, Yamaha, Tannoy, Monkey Banana, Mackie, Behringer, Prodipe, etc) and my experience is that Genelec is up there in the quality of reproduction which is why they can commend such a high price.
I actually wish someone could come up with monitors that would match Genelecs but at a much lower price. So far, every niche maker that I have heard about only came close and weren't any cheaper (often more expensive actually).
In that sense, it's a lot like Apple laptops, the brands who come close to the quality are not doing it for much cheaper if at all...
At some point a few years ago I didn't want to get a mac with the crappy keyboard so I bought a Thinkpad, I think it was a t480s or something of that era. The speakers were so bad it was impossible to understand the dialogue while watching a movie in a decently quiet room with some city noises in the background.
I have a newer Thinkpad (T14, Ryzen 6850) and while the sound is clear it's shameful how much better my wife's Macbook Air is. I only headphones now. I can't unhear it.
30hz is so painfully slow, I can always tell when I'm in power saving mode on the iPhone by that alone. On desktop I can't even do 60hz anymore after switching to a 144hz monitor.
> 10 times out of 10 I could tell an OLED font rendering to LCD. I wish I couldn't.
I second this. I can tell, and I would never wish that ability on my worst enemy. Very glad there's a (slim, but exists) market catering to that — and that I no longer have to buy a monitor that costs as much as a small motorcycle to not be constantly infuriated at everything in my field of vision when working.
Out of curiosity can you tell the difference in font rendering on a 14 inch laptop between 1920x1200 LCD screen and a higher resolution one?
I mean during normal work in normal position not trying to look really close.
> He's also got Cleartype on and set to RGB stripe even though the OLED is not RGB stripe (though to be fair, Windows doesn't really make it clear what each page of the ClearType tuner does).
I doubt it's Cleartype, the close up photo of the U3223QE show all subpixels uniformly dimmed on the fringes. The post also says the monitor is attached to a Mac mini and a previous post about OpenSCAD has a screenshot with MacOS window decorations.
You act like being able to see this a minority opinion, it’s not, and it’s a known issue. And you don’t need to be sitting close, or using a tiny font to notice it.
My 4K OLED is noticeably less clear compared to an IPS display and I’d never use it for productivity as a result, because why would I willingly subject myself to an objectively worse experience?
It's a tradeoff - if you want a fast clear gaming monitor with pure blacks, you get an OLED. If you want the clearest text and are OK with middling response time, you get an IPS.
There is also VA which provides deep (though not pure) blacks, clear text and decent response time though you get some slight smearing on high contrast edges.
Personally i prefer VA to IPS by far because IPS looks washed out to me.
Whether it's a minority opinion or not, I really can't see the difference. Even when he posted highly zoomed images of VS Code ("Visual Studio Code does a wonderful job demonstrating this problem"), the only thing I noticed is that the image on the right looks slightly brighter than the image on the left.
Then as I went back to where he was describing the problem ("fringing"), I kept forgetting when I scrolled back to the images which was which (and which image was supposed to be "worse").
I'm on a 2025 Macbook, so maybe the laptop's monitor masks the issue?
That's an interesting point you mention about not seeing it, because prior to buying an OLED I'd read a bunch about fringing and in many articles I just... couldn't see it. I couldn't tell what was being illustrated in the images.
It wasn't until I sat in front of one for a few hours, in my room and lighting and with my apps and had funny-feeling eyes and a this-seems-off feeling that I decided to investigate. And yes, those macro photos show fringing, but it /is/ hard to understand how the subpixel pattern translates to on-screen weirdness until you've seen it for yourself.
I'm on a M4 Macbook, and I can see it. I'm inclined to totally accept the blog author's experience as true for them, I'd probably experience the same thing.
Adequate as I can't really tell the difference but with OLED you get perfect blacks which make dark mode and white text on black background look much better.
There's no Cleartype in use here, as the first paragraph says, it's macOS.
And I'm using the default font sizes because they work well for me on an LCD. The point of this post is to document my experience with trying a current-gen generally-available OLED and how it did not work out well because of the subpixel arrangement.
It's also not just an issue on text, it affects any high contrast edges, especially perfectly vertical or horizontal ones. This meant that CAD stuff, spreadsheets (the grid), and large colored sections in graphic design software looked off as well.
There is no Cleartype; this is macOS. And as mentioned I couldn't see the fringing from normal use, that only became evident with macro photos. During normal use it just looked sparkly or weird or artifacted.
And yes, the fonts are small, but that default size in VS Code or Numbers.app -- the example photos -- work well for me. And look fine on an LCD.
I bought a w-oled monitor for office work and gaming, very happy with my oled tv. I returned it after a couple days.
I got unbearable eye strain from it, even though I use rather large fonts, and the ppd was the same as with my previous IPS. Yes, the “more fuzzy” text was very much noticeable too.
Maybe it varies by person, maybe it’s influenced by things like astigmatism, but I totally see where the author is coming from, and I too am waiting for the new OLED panels to see if there’s an improvement.
In my experience, it seems to. My astigmatism (or other eye stuff) seems to move different colours different amounts, leading to wider RGB pixels and making things like Cleartype so much worse. So people were enjoying Cleartype and I was hating the obvious colour-changes and fringes that somehow they weren't seeing. I assume some people are lucky enough to have aberrations that actually make cleartype more pleasant.
I do too. Combined with progressive lenses and I have significant chromatic aberration issues. Blue and red pixels require different focus, which is sometimes an issue when solid blues and reds are on screen in close proximity. I turn off pure blue colors in my terminal emulator, for example.
That sounds familiar. I also have ever so slight green-brown color blindness. It's only really noticeable in low light (like in the woods in evenings), but that could well all stack up to be a problem.
I also have significant problems with blue LEDs around the house, to the point where I've removed, replaced, or covered almost all of them. They really, really bother me because it feels like my eyes never focus on them and they leave me feeling slightly disoriented.
He's using a Mac, Apple removed support for subpixel rendering many years ago, it's not ClearType, the display shows color fringing even with grayscale antialiasing, that's what he's complaining about.
It's not like the Cleartype tuner actually does what the pages claim - you can go through and use the magnifier to choose the grayscale-only outcomes and still see Windows doing RGB stripe cleartype throughout. People literally have to install third-party tools like MacType or GDI-PlusPlus to get solid font rendering. So blaming users for using it wrong (especially when they're not even on Windows) is odd.
Also, many people can see and are bothered by particular non-rectangular pixel layouts - it doesn't require doing odd things.
I have a 49" QD-OLED panel. I have never been one to find visual artifacts distracting, but fonts were awfully jaggy in Linux to the point I spent a week tinkering with font config and almost switched panels to a larger miniled since code looked horrible. And I'm someone who was fine with horrible VA low res low quality screens back in the day.
The sub pixel geometry on samsung's qd-oled needs very specific font configuration to be correctly displayed, and even then it just stops looking bad.
The issue with his monitor is that regular subpixel hinting of cleartype doesn't work because of different subpixel layout, and DPI is not high enough to not need it. It is a real problem.
I use smaller fonts as well and when I first got an older OLED display with a pixel layout not supported by Windows ClearType, I used BetterClearTypeTuner and later MacType to adjust it. It was leagues better after tweaking a few settings and I'm very happy with text now, even on my AW3425DW, which has an older layout they moved on from in recent generations.
Those do show the best example, yes. And best to photograph. But it's also noticeable on any high contrast edge or a fine line, like a drawing in Autodesk Fusion (CAD software) or just the lines between spreadsheets.
And no, no Cleartype here because (as mentioned in the first paragraph) it's a Mac running macOS.
One of the benefits of having somewhat terrible eyesight :). I can see the fine details if I squint and get closer, but for me everything fringes because that's just what astigmatism or a smudge on my glasses does.
I used to work on custom embedded UIs that supported fractional scaling. This shit is hard.
You have to understand there's enormous effort that needs to go into this to make things look good. It's absolutely no surprise to me that Jobs era Apple used to stick to integer scaling ratios with relatively low-res phones while the competition battled with paper specs.
The trick to make things look good is to be mindful of the pixel grid, (and the subpixel layout). You have to choose font spacing and fonts so that major font features line up with the pixel grid. You sometimes have to slide a letter a bit to the left or right, which might result in inconsistent spacing, you might even want to have multiple versions of characters to hide these issues.
This applies to borders as well (both spacing and thickness).
Sometimes you can't make it look good no matter what you try - and the designer has to change it.
While flexbox and other super-duper layout algorithms might be very clever, if you use them, there's no way to line things up perfectly, and if you expect to, you might be in a world of hurt.
Adding PPI is a poor way to fix this. Even if you 2x the resolution, going from 1080p, to 4K, these issues still persist, and you quickly run out of hardware beyond that.
There's no wonder why modern 'flat' UIs usually have 1-2 pixel-ish gradients on edges of features, or use smooth transitions, it's a cop out - but that's the only thing that works with flexible layout, different pixel densities. But subtly, you notice things looking a bit blurry, just as if the images were low-res, even though the pixel density is insane.
ClearType is brilliant, but it only works with a certain set of assumptions, like a bit of light bleed (which exists on LCD but not on OLED afaik), and needs to know the subpixel layout, and can result in absolutely gorgeous looking fonts with relatively low res displays.
There's a reason why Windows 95-era UIs have a cult following - it's insane how sharp they looked even on hardware that on paper is much worse than modern stuff.
I don't generally use tiny fonts and I hopefully don't sit unreasonably close to the screen yet fringing is very apparent to me (even on window borders, i.e. it's not a font rendering quirk in my case). Just another anecdotal data point.
He's also got Cleartype on and set to RGB stripe even though the OLED is not RGB stripe (though to be fair, Windows doesn't really make it clear what each page of the ClearType tuner does).
But yeah, if you use a _tiny_ font and sit _really_ close to the screen, you see fringing. In practice for me, it's been unnoticeable.