What does one have to do to get as clear and pretty of a UI as that on the modern desktop OSes? Everyone switched from bitmaps to vectors over a decade ago and now everything is fuzzy, antialiased, scalable, flat, hard to make out.
Of course you can always run dwm on linux with a hodgepodge of makeshift utility apps, write your own keybinds to set your brightness and volume and other such inanities, but in order to get a 'complete' desktop experience you kinda have to opt-in to GNOME/KDE which are trying to do the same things as Apple or Microsoft, aesthetically speaking. And tough luck if you actually use Apple or Microsoft to begin with. I tried running bug.n and it seems to not work in win10, let alone win11.
Vectors will scale to any pixel density, but they look irritatingly fuzzy at the lower pixel densities you commonly see in large widescreen monitors.
Obviously it's not an apples-to-apples comparison, but decades ago bitmapped fonts could give you crisp text on any monitor, and in 2022 I'm using a popular, >$1300, very favorably reviewed monitor, and the text is ever-so-slightly fuzzy. It's plugged into my Macbook Pro right now, and the difference in how well the text renders on the two displays is plain.
I guess as soon as Apple started making laptops with 300dpi screens, it became okay to render fuzzy text on any screen that wasn't 300dpi.
The problem is you have it plugged into a macbook. Apple doesn't support subpixel anti-aliasing in any of its latest releases. Your monitor looks so fuzzy because Apple has removed the capability to make it not look so fuzzy.
This is only part of the reason. The other reason is that Apple doesn't actually do fractional UI scaling, they only ever render into a framebuffer at whole integer multiples of an internal "native" resolution and then scale down this framebuffer to the actual monitor resolution afaik (they still render natively to this multiplied resolution - no doubled pixels or anything like that).
This results in less crisp output than you would get if you actually rendered vector based UI at the exact monitor resolution or if you natively rendered bitmap art 1:1 with the monitor resolution.
This approach makes sense when you're only using Apples monitors because they can optimize their internal "native" resolutions for the handful of actual native resolutions they offer, but it falls apart when you start plugging in external third party lower dpi monitors.
(I believe this is also the same approach that Gnome 3 uses to do (experimental, with a toggle) non integer multiple UI scaling.)
To mitigate the issues you mention, I've long wanted higher res screens since forever (I had one of those early Vaio VGN-Z that was the only 13-14" laptop with 1600x900 displays when 1366x768 dominated; they were also the first to move to 1920x1080 with their next model).
And I had that Dell's first 24" 4K screen that required DP MST to go past 30Hz. I've tried disabling subpixel rendering to see if that resolution was really sufficient (185dpi) but you could then see text regain jagged edges. Thus I reasoned that, with my glasses correcting my myopia to better than 20/20, I needed to wait a while until I can fully enjoy nice, laser-printout-like text.
As a reminder, even laser printers printing at 600dpi use techniques similar to subpixel rendering to render that smooth text.
I am hoping for some 8K screens at 32" for productivity work (though that's still probably too low at 275 dpi), and I never get it why people keep talking about there not being any 8K content. My desktop is my 8K content, though I also lack the GPU to drive it at more than 30Hz :)
Yeah, thus my mention of lacking a modern-enough GPU to drive them.
Thanks for a very clear and direct source for the bandwidth, I was under the impression HDMI 2.1 would not require DSC for 8k at 60Hz but never paid attention to the actual numbers.
DP 2.0 is what I need to wait for then :) What was the bandwidth it promises? 77Gbps, should work.
Then again, I feel like eGPUs are the way forward for this, as you can keep the GPU close to the screen, and your computer would talk a "semantically compressed" (or rather terser) language to the display output (eg. talk OpenGL to your display).
I think the secret is to go with much larger monitors and blow up the text size. Right now I'm typing this on a M1 MBP plugged into a curved 48" 4K TV that I use as a monitor. I can have the text scaled about 3-4x as large (physically) as on my Macbook and it looks great. Slight bonus, avoiding reading small text is healthier for your long-term vision.
I've recently gotten a MacBook Pro 14 for work, and tried out Ubuntu 22.04 RC on a laptop, and I was confounded with how fuzzy both of them look today (Ubuntu with fractional scaling, but Mac even with 2x scaling on 4k screen).
Whatever setup was there on Ubuntu 20.04 which made text render clearly needs to come back (I used to use slight hinting setting, but it seems subpixel rendering is going away on Linux too). Please!
That's probably because Ubuntu 21.04 used Wayland by default for all but Nvidia [0] and 22.04 enabled it for Nvidia [1], and Wayland's fractional scaling is an ugly hack (both in how it's implemented and how it looks) [2][3]. Progress is being made [4], but it's quite slow.
Switch to the X session on the login screen and you should have good scaling again (also, lots of applications that do screen sharing, like Zoom, do not work on Wayland [5]). Or just be like me and use Mint, which has no plans for Wayland support and you don't have to think about any of this.
Thanks, good to know it's only due to Wayland, and that it's both being worked on there, and that there is a quick fix otherwise when I upgrade to 22.04.
Text fuzziness has had nothing to do with your monitor ever since we've gone digital with HDMI and DVI
Linux & Windows generally use implementations of antialiasing that aren't so "fuzzy," so you could always ditch that Mac for something that won't forcibly abstract away the details for you
It can still happen when different display panel pixel colour arrangements don't match subpixel rendering settings in software, though that'll usually happen with non-monitor displays like TVs.
I'm using the U3219Q at 1:1 resolution (so 3800px wide) on BigSur and Monterey, and some display text is sharp at the beginning and end of the text block, but fuzzy in the middle. It's like there's an aliasing going on, where a text block that would be natively 100 px wide is shown at 99px or 100px.
It was better on previous versions, I think Mojave was the last good one.
Yeah, I was completely surprised when I moved from Ubuntu 20.04 to MacBook Pro 14 on the very same Dell U3219Q: MacOS made text editing completely unpleasant.
Then I upgraded my son's laptop to Ubuntu 22.04 RC, and I was greeted with the same fuzziness. I hope that's only due to Wayland, I haven't tried out the X desktop when docked.
This has nothing to do with the display, this is a MacOS scaling issue. Plug in a Windows device, put it at 100% scaling and it will be as sharp as it can get.
macOS is really limited when in comes to displays, because the display must have a PPI of around 110, or a multiple of that. It sucks how bad macOS is at scaling, especially with the terrible font rendering.
Back in the day, the fixed pixel density of the bitmaps was 1-to-1 aligned with the pixel density of the monitors. Every line was crisp and clean because there was no antialiasing across partial pixels.
The world in which we now live allows for a lot more crispness when it matters (because the pixel density approaches indistinguishable-by-the-human-eye-at-viewing-distance), but with the tradeoff that UIs designed to work at whatever scale on whatever monitor with whatever DPI do a lot of aliasing, and things tend to look, sometimes, just a tad muddier. Especially when vectorized icons get drawn in a rendering pipe that passes through the graphics accelerator.
If you have e.g. a diagonal line from a vector graphic then it will almost always be antialiased (i.e. light grey pixels along the edge of the black pixels - or maybe there aren't even any black pixels and they're just dark grey, which might look fuzzy to some).
The alternative is that you convert the vector line with a "nearest neighbour" pixel algorithm (or cleverer but fundamentally similar algorithms e.g. [1]) but then you get the aliasing artefacts that anti-aliasing is meant to avoid. For example, two parallel lines that are rendered 2.5 pixels apart might end up having a gap between them that alternates between 1 and 2 pixels along its length, which looks even worse.
With hand-designed pixel graphics, this isn't a problem because you can manually choose to draw everything an integer number of pixels apart or avoid certain constructions altogether.
Not necessarily; I fired up some old devices last month, and the way their bitmap icons and apps and screens appeared, can only be defined as "Crips" and "Sharp". I'm not saying prettily or beneficially so! But they were definitely very stark and constrasty (let alone colourful and "stand-out-ish"!) holding them next to my modern phone or PC.
I don't think "Vector vs bitmap" is necessarily the cause - more of design sensibility. When I look at the lineup of google apps on my phone which all look EXACTLY the same (3 primary colours in random rotation; I can only effectively use / distinguish them if I memorize location of icon), and then the same as slack and my son's daycare app which also use swirls of same 3 primary colours; and when I compare my Windows desktop and icons and browsers to old apps - there's a definite brutal sharpness to old stuff.
What does one have to do to get as clear and pretty of a UI as that on the modern desktop OSes?
Buy a retina mac ;)
I kid but only partly. Running at integer (2x) scaling on a 200 dpi screen makes everything perfectly crisp. Anti-aliased text no longer appears blurry and lines are rendered at exact pixel boundaries.
Those old bitmap art displays didn’t do any scaling. Every pixel appeared exactly as someone drew it, with crisp boundaries. It’s the scalable part of vector graphics that fuzzes things up.
XFCE is pretty complete and can be tweaked into just about any retro desktop experience you want--i.e. you can make a panel feel like a start menu and task bar, or sit in the middle like a dock, have both a dock and top bar panel, etc.
If you really want you can also tweak some settings to totally disable antialiasing and switch to bitmap font rendering. On modern semi-high DPI displays (like a 14" 1920x1080 panel) it isn't that great though as most bitmap fonts are far too small to be readable (~8-12 pixels tall). For the most crisp and clear text, even crisper than old bitmap stuff, you really want a high DPI display in the 4k+ range. Try using a modern mac and you will be blown away at how clear and sharp the text renders.
With respect to the anti-aliasing on Linux, I run Xfce, which easily allows one to control the font anti-aliasing in the Appearance settings under the Fonts tab. I assume that GNOME and KDE have similar configuration settings. You can also get the same sort of controls in fonts.conf [0] for applications that don't respect Xfce's settings, like Firefox, as I recall. I could make everything as sharp as I'd like with the right settings.
In my experience, basically every window manager has some 90s-esque style as well.
For fuzz, anti and anti-aliasing issues, you can sometimes make them worse with a HiDPI display.
All of these issues really boil down to render resolution not integer scaling to the display resolution.
If the render resolution is an integer multiple of the display resolution or vice versa, you will generally always get beautiful crisp rendering - this is exactly the approach Apple adopted, and is why they had to use some slightly less common resolutions like 5k on some devices - 2560x1440/"QHD" has integer scaling factor of exactly 2 for a 5k display etc.
The problem though is that outside of Apple devices, almost no hiDPI display will neatly integer sale. The vast majority of Hi-DPI monitors on sale today are 4k, and only 1080p really has a useable integer scaling factor there. 1080p of "useable" screen real estate on a 4k monitor is going to make all UI elements too big usually though... So you are forced to non-integer scaling and images that will not cleanly map to the grid of pixels in the monitor, which is where the fuzz and anti-aliasing etc starts... 1440p of "usable" space does not cleanly map to a 4k monitor, but many people run them this way.
5k is frankly a brilliant resolution for high quality ~200ppi style HiDPI rendering on 27 inch displays with 2x integer scaling for macOS and Windows especially, its tragic the resolution hasn't become more mainstream.
Yes, it’s a shame that there are only two 5K 27” displays to choose from, and both are expensive. The industry has dropped the ball on this for years, all those 32” 4K screens may be cheap but they’re no use to me…
Step two set scaling factor for gtk/java apps to 2. KDE/QT apps can figure out the correct DPI without hassle.
Step three set fonts smaller or larger if desired.
On Windows plug in 4K 27" monitor. Text doesn't look in my opinion quite as nice as Linux but it isn't fuzzy, small, or giant. Possibly font rendering could look nicer if I bothered to tweak it but since its basically boot to steam I see little reason to bother.
I keep hearing this argument that 4K somehow doesn't work or looks shitty and only 5K Mac displays provide an acceptable high dpi experience and I feel like I'm getting transmissions from an alternate universe where nobody had to scale UIs from screens that varied in DPI by a factor of 3 for almost 20 years. Long before 4K screens.
It's a shame Apple turned that into such a problem for their users, when Linux and Windows didn't. I guess they gambled wrong on how the market for monitors would evolve.
This is an interesting analytical failure. To explicate. You go through your day wearing only boots. You see people wearing sneakers and so one day you buy a pair the wrong size and put it on with a tag inside and walk around like that for a day. You think to yourself how do all these people do this every day did nobody ever show them a good pair of boots! Sir Boots4Life's analysis is faulty they aren't all walking around uncomfortable because they don't wear the wrong size nor wear a small object between footwear and foot for extra penance and everyone isn't walking around with a fuzzy screen.
There as it turns out are other ways to scale a UI other than integer scaling factors. Even if you don't use svg you can use different image sizes and scale fonts by smaller increments yet.
You describe their being 1080p of usable real estate which is a complete failure to use a meaningful unit of measure. Your 5K display, future 8k displays etc use more pixels to draw a button 3cm x 1 cm. They aren't wasting increasing number of pixels they are drawing the element with increasing fidelity. If we measured screen real estate in pixels one would conclude that a 5" screen and a 32" screen both drawn in 1080p have equal screen real estate. This is a clearly incorrect conclusion. Clearly screen real estate is measured in area with fidelity not real estate measured in DPI.
Increasing fidelity might bear on the smallest element one can possibly usefully use but its not going to bear as much on what size element people desire to use which has much more to do with how far the object is from a users face.
You are saying that 4K monitors require one to choose between giant elements that somehow waste all the pixels or exceptionally tiny ones. This is silly. First off 108Op displays that range from 12" - 32" already had to adjust elements to be usable even before 4K became a thing and there were already more knobs than scaling factor to achieve the same end result. 4K changed the existing equation that has existed for 20 years by an exact factor of 2. 8k changes it by an exact factor of 4.
TLDR: Set an integer scaling factor then tweaking your fonts a little bigger or a little smaller until it looks nice according to your taste. Nobody on earth expected a 24" 4K monitor to display twice as much content as a 1080p 24" monitor because they want things to be the same size on the screen only prettier.
Actually, I want stuff smaller on 1080p displays so I could display more stuff, but the low pixel density set a "seemingly unreasonably high" lower bound on how small stuff could be and still be usable or legible. (-:
It's pretty poor to regress on the vast majority of currently available displays just because there are a handful of multi-thousand-dollar monitors available. Apple can't even sell a 27" monitor for less than $1500 because the experience wouldn't be on-brand. If you look at the big-ticket items that Apple sells, they have cheaper versions of everything except displays. They only put their brand on the most expensive displays because they know what their software looks like on anything less.
MacOS looks great on my 250€ LG 24UD58 (24" 4K). Unfortunately they stopped making it and all 4K screens made now are 27" and up which is too big for 200% scaling :(
I switched to a Mac recently, prior to that I'd been using Elementary OS. I'm still using the same monitor though (a large 4k display), so any visual differences are quite obvious to me.
The Mac is definitely fuzzier than Elementary; that clarity is the main thing I miss. It seems pretty obvious that modern MacOS is designed for high DPI displays.
You might be overestimating the complexity of creating such a hodgepodge.
i3wm = tiling window manager
lxappearance = set gtk theme and settings
qt5ct = set qt5 theme and settings
kvantum more settings for qt
ponymix cli for pulseaudio that sucks less than pactl
pavucontrol gui mixer
nm-applet tray applet that lets you select a different network
i3status-rust nicer and more powerful status line for i3bar
rofi a launcher that looks like nice and is more powerful
To this add all the other apps you already use.
The thing about this hodgepodge is that there really isn't much to be said about integration as the different components don't require or benefit from such. The connection between such are so bare direct and obvious that there isn't much of a question how to put it together. You change a line in your i3 config to start a different app at startup or trigger a different one with a keybinding.
If you had done so 5 years ago you could be using much the same configuration now and much the same configuration 5 years hence. I guarantee you that whether you use KDE or Gnome or Windows you are liable to spend a day here or there tweaking your environment even if Bob in accounting doesn't.
Most of the time when I notice poor font antialiasing it's due to the OS resolution or subpixel antialiasing not precisely matching the panel. An old OS will not have subpixel antialiasing, so if it looks correct you could try disabling subpixel/ClearType, or testing a bitmap font. If the old OS still looks wrong, your panel may be pretending to be 4k and actually stretching things a little. Also, any OS level scaling is likely to mess with things, so check that it's either 100% or 200%.
I think, this is a rather relative statement. Modern UIs would have been perceived as toy-like and not fit for business back then. It all comes down to what we are used to. (That said, personally, I always favoured the System 7 UI over OS 8.)
I use Terminus 9pt whenever/wherever I can easily get it going for this reason. It's funny seeing all the replies from people who don't get it. Other fonts really do all look blurry next to bitmaps. I've switched terminal emulators at least once due to Pango dropping bitmap support. Still works in foot and alacritty. Probably others as well, but I also need Wayland support.
Ha! I remember feeling the same way when Apple started replacing the classic Mac system fonts (Chicago, etc.) with newer, anti-aliased alternatives ca. Mac OS 8. A couple dozen years later, I guess I'm used to it.
I don't get it. Windows 10 is all crisp 1-2 pixel lines and rectangles for me with sparse use of gradients, and the windows theming engine (used in XP, 7, etc) was all based on bitmaps.
> but in order to get a 'complete' desktop experience you kinda have to opt-in to GNOME/KDE which are trying to do the same things as Apple or Microsoft, aesthetically speaking.
This is so true and that's why I deep inside mourn the end of this UI era.
Of course you can always run dwm on linux with a hodgepodge of makeshift utility apps, write your own keybinds to set your brightness and volume and other such inanities, but in order to get a 'complete' desktop experience you kinda have to opt-in to GNOME/KDE which are trying to do the same things as Apple or Microsoft, aesthetically speaking. And tough luck if you actually use Apple or Microsoft to begin with. I tried running bug.n and it seems to not work in win10, let alone win11.