Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

For fuzz, anti and anti-aliasing issues, you can sometimes make them worse with a HiDPI display.

All of these issues really boil down to render resolution not integer scaling to the display resolution.

If the render resolution is an integer multiple of the display resolution or vice versa, you will generally always get beautiful crisp rendering - this is exactly the approach Apple adopted, and is why they had to use some slightly less common resolutions like 5k on some devices - 2560x1440/"QHD" has integer scaling factor of exactly 2 for a 5k display etc.

The problem though is that outside of Apple devices, almost no hiDPI display will neatly integer sale. The vast majority of Hi-DPI monitors on sale today are 4k, and only 1080p really has a useable integer scaling factor there. 1080p of "useable" screen real estate on a 4k monitor is going to make all UI elements too big usually though... So you are forced to non-integer scaling and images that will not cleanly map to the grid of pixels in the monitor, which is where the fuzz and anti-aliasing etc starts... 1440p of "usable" space does not cleanly map to a 4k monitor, but many people run them this way.

5k is frankly a brilliant resolution for high quality ~200ppi style HiDPI rendering on 27 inch displays with 2x integer scaling for macOS and Windows especially, its tragic the resolution hasn't become more mainstream.



Yes, it’s a shame that there are only two 5K 27” displays to choose from, and both are expensive. The industry has dropped the ball on this for years, all those 32” 4K screens may be cheap but they’re no use to me…


On Linux

Step one plug in 4K 27" monitor.

Step two set scaling factor for gtk/java apps to 2. KDE/QT apps can figure out the correct DPI without hassle.

Step three set fonts smaller or larger if desired.

On Windows plug in 4K 27" monitor. Text doesn't look in my opinion quite as nice as Linux but it isn't fuzzy, small, or giant. Possibly font rendering could look nicer if I bothered to tweak it but since its basically boot to steam I see little reason to bother.

I keep hearing this argument that 4K somehow doesn't work or looks shitty and only 5K Mac displays provide an acceptable high dpi experience and I feel like I'm getting transmissions from an alternate universe where nobody had to scale UIs from screens that varied in DPI by a factor of 3 for almost 20 years. Long before 4K screens.


It's a shame Apple turned that into such a problem for their users, when Linux and Windows didn't. I guess they gambled wrong on how the market for monitors would evolve.


This is an interesting analytical failure. To explicate. You go through your day wearing only boots. You see people wearing sneakers and so one day you buy a pair the wrong size and put it on with a tag inside and walk around like that for a day. You think to yourself how do all these people do this every day did nobody ever show them a good pair of boots! Sir Boots4Life's analysis is faulty they aren't all walking around uncomfortable because they don't wear the wrong size nor wear a small object between footwear and foot for extra penance and everyone isn't walking around with a fuzzy screen.

There as it turns out are other ways to scale a UI other than integer scaling factors. Even if you don't use svg you can use different image sizes and scale fonts by smaller increments yet.

You describe their being 1080p of usable real estate which is a complete failure to use a meaningful unit of measure. Your 5K display, future 8k displays etc use more pixels to draw a button 3cm x 1 cm. They aren't wasting increasing number of pixels they are drawing the element with increasing fidelity. If we measured screen real estate in pixels one would conclude that a 5" screen and a 32" screen both drawn in 1080p have equal screen real estate. This is a clearly incorrect conclusion. Clearly screen real estate is measured in area with fidelity not real estate measured in DPI.

Increasing fidelity might bear on the smallest element one can possibly usefully use but its not going to bear as much on what size element people desire to use which has much more to do with how far the object is from a users face.

You are saying that 4K monitors require one to choose between giant elements that somehow waste all the pixels or exceptionally tiny ones. This is silly. First off 108Op displays that range from 12" - 32" already had to adjust elements to be usable even before 4K became a thing and there were already more knobs than scaling factor to achieve the same end result. 4K changed the existing equation that has existed for 20 years by an exact factor of 2. 8k changes it by an exact factor of 4.

TLDR: Set an integer scaling factor then tweaking your fonts a little bigger or a little smaller until it looks nice according to your taste. Nobody on earth expected a 24" 4K monitor to display twice as much content as a 1080p 24" monitor because they want things to be the same size on the screen only prettier.


Actually, I want stuff smaller on 1080p displays so I could display more stuff, but the low pixel density set a "seemingly unreasonably high" lower bound on how small stuff could be and still be usable or legible. (-:




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: