There's no need to bicker about this. A 13 inch display at 4K has about 340 pixels per inch along the diagonal. At 1080p it would have about 170 pixels per inch.
Perfect vision is usually understood to be the ability to discern details of about 1/60 of a degree, or 0.29 milliradians. By the arc length formula (s=rθ), we find that the distance needed to precisely align this angle with the level of detail provided by a 1080p screen is about 20 inches.
In other words, if you sit closer than 20 inches away from your screen (perhaps not that unlikely for a laptop), you might be able to discern details beyond the 1080p level. This would only be possible for extremely contrast sensitive small details, like text in a small font size at 1:1 (no DPI scaling).
So... it's a bit complicated, but I suspect 1080p would be good enough for nearly everyone at 13 inches, but move up to 15 inches and I could see many people preferring 1440p.
High DPI screens are able to render text without aliasing, and that makes a significant difference, especially with less-than-perfect eyesight.
What you're saying is true insofar as I had to squint up to a 1080p screen at 13" to actually see the pixels. But everything was slightly blurry, and switching to a "Retina" (1600p iirc) screen fixed that.
Which was the lack of subpixel aliasing: you can make those screens look blurry as well, by turning it back on. But a 1080 without it just looks like it has janky text rendering.
An actual 4K screen is unnecessary at 13", imho, but the extra pixels are harmless except in battery life. I do detect a real difference in textual rendering between 170ppi and 227, because it's the difference between leaning on subpixel aliasing and turning it off while still getting good results.
I never understood the "eye resolution=screen resolution -> good enough argument".
First of all the cone spacing in the fovea is around 31arcsec or about half the arcmin you assume.
IMO that is more relevant than the 20/20 vision number because that number is not based on any intrinsic quantisation of the visual system but rather mostly limited by blur which tends to be very much not gausian -> not an ideal low pass filter for most eyes.
Now consider the nyquist shannon sampling theorem that tells us that any signal we want to fully capture needs to be sampled at at least twice the frequency of the highest frequency of interest.
So if we want to be able to fully represent any state of our visual system on a display we need at least twice the resolution of our visual system (ignoring for a minute that that assumes an ideal lpf which your eye is not as stated above). so already 4x your 1arcmin resolution number.
But that all quickly becomes rather theoretical when you look at jagged elephant in the room: aliasing and scaling!
a lot of what we look at is either rendered with pixel precision being very prone to aliasing at scales that are much much larger than your pixel pitch
(see this worst case demonstration https://www.testufo.com/aliasing-visibility)
or image or video files that might be displayed at a size that isn't an integer multiple or fraction of its native resolution. scaling just like aliasing causes artifacts that go way beyond the scale of your pixel pitch and one way to mitigate the issue is to just have a very high target resolution to scale too.
So yeah I don't thing "can't distinguish individual pixels" is a meaningful threshold and even way beyond that there is still benefit to be had even for those with less than perfect eyesight
I use small text. Readability absolutely improves around the 8-10px ranges.
Especially with Chinese/Japanese characters, these can sometimes become unreadable on smaller fonts on 1080p.
I have two 14” Thinkpads I switch between which have almost identical setups, but one is 1080p and one is 1440. I’m not buying another 1080p if I can avoid it. And yes I have been tweaking fonts and X11 quite a bit already.
And people can’t really tell 30fps from 60fps from 120fps either, right?
Once you see and appreciate the difference, you can’t unsee it. Anything less is jarring.
Print media figured out three to four decades ago that print buyers prefer at least 144 lpi for black shapes, at least 300-600 dpi for grey images, and minimally 300 dpi but ideally 1200 dpi for colour photos people will consider high quality. (Note: For reader convenience I’m mixing lines per inch or lpi with dots per inch or dpi here, lines is per screen but there may be several screens per image, so dpi is the source material you’re trying to reproduce with screens.)
We’ve been settling on computer screen quality for too long. If all we did was video, fine, but we spend most time simulating print.
Can't find any results for 13" displays. But there have already been several blind tests carried out for TV's that show people can make out the difference.
I went from a 27" 4k display to a 27" 2k display on my desk and I notice the difference every day. I only switched for higher refresh rate.
The biggest thing keeping me on my Macbook is it's hard to find a 13 inch, hi dpi, close as possible to mac keyboard laptop. The Razer 13 inch was sooo close but they made tilde and backslash half keys for some reason....
I'm pretty sure most people see a huge difference between a old-style MacBook Air screen and a MacBook Pro screen. The Air is a 1080p 13in display. The Pro isn't even 4k, but the difference is extremely noticeable, especially if you are looking at a document or code (even from 24-36 in away). Seeing the difference between Apple's retina display and 4k? Well, that is difficult.
Would love to take the test. I can bet all my life savings that I will be able to tell the difference on first second. Difference between those resolutions is HUGE and I immediately see the difference.
It is
1080p to 4k is such a large difference