I see lots of progress and resources put towards monitor refresh rates but very little progress on making 5K displays cheaper. Any reason why ? Is this due to gaming ?
Mostly because 4K is the standard for video content and game consoles, and on desktops it is fine for Windows users, who make up the majority of the market. The only people really clamoring for >4K monitors are Mac users due to Apples decision to standardize on 220dpi in macOS. That is a potential market, but it's a price-insensitive one, so the current gouging for monitors in that category probably isn't going to stop any time soon.
I don't think so. I'd love to have 8k or more panel at 27-32 inch. The reason is that I can easily see pixels from a metre away and it is annoying. I'd love to have more detail etc. for instance being able to display nice texture underneath my text editor so it is more pleasing and easier to read.
The cables and ports for 5k120hz barely exist at this point (it needs displayport 2.1 uhbr13.5 or better, which only exist AMD's 7000 series GPUs at this point). Dual cable solutions always suck so everyone tends to avoid them.
32" is large enough that you can get away with 100% scaling and have much more actual workspace instead of increased clarity. It's not really until you get to the 15" or 17" laptops 4k starts being about high DPI and in that space the MacBook Pro screens actually overshoot the target a bit.
Because most people pick monitors by size, not resolution and PPI.
4K is also popular because it fit within the capabilities of common HDMI and DisplayPort ports on people’s computers and cables.
If you try to sell a monitor that doesn’t work well out of the box with the average laptop, it’s going to have an extremely high return rate. That’s why display resolution will always lag behind the common capabilities of your average HDMI port on a cheap laptop.
I had one, Dell's UP2414Q. It was a piece of shit, mostly due to requiring multi-stream transport to run in 60 hz mode. So you'd get a GPU driver or OS update and the screen would stop working. Sometimes the panel would split in half and one of them would black out or shift its content sideways so there was a seam in the middle and a piece of the edge wrapped around to the center. Would not recommend.
The ones afterward that got rid of the MST requirement might've been better. Pixel density was great.
The gaming market is huge. And gamers are typically willing to spend top money on hardware (thanks to nVidia raising the bar every time). That's why the market is flooded with expensive 1440p models featuring ridiculous refresh rates and RGB lightning.
This is the same reason why you can get a 65" 8K OLED TV from Samsung for "just" €2500, but you still pay €4000 for a Dell 32" 8k monitor. The market for 8k TVs is just so much bigger than 8k computer monitors.
> This is the same reason why you can get a 65" 8K OLED TV from Samsung for "just" €2500, but you still pay €4000 for a Dell 32" 8k monitor. The market for 8k TVs is just so much bigger than 8k computer monitors.
I was with you until this point. I’m fairly certain that this is due to shrinking pixels to cram 8K in 32” is much harder than doing it at 65”. That’s why it’s cheaper. Recently, many super high resolution TVs debuted at CES (or whatever has replaced CES) start at larger sizes and it’s an accomplishment when they start to come /down/ in size.
> This is the same reason why you can get a 65" 8K OLED TV from Samsung for "just" €2500, but you still pay €4000 for a Dell 32" 8k monitor. The market for 8k TVs is just so much bigger than 8k computer monitors.
Speaking of which... the market has this really obnoxious hole in it.
There are 32" 8k monitors and 65" 8k TVs (I think there are a few older 55" models), but there's nothing really in between.
I really hope someone either makes smaller 8k TVs or larger 8k monitors, because the situation is a little absurd. From what I can tell, 8k monitors are one of the few areas where people could be expected to notice the difference in resolution from 4k at normal viewing distances.
I was going off of this chart[1]. It's true that some people sit relatively close to their TV, but in most of the living rooms that I've been in, the TV has been more than 6 feet away.
This is purely subjective. I could definitely tell the difference from more than 6ft. The picture looks just more life like.
There are people for whom it makes no difference or they don't pay attention to details. Like I have a friend who wouldn't tell 720p from 4k, unless you tell them what to look for and still they wouldn't understand what's the fuss about.
I've also heard some handwavey explanation about how certain parts of the panel manufacturing process is essentially separated by DPI. That you can reuse certain aspects of a ≈ 100 DPI manufacturing line across varying refresh rates (and physical sizes).
Then those line can probably serve a large part of the gaming / normie market, while the ≈ 220 DPI (e.g 5K @ 27") have a very small market.
Yes. The two mass markets for good screens are business and gaming. Business pushes for color accuracy and good stands at reasonable prices, gaming for high refresh rates.
Of course gamers also like 4k, but given a limited budget for monitor and GPU a high refresh rate has the better payoff
This site is an absolute gem besides the news articles!
Grey-to-grey response times of LCDs offset basically any benefit received from higher refresh rates though as they mention. OLED definetly seems like the future here..
> This site is an absolute gem besides the news articles!
They’ve done some great things around information sharing over the years.
Interestingly, this site is also where the gaming equivalent of audiophile quackery congregates. Part of their forum (including an admin, IIRC) are convinced that EMI causes stuttering in games. They do things like upgrade the electrical outlets, cables, and other things around their computer and convince themselves that it’s improving their gameplay due to EMI/EMF.
> Unlike mostly human-invisible refresh rate incrementalism (e.g. 240 vs 360), made worse by slow LCD pixel response, geometrics on near-0ms-GtG displays are where mainstream human-visible benefits are. 2x-4x increases in Hz and framerate, such as 120 vs 480, are more human visible.
It's only 4 times more than the 240Hz gaming monitors sold by basically every brand.
You are getting into diminishing returns. 30->60 and 60->120 are big steps up, both in terms of smoother motion and enabling better human performance. For example various experiments seem to show that tracking enemies in first person shooters is easier with 120Hz. 240Hz is already in the realm where it's better, but not much better. 1000Hz is probably a pretty marginal improvement. But people pay big bucks for marginal improvements all the time. The audio space does much crazier things for much less tangible improvements
As someone with a 240Hz, 1920x1080 IPS monitor from Dell, I definitely agree 120Hz -> 240Hz is nowhere as big an improvement as 60Hz -> 120Hz is.
That being said, I would still buy a 240Hz instead of 120Hz if I had a choice and the price difference isn't that much. There is still a perceivable improvement as far as I'm concerned.
On a tangential note, I frankly think 4K (let alone 8K) is a waste of money if you're still rocking 60Hz. The smoothness of 120Hz or above makes a far bigger impact than the higher DPI from rendering 4K. Everything from reading texts to watching videos to playing games benefits from higher refresh rates and to a greater degree than from higher resolutions.
Reading text definitely benefits from the resolution more than refresh rate. Once you've used HiDPI displays, it's hard to go back to noticeably pixelated fonts.
I'd rather have 1080p at 120Hz than 4K at 60Hz. The most legible font is the one you read the most often (people had no trouble reading cursive or blackletter scripts when they were standard), and pixelation in no way compromises legibility. Blur is a problem, but that's easily avoiding by disabling antialiasing and enabling full hinting.
Video needs to catch up to higher refresh rates, but there are also those of us who regularly watch videos at 2x or higher speeds, so there is that edge case where a 60hz video can benefit from a 120hz monitor
video is typically way less than 120 Hz for sure. There are ways to get your PC video player to increase / motion interpolate your video (similar to how some TVs do it and everything somehow looks like an old soap opera). I myself have found it to work via potplayer with "avisynth processing" enabled plus various tweaks you have to lookup (reddit thread somewhere) to make it work -- presumably it would crank to 1000 hz but never tried.
And even if we assume that there is no further improvement in perceived smoothness or your ability to predict motion; a monitor that updates every millisecond instead of every four millisecond will show you any new information earlier, like say somebody appearing around the corner. It's basically a 1-2% boost to your reaction time just because you get to react earlier. At a competitive level that's nothing to sneeze at
I’m interested; how is reading (static?) text better with a higher refresh rate? Or are you meaning scrolling text in support of reading it is smoother and more appealing?
Not the GP, but yes, scrolling is smoother and more appealing. That can make the difference between choosing to scroll by wheel or by page down key.
In principle I would prefer the latter, but in practice web pages with headers that aren’t accounted for while scrolling causes the key to be quite inconvenient.
Mostly it’s a matter of fatigue, however: low frequency monitors make my eyes get tired faster.
The ability to use touch and drag gestures without any visible delay [1]. This dramatically improves usability as a drawing and painting tool for artists.
Great video! This is more a problem of the various other hardware interfaces though, considering even a 120Hz panel has a maximum of 6ms of latency between it's refreshes.
As a person who does some digital art from time to time though, I'm depressed we still don't have that tech 12 years later...
Well even with a 6ms refresh rate, you still have the inherent LCD grey-to-grey latency issue.
Latency is a pipeline issue though. Every stage can add a bit of latency and the total is the sum of all the stages. So with a 120Hz panel you really have a minimum of 6ms latency, with more to be added by software and other hardware in the pipeline.
This is what I am wondering every time I am updating my TV (to a larger model, better HDR, etc. ]). I almost exclusively use the TV to watch movies. Movies that have been shot at 24fps, or 29.97 or 48fps. So this being my use case, all I need is that the player chain and the TV can change refresh rate to the actual movie refresh rate/fps.
For my use case, is there anything I am missing here? Or have I been in the wrong all these years to brush off the TV sales guys at the store trying to push me 240Hz etc. latest fads?
Okay you. Its common for people to call the big led/oled monitor you put in ur living room a tv. And as far as I know there's no difference between them tech wise. My tv is a 65inch 4k 120hz oled panel. Everyone calls it a tv. They are also still sold as "tv"'s online. Smart tv, etc.
Sidenote the word monitor is also a pretty clear sign of age. If I hear somebody call my screen a monitor its like when my grandma calls the toilet the comode.
There’s a pretty big difference between them technically, mostly the latency of the display chain. TVs can have signal input to picture on screen latencies of dozens of milliseconds, whereas a monitor would be considered defective if it was more than one frame.
This only matters for interactive content, but it’s where a lot of the price difference comes from.
Most TVs have a "game mode" that reduces latency to a single frame time, give or take a few milliseconds[1]. Not as good as the best monitors, but close, and certainly not dozens of milliseconds.
Only recently have low latency screens been the default expectation for flat panel computer displays. (I know the crt's were really fast) For most of my life most flat panel computer displays were high latency. Me and my gamer buds have always had to check we were buying a screen with sub whatever ms latency. Nowadays though they are pretty much all fast like that.
This applies to the tvs too. Most of them have a low latency mode. Mine has it. I use it all the time.
Anyways, point is the uneducated consumer can't tell the difference between them, and me the well educated consumer also can't tell the difference, and they come with all the same features, so people just call the big ones tvs, and the small ones screens. (Except the smart tv stuff)
Obviously, if you exclusively watch 24/29.97/48 fps movies, a display that updates faster than that has no benefit to you. I suppose it does get the next frame faster.
But have you watched 60/120/240 fps content? How often do you update your TV? If you're going to keep it for more than a few years, definitely get 240 Hz.
I was similarly dismissive of high-frame-rate displays, until I got a new monitor (4k, 120fps, replacing an ancient 1080p display with screwed up vsync settings that capped it at 30 fps because I didn't know any better) and watched some content I'd shot on my own Insta360. Oh my god was it incredible. Buttery smooth and sharp and fast, like real life instead of a movie. I booted up Rocket League just to see the difference. It's like walking around while looking through a smartphone camera preview versus looking at the world with your own eyes. 30 fps is now hard to watch and totally unplayable for me. (Yes, unplayable, not unwatchable; if you ever hook up a PC gaming system to the TV there's zero dispute in my mind that 240 Hz is worth it, even if it's less important for movies).
I understand that 24/29.97/48 fps have incredible inertia in because of limitations in distribution - if it can't be played on Blu Ray at high frame rates, and can't be played at the box office at high frame rates, there's a chicken-and-egg problem where studios don't produce HFR content so distributors don't support HFR content.
But I cannot imagine that this will hold off high-frame-rate content forever. More and more distribution is over vertically-integrated streaming providers. Because each frame is similar, the bitrate of compressed 120fps content isn't 2x that of 60 fps which isn't 2x that of 30 fps (it's more like 1.5x, with diminishing returns), so it won't cost 2x as much to distribute high frame rate video. It takes more compute from the encoders, but the AI boom is fueling GPU improvement at incredible rates.
I think that the first sports league (XFL? UFC?) to offer 120 fps or 240 fps content will put a crack in the dam that will quickly erode. Some AAA blockbuster action movie (by Ang Lee?) will premier exclusively on some streaming service, with special treatment to be distributed at 120 or 240 fps. A few top-tier cinemas will update some of their projectors to run at high frame rates. Eventually mainstream sports and mainstream movies and optical media standards will be at high frame rates.
Whether that's before or after your next TV upgrade or not, I don't know.
Modern TVs (or at least those I've been interested in) have upscaling functions both for resolution and motion. So they can take your 24fps movie up to some more manageable frame rate.
I personally cannot watch anything lower than 60fps (and 60fps is still inconvenient, but passable). I see it as a really fast slide show which gets my eyes tired quickly.
Also you can watch clips recorded at higher fps natively - like nature videos on YT etc (I think these still max out at 60fps, but with motion upscaling it is quite decent).
Combined with a good sound system, these can be quite relaxing and if you have an OLED screen it can even be mesmerising to watch.
The simple fact that it is announced by a company selling refresh rate testing hardware shows the improvements in term of ergonomy/real world usage have plateaued past a certain refresh rate. The only remaining practical use case for increases is a metaphorical penis size contest.
The site has been around since anything over 60 FPS was blazing and has been saying 1000 Hz is the ideal target for nearly as long, some articles on why are linked throughout.
There are various visual effects that can be done with high frame rates. For example, mount the screen on the wheel of a moving car and show an image spinning the opposite direction at the same speed and it appears stationary.
But if you have a low refresh rate, it just appears blurred.
Higher refresh rates in general produce smoother scrolling, lower latency screen updates, more fluid motion, and a generally more immersive computing experience.
Going from 60Hz to 120Hz is a noticeable upgrade for most people, though others don’t notice much.
Higher rates offer marginal improvements in smoothness and latency, but it’s really marginal. At 120Hz you’re getting a frame every 8ms (if the GPU can provide it), while 240Hz drops that to 4ms and 1000Hz to 1ms.
Competitive FPS gamers will already push hardware and settings to 200+ fps for optimal smoothness. Add AI frame generation and you could get even higher rates, though frame generation introduces its own lag by necessity of having to interpolate between two frames.
This would be fantastic for people like me that are extremely sensitive to flicker. LED lights and alarm clocks drive me insane at lower refresh rates because they leave dotted trails at night time.
I would kill for a computer that didn't show mouse trails on bright screens. If I move my MBP's cursor around on this HN page, I can count approximately how many frames of cursor I can see.
If I had any legislative power, I'd outlaw LED brake lights or headlights with refresh rates lower than 1kHz as well. :)
My car actually have rapidly blinking brake lights as a feature :) this happens only with rapid braking, but it supposedly brings attention more quickly
I personally have some dimmable AC LEDs in my home that blink at 50Hz (EU power), that seems to distract only me
This is cool and all, but it's been 10 years since 4K monitors. Where are (affordable) DUHD ultrawides and why there is not a single 42 inch 6K or 8K panel in the entire world?
There's not a mainstream use case for 4k content. Show watchers don't care enough about the quality improvement to pay their streaming provider $5 more for the 4k stream.
Even the gaming enthusiast market has rejected 4k, opting to spend their precious gpu cycles on refresh rate improvements instead.
Panel marketers are going to figure out what number consumers comprehend and make it bigger regardless. Hence their focus on 4k, 8k, and now these silly refresh rates.
But their big problem is that consumer behavior is currently bottlenecked by streaming bandwidth and GPU cycles. Two things panel manufacturers have no control over.
Buy a nice 1440p @ 120 Hz panel and ride out the next decade in comfort.
Thanks for those datapoints. I've checked out of the screen race for over a decade now (around the time 3D glasses were a thing) and have not really kept up.
The displays look incredible, but I was always wondering if that was just me or if other people really bothered to keep up with the latest tech to have a slightly better viewing experience.
But let me guess. When you open the sources menu on the monitor, then select one source, you still have to wait 5-10 seconds before it shows anything :/