Hacker News new | past | comments | ask | show | jobs | submit login

This is what I am wondering every time I am updating my TV (to a larger model, better HDR, etc. ]). I almost exclusively use the TV to watch movies. Movies that have been shot at 24fps, or 29.97 or 48fps. So this being my use case, all I need is that the player chain and the TV can change refresh rate to the actual movie refresh rate/fps.

For my use case, is there anything I am missing here? Or have I been in the wrong all these years to brush off the TV sales guys at the store trying to push me 240Hz etc. latest fads?




If I recall correctly what a TV was, it’s not clear to me why you’d ever want to have one. This article is about monitors.


Okay you. Its common for people to call the big led/oled monitor you put in ur living room a tv. And as far as I know there's no difference between them tech wise. My tv is a 65inch 4k 120hz oled panel. Everyone calls it a tv. They are also still sold as "tv"'s online. Smart tv, etc.

Sidenote the word monitor is also a pretty clear sign of age. If I hear somebody call my screen a monitor its like when my grandma calls the toilet the comode.


There’s a pretty big difference between them technically, mostly the latency of the display chain. TVs can have signal input to picture on screen latencies of dozens of milliseconds, whereas a monitor would be considered defective if it was more than one frame.

This only matters for interactive content, but it’s where a lot of the price difference comes from.


Most TVs have a "game mode" that reduces latency to a single frame time, give or take a few milliseconds[1]. Not as good as the best monitors, but close, and certainly not dozens of milliseconds.

[1] https://www.rtings.com/tv/tests/inputs/input-lag


Only recently have low latency screens been the default expectation for flat panel computer displays. (I know the crt's were really fast) For most of my life most flat panel computer displays were high latency. Me and my gamer buds have always had to check we were buying a screen with sub whatever ms latency. Nowadays though they are pretty much all fast like that.

This applies to the tvs too. Most of them have a low latency mode. Mine has it. I use it all the time.

Anyways, point is the uneducated consumer can't tell the difference between them, and me the well educated consumer also can't tell the difference, and they come with all the same features, so people just call the big ones tvs, and the small ones screens. (Except the smart tv stuff)


Obviously, if you exclusively watch 24/29.97/48 fps movies, a display that updates faster than that has no benefit to you. I suppose it does get the next frame faster.

But have you watched 60/120/240 fps content? How often do you update your TV? If you're going to keep it for more than a few years, definitely get 240 Hz.

I was similarly dismissive of high-frame-rate displays, until I got a new monitor (4k, 120fps, replacing an ancient 1080p display with screwed up vsync settings that capped it at 30 fps because I didn't know any better) and watched some content I'd shot on my own Insta360. Oh my god was it incredible. Buttery smooth and sharp and fast, like real life instead of a movie. I booted up Rocket League just to see the difference. It's like walking around while looking through a smartphone camera preview versus looking at the world with your own eyes. 30 fps is now hard to watch and totally unplayable for me. (Yes, unplayable, not unwatchable; if you ever hook up a PC gaming system to the TV there's zero dispute in my mind that 240 Hz is worth it, even if it's less important for movies).

I understand that 24/29.97/48 fps have incredible inertia in because of limitations in distribution - if it can't be played on Blu Ray at high frame rates, and can't be played at the box office at high frame rates, there's a chicken-and-egg problem where studios don't produce HFR content so distributors don't support HFR content.

But I cannot imagine that this will hold off high-frame-rate content forever. More and more distribution is over vertically-integrated streaming providers. Because each frame is similar, the bitrate of compressed 120fps content isn't 2x that of 60 fps which isn't 2x that of 30 fps (it's more like 1.5x, with diminishing returns), so it won't cost 2x as much to distribute high frame rate video. It takes more compute from the encoders, but the AI boom is fueling GPU improvement at incredible rates.

I think that the first sports league (XFL? UFC?) to offer 120 fps or 240 fps content will put a crack in the dam that will quickly erode. Some AAA blockbuster action movie (by Ang Lee?) will premier exclusively on some streaming service, with special treatment to be distributed at 120 or 240 fps. A few top-tier cinemas will update some of their projectors to run at high frame rates. Eventually mainstream sports and mainstream movies and optical media standards will be at high frame rates.

Whether that's before or after your next TV upgrade or not, I don't know.


Modern TVs (or at least those I've been interested in) have upscaling functions both for resolution and motion. So they can take your 24fps movie up to some more manageable frame rate.

I personally cannot watch anything lower than 60fps (and 60fps is still inconvenient, but passable). I see it as a really fast slide show which gets my eyes tired quickly.

Also you can watch clips recorded at higher fps natively - like nature videos on YT etc (I think these still max out at 60fps, but with motion upscaling it is quite decent).

Combined with a good sound system, these can be quite relaxing and if you have an OLED screen it can even be mesmerising to watch.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: