Hacker News new | past | comments | ask | show | jobs | submit login
HDMI 2.1 Announced: 10Kp120, Dynamic HDR, New Color Spaces, New 48G Cable (anandtech.com)
238 points by dmmalam on Jan 5, 2017 | hide | past | favorite | 187 comments



Wow that is a lot of bits. Seriously, that is a LOT of bits. Its almost like they said, "We're sick and tired of people complaining the cable is holding them back, here beat that suckers." :-) I guess the margins on 4K televisions have done better than expected, certainly as a 'feature' driving upgrades they have out performed '3D'.

For me it starts to push up against the question of whether monitor resolution has hit the top of the s-curve. A 9600 x 5400 (10K) screen at 200PPI is 48" x 27". That is a 55" High DPI screen. And while glorious and amazing as that would be, I'm wondering if that would be "main stream" any time soon.


I think 4K has ended up being a bit of a whiff in most respects. I just bought a 60" TV for my living room, where we are about 7 feet from it, which is pretty close (it's the narrow dimension in my rectangular living room), and I can't tell much difference between putatively 4K content and clearly-1080P (from Bluray). It's hard to be sure, because all my putatively-4K content is streaming-based, and there aren't that many sources willing or able to ship enough bits to clearly distinguish between 1080p and 4K.

That is, in a 1080p v. 4K streaming contest, if you dedicate the same larger number of bits to the 1080p content as the 4K content, you'd get a better 1080p signal, too. With lossy compression, pixels aren't directly comparable, they have a quality, and low-quality 1080p pixels v. low-quality 4K pixels makes it hard to tell what differences are the format, and which are just the low quality in general.

(In other news, I consider the idea that streaming is obviously superior to disks to be a bit crazy. What disks have is higher quality pixels, and I expect that to continue to be true for a while. I've seen Bluray 1080p streams that are clearly superior to putatively 4K over-the-internet streams. And I've got DVDs that upscale to a higher quality than putatively 1080p streams of the same content.)

Far more important is the increased color gamut and HDR support, which does provide a noticeable increase in image quality. Even from those aforementioned dubious-quality streaming sources; the higher-gamut HDR images are a visible improvement over the NTSC-sunglasses we were all wearing without realizing it.


Amen. Like about 50% of the population, I wear glasses. I absolutely can't make any noticeable difference between 2K and 4K at normal viewing distance. Most cinemas nowadays are 2K only, and nobody cares. Because most people older than 40 simply can't see.

Heck, for 10 years I've seen people watching 4:3 TV on 16:9 screens, horribly stretched. People are almost blind, seriously. They'd tolerate about anything.

HFR and HDR on the other hand, are great. I've seen HDR/HFR demos in 2K at Amsterdam IBC and it looks much better (much more realistic) than standard 4K.

8K? The NHK makes 8K demos, and they can only show almost still images, else you only have extremely detailed motion blur at ordinary framerates. They even give you a magnifying glass to look at their 8K screen. That's impressive, but I don't plan watching movies through big lenses to catch the details, anyway. What's the point, apart from planned obsolescence?


Oh man, the 4:3 stretching. Whenever I reach for the remote to correct it, people go "stop messing with the TV! It's fine!" and then go "it's stretched now, it was fine before!" when I fix the aspect ratio.


That and "smooth motion" frame interpolation. I don't know how people can stand it.


Oh, when I was visiting my father for Christmas, he had a new TV with Live Football Mode, which is just a fancy name for upping the contrast, saturation, and making it look 'smoother' doing what you mentioned. During sports it's actually fine, but he turned it on during a different tv show or movie and it looked horrible, like it was turning the tv show into a soap opera. It has its place, but in general I trust the designers of these shows, networks, manufacturers, and everyone in between to know what they're doing more than me and besides changing a dull looking picture due to something like a different colored light (contrast, warmth/coolness, saturation settings) I wouldn't mess with other settings.


That's a big peeve I have as well. The photographer/videographer/musician/artist has spent many hours getting things just right, and my equipment decides to pump everything up to 11? It's an insult to the creator.


I'm so glad to know I'm not the only person that is driven crazy by this. It's hard to put a finger on why it bothers me, but to me it makes everything look "amateurish" and I find that so distracting.


If the glasses are the correct one for you, you should have normal eyesight (that's the point of glasses).


It's also almost never 100% true. Even if the glasses are perfect, they also have to be perfectly clean or you get a bit of a blur anyway. Small enough to not be noticeable, but strong enough to blur the benefits of such a higher resolution.


Actually, no. There is no available correction that would allow me to read the smallest letters on the optician's test board. Age, floaters, coming presbyopia combining with aggravating myopia, uncorrectable changing astigmatism, etc. You're probably too young to understand :)


TVs just keep getting bigger while viewing distances stay basically the same. With 8K it would be possible to have an IMAX home theater experience with the display completely filling your peripheral vision.


>Most cinemas nowadays are 2K only, and nobody cares

I care very much. The Force Awakens was in 2k and looked like a gameboy game. And just about everything else unless I'm sitting right at the back.


Your 60" TV is not big enough for 4k at 7' away. You need > 75" of screen[0]

[0] http://www.rtings.com/tv/reviews/by-size/size-to-distance-re...


Video output and its perception, is not at all like audio output and its perception.

We can make blunt statements about the limits of audio, because output devices behave (largely) linearly and band-limited, and because human perception can be modeled accurately and is also band-limited.

You can't say the same about video. The output is an array of points, not even perfectly square or smooth, it is picked up by another array of receptors, and is non-linear. The limit being 1/60th degree is too blunt a rule of thumb. That's picking an arbitrary point on a curve showing how much better than guesswork a viewer is.

The screen isn't a perfect output source, and the viewer isn't a perfect receiver, and the combined effect is you can easily observe the difference of 2K vs 4K images at distances those graphs say you wouldn't. I can easily observe the difference where it says I shouldn't be able to, and I'm nowhere near perfect vision.

I would say pretty much everything in those graphs is off by a factor of 2.


Those size charts carry a lot of assumptions. I have a 65" and can tell the difference just fine at 8 feet away.

The real benefit from 4k is actually HDR and the MUCH higher bitrate for video. The majority of streaming 4k is really low quality when compared to a normal bluray. Just because the resolution is big doesn't mean the picture quality is any better. Also a lot of 4k content is actually upscaled from 2k sources. http://realorfake4k.com/


I agree, I have noticed some poor 4k quality, even on netflix, especially in dark scenes.

I am constantly consulting realorfake4k.com especially since the marketing for streaming (and even purchasing) is so confusing on Amazon Video.


The print industry will print in resolutions up to 2400dpi depending on the intended view distance. If 2400dpi is the ultimate premium for a handheld print then phones still have a lot of catching up to do.

That table is purely relative to the mass-market options and tries desperately to make a case for buying a TV in 2017 which only supports lower resolutions than were common a decade ago. Looking at you, "HD" in 720p.


That's probably a 2400dpi halftone (or similar) resolution. To compare to ppi (pixels per inch), you need to decide how many gray levels you want. Assuming the 2400dpi is a grid, and you want >= 256 shades, you need a grid square of 256 dots (16x16). This is all rather handwavey though, as two-tone b+w prints truly would be 2400dpi.

So 2400dpi might be comparable to 150ppi for photographic images.


> as two-tone b+w prints truly would be 2400dpi.

And by using colored ink (and/or colored paper) you can get some spectacular results on a fairly reasonable budget that way (worked at Kinko's a lifetime ago).


Previously worked in a photographic print lab; digital wet labs (that use R, G, and B lasers to expose traditional emulsion-based photosensitive paper) operate at 300dpi. Photographers dont target higher than 300dpi for any prints, and for large prints, it's perfectly acceptable to go down to 100dpi.


I guess this would explain why my photo prints often feel less sharp than the on-screen versions these days. :)


My TV is quite a bit further away so that's why I'm still quite happy with 1080p and lower content.

The only reason for me to go to a newer TV at this point is for HDR.


I recently obtained a PS4 Pro which outputs up to 4K on supported games (of which there are very few currently). Even on the games that support >1080p but less than 4K, I notice an incredible difference in quality despite the 50" TV screen being 5 feet away. (Final Fantasy XV is very pretty on a Pro)


I do wonder if that could be mimicked by rendering at 4K and antialiasing down to 1080. After all, if we don't notice in live video which is naturally antialiased, it would follow that we wouldn't notice in antialiased games.


I believe that's usually called MSAA, Multi-Sampled AntiAliasing. It generally does work very well for making a pretty image but it's also a lot of samples that'll be redundant. That's why there's some things like CSAA, FXAA that try to only do it on places where there's higher contrast or on edges where it'll be most notable.


SuperSampling or downsampling (there might be some nuanced differences between the two). AFAIK, MSAA works at the render resolution and you can have it in addition to downsampling, although at a very poor ratio of computation to reward.


PS4 Pro already does this for some games like the last guardian.


On a 1080p TV, the PS4 Pro will downsample supported games, reducing jagginess.


If you want a good test try doing text editing on 1080p vs. 4k at whatever distance on whatever TV. For me, text is always easily visibly nicer on the 4K. Your PC is not doing any video compression, so you can be sure that everything is pixel perfect in both resolutions, and truly a fair comparison.


I recently bought a gaming laptop and decided to go with a 1080 screen because it had gsync for smoother graphics versus a 3k screen. My main monitor and work machine are retina displays.

Definitely had a bit of shock reading text on the 1080 screen. My eyes had forgotten how painful it was by comparison.


mirroring what the other person said, its more useful for anything which isn't video

A good example to consider is mobile phones and how sharp and precise the UI can be, with lines almost at hair thickness. Anti-aliasing is almost useless on phones already, and its all due to their pixel densities.

(of course, not so useful for 60" screens!)


>> I think 4K has ended up being a bit of a whiff in most respects.

And some unintended consequences as well. Buddy on my hockey team got a notice from Comcast that since he continually went over his 100GB data limit every month, they were raising his rates.

Incredulous, I asked him, "What on earth on you doing where you're blowing through 100GB of bandwidth every month?!"

His response? "I have to stream all my Netflix movies in 4K."


100GB?? That's barely enough to buy a couple of video games. Where do they have that limit?


15 years ago in The Netherlands.


Try a video game, you will notice the difference instantly.


Sure, but that has not much to do with the display resolution but more to do with the rendering resolution. It is the case that usually these are the same but it is not a requirement.


"I can't tell much difference between putatively 4K content and clearly-1080P"

I presume the TV does upscaling? Not an expert, but theoretically a kick-ass algorithm should be able to gleam more resolution out of a temporally changing signal than the native resolution - thus actually providing more than 1080 pixels worth of resolution out of the 1080p stream.

When I bought a HD TV either the Blu-Ray player or my TV (can't recall anymore) upscaled my old DVD:s with surpising quality - I can only presume the same applies for the current generation (haven't had the chance to do detailed analysis).

Experts please revise.


How can an algorithm create more resolution (i.e. information) from less information?

Do you mean a wavering 4k sream outputted as 1080p?


Video Upscaling via Spatio-Temporal Self-Similarity: http://vision.ucla.edu/papers/ayvaciJLCS12.pdf

The Freedman and Fattal paper they mention can be found here: https://pdfs.semanticscholar.org/7df0/39049948d54fd1f4d75526...


We all laughed at the ridiculous "Zoom and Enhance" bits on TV crime shows, but it's become much more plausible in the past couple of years.

It's called super-resolution, or upsampling. Here is a good overview of techniques: http://www.robots.ox.ac.uk/~vgg/publications/papers/pickup08...

More recently, Google's RAISER: https://research.googleblog.com/2016/11/enhance-raisr-sharp-...

This repo pulls together techniques from several papers with impressive results: https://github.com/alexjc/neural-enhance

Anyway, it's an area of active research, there are already four dozen relevant papers in 2017 alone: https://scholar.google.com/scholar?q="machine+learning"+"sup...


It's using information from adjacent frames to add extra detail.


The general term is "video super-resolution". There's software available off the shelf to do it, IIRC.



The funny thing about 4K is that I do home AV installs, and its almost impossible not to have compatibility issues still.

To this day, I don't think we've had a single installation with 4K projector / Receiver / player working together of different brands without some kind of serious issue.

ARC is another disaster that we STILL are having trouble with, and thats been out for years. I really hope that more has been done in the way of standardisation to help iron out these issues


I had to use an optical cable between my TV and AVR to get audio. The only thin ARC did was allow me to control the AVR volume control using thr TV's remote. Not very useful when no audio is coming back.


I've got a top-end Sony 4k TV (x940d with full-array local dimming) and some of the first content I watched on it was the Rio Olympics in over-the-air 1080i. The quality was simply stunning. As you can imagine, the Olympics broadcasts would have used some of the best cameras available. Upscaled 1080 content is absolutely superb, and I often have a difficult time determining 4k from 1080p at ANY distance.

That said, I'm not sad I went 4k because the set really IS that much better than a 1080 set, especially a low-end one. Sony upscaling and motion processing really is where it's at.


Yeah, that's the thing too...high-end sets are still better than cheap sets. TL;DR you get what you pay for. If you get a 50" 4K TV for $500, I wouldn't expect it to be very good. Vizio's 2016 P series is decent, but they more or less lie about the ability to render proper 4:4:4 chroma at 4K resolution. Not great for a $1200 TV (which I use as a monitor, upon which a single-pixel-wide red line is very blurry). The equivalent Samsung which can do real 4:4:4 at 4k@60hz is close to $2k, ditto the Sony, and the Sony has input lag issues while the Vizio will do 120hz@1080p with ~10ms input delay, which is great for gaming. The Samsung is OK for input lag (~30ms IIRC) but more expensive and I just don't like giving Samsung money.


I used to say I couldn't tell 4K from 1080p but after a year I can tell the difference.

I went from a 2001 Toshiba 720p flat CRT to 2016 55 inch 4K SUHD HDR. I guess my brain had to get over the shock.


I don't think it ever made sense to say it's generally hard to tell 4K from 1080p. It just really depends on the viewing distance and size of the screen.

My tiny laptop screen is almost 4K, and the difference is massive from a 1080p screen because I'm so close.

Likewise, your 55" inch screen wouldn't need to be 4K if you sit 10 meters away from it.


Well, as I think I may have said poorly, I don't really know if I can tell 4K from 1080p reliably, because a 4K stream that contains such high quality pixels in it that it qualitatively different than an upscaled 1080p stream is very hard to come by right now.

According to the charts, I'm right on the outer edge of where it may make a difference, so it's at least physically possible I could tell, if I got a truly 4K stream.


Not only that, but the quality of the video feed. For example, highly compressed 4K is indistinguishable from much less compressed 1080p.


The upscaling process on some televisions is astounding. Upscaling a 1080p@24 source to 4K@120 display brings out so much detail and gives the content an entirely different feel.


What are you using to get 4K at real 120hz? HDMI 2.0 maxes out at 60hz for 4k: http://www.hdmi.org/press/press_release.aspx?prid=133

http://www.rtings.com/tv/reviews/vizio/p-series-2016 has a 120hz panel but at 4K the "120hz" is interpolated; it's not actually getting 120 updates per second from the source (I wish). I have this exact TV as a monitor and 1080p gaming @120hz is glorious. 60hz is fine for desktop use but now that I've tasted it I can't go back.


How can it bring out more detail, surely it can only smooth a graduation from one colour to another?


It's not just linear interpolation, there's all sorts of clever edge detection going on as well. It still cannot invent detail out of thin air of course, but based on the type of contents it can infer the detail that "could have been there". YMMV though, I hate it just as much as motion compensation.


Suppose your desk is 36" deep. A 48" wide screen at the back of it occupies 41 degrees of your world. The THX recommendation for cinema is 40 degrees. This is as immersive as you're going to get without strapping a screen to your head.

20/20 vision is about 60 pixels per degree, depending on which study you want to believe. 41x60 is 2460 pixels across -- you've already reached the max required pixels with a 3840x2160 display.


I run a 40" Philips 4k monitor as my primary desktop which sits 3 feet away from me, and I can guarantee you that I can see every single pixel still. It's nothing like a 5k iMac, for example, which at 40 inches, would basically be 8k. 4k is nowhere near the end of the game yet. Even for a TV 6 feet away. Until I can't distinguish my TV from my window, there is room for improvement.


I'm in a similar boat, although my sizes and distances are a little different from yours.

I couldn't wait for the 40" Phillips to arrive in Canada, so I ended up getting a Dell 27" 4K monitor (that turned into two once I got addicted to all the real estate).

I have my monitors side by side on a standing desk where the range of closest and furthest distance of the screen to my eyes is between 21 and 27 inches. I have the monitors running at native text scaling in Windows (my wife and friends look at my screens and think I'm nuts), so I am pretty close to the threshold of seeing every pixel. Granted, I have single vision glasses that I wear for computer use (combination of distance + presbyopia correction), and those glasses coincidentally magnify what I see by a hair.

I'd love to get a 55" 8K curved display as a monitor. Oh, the added real estate...


How is web browsing at that density? I'm nervous to pick up a 4K display even at 30-32 inches, as I don't think scaling is solved in Windows or on the web (though that's just speculation).


It's fine for me, but I'm used to high density screens. For the longest time, I was using 1680x1050 15" laptop screens at 1:1 (before 1080p became common).

Looking at the HN site, which has pretty small text to start with, I'm perfectly fine reading the text on the screen. If the text is too small, zooming once or twice usually fixes it.

I originally thought 40 or 32 would be the ideal size, but got tired of waiting for the models to come out.

I went into a store and checked out the 28" 4K monitors (which were pretty common), but I wanted IPS, so I ended up with the 27" Dell IPS, which got good reviews and not that much denser than the 28s (for me).

I actually am glad that I did not go 32 or larger because I got acclimated to the added real estate very quickly and wanted more. My desk handles two side by side 27s in landscape. I guess I could have went with two portrait 32s but I definitely could not fit in dual 40s.

I don't know if I can handle four monitors (more head turning than I have now), but I've got some mini monitors in addition to my two main ones, and I still feel at times that I don't have enough screen real estate when I'm multitasking.


You're not facing a resolution problem so much as a brightness and contrast problem.

Bright sunlight is about 120000 lux. That's about 38000 nits -- and the brightest specialty monitors I know of go up to about 10000 nits. A typical "wow, look at that" display is 2-3000 nits.


I do not want to downplay that we have a lot of improvement to do on contrast and brightness before we can match an actual window to the outdoors, but vegabook specifically called out his ability to resolve pixels on his 40-inch 4K. Resolution is still a problem. In fact, I feel we have a lot of improvement still to do in desktop displays and that resolution and size remain the most significant hurdles.

I run two of the Philips 40-inch 4Ks on my desk and although these are fantastic displays for the price, there are many areas for improvement.

Most importantly, and easiest to fix: they should be fully matte rather than semi-gloss. My contention is any display used for work should be matte unless you work in an environment with absolutely no lighting. Lighting glare is the most frustrating issue with computing, in large part because it's so easy to fix. In the early 2000s and before, monitors, including the highest-resolution monitors of the time, were almost exclusively matte. During the malignant era when living room displays converged into desktop workspaces and "HD" reigned terror on computing, we also accepted glossy/mirror displays. Although we have recovered from that period of resolution regression, matte remains marginalized.

40-inch at 4K is too small to use singularly and too low-resolution. I feel ideal is ~55 inches, concave, 10 to 15K horizontal resolution.

For what it's worth, here is a blog/rant I wrote in 2014 about ideal desktop displays: http://tiamat.tsotech.com/ideal-desktop-displays


I think what's keeping us back as much as anything are video cards that can handle those displays. I have a 40" 4K as my primary desktop display, and run at native 1:1, it's still a little small for me. I often zoom websites to 125%. On that, I bought a GTX 1080, and it runs it well enough... native gaming really only works on older games and doesn't hit even 40fps consistently.

I have an i3-5010u as a backup, which has a 28" 4K attached... it feels sluggish just on desktop use (150% zoom) in windows.

I you want to push much more than that, you're going to need some really beefy video cards in tandum.


I'm a bit of a monitor nut myself and I too am running 2xBDM4065UC. Actually I run the second one above the first one, mounted at an angle towards me, and with a lean-back, headrested, programmers chair. It's awesome. If you can get used to the i3 tiling window manager, it's even better IMO, but that's a taste that not everybody acquires.

Maybe the new 43 inch version is more matte but I have not seen it yet. Could not agree more that a 55 inch'er 10k resolution, would be my absolutely perfect display. With a low radius curve, and possibly curved vertically too (i.e. a rectangular sector of a sphere) though I am not holding my breath for that.


>> I feel ideal is ~55 inches, concave, 10 to 15K horizontal resolution.

I'm with you on that. I run a pair of 27" 4Ks at native scaling, and I feel like I'm looking at a 10 year old monitor.

27" was about the smallest screen my eyes could tolerate at 1:1 scaling - I was driven by real estate more than sharpness. All other things being equal, I'd prefer a pair of 27" 8K monitors so that I can get my 4K equivalent real estate with crisper text.


I was all in on matte. Glossy screens were absolutely terrible in any kind of lighting... until the 2012 Retina MacBook Pro. Whatever reduced layer-gluing-this-to-that magic they did there reduced glare to the point where I just don't notice it anymore.

Meanwhile matte displays actually get more affected by light falling on them, just that it's spread out a bit.


That is if I stay fixed in my position. If I lean forward to see some detail, I again profit from better resolution. And screens bigger than my viewing field (disregarding peripheral vision) may be unnecessary for entertainment but are certainly useful for work. Head movements are fast and precise and much better than software solutions for switching between windows.


I'm picturing a future where people get out a magnifying glass to see some detail on a digital screen.


When I was at the security office for the medical division of an at-the-time-German-company-owned software company I noticed a security officer comparing digital scans of fingerprints on an LCD monitor using a magnifying glass.


Pinch to zoom!


Higher resolution and higher refresh rates are still needed for VR. VR requires both a relatively high refresh rate (90Hz) and pretty high resolutions (current generation devices are 2160x1200). I believe that uses most of the bandwidth of HDMI 1.3, I'm not sure if any headsets use HDMI 2.0 yet.

And if you've tried VR, we're still at the point where pixels are obvious and higher densities would be an improvement. The usual comment I get when friends try my Vive is that it looks like an N64.


Even as the HDMI consortium, being composed largely by manufacturers of home theater equipment, may not be thinking of this, I think you're absolutely right: 4K TVs are nearing, or past for many, the point of diminishing returns in terms of resolution. 8K is going to be a very tough sell next decade, but the future of VR will hinge on it.

Today's 2K VR headsets are the 480i of their nascent industry. They're good enough to kickstart a revolution. But the next generation of VR, with 4K displays, will be like the jump to HD – and the eventual adoption of 8K will be much more impactful than the TV world's move from 2K to 4K.


There are many factors besides resolution. This is why you are seeing OLED and HDR for example.


Projectors with their 100"+ screen sizes will definitely show the difference


My description of my Vive is "Nintendo DS". You can tell the quality 3D rendering is there, but it's still on a low resolution screen (compared to what you'd expect to see today). It's cool, but it's so obvious how much room there is for improvement.


Current generation are in many ways built around the limits of HDMI. With this new generation of HDMI, we might get to see 4K in each eye at a very high frame rate. High frame-rates reduce VR sickness so they are not just a nice to have, and good luck getting 4k x 2 at 90HZ over current HDMI.


That's tough to justify for home viewing distances and sizes. I sit about six feet away from a 4k TV (at the closest) and feel like I'd have a hard time justifying greater resolution than that.

If I were guessing, this standard would be only for 1) large format projectors and 2) really larger and expensive boardroom/home theater LCDs.


I used to work for a high end home theater company.

The boss' son was a total audiophile and was constantly bragging about all the high end audio they sold and put in their clients home theaters.

I used to listen to a set of $15K speakers and then to a set of $8K speakers and I just couldn't tell the difference. It's kind of the same I think with monitor resolution, at some point, can the human eye really tell the differences between all the high end resolutions out there right now?

I have doubts most people can TBH, but would love some evidence we're not beyond the capabilities of what the human brain and eyes can actually differentiate between.


It depends on the content, but I can definitely tell the difference... I have studio-style speakers for my dekstop... that said, I have a < $800 set of speakers for my surround sound. I notice the difference, but it's only subjectively better.


Where this becomes very useful is 3D 360° video. 120 FPS allows two 60FPS streams, while transmitting the entire 360° at 10K allows reasonable resolution for the portion the user is actually looking at.


I'm looking forward to standing as close or as far away as I'd like from some sweet zooming fractals, and it looking awesome in every case. If the screen is the size of a wall, even better :)


"Among other things, the new standard will support the game mode variable refresh rate (GM VRR). Nowadays, AMD already supports FreeSync-over-HDMI using a custom mode and with the HDMI 2.1 everything gets a little easier."

Oh, that'll be nice. Now that almost every interface is using 3D rendering tech under the hood, regardless of how it looks, I've noticed tearing has spread from something only gamers see very much of to something I'm seeing everywhere. Hopefully this helps drive the tech to push this everywhere and tearing will, in perhaps 10 years or so, be a thing of the past.


Silly thing is that tearing is a non-issue as long as one has vsync enabled. But games often disable vsync to get that "lovely" raw FPS number...


I don't disable VSync for FPS. I disable VSync because it reduces input lag.


The input lag can be a serious issue with certain monitors and video cards. It's very noticeable to me when playing and FPS.


Agreed. Overwatch in particular has felt a lot more snappy since disabling it. I'd been out of FPSes for a while, but that game made me acutely aware of it.


You should try vsync with triple buffering and make sure to set:

flip queue size / max pre-rendered frames to 1 (depending on AMD/Nvidia)


Most games I play these days are DirectX 10+, which seems to preclude the use of the pre-rendered frames setting.


Actually, to reduce input lag. You don't want vsync in multiplayer first person shooters, that's for sure.


v-sync is a workaround, VRR is the solution.

As others have pointed out, there are numerous problems with v-sync — in addition to input lag, f you want to drive (for example) a 100 Hz monitor, the GPU will need to consistently deliver ≥100 FPS, or you'll repeat frames and introduce jitter.

This means you don't need to consistently hit 100 FPS to experience ≫60 FPS refresh rates (without jitter).


The problem with vsync is if the game would be 59 fps on a 60hz monitor, it's going to be presented at 30 fps because it's going to barely miss the vsync every frame. This creates large fps instability when the game runs around the monitor frequency.


A somewhat related question -- why is the industry able to agree on a single standard for how data will move over a wire from a device to a display, but yet can't come up with a single standard for doing it without a wire.

I can take my Mac laptop or my PC laptop and roll up to my TV and with a single cable make pictures and sound come out of the TV.

But yet neither one can talk to the TV without a cable, unless I get a special device (which is different for both and ironically uses the same cable).

And even worse, if I had a Samsung phone, it could connect to the TV wirelessly.

Why can't we have a single wireless display standard? Is there a technology problem I'm missing, or is it really just walled gardens?


Miracast is such standard. Thousands of devices support it. However two major players ( Apple & Google) dont. Apple has traditionally been a closed system, so it is not a big surprise. What surprises me is that Google also removed support for Miracast from Android and started promoting their proprietary casting protocol.


Miracast is heavily broken on many devices and rarely works at a reasonable level. Netflix and Google worked to create a good streaming standard names DIAL. Google took the idea and ran with it to create the Cast system which ended up working really well.


But the Cast system doesn't send arbitrary bits to the TV. It's a protocol for handing off authorized streams which are then processed by the TV (or in most cases the device connected to the TV by, ironically, HDMI).

So it's not really a method to stream arbitrary visual and audio data to a TV.


Well, you _can_ do screen mirroring with Cast, at least on Android. In my experience though it has fairly high latency and will occasionally freeze for a few seconds at a time.


We also have had DLNA for around a decade now. It was never perfect, but it was fairly well supported in the past. Unfortunately it didn't evolve quick enough so proprietary solutions overtook it.


The problem is that Miracast interoperability is terrible and often doesn't work.


Because doing it without a wire was, until relatively recently (5 years), something only iPhones did well, or at all.

Consumers have been used to cables for decades, so they demand something that just works in this area. The same goes for data. Apple tried Thundebolt and it failed, because consumers already have something that just works and is compatible in this area (USB), and they wouldn't let go of it. Wifi streaming is relatively new, compared to cables, but once we get used to it Just Working, there's no going back for consumers.

It's almost the same with Wifi voice calls, where, unfortunately, consumers don't demand something that just works, but are left with two incompatible protocols (FaceTime and Hangouts, or whatever Google's version is called these days).


DLNA has been around for ages but Apple and Google have NIH syndrome.


Does this mean TB3 (with "only" 40Gbps bandwidth) won't be able to support all the features/resolutions of HDMI 2.1? This is of course assuming that the HDMI 2.1 spec is designed to fully utilize the hardware level bandwidth.

Also I wonder what exactly is making the 48G cable special. Is it using additional pins while maintaining backwards compatibility?


And I was here, hoping HDMI to disapear and be replaced by USB-C.


I guess the HDMI folks thought USB-C insufficiently ambitious on the bandwidth front. 48 Gbps might hold us for a while on the video front.

[edit]I too wish USB-C was the single connector, but it is such a confusing standard with all the Alternate Modes and different cables on the market.


You don't say. Right now I'm trying to buy a new cable to charge my XPS 15, I already bought 2 not working.


In their infinite wisdom, engineers replaced a large number of incompatible, different looking cables (HDMI, USB, power, etc) with a ton of incompatible, identical looking cables.

At least in the past it was easy to eyeball the connectors and pick the right cable.


Its friggin unreal. I'm trying to do a cable order and have no clue what the heck to buy. I just want something that works. I swear these alternate modes and charging-only cables are going to be the end of me.


It might be worthwhile to check out Benson Leung's (Google engineer) reviews on USB-C cables: http://amzn.to/2hX3cMZ

Previous post from last year on the subject about his work: https://news.ycombinator.com/item?id=10508494


Things have changed a bit, now I have to worry about Thunderbolt 3.


Do you have an appropiate charger? I was researching usb-c charging for my XPS 13 (9350) and apparently the charging standard allows for different voltages so it's not all 5V as you'd expect from good old usb and therefore requires a specific charger capable of delivering the right voltage (19V in my case from what I could gather).


> 48 Gbps might hold us for a while on the video front.

Maybe, but Dell just announced their new 8K monitor today, so we may need more bandwidth sooner than you'd think.


Usb 3.1 Gen 2 over USB-C can do DisplayPort1.2 as an alt-mode in some cases (if everything supports it). Kinda the solution, not HDMI. Fun times we live in.


I wonder which causes more pain for consumers: new cables with new connectors (USB-C) or new cables with backward compatible connectors (HDMI 2.1).

While the former definitely makes consumers unhappy in the short term, it seems like in the long term the latter's confusion around what cable you have vs what cable you need is worse.


USB-C is actually the latter because there are several different flavours of USB-C cable that support different protocols.

This is going to suck.


Yeah, it's gotten to the point where I'd rather just buy Thunderbolt cables (despite the extra expense) because it's the only easy way to be sure that I'm getting a cable which fully supports all the features of the USB standard. (Thunderbolt cables are compatible with USB-C connectors, USB 3.1, and USB Power Delivery.)


Even THAT isn't good enough, since not all thunderbolt 3 cables support the necessary wattage. (for example https://www.amazon.com/Belkin-F2CD081bt1M-Certified-Thunderb...)

I'm really hoping this is just a transitionary period and soon all USB-C / TB3 cables are created equal. When type-A started being used for charging there was a time with similar inconveniences.


> When type-A started being used for charging there was a time with similar inconveniences.

Meh, that problem is still there. On top of how Apple has their own take on signaling that the charger is "dumb"...


I remember in the early days you could fairly easily fry a USB port on your computer if you plugged in the wrong thing. I think they are far more tolerant these days, though the recent USB Killer doodad that's been going around shows that extreme abuse can still break things.

It is one thing for things not to work if you plug them in, but when shit gets broken by doing so then that's just inexcusable.


I don't think i have ever seen a USB port get permanently fried from having a device draw too much. I have however managed to fry a cheap thumbdrive by reversing the connector on the motherboard...


I don't even necessarily care if all USB-C cables are created equal, I just want a way to easily be able to tell from looking at the cable what capabilities it does and doesn't support.


Doesn't Ethernet have exactly the same problem though?


Nah, 1000baseT will work over bog-standard Cat5 (no need for Cat5E even) up to 100m: https://en.m.wikipedia.org/wiki/Gigabit_Ethernet

1000BaseT requires 4 pairs though (all 8 wires) whereas you could run two 100BaseT runs on a single Cat5 cable (great for adding a second drop without snaking a new wire through walls). 1000BaseTX runs on 2 pairs but requires Cat6 or better, but it doesn't matter because nothing you'd ever run into supports 1000BaseTX.

10gig on copper is likely different and I don't know enough about it to comment aside from how it's really uncommon in consumer-level gear right now. I believe a 4gig over Ethernet standard is coming soon which should be a good middle-ground.


Can somebody please introduce the HDMI Forum to semver. While I'm looking forward to what this will enable, it's a little overzealous for a point release.


Agreed. As a consumer, it's a bit odd that HDMI 1.x and 2.0 used the same cables, but 2.1 requires a new cable. That seems like a good reason to bump the major version number.


It's the same with USB; USB 2.0 and 3.0 were compatible, but USB 3.1 (more commonly known as USB C) is not compatible with either of them.

EDIT: Actually, from consulting Wikipedia[1], it's even worse than I remembered. Apparently there's USB 3.1 gen 1 and USB 3.1 gen 2, where USB 3.1 gen 1 is literally just USB 3.0, and USB 3.1 gen 2 is what you get with a USB C cable (where the letter evidently denotes the port size/shape and the version number denotes the rate of transfer).

https://en.wikipedia.org/wiki/USB_3.1


The "gen 1" thing was really stupid, but it's not as complicated as you think it is.

A cable can support USB 2 or 3.

A cable can end in the older A/B or in C.

These two are independent. All combinations exist. Most importantly, a USB C port or cable might only support USB 2.0

Whether you get the "gen 2" speed is also independent of whether you use A/B or whether you use C.

The only wrinkle is that micro A/B needs a special extra-wide plug to do USB 3, while everything else fits the pins without changing shape.

People get excited about using USB 3 and USB Power Delivery along with USB C, but all three of those are actually separate.


> and USB 3.1 gen 2 is what you get with a USB C cable

Nope, it's even more confusing than that. USB-C is just the connector type. Some USB-C cables only support USB 2.0.


> Some USB-C cables only support USB 2.0.

I don't even want to know why somebody thought that was a good idea...


Apple does it with the USB-C cable which needs to be purchased separately from the 87w power adapter for the 15" tbMBP. The cable is only USB 2, but since it's meant for power only, you'd probably never notice. It's quite annoying though that it's a separate accessory, and won't function as a "normal" TB3 cable or even a USB 3 cable, and likewise most normal USB-C or TB3 cables won't work with the 87w power adapter at full power delivery rates.

Such a mess.


I've always thought 2.0 was a relatively disappointing upgrade over the previous standard. It felt like it should've had at least twice the bandwidth it ended up having. This is looking much better.

Also, considering how disappointing 2.0 was, yet it still got the "2.0" name, this should be at least a 3.0, especially with a new cable coming. The only reason I see them using "2.1" is because they don't want TV and laptop makers to yell at them for making their devices look awfully obsolete just a few years later. But it's probably going to confuse users and it's going to hurt adoption of 2.1-enabled devices.


Intel has some nice cables delivering 30GB per FRAME[0]. [0] https://www.cnet.com/uk/videos/intel-demos-worlds-first-walk...


I think he said 3GB per frame, not 30GB, but that's still pretty insane.


That bit about different functionality when using different cables etc have me worried.


Good thing we're all switching to USBC for everything, apparently, since it doesn't have that problem.

http://www.informationweek.com/mobile/mobile-devices/usb-typ...

Oh.... :-/


Yeah, my head started to hurt when you add Thunderbolt 3 to the mix. Then you've Type-C for USB 2, 3, 3.1 -- and 3.1 gen 1 is really 3.0 re-branded. You need to look for "USB 3.1 gen 2" for the 10 Gbps support in the cable, and don't forget charging requirements -- the Belkin cables they sell at the Apple Store only support up to 60 W, but others, such as https://plus.google.com/+BensonLeung/posts/TkAnhK84TT7 claim support for 100 W or more. Then you've Thunderbolt 3, where you can theoretically hit 40 Gbps, but where cables that support those speeds are basically 0.5m, except one Belkin that's 2m but doesn't support DisplayPort or USB 3.0. And finally, the white USB Type-C cable Apple sells and ships with its power adapters is called a Charge Cable and doesn't support data, like connecting two Macbooks together to do Migration Assistant. But that's okay, because even if you do connect two computers together with a 0.5m Belkin TB3 cable, you won't hit 40 Gbps. Using Migration Assistant, copying 256 GB off a disk capable of 3.1Gbps should take about 11 minutes, but it always takes 2 hours when using cables between two 2016 MBPs -- Ethernet gave me the same speed as that 0.5m TB3 cable -- and it's about a half hour shorter if you use Wi-Fi for a direct connection. Given that, I can't wait for the day when we replace all these confusing cables with new directional wireless standards...


One of the missed opportunities of WIFI Direct was that they didn't specify a transfer protocol as part of the spec...


Target Disk mode has always been much slower than a native drive for me, right back to the SCSI PowerBook days.


Why do we still have or still need HDMI when we've got USB 3.1 Type-C?


USB 3.1: 10 Gbit/s

DisplayPort 1.4: 32 Gbit/s

HDMI 2.1: 38 Gbit/s

Thunderbolt 3: 40 Gbit/s


HDMI 2.1 is 48 Gbits/s


USB 3.1 is capped at 10gbps which is a lot more limited than the capabilities in the article.


HDMI also has built in digital content protection built in which is a part of who it got so much support from content producers, particularly with Sony.


It keeps mentioning improved frame rates (very little about refresh rates?) - will HDMI now be able to support 144hz monitors?


DisplayPort works very well for high-resolution and high-refresh-rate monitors (and also support things like FreeSync), why would you want HDMI?


For some reason all my kit comes with HDMI rather than DisplayPort. It also works with "televisions" (which these days seem to run Android)


Just curious - what's the use case for 144 Hz? It's a multiple of 24 but not 30. 120 Hz is a multiple of both.


It's even in the title of this thread. 10k progressive @ 120Hz. ;)

So, no, not 144Hz.


How about 4k @ 144Hz? I prefer display port anyway but am curious anyway.


The bandwidth for a higher refresh at 4k is obviously available. But all of the information available so far points to a maximum of 120 Hz at all resolutions. The actual spec is not available, yet, so there's that.

From http://www.hdmi.org/manufacturer/hdmi_2_1/index.aspx

  Q: What are the support resolutions and frame rates?

  A:

    4K50/60
    4K100/120
    5K50/60
    5K100/120
    8K50/60
    8K100/120
    10K50/60
    10K100/120


More details about the spec are here: http://www.hdmi.org/manufacturer/hdmi_2_1/index.aspx

Looks like it maxes out at 4k@120Hz


Didn't see any mention of power. Are the TV sticks going to be able to ditch the plugs?


Powered HDMI existed before 2.0 didn't it? I know it wasn't in 1.0.

Since these things tend to be backwards compatible I'd assume 2.1 would still provide power.

I suppose it may not be mandatory.


As for the new cable business, is there a way for the devices to detect the version of the cable (maybe resistor strapping or something)?

The reason I ask is because HDMI cables are passive, so any cable with all of the pins populated should be able to work, provided it's able to satisfy the bandwidth requirements. In other words, I'm wondering if these new HDMI 2.1 cables are just regular HDMI cables with better electrical characteristics. If so, then I bet older HDMI cables will work as long as they’re kept short.

I also wonder how this variable refresh rate thing relates to the recent AMD FreeSync over HDMI thing...


I imagine it is like PCI-Express where they link up at the lowest bandwidth possible (x1 in the case of PCIe) and then keep attempting higher bandwidth modes (x4,x8,x16) until the the transceivers can no longer link and then step down one mode.


Is this what windows was trying to do when I hooked it up to my TV the other day? For some apps it kept trying to cycle through resolutions (and never found a stable point).


It's not written in the article how the cables are different. If the new ones user more differential lanes (like USB3 does over USB2) that could of course be detected by the transceivers. If it's still the same amount of lanes but only tighter electrical requirements then the old cables could still work.


Just from the title, it seems this is aimed at the emerging 8K television segment -- with a little overhead.

As such, I don't find it "wow" at all. Maybe more "just in time."


I find it really annoying that cables with the same connectors at the ends and same basic standard have a version today.

Imagine if back in the day it was like: "ohhh headphone jack 1.3 eh? Well these headphones only work on 1.4 or greater"

Or: "Wow you only have RCA 7.6 you'll have to get a TV with rev 8 to hook up your NES"

It's a friggin' cord with some connectors and some pins, how did we wind up having different versions be even possible?!


The cables aren't versioned. They're just rated in categories for attenuation exactly like copper phone wiring is and named for the connector type just like USB-A, USB-C, micro-USB, etc are. You don't have HDMI 1.3 and HDMI 1.4 cables, though; cables bought years before a HDMI version is announced can still work with any of that new version's features that work within the bandwidth that cable can carry. The version is for the HDMI spec which the receiver must support. HDMI isn't just a cable, it's also the hardware encoding and decoding signals, and the different versions define new and different ways of encoding and decoding digital media.

As for 'headphone jack 1.4' and 'RCA jack 7.6'-- back in the day we had analogue audio and video signals and invented new types of jack, connector, and cable to carry them. You had RCA jacks, and 3.5mm jacks, and 1/4" audio jacks, and component video jacks, and S-video jacks, and SCART jacks, and coax jacks. Now we have digital audio and video and consistently use HDMI to carry it; instead of inventing new connectors we update the HDMI spec and say "Systems meeting the 1.3 spec can only transmit 1080p over their HDMI cables, and systems meeting the 1.4 spec can transmit 1080p over those cables." Isn't that better, and simpler? Instead of 20 types of cable with different features, we just use one type of cable, and add features to it over time.


So they are announcing variable data rate streams and Display Stream Compression support. I thought until now HDMI had a fixed data rate except for the control channel, unlike DisplayPort which is fully packetized. Can someone more knowledgeable chime in on whether HDMI will now be packetized too, or how they support variable data rate modes?


Hmm. To make actual use of this, it seems you would need to be sitting right on top of a ginormous (120"?) screen.

You might actually need to use a whole wall, which leads into immersive environments.


How does this relate to Apple’s advanced USB connectors on the new MBPs? I thought, they were supposed to replace all other types of connectors for video. Will we have HDMI for TVs and USB for monitors?


They're not "Apple’s advanced USB connectors" - they're the USB-C standard. They support (per the standard) what are called alternate modes, HDMI being one of them. HDMI 1.4b is currently supported over HDMI alt mode, which is unsurprising given that USB-C has been in the wild for a while and HDMI 2.1 is brand new.


Well they do support Thunderbolt 3 too, so they're not just USB.


Thunderbolt 3 is standardized on using the same USB-C connector.

Much like Apple used the DisplayPort connector before for Thunderbolt 1 and 2.


I know. But a USB 3 Type C port won't support Thunderbolt 3 by default. It has to be a Thunderbolt 3 port.


It's still called USB-C (or more correctly, USB Type-C, but nobody says that). Thunderbolt 3 uses the USB-C connector.


Correct.


Those switch to HDMI alternate mode and send HDMI signals over the USB cable.


Sadly, HDMI alternate mode currently only supports HDMI 1.4b.


Still no daisy chaining though, unlike with DisplayPort which uses packeted data. Why isn't DP replacing it faster? Is it too expensive to become ubiquitous?


It's a matter of targeted use case. DP is for PCs, while HDMI is for point-to-point AV. There aren't many non-PC HDMI sources that support multiple displays (mirroring or expanding), and this is by design. Chaining makes no sense in the home theater nor building video dist use cases.


The point is, DP offers a superset of functionality and is also supposedly free of license fees (from what I've heard), while HDMI requires the manufacturer to pay for the license. So why isn't everyone ditching HDMI for DisplayPort which is better on all accounts? May be manufacturing cost plays a role, but I have no idea.


This will be great for VR basically.


I wish companies would start rolling with HD-BaseT.


Their current chipsets top out at 10.2Gbps (HDMI 1.4a) for the video signal, which is somewhat limiting today.

It's also worth remembering that while historically it's done a pretty decent job at assisting with distance transport it's a 'standard' made by Valens to sell their chips. The use cases for it also make up a minuscule component of the overall display source/sink device market so there's little benefit to manufacturers having to roll the additional cost into every device they make.


Just keep making things more complicated for no good reason.

     4K   4096 / 2160 = 1.8962962962962964
     5K   5120 / 2880 = 1.7777777777777777
     8K   7680 / 4320 = 1.7777777777777777
    10K  10328 / 7760 = 1.3309278350515463
Thanks a whole lot.


> 4K 4096 / 2160 = 1.8962962962962964

What most people call "4k" is 3840:2160 (typical 16:9 = 1.77… aspect ratio). At least that's the amount of pixels in most TVs/Monitors I've seen.


True. But for the consumers there is typically no 4k but UHD (3840 / 2160). This matches the 16/9 aspect ratio in other resolutions.


Consumer 4K is definitely 3840x2160. Its likely 10K is the same situation; there are multiple resolutions that fall under that banner, but the consumer version will be 1.77 aspect ratio.


Some of the articles speculate that 4k and 8k are what you'd expect, and that 5k and 10k have the same vertical resolution as 4/8 but a wider aspect ration (probably something closer to that current cinemas use) and that's where the extra pixels are going.

Basically they may not all be the same aspect ratio.


One of the more ridiculous arguments in favor of the new MBPs was a refrain of "HDMI is on its way out", "HDMI is old tech", "New equipment won't use HDMI".


I've never heard this argument in favour of the new MBP.

In fact, one of the best arguments in favour of the new MBP that I've seen is that USB-C can replace every port we're using right now, and often with the same or fewer dongles as before. For example, USB-C supports 'HDMI alternate mode' which is a simple way to provide an HDMI signal over a USB-C port/cable:

http://www.hdmi.org/manufacturer/HDMIAltModeUSBTypeC.aspx


Exactly. Dongles can suck, but it's pretty sweet that you can buy a new dongle to upgrade to a later spec or different port. I'm sure we'll see TB3 HDMI2.1 dongles.


"It's turtles all the way down", or namely, "new" standards are built from old ones

Display port may not be electrically HDMI, but logically it is https://en.wikipedia.org/wiki/DisplayPort

And HDMI is based on DVI: "CEA-861 signals carried by HDMI are electrically compatible with the CEA-861 signals used by the digital visual interface (DVI). No signal conversion is necessary, nor is there a loss of video quality when a DVI-to-HDMI adapter is used." https://en.wikipedia.org/wiki/HDMI


No, DisplayPort is not "logically HDMI"! Where did you find that in the article?

You can transmit DVI or HDMI signal over a DP cable — which is used for passive adapters. If you plug the DP cable into your monitor, it's using the DP protocol.


You're right, I got it mixed up with DP dual mode




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: