Hacker News new | past | comments | ask | show | jobs | submit login
Nvidia’s latest GPU drivers can upscale old blurry YouTube videos (nvidia.com)
190 points by arprocter on Feb 28, 2023 | hide | past | favorite | 162 comments



I've been blown away at the AI upscaling on the Nvidia Shield. On my fairly large 4K TV, the Shield can produce picture quality that looks incredible out of source that isn't all that great. On that note, the Shield has been an unexpectedly great purchase over the years. Long life of support from Nvidia, runs Google TV, works incredibly as a gaming console for Steam Linking into my PC and/or playing compatible Android games, and hosting a Plex server (although I now use Jellyfin hosted on a raspberry pi). If Nvidia would open source their Linux driver (or at least drastically improve support), I'd be a huge fan. I highly recommend, despite the fact that I'm mad at them for the pain they cause me on Linux. Buy AMD cards if you run Linux).


I also have an Nvidia shield that now sits unused for the simple reason that Google forced ads onto the home screen.

It used to be that the Shield was the one bastion away from ads on Android TV, then with one update that somehow manages to reinstall itself despite being completely disconnected from the internet, there are now permanently ads on my Nvidia Shield TV.

I honestly don't know why it's such a big deal for me, but I basically never use it anymore. It's real shame, I got a lot of good use out of it, because as you mention the support has been going on for years now. But I will not subject myself to ads if I can help it. I pay for Youtube to avoid ads, I pay for Twitch to avoid ads. I can't pay to remove ads from my Android TV homescreen (that I didn't even have for the past like 5 years)


I was the same, really pissed off when they made the change. Luckily, it's android, so you can install a custom launcher. I remember I had to play around with adb a bit, but after that, it's smooth sailing. I use FLauncher[0], but there might be something better out there.

[0] https://gitlab.com/flauncher/flauncher


Thank you for this link! I, like a moron, kept looking on the play store for something and only saw garbage. Never thought to sideload.

The nagging in the android launcher has got more and more obnoxious recently. Half the time I open an app it asks if I want to "add it to my discover" with the only options being "yes" or "later". Infuriating.


Oh wow, I was put off from getting one because of the ad update a few years ago. If this works well enough I'll probably actually get one now


Are you saying this works with a Shield TV?


Yes, I've switched out launcher on my Shield TV to one that only shows a few app buttons. It works fine.


It is possible to plug your shield into your PC and sideload a new launcher which doesn't have ads, it's not too difficult (tho installing adb is a hassle).

I heard on reddit that if Nvidia even hints that this is possible, they violate the license agreements with Google, and then Google is allowed to yank the play store...


Do you know if it has to be plugged in to sideload a launcher? I sideload apps through wireless adb and it was pretty easy to get set up.


Should work but I haven't tried it


That works. That's what I did


The home screen ads are infuriating. I know it's Android, by Google, but I bought the freakin device. It wasn't cheap. And they added this retroactively. Shame on Google and shame on Nvidia for allowing it.

I used a 3rd party launcher but there wasn't a way (at the time) to make it the default.


Personally, I don't see what the big deal is. I don't have an Nvidia device, but I do have an Android/Google TV. I really don't care about all the "recommended" crap on the home screen; I just ignore it all, and use my remove to skip down to the SmartTubeNext icon and open that app to watch YouTube videos ad-free (no sponsor messages either!). Or I use a different button that leaves the home screen and takes me to a file manager to view the files I've loaded on my USB stick so I can watch those (I decline to say where these files came from exactly). These are the only 2 things I need out of a TV.


I've been a year or so using Sponsor block in addition to basic ad blocking. Now any device that doesn't offer both simply is a complete non starter for me.


you mean something like that?

https://github.com/yuliskov/SmartTubeNext/


I mostly stream to "dumb" devices connected directly to a display. I hate smart TVs. You can still use a remote and stuff by just opening `youtube.com/tv` in the browser and you can even use a remote/phone to control it.

For mobile you can use ReVanced.


That app is a godsend.


What kind of ads? I see recommendations for movies that I would have to purchase but not general ads in my Chromecast.


Those are ads.


It's not like they're showing you ads about shoes or kitchen utensils. It's a TV appliance so.. what else did you expect? A blank screen with just the Netflix icon? Are the Netflix recommendations ads too?


> what else did you expect? A blank screen with just the Netflix icon?

Yes

> Are the Netflix recommendations ads too?

Yes


We're entering troll space, I see.


Not really.

it's just that the tolerance towards unsolicited promotional material / ads varies from person to person.


It's not trolling.

They are ads, and I also would prefer a tastefully blank screen with none of them.

Just to add support to that post.


Call me old fashioned, but my TV should display exactly what I tell it to display and precisely nothing else.


This is your brain on ads.


> I pay for Youtube to avoid ads

As Google clearly doesn't respect your boundaries, maybe stop giving them money?

eg stop rewarding them for their current behaviour


I choose to pay for YouTube Premium to support the content creators. I doubt they will get paid if I used an ad blocker.


Interesting. Most of the content creators I tend to watch (not that many really) organise their own sponsorships. eg "This video brought to you by XYZ"

For those, using an ad blocker won't make much difference. But I guess if the content creators important to you are relying on youtube's monetisation, they would indeed be affected.

Not sure how else to send a signal to Youtube though, as they only seem to care about revenue. Seems like there's only that one lever that can be pulled. eg "ad blocker or not?"


Interesting I haven't seen any ads, although I do use the Dan Pollock hosts file on my router to block a lot of stuff: http://someonewhocares.org/hosts/


I was on the verge of buying a Shield but this has put me off.


I had 5 Shields in my family I maintained/advised people to buy, they all went in the trash and got replaced by Apple TV as soon as Google callously and retroactively forced tracking and ads onto the Shield.

I loved what the Shield was, but what it is now is utterly unacceptable.


how wasteful. why not giving them away or selling them?


I like the hardware in the shield but Android TV kind of kills it for me. When I was still using my shield, the UI would frequently get updated, and each update made the home screen worse and worse.

I wish they would just copy the Apple TV's home screen. All anyone needs is a grid of icons and NO ADVERTISEMENTS. There is no more room for innovation here. Stop trying to re-invent the wheel. Just admit someone else already perfected it and move on.


I have an Nvidia Shield pro and I have to say it's been a significant cause of frustration for me. 1-2 second audio drops on occasion, apps that have glaring bugs (such as freezes or inability to go forward/backwards) that I suspect is due to lack of adoption, adverts on the home screen. I haven't switched off of it, I guess, but I wish I had something better.


Fascinating, thanks. I've been considering the Pro, but my current one still does everything I need it to and more so I can't justify the purchase, but this makes me a little more content with my status quo.


No freezing or notable stuttering for me. I switched to wired internet for better consistently higher bandwidth videos as my wifi is super swampy. Some applications like Netflix would normally be able to lower you to a smaller sample rate to compensate for bad conditions, but most applications will just be inferior in studders, skips, etc


> Buy AMD cards if you run Linux).

Experience shows that in practice they also aren't much better than Nvidia in regards to Linux.


I dont feel like I share the same planet with this comment. AMD on Linux is vastly widely better in almost every way. Nvidia is a terrible partner & the binary/closed/proprietary drivers they offer have tons of wrinkles/issues, and whether Nvidia even bothers supporting modern capabilities everyone else agrees on is forever & seemingly always will be in question.

Nvidia's whole role in life seems to be competing against standards. Everyone else heads one way, Nvidia tries to out-compete by making their own. Oh great, there's DLSS support in the new binary driver... FSR only works because AMD did it at a share-able level. FreeDesktop is making GBM? Let's invent EGLStreams & insist on that for a decade. Nvidia has no visible cooperative spirit.

AMD has good drivers, super reliable, worked on by a wide variety of the gaming/computer-graphics industry/world. There's a huge trove of Proton games that run exceedingly well & without fail: proof positive that AMD has the situation down.


Some people's planet is graphics-only, and for them AMD is a great choice.

AMD is pretty great on Linux so long as you only ever want graphics. Once you want anything else out of their very-good-on-paper hardware, the experience starts to break down. You have to manually install/uninstall/reinstall their ever-shifting compute driver(s) and which driver you install (and the cadence at which said driver is updated, particularly for newer kernels) depends heavily on the device you have. E.g. I have a 5700XT and W5700, very similar hardware, yet supported by two very different drivers (I'll let you guess which one lags much farther behind - hint, it's for the more expensive card).

In contrast, Nvidia's driver has essentially always worked - I install the same driver on all my systems, and it gets updated frequently enough that it's always compatible with the current Ubuntu kernels. Over a dozen or so machines (personal, not counting work) over the last ~15 years, I've only encountered one or two serious issues with the Nvidia driver stack. Compare that to ~5 AMD machines over the last ~6 years where I've encountered at least one major issue per machine each year with the AMD driver stack.


Since when is FSR the standard, or even a standard at all? A part from AMD using it.

Also, in my experience Nvidia works just fine on linux, you just have to install their drivers. I literally never had an issue with them, and I use it professionally and for games. AMD is probably better if you want bleeding edge updates or use a rolling release distro, but the downside is that you still have to contend with some weird driver issues that aren't magically fixed just because the code is open source.


While FSR isn't a standard, AMD at least put effort into getting it working on competing Nvidia cards.

Yes, things mostly work when you install the Nvidia Linux drivers. But even on a Ubuntu LTS I've had updates break the Nvidia driver, sleep is a 50/50 toss up whether it resumes broken too. And that's on my desktop and ignoring the giant PITA of using a laptop with hybrid graphics. Using the Nvidia drivers also greatly complicates using a Libre distro like Debian or Guix because obviously they're non-free. Plus until the past year or so Wayland has just been essentially a non-starter with Nvidia even with drivers because of their choices.


I agree that you shouldn't use nvidia if you want any distro that prefers libre software. For most regular users it's just fine, except for wayland as you said. I wonder if nvidia has any plans w.r.t wayland support. Everything still works with X but wayland has finally shaped up to probably displace it.


The one major downside I'm experiencing using an Nvidia card (980 Ti), is sleep/resume not working. eg at resume time the screen rarely powers back on, then the system hard locks with the kernel traceback indicating an nvidia problem

The whole power off/power on thing instead doesn't take all that long, but it would be more convenient to just have a working sleep/resume.


AMD decided that my APU card wasn't worth their time on the new open source driver. fxglr => OpenGL 4.1, AMD new open source driver => OpenGL 3.3, without hardware video acceleration.

Secondly, good luck having a go at any compute framework offering from AMD on Linux, unless working for some research lab doing HPC.


Even with the research lab and elbow grease the thing is hard to use as heck.


You can remove the "tries" Nvidia is outcompeting everyone.

You can hate the company but their products are second to none.


Yeah sadly agree, depending on what you want. If you want OpenCL, you're absolutely right. The AMD PRO driver is even worse than the Nvidia driver as it only supports old kernels and install process is terrible.


Anything that uses qt will suffer on nvidia.

There are pros and cons of each by radeon will get you a more intel like experience.


As a KDE Plasma user on a rolling release bistro, I am happy I switched to AMD!

And yes, I realize how much of a niche I inhabit.


man who use rolling release bistro, hungry for fast food


Is this true? I'm in the market for an AMD card just to switch over to linux. I'm on an old 1080.

Most discussions centered around the various linux communities seem to parrot that AMD is the way to go now because of open source drivers etc. I always wonder how much of it is an echo chamber.


I run Linux on everything, and have a system with nvidia card and one with AMD, so speaking from real experience rather than echo chamber regurgitation (which is definitely a problem in this space so good to be aware of it). I will go AMD every time for gaming and general dual/triple screen monitors. I would only go Nvidia if I needed CUDA (such as for machine learning).

If you need OpenCL or if you go Nvidia and need the proprietary driver, it will make distros like Fedora (my favorite), Arch, etc a lot more difficult as they stay very current on kernels (which is great for hardware compatibility). Nvidia is better than AMD in this regard if you need the AMD pro then you just won't be able to run Fedora. If you don't mind Ubuntu, then you'll be ok. It's well supported for Nvidia drivers and AMD Pro.

I have been amazed at how well everything "just works" with AMD on Fedora, and for that reason AMD is my default.


It depends. If you are willing to download proprietary drivers from nvidia, it literally just works. Though I'd probably suggest not getting on a rolling release distro, or any distro that is focused on mostly sticking with libre/foss solutions. Ubuntu works great with non free drivers, out of the box, so just use that. Kernel updates might be tricky on some distros, so sticking with more stable releases is preferable in my experience

(I have used AMD and NVIDIA cards, including a 4 GTX1070 cluster a few years back with mostly little issues on both. Also used GCN andRDNA1, and now I use nvidia on ubuntu for work and it just works.)


If you want to run Linux and have your system work as nicely as possible, go with NVidia; they write better drivers than AMD and always have done.

(The counterargument is: if you're going to use a proprietary driver why are you switching to Linux at all?)


I don't agree with this. I've been using an Ubuntu system for ages with Nvidia and it regularly needs a reinstall of the drivers when it randomly decides to come up in some silly low resolution, like 320x240 or something.

My next linux GPU is AMD.


With the first-party NVidia drivers? I've never had that happen or even heard of it happening before now. Admittedly I don't use Ubuntu much, and they sound like the kind of distro that would silently auto-update your kernel or something, which would produce that kind of failure mode? (Which is at least a clean immediate failure on reboot that you can fix immediately, rather than something that crashes in the middle of your session destroying your work - as bad as it is, that still sounds a lot better than my AMD card experience).

Whereas I've used linux on several systems with AMD/ATi cards and they've always been flaky.


Yes, the .run drivers from Nvidia.

It only happens on a reboot, not while I'm using it, and only maybe once a month, but it's still annoying because I have to ssh in to reinstall the drivers every time.

And here I was all hyped that AMD would be my savior. I hope you're wrong!!

I'll find out for myself the next time I upgrade and I switch the current AMD card that I'm using on my Windows PC into the ubuntu system.


> Yes, the .run drivers from Nvidia.

If you're manually installing them outside of Ubuntu package management then you need to reinstall them every time you upgrade your kernel. (This is a design decision by Linux to make life harder for closed source drivers, there's nothing Nvidia can do about it)


Using the driver from their site is the number one way to fuck up your system. There has got to be an nvidia package on Ubuntu.


In my experience the ubuntu nvidia packages were hopeless, I forget exactly what it was but I think even basic stuff like hardware video decoding was not working.

Anyway just basic stuff wasn't working and IIRC the GPU fans would stay at 100% all the time. Just terribly broken.

I'm overall happy with the .run drivers from the site except when they come up at the wrong resolution and need reinstalling as I described.


I recently switched back from AMD -> Nvidia and couldn't be happier. Other than some issues with an inadvertent install blocking the kernel module it has been pretty much flawless.


Could you give an example of a TV show that blew you away when processed with Nvidia Shield's upscaling? I tried the feature while streaming HBO's Sopranos, going from 1080p to 4k and it didn't look any better to me, just slightly different with occasional artifacts. I didn't understand the hype around this feature.

I believe how good your TV's native upscaling will also be a factor in how improved the Shield's implementation looks.


I can: Star Trek DS9 (DVD rips). I actually had forgotten the Shield was supposed to be doing AI upscaling since a lot of what I watched for a while was current.

I was midway through the series when I traveled somewhere else and used a Roku on a different TV - and it was instantly noticeable. When I got home, I plugged in a Chromecast Google TV to my TV (LG OLED) and watched parts of an episode.

The Shield TV made DS9 look significantly better.


This happened to me with TNG! I was blown away by the picture quality. That's not something my wife usually notices, but even she commented about how good it looked. I later watched it on a different device and the quality was noticeably worse.


TNG did have a massive amount of time put into remastering it though, DS9 sadly has not.


Others agree with you:

> The Shield upscaler works, but if you have an eye for quality it is terrible. The sharpening it uses leaves a very noticeable sharpening ring around edges, most noticeable on animated content. It will also cause weird artifacts to pop up on live action content.

https://www.reddit.com/r/nvidia/comments/11e7e2h/rtx_video_s...


I'm tempted to get one, but it doesn't support HDR+ and some DV profiles, and AVI. Hopefully once the new Nintendo Switch comes out (as that's been the cadence for new Shields) an updated version will come out with updated support.


My biggest issue with the Shield is that they still have not implemented refresh rate match across the OS, which all it will take is an update to the OS since google has implemented it.


For the newer GPUs nvidia's open kernel modules have created less hair pulling for me and are much more comparable to amdgpu than the traditional modules. Recently I was reading Nouveau is working on re-clocking support on newer GPUs via GSP support which is just starting to be upstreamed, that'll be even nicer if it materializes.


It's good that there's at least some way to load the whole scmorgasboard of proprietary drivers/firmware now that doesnt constantly break, but this is still so far from great.

Nvidia picks & chooses what to implement. If they deign do work at all. And actually open-source Nouveau is wonderful, but wow, they have like 1/20th the chip-maker resources to read-from/work-with & 1/20th the developer headcount of Intel or AMD open-source driver development. Phoronix is posting RADV stories every other week with cutting edge industry folk making this or that innovation, & Noveau shows up like quarterly maybe.


The dearth of Nouveau interest is largely to do with the last 4 generations of cards being limited to the power saving clock speed. The new method of using the GSP in the open driver will remove that limitation and allow work done on Nouveau to actually be usable again.


can the open kernel modules do CUDA? I read somewhere that they couldn't, but that could be out of date now and would be a game changer for me


Yes, another + from me for the shield. After ~ 5 years now, still running great. Using it for Kodi/SmartYoutube/Steam-streaming. Only had to remove the inner dust last year


> Buy AMD cards if you run Linux

They don't do CUDA, so they're useless.


A generous translation:

AMD GPUs are toys. As long as AMD drops the ball on ROCm, etc their cards will be relegated to gaming - quite literally a toy to many people.


I know that from the DaVinci Resolve forums and subreddit, everyone using AMD has all sorts of pain-in-the-arse problems that just don't happen with NVidia, even on Windows.


Yeah the downvotes are interesting.

Many users who are happy with "I want a GPU to drive my display and maybe game" are pleased with the current state of AMD - and that's fine. However, that population rarely goes beyond "how many FPS am I getting?". That population regularly complains about the Nvidia closed-source/binary driver situation. A lot of that criticism (from their point of view) is fair - although I've never had issues with the binary driver and as seen in this thread I'm not alone.

However, if they were to expect even just a little bit more - ML, reliable and usable hardware video encoding/decoding, etc they'd quickly realize what we have. The closed-source Nvidia driver is remarkable capable and consistent (from a functionality perspective) supporting every single feature and function of any and every card that says "Nvidia" on it across multiple platforms. From nearly decade-old laptop GPUs to the latest and greatest H100 datacenter GPU and everything in between. This is before you even get to the dozens of libraries Nvidia supports and provides (like cuDNN, TensorRT, etc) that have the same consistent, reliable, and cross-platform functionality.

Again, I understand their perspective and I don't look down on them or their simple use case (gaming on desktop) but in 2023 that's an extremely low bar. There's also a lot of criticism of the Nvidia price point - but those of us who are making full utilization of our hardware and supported features balk at that criticism. What's also interesting is the more buggy and less supported also closed-source (to my knowledge) AMD Pro driver that's required to do anything more than game. So at this point you're still not going to please the open source community and the functionality and experience is worse in every aspect.

My 4090s will not only beat AMD top of the line on just about every gaming benchmark, in seconds (or simultaneously) I can be training an ML model, encoding video, upscaling, doing realtime noise suppression, etc. The functionality fundamentally supported by that driver is amazing. When you look at it from that standpoint an Nvidia card being (at most) 3x the price point is perfectly reasonable. I'm thrilled to spend $1000 more if it means I can actually use the card for what I need and I save many, many hours of my time. In terms of total value it's not even close.

The increase in value, utility, and time returned is astounding and (to many of us) more than justifies the driver situation and cost. I support competition and have tried several times over the years to get ROCm to perform the most basic ML tasks - it's a blackhole of wasted time, energy, and effort (starting with the AMDPro driver). AMD clearly just doesn't care - and even if they decided to throw everything they have at it Nvidia has a decade-plus head start.

I truly wish the AMD software situation was better but I've been let down too many times to be hopeful I will be making anything approaching full use of any AMD GPU anytime in the near future.


Though the removal of Gamestream is a real pain, IMO. I'm still mad that Nvidia have done this, but am thankful that Sunshine exists.


> the Shield can produce picture quality that looks incredible out of source that isn't all that great.

So much for garbage in, garbage out.


Hmm, doesn't seem to be scaling early seasons of Seinfeld to HD. I could use that!


It is interesting that youtube is silently removing the ability to watch old videos in HD after making it difficult to sort by oldest on a channel.

I have videos uploaded in 1080p 5 or so years ago now only giving the option to watch in 360p.

Anyone else notice this?


That should be a warning to people who think YouTube and Alphabet by extension has any interest in archiving knowledge for the greater good. This is commercial interest and commercial cost cutting.


While I have some sympathy with this position I think it's worth remembering that there's a huge long-tail of absolute rubbish on YouTube.

The fact that they keep it around, for free, over years and years, and at no cost to the user is remarkable to me.


Yes, YouTube is quite good if your data is absolute rubbish and you’re pleasantly surprised by anything less than full deletion.


Expecting any company to store infinite content for free forever is a bit of a curious expectation


Yeah I remember when image sites all stopped their free tier a few years ago and it broke a lot of old forum posts.

There's a reason why serious archiving of knowledge is left to government institutions.


Is it actually anyways best interest to permanently store everything that has been uploaded to Youtube? I think it would be very reasonable to purge every video uploaded by non-partenered channels that hasn't uploaded in 5 years or the video hasn't been watched in 10 years.


Let me fix it for you "... it would be very reasonable to purge every public video uploaded by non-partnered channels that hasn't uploaded in 5 years, except the very first and the very last video by that channel, along with the videos that haven't been watched in 10 years."


Personally that would delete a lot of niche music videos where it's been like 10 years since they last uploaded to that channel.


There is niche and then there is "no one has watched even a second of this in 10 years"

As much as I don't like Google or Youtube - expectation that they will just store whatever stupid stuff you post forever is unreasonable.

Like I posted some gaming highlights when I was teen and I don't think anyone, but I have ever watched them and I don't even have access to that account anymore. All of that stuff should be deleted.

Even now I occasionally have itch to make a video and I am the only one who watches them, but at least they are good enough for me that I return to watch them.


There's some sort of change going on behind the scenes. I have a bunch of 5k (2880s) footage in YouTube that now maxes out at 4k (2160). YouTube silently removed all 5k versions of videos (nothing between 4 and 8k now).

It really seems like they may put 4k and above into their YouTube premium offering or something?

https://support.google.com/youtube/answer/6375112 >Note: In 2022, we started removing support for playback at resolutions between 4K and 8K. For example, we may no longer support playback at 5K.


I guess it's not exactly shocking... what better way to reduce their storage requirement than to downsample old and not-frequently-watched content.

I bet 95%+ of videos on the platform fall into this bucket.


Notification of this fact by youtube would've been nice to be honest - and perhaps a way to pay to keep them in high quality.


Highly unlikely, we’re talking about Google here.


One day, the future may regard YouTube, as BBC, who recycled and overwrote Dr Who tapes to save a few bucks.

And lost handsomely as a result.


in fact, unfortunately, almost every TV station did that in that time, some of them preserved their material on film, but unfortunately not all of their material


TV station?

Dr Who wasn't a daily, TV station talk show, and Studios tend to archive.

The BBC gets the footnote in history, here.


No it happened in my country too. Technology was pretty shit in the 60s, TV was a new medium for European public broadcasters and they had to budget.


The most egregious example I know of is a "60fps audio video sync test" now only available in 30fps


Yup, they are potentially ruining rare moments captured


I should probably download all the old music videos I care about before it's too late


A local copy is a good step forward for content you truly care about. I don't think there's any service you can really count on the decade long term. Meanwhile, I still have the music and photo collection from all my life. Is is a worthy pursuit? The more I grow, the less I think so. Does it work? Absolutely yes.





Can you have some sort of bot do comments like these in a thread as a sort of changelog? Sometimes comments are referring to a previous link or a previous headline.


I considered submitting that instead, but unless you already know what "RTX Video Super Resolution" does the headline is less descriptive


I have watched 3kliksphilip and LinusTechTips demo video upscaling AI stuff, and it always looks really really weird and gross. It seems really unnatural, and has the same kind of "nope that's wrong" as if you maxed the saturation on the photo. It feels more like a highschooler touching everything up in photoshop than correctly understanding what the objects in the video should look like.

Also, in their recent demo of a video upscaling technology, Nvidia clearly kneecapped the pre-upscale footage to make it seem like the upscaler was doing more than it actually was. If you take the source video they were using and convert it using settings they claimed they were using for a low bitrate version, you get a much clearer and cleaner video than they showed.


> Nvidia clearly kneecapped the pre-upscale footage

I'm not so sure. High bitrate content like game streaming does end up losing visual quality with lossy encoding. The average 1080p Twitch stream struggles to look as clean as Nvidia's 1080p source footage.

Given the circumstances, I think this is a great option. Most videos don't look right when encoded on YouTube or blown up at 2x resolution. A destructive AI pipeline is your best shot at reducing the destructive artifacting introduced at the encoding stage.


This news is still something to celebrate as it shows the way forward.

The information is in the video files and our brain can't just use it but math can.

I can imagine using/buying a upscale model optimized for an old tv show for example. Or using a model for my prev game streams.

There is temporal information and high res pictures from sets, styles or actors at the time of the recordings. All information which can be used.

If its not yet good enough, it will.

Lets see if this forces a global standard or opensource model plug and play or whatever.

And if you look at the other nvidia research like were the model learns the face and than interpolates all further movements also a really good use case.


Is it handled by a browser extension? Curious if it might be hacked into Firefox.

There's a suggestion to add "AMD FidelityFX or similar technology" here: https://connect.mozilla.org/t5/ideas/video-upscaling-in-brow...


It's a plug-in to the Windows ID3D11VideoProcessor API, https://source.chromium.org/chromium/chromium/src/+/main:ui/...


I'm using it right now and I didn't install an extension. I believe it's baked into Chrome. It required a chrome update to work.


It doesn't make much economic sense for millions of people to upscale vs YouTube doing it once and storing the output.


I think you would need to understand in some detail the makeup of all the content on YouTube before you can make that choice.

As a hypothetical example, say yt had one user and 1 billion years of video. It makes a lot more sense for that one user to do up scaling than it does for yt to transcode all that video. The reverse is also true i.e. 1 video, 1 billion users = yt should do a lot of transcoding.

The real solution (in the economic sense of minimising overall cost) is probably a compromise where yt transcode their most popular content, and if you're interested in the long tail of unpopular content then you can invest in upscaling.


Course it does, they don't need to run the upscale job, don't need to store it, get to sell more hardware.

Maybe not for the user tha pays more to upscale through having to buy hardware or pay extra for electricity (minimal but still). At least they save on they b/w :)


This is the opposite of economic sense.

Store 99% of video that people watch less than a few times a year. Those people upscale.


It would be cool if there was a way to upscale old Brood War videos to look as if it was being played on the new Remastered edition. I'm not sure if the technology is quite at that point yet, but it seems like an increasingly tractable problem.


I think the likely avenue there would be using something similar to Stable Diffusion to perform style transfer. You may not need to SD for every frame if you can intelligently generate sprites that can be re-used. We are getting there


Currently only supported by Chrome and Edge. Other Chromium-based browsers seem not to be supported yet (e.g. Vivaldi). Firefox is probably not (yet) supported at all. Checked on Vivaldi and Edge with the Revolution OS video on Youtube.


Works on Brave for me.


Oh yeah, just updated Vivaldi et voilá! AI artifacts in blurry videos! xD


Old games like Quake always look pretty janky when upscaled (extreme example of course). The low-poly models seem to become worse the closer you can look at them.

I wonder if there’s a level of detail that just fundamentally can be upscaled indefinitely and end up vaguely “looking right”, or if it is just the case that something like Apex Legend was release recently enough that the lift is not so huge.


Maybe the "upscaler" just needs to be a really good CRT emulation.


Yea, was gonna say that CRTs basically gave hardware-level anti-aliasing to every game for free. Playing the same games on an LCD TV makes them look strange and jagged edges stick out a lot more.


CRT also gave extremely high smoothness for free. 60Hz there looks a lot better than on LCD etc.


That's down to the longer persist. LCD goes on and off (almost) instantly, CRTs have a certain amount of "persist" where the phosphor glows for a moment after the beam has left it.

It's also (kind of) why interlacing worked, when black-and-white CRTs had a persist about as long as one field. This would make fast action a bit more "smeary" though, and you can think of its effect as similar to a wider "shutter angle" in a camera.


From online reading and my personal observations, CRTs have lower phosphor persistence than LCDs, resulting in both the flicker and clearer motion. I think interlacing works more due to persistence of vision, and because the beam is thick enough to nearly fill the scanline gaps. You can actually simulate interlacing on a LCD by showing simulated scanlines, alternating by half a line each field/frame. There's a video demonstration at https://youtu.be/tS0cFwvDWkE?t=480.


The visual properties of CRTs are a surprisingly complex topic.

Still images:

- 240p and 480i (15 KHz) console games were meant to be played on SDTV CRTs. Most consumer TVs have relatively coarse dot pitch (white lines are actually composed of color dots/stripes spaced relatively far apart). This adds "texture" to game visuals, masking aliasing and blur to a degree.

- - I think low-res LCDs actually have a similar horizontal texture to aperture grille (phosphor stripe) CRTs, but dot/slot mask (dot/slot-shaped phosphors) CRTs look quite different from LCDs when seen close up.

- - VGA CRT computer monitors have a much finer dot pitch (adjacent phosphors of the same color closer together) and higher TVL (phosphors of the same color per screen height). This makes them look closer to LCDs, since the color dots/stripes are so close together they can disappear at a normal viewing distance. This is not necessarily a good look for 2D games, or even pre-HD (PS2/GC/Wii) 3D games.

- Unlike LCDs, CRTs have visible scanlines (lines of light traced by the three electron beams, horizontal on wide monitors and vertical on rotated/tall arcade monitors). On a properly calibrated CRT, the three colors of scanline are vertically aligned (convergence errors cause the scanlines and/or horizontal changes of brightness to be misaligned between the three colors). On many (not all?) CRTs, each color's scanline gets taller (vertically wider) when the color is brighter, creating a "blooming" effect utilized by games. The scanlines are usually narrow enough on SDTVs that there's a visible black gap between adjacent lines of light; modern emulators' "scanline" filters add black gaps between rows of pixels.

- Most consoles output a horizontally soft image (or composite video blurs it horizontally). In 480i on a SDTV, adjacent scanlines overlap. Both effects blur the image and reduces high image frequencies, acting as "hardware-level anti-aliasing to every game for free". Additionally, horizontal transitions between colors and black cause the scanline to become narrower as it fades out (on many TVs), which is another characteristic of the CRT look: https://twitter.com/CRTpixels/status/1599513805717135360

- Unlike SDTVs, high-quality properly-focused VGA monitors have scanlines so narrow that drawing 240 scanlines across a 17 inch monitor produces taller black gaps than scanlines, even at full brightness (which produces maximum scanline height). This is very much not an attractive appearance for 240p games. One solution is to display each line twice in vertically-adjacent 480p scanlines (line doubling). Alternatively you can send a high-resolution (960p or above) video signal to the VGA CRT, and add emulated scanlines in a GPU shader or hardware scaler (ideally RetroTink-5X).

- - Note that visible scanlines boosts high vertical frequencies up to (and to an extent, beyond) Nyquist, effectively acting as alias-boosting filters rather than anti-aliasing! *If you want phosphor dot texture and a smoothed on-screen image, which CRT TV/monitor you pick matters as much as playing on a CRT!*

- - Both 240p and 480i games were meant to be played on SDTVs. 240p games were meant to be seen with scanline gaps, while 480i games were not.

- - As an example of "alias-boosting", my 17-inch VX720 VGA monitor has visible scanlines when running at 480p. They're beautiful to look at, and work well in arcade-style games with large objects like Super Monkey Ball 2, or bloomy games like Mario Kart Wii. But they boost aliasing to the point that many GC/Wii games (especially Wind Waker with half-resolution DOF distance blur) look objectionably aliased (worse than a regular CRT or even a LCD display). As a result, when playing 480p games through my GBS-Control, I often configure it to upscale to 960p with vertical bilinear filtering ("line filter").

- I'm not actually sure what CRT monitors (dot pitch, beam sharpness, maximum resolution) PC games from various eras (DOS through 2000) were meant to be seen on; PC monitor resolutions have evolved substantially throughout the decade or so, while pre-HD consoles have universally targeted 15 KHz TVs. I'd appreciate input from someone who grew up in this era of PC gaming.

Interlacing:

- Console games since the PS2/GameCube era primarily output a 480i video signal, where every other frame (technically field)'s scanlines are vertically offset by half a line, causing solid colors to appear "filled" without gaps between scanlines. However, if your eyes are tracking an object moving up/down by half a line per field, then the scanlines line up again and gaps reappear. This is because each scanline is only illuminated for a small fraction of a field, and by the time the next field appears and the scanlines have moved half a line up/down, your eye has moved that distance as well.

- In 480i video mode, if even and odd scanlines are different average brightnesses in an area of the image, the image will flicker when displayed on an CRT. To address this issue, many video games enable "deflicker", which blends together (on Wii) 3 adjacent scanlines when outputting the video (technically when copying an image to XFB). This blurs the image vertically but eliminates flickering between fields.

- - Some GC/Wii games enable "deflicker" (vertical blur) even at 480p output (with no flickering), which softens the image vertically; Nintendont and Wii USB loaders come with optional hacks which hook games to prevent them from turning on vertical blur.

- - And for some reason, in certain translucent foggy/glowing/shadowed areas, some GC/Wii games actually output alternating brightness per scanline (like vertical dithering) and rely on the deflicker filter to smooth it out to a uniform color. Running without deflicker enabled results in horizontal stripes of alternating colors, which I'm (usually) fine with but some people find ugly.

Motion and latency:

- Video signals, both analog VGA and digital HDMI (and probably DisplayPort as well, unsure if DP's MST alternates screens or interleaves them), always deliver the screen's contents from top to bottom over the duration of a frame. Displays (both CRTs and flat-panels) update from top to bottom as the image arrives into the screen. As a result, the bottom of a screen has nearly a full frame of extra latency compared to the top of the screen (unless you configure the game to turn off both vsync and triple buffering, resulting in tearing and the bottom of the screen receiving a newer video frame).

- - You can reduce the latency at the bottom of a screen by delivering pixels faster, and spending more time after each frame not delivering pixels (HDMI Quick Frame Transport, https://forums.blurbusters.com/viewtopic.php?f=7&t=4064).

- LCDs and OLEDs update the state of each pixel once per frame (from top to bottom as the signal is transmitted from the video source), and pixels continue transmitting light for a full frame's duration until the next update arrives. (LCDs also have additional latency from an electrical signal to the image changing, because the liquid crystals respond slowly to electricity.) CRTs instead light up each point on the screen briefly once a frame, and the light output decays towards black (within <1ms on VGA monitors) until it's lit again a frame later. (Note that oscilloscopes and radar screens may use phosphors with far longer decay times: https://en.wikipedia.org/wiki/Phosphor#Standard_phosphor_typ...)

- - As a result, CRTs flicker (bad). But when your eye is tracking a moving object, the entire screen is not smeared by the distance of a single frame like in LCDs, resulting in a sharper image in motion (noticeable in video games and even web browser scrolling). There are techniques (black frame insertion, backlight strobing) to produce CRT-like motion properties (complete with flicker) in LCDs and OLED screens (more info at https://blurbusters.com/gtg-versus-mprt-frequently-asked-que...).

- - Sadly, 30fps console games on CRT effectively lose out on this beautiful fluid motion property, because they display the same video frame twice in a row. As a result, when your eye is tracking a moving object, the object (and screen) appears as a "double image", with the two copies 1/60 of a second apart (at your eye's rate of motion). The same "double image" appears (but with a shorter distance, I don't know if it's perceptible) if you display a 60fps video at 120fps to avoid CRT flicker (or to feed 240p video into a 31-KHz-only monitor).

- - One idea I have is "motion adaptive black-frame insertion" (similar to motion adaptive deinterlacing), where an (ideally) OLED display displays a static dim image for nonmoving sections of the image (to avoid flicker), and a bright strobed image for moving sections (to avoid eye tracking smearing). I'm not aware of any monitors which perform this trick, and I'm not sure if the 1ms or so of added latency to perform motion detection (my best guess, judging by the minimum latency of a RetroTINK-5X Pro or GBS-Control) would make this product unpalatable to gamers.


Were Wii games designed to be displayed on CRT or LCD? If I recall correctly, LCD screen were pretty widespread by the time the Wii came out — at least among middle-class Americans.


I'd say a mix of both. The Wii offers both 480i SDTV (for CRTs and LCDs, through the pack-in composite video cable) and 480p EDTV (better for LCDs, but you had to buy a higher-quality component cable) output. Unfortunately 480p-compatible CRT TVs were rare and far between. Additionally 480p doesn't actually take advantage of the full resolution of 720p/1080p LCD TVs, resulting in a blurry image on-screen (but fortunately free of composite and interlacing artifacts, and deinterlacing latency on some TVs).

Did middle-class Americans commonly have 480p LCD EDTVs, or were they a rare transitional stage of television with little uptake? My family jumped straight from CRT (technically rear projection) to a 1080p HDTV.

Early Wii games were built to output in both resolutions, adjusting the camera width and HUD to match. I think System Menu actually looks better in 4:3, since the channel icons mostly add empty featureless left/right borders when stretched to 16:9. Some later games (NSMB Wii, Skyward Sword) only display in 16:9, adding a letterbox if the Wii is configured to play in 4:3. Interestingly, in NSMBW playing in 4:3 saves two seconds in one scrolling cutscene, because the game objects are actually loaded underneath the letter box, and appear "on screen" sooner, cutting the cutscene short (https://www.youtube.com/watch?v=gkt8L3t1GEU).


> Did middle-class Americans commonly have 480p LCD EDTVs, or were they a rare transitional stage of television with little uptake? My family jumped straight from CRT (technically rear projection) to a 1080p HDTV.

Scottish and very much not "middle-class", but I had a 720p-capable CRT TV in the early-2000s or so. I only got rid of it in about 2010, and still kind of regret it, but it was huge. Secondhand it cost about 400 quid in 2002 money! You can imagine what it must have cost new.

Around about the same time I fitted a VGA input board to a spectacularly expensive Loewe TV for a local high-end home cinema shop.


> Unlike LCDs, CRTs have visible scanlines (lines of light traced by the three electron beams, horizontal on wide monitors and vertical on rotated/tall arcade monitors).

Or in a triangle, on old delta-gun CRTs.

> On a properly calibrated CRT, the three colors of scanline are vertically aligned (convergence errors cause the scanlines and/or horizontal changes of brightness to be misaligned between the three colors).

Inline gun CRTs largely eliminated the need for convergence. I do not miss converging delta gun CRTs at all. Yes, I am quite old.


Sorry I meant that the scanlines are horizontal, not the guns.


Analog video:

- RGB (including VGA) and component (YCbCr, green/blue/red RCA jacks) video are theoretically lossless encodings. S-Video blurs color information (chroma) horizontally, but has theoretically unlimited brightness (luma) bandwidth horizontally.

- Composite video (single yellow RCA plug), as well as RF, encodes color as textured patterns (a 3.58 MHz sine wave in QAM with variable amplitude and phase) added to a black-and-white signal. TVs rely on various filters to separate color and brightness information, inevitably resulting in horizontal color blur, some degree of brightness blur, and (with some video sources and TVs) horizontal changes in brightness being interpreted as color, or changes in color being interpreted as brightness dot crawl.

- - Some games were actually built to rely on composite video to blur color horizontally (https://twitter.com/CRTpixels/status/1408451743214616587), or even smooth out dithering (https://twitter.com/CRTpixels/status/1454896073508413443).

As for my personal setup, I mostly play GC/Wii games (rather than earlier generations of consoles), and run it through a component cable (lossless) at 480p (no interlacing), through a GBS-Control transcoder/scaler, into a Gateway VX720 17-inch Diamondtron VGA monitor. This monitor is useful because it's relatively compact compared to large TVs, has excellent geometry and no audible whine (31 KHz is ultrasonic), and is sharp enough to be used as a computer monitor. The downside of using this monitor with console games is that the phosphors are too fine to add texture, the image (electron beam focusing/scanline size) is too sharp to act as antialiasing like a real TV (so I have to simulate it with bilinear scaling to 960p), and it cannot display 240p games properly unless upscaled like you were displaying on a LCD.

Sometimes I also plug this monitor into my PC (RX570 -> https://plugable.com/products/dpm-vgaf) and use it as a second monitor. I bought this DP++-to-VGA dongle because I heard that DP-to-VGA dongles are less likely to have image quality problems than HDMI-to-VGA dongles, but unfortunately most of my devices don't have DP ports so I can't use this dongle with any of my laptops.

Additionally, my monitor doesn't broadcast EDID data when the front power switch is turned off (I power it down when not in use to reduce CRT wear). And my DP++-to-VGA adapter identifies itself as an "unrecognized but present output" when the VGA cable is unplugged or EDID is missing. So for my computer to properly recognize my monitor, I have to first power on the monitor and plug in the VGA cable to the dongle, then plug the dongle into the DP port (and if it's already plugged in, unplug it first, DP latch and all).

I used to have a 24-inch flat-screen Trinitron SDTV given away by another person. Unfortunately, the screen geometry linearity was poor due to high deflection angles (objects were wider in the left and right of the screen, causing scrolling 2D games to warp and distort unnaturally) and wide pincushion at the top of the screen. Additionally, the TV only ran at 15 KHz, did not support 480p from my Wii (which improves fine detail in fast motion), and had painful levels of high-pitched whine requiring I wear headphones and/or put layers of sound-muffling clothing around the TV's vents (and set up a forced-air cooling fan to replace the obstructed air circulation). I ended up giving it away for free (after two people on Facebook Marketplace ghosted me).

Sadly it's now common for eBay and Marketplace sellers to offer CRT TVs and monitors at absurdly inflated prices, waiting for a desperate buyer to pay hundreds (or thousands for BVMs and widescreen PC monitors) of dollars for a monitor, or most likely sitting on their listing for months on end with no takers. These listings tend to clog up search results if you're looking for a CRT TV. I'd advise you to look out for the occasional "free TV" Craigslist/Marketplace listings (and mythical "free VGA monitor" offers), or see if any electronics recyclers will sell their CRTs to you at a reasonable price ($40 for a 17 inch VGA isn't a small amount of money to drop, but it's downright generous compared to the average eBay scalper).


> Composite video (single yellow RCA plug), as well as RF, encodes color as textured patterns (a 3.58 MHz sine wave in QAM with variable amplitude and phase) added to a black-and-white signal

In the 1970s the BBC transmitted colour TV programmes but archived them on black-and-white film, shooting a black-and-white monitor that actually had enough bandwidth to display the 4.43MHz PAL colour carrier. Someone wrote software to decode this and recolour footage based on what they recovered. It's not great, but it's at least as good as VHS colour.

Unfortunately the only really good examples of footage captured both on film in mono and tape in colour is an episode of Top of the Pops, presented by the infamous Jimmy Savile. In a happier example they were able to recover the colour from a couple of lost episodes of Morecambe and Wise.

https://stardot.org.uk/forums/viewtopic.php?t=16161


I think it’s the latter.

Because, as you point out with low poly models, there’s so much that goes into the fidelity of a game beyond just output resolution - there’s polygon count on the models, resolution of the textures applied to those models, shadow and light effect resolution - each operating independently.

When a game like Apex is played at high settings but output at a low resolution, at that point upscaling it isn’t much different than upscaling a standard def DVD of a movie like Frozen or Avatar. You’re not creating detail in things like individual set pieces that didn’t previously exist, you’re filling in the blanks of the entire image blown up at once per frame.


I've just tried this and I can't tell any difference. Could it be because my laptop has an Intel and an nVidia 3060 I wonder?


Have you tried this?

> To enable it, launch the NVIDIA Control Panel and open “Adjust video image settings.” Check the super resolution box under “RTX video enhancement” and select a quality from one to four — ranging from the lowest impact on GPU performance to the highest level of upscaling improvement.


Yes thank you, I did try that. I can enable it and turn it from 1 to 4 but I wonder if it's because Chrome and Edge are both using my Intel GPU by default.


From what I can see on Reddit it's just not working for some people - I can't see the difference on my desktop no matter what I try.

You need to set the video source lower than your monitor resolution and possible set chrome to high power mode (details here: https://nvidia.custhelp.com/app/answers/detail/a_id/5448). Maybe that'll help you - it hasn't for me.


I had to turn it on manually under Nvidia control panel. My GPU used roughly 100W more while playing a video on any of the super resolution settings, so it is definitely working when turned on. I didn't do rigorous testing of video source quality though.

The idea is cool, but I'm going to leave it off since I don't think I'll get much benefit out of it.


I tried it on my Ryzen laptop and was able to work. You need to use 360p video and apply the settings and then restart the video for it to kick in. To really tell a difference, you'll want to compare screen captures.


I wonder whether this will work on uncompressed video such as .AVI or .MOV?

I have some HD 8mm transfers that I would like to try out 4k on.


.AVI or .MOV are containers and can contain many different things. But to answer you. Here is a renderer with instructions for mpc. https://github.com/emoose/VideoRenderer/releases/tag/rtx-1.0


I would love to use this to make more watchable some of the weird crap from the 70's I have kicking around on my NAS.

My favorite and weirdest thing from the 70's, which is only available as barely-watchable youtube vids:

https://www.imdb.com/title/tt0078697/


Is it possible to process my own videos with this? Is there a python/cuda API?


Oh god no.. Hopefully it does not become some default behavior in the hardware accelerated decoder.. or even worse, a.i. based codecs..

It's great until it's not.. If I must chose between a blurry video where I don't know what some detail is, or a sharp video where I don't know if some detail was actually there, I'll go with the blurry one thanks.


Will this also fix how youtube always starts high res videos at the lowest resolution making them look like crap until you manually reset it?


Perhaps this[0] plugin will help? It's only for Firefox but I'm sure there are equivalents for other browsers. You can set a plethora of options, including auto-selecting playback quality of videos, playlists, pop-up players, embedded videos, when in full-screen mode and more.

[0]: https://www.mrfdev.com/enhancer-for-youtube


I've been using this extension for years and consider it a must-have. It turns the hot mess that YT has become into a decent experience. The site says it's available on Chrome and Edge, as well.


I know this might be an odd request but does anyone know if there's a way I can get youtube into a view mode where it makes the video as wide as the screen but I can still scroll the comments below, without scrolling the video off-screen?

Like "Theater Mode", or even the default view, but the video stays fixed and the rest of the page scrolls. I don't like the popup player.

I use Firefox if it makes a difference.


This must be such a easy network to train. Get high res video. Down sample it. Add artifacts. Train the network.


Are there any software solutions for upscaling content on-the-fly as you watch it in a media player like VLC or MPC-HC?

Most televisions nowadays have built-in upscaling so you'd think something would exist for the desktop PC market.


For VLC I don't know, but with MPC-HC you can use madVR [0]. My preferred setup is mpv [1] with a few shaders [2] such as FSRCNNX [3] and Krig Bilateral.

[0]: https://madvr.com/

[1]: https://mpv.io/

[2]: https://github.com/mpv-player/mpv/wiki/User-Scripts#user-sha...

[3]: https://github.com/igv/FSRCNN-TensorFlow/releases


Most televisions only render at one resolution; they don't have the ability to dynamically resize the "window" of the player. I've always just assumed upscaling on the TV is just the same as fullscreening VLC on a large monitor.



Curious about the eventual usage of this in WebRTC.


put a big * in the title:

*for those with RTX cards.

also funny how this is out while youtube is rolling out higher bitrate videos.


Put a big * to "youtube is rolling out higher bitrate videos"

*for those that pay.


Bad on youtube for down scaling old videos




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: