Hacker News new | past | comments | ask | show | jobs | submit login

Hmmm... Would love if this allows me watch netflix using a fully FLOSSed arm sbc.



The point of semi-unusable DRMed crap when L1 released keep getting WEB-DLed?

Very likely it's not the Shield now which leaks L1, but an actual key recovery because they get the stream even before it gets watermarked in the secure domain.

My guess, it's Qualcomm's debugging TZ applets. They cannot really revoke keys because they will take down a giant amount of Snapdragon based handsets for which manufacturers don't bother to put a single OTA.

This is also likely why Netflix uses such a silly restrictions as refusing to run on old Android version numbers on some Snapdragon handsets, which are easily root bypassable.


Lately Google has mostly stopped revoking whole devices. Instead, when someone extracts a key from a device and it leaks publicly, they just revoke that one specific device's key. That improves the experience for legitimate users, but also means the person who extracted the key can just go buy another device of the same model and use the same exploit to extract a new working key.


Are these keys unique per-device?


Yes.


The content key at most Netflix like website is one for the entire library, only per-device key needed to get that key, and watermark ID is unique.


What's L1? web-dls I have seen are always relatively low resolution, so at least it protects fullhd or 4k.


1080p WEB-DLs are very common if you're in the right places, but even public trackers should have plenty. 4K is slightly less common but does also happen, with frequency depending on the streaming service.


>so at least it protects fullhd or 4k.

Not really. Popular streaming-exclusive shows often get 1080p versions released within a few hours, and the 2160p versions released within a few days.


The Grand Tour S04E03 4k web-dl is readily available on private torrent trackers hours after public release on Amazon Prime.

No idea what encryption Amazon use, but suffice to say it is thoroughly broken by someone out there. Given the expense of acquiring those presenters and the production costs of their shows, and how they bring people to the Prime video platform, I suspect Amazon is reasonably interested in keeping that content protected.


Widevine L1 - the hardware DRM in ARM trustzone with individual keys for each chip.


I've never really looked long into these things, but now that most GPUs do the actual video decoding, how come it's still not possible to use Linux or any random OS? Isn't the GPU supposed to somehow guarantee that it only sends the decrypted stream to a compliant screen? Isn't this the point of HDCP?

When this was done in software, I understand that open source decoders could have been modified to pipe the clear stream to disk, but now the software basically just hands the encrypted blob to a "trusted" hardware decoder.

Or am I missing something?


Linux is not the issue here but ARM processor as OP said. First DRM library for ARM came only earlier this year so finally I am able to play DRM content on Raspberry Pi 4 in Chromium but I am not going to because chromium is painfully slow and plugin maintainers have figured out how to play Netflix in Kodi.

edit: DRM library still doesn't get to "fully FLOSSed"


So then Netflix et al.'s requirement of Windows or macOS in order to play high definition video is purely artificial on x86?

Last time I checked (a few months ago) they didn't even support Chrome (either Windows or Mac) for UHD, they required Safari, Edge or their own Windows app.


The UHD restriction is not a technical one. Content producers and the rest of the media industry has strict requirements about streaming. UHD content uses proprietary DRM systems from Microsoft and Apple that are considered more secure than WideVine, which is why those browsers are permitted to watch 4k.

You can't watch UHD content on Edge for Linux, for example, because the necessary DRM isn't implemented.


So does this mean that the DRM is still at least partially implemented in software?

I also seem to remember that Netflix on Windows (don't know about mac) also requires hardware decoding support for HEVC1 (or whatever their codec is). I never got it to work with an intel GPU (8th gen udh630 — supposed to support it) but it worked with an AMD GPU.


Partially, for sure. However, even with hardware support you need some kind of licensed (signed?) binary interface with the DRM hardware to ensure that nobody is decoding a hundred Netflix streams at a time.

I don't know a lot about how everything works below the hood, but you can probably find some of the details on the Microsoft website [0] if you want.

[0]: https://www.microsoft.com/playready/


Although, you used to be able to watch it in Windows Edge in Wine.


Between the video decoder and the screen is the display server (e.g. Xorg or GNOME Shell) which is untrusted.


This wasn't my understanding. If the decoding happens in hardware, I wouldn't have expected the decoded video to be passed back to the display server to be sent back again to the GPU and out to the screen.

My understanding was that there was some kind of compositing going on, in hardware, where the display server would tell the GPU to display the output between some coordinates, but the server itself wouldn't know what the actual output would be.

Here is the libva documentation which seems to support this: http://intel.github.io/libva/group__api__prot.html




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: