Hacker News new | past | comments | ask | show | jobs | submit login

>This data is becoming a huge mechanism for subsidizing TV sales and the interactivity is being looked at as a huge opportunity to recoup some of the ad spend being lost via streaming and fewer 30 second spots

So what are the options for a consumer willing to pay for privacy? Will console manufacturers be more respectful for example? (I've considered a console to serve as a bluray player / host OS for streaming apps that also plays games).

Or are we stuck using dumb tvs and connecting out laptops to them via HDMI? (And thus no 4K iirc)




I was watching a ripped Spiderman years ago on my PS3 and the PlayStation refused to play it after 10 mins with an antipiracy message. This was via a network video server. Don't see why Sony would have rolled back that feature since.


This is Cinavia audio watermarking. It's designed to survive lossy compression by staying within the human audible range.

> If a "theatrical release" watermark is detected in a consumer Blu-ray Disc audio track, the accompanying video is deemed to have been sourced from a "cam" recording. If the "AACS watermark" is present in the audio tracks, but no accompanying and matching AACS key is found on the disc, then it is deemed to have been a "rip" made by copying to a second blank Blu-ray Disc.

https://en.m.wikipedia.org/wiki/Cinavia

Edit: that same page says its now a requirement for all consumer bluray players to use this tech. But I don't remember seeing those messages for years. The pirates must be winning with their methods of changing the signatures.


> So what are the options for a consumer willing to pay for privacy?

Don't buy a TV at all. Instead, buy a large monitor and hook it up to a computer to act as a media center.


Maybe this is another selling point for Asus new TV-sized gaming monitors.


pihole (while list DNS, etc.) or not connecting the TV to internet, block it by MAC entirely on the router.

>Or are we stuck using dumb tvs and connecting out laptops to them via HDMI? (And thus no 4K iirc)

HDMI 2.0 (2013) supports 4k/60Hz.

HDMI 2.1 is significantly more ambitious with 8/10k resolution and variable refresh rate.


The 4K comment was likely in refrence to streaming providers like Netflix, etc. which don't offer 4K content playback on devives which are not deemed to be adequately locked down, which is a typically a stipulation of their content licensing agreements.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: