Hacker News new | past | comments | ask | show | jobs | submit login

I have trouble understanding your use of the term DRM. Media DRM makes sense: the copyright holders want to "manage" their rights digitally. How is that relevant to Play Integrity or WEI? Whose right is being protected or managed? If I have an Android without Play Integrity there are certain apps that will not run, but I don't see any rights being managed here: an app developer has the right to refuse service just like I have the right to refuse running an app.

In fact I see no relationship between DRM and Play Integrity other than a tenuous connection that both are about controlling what a user cannot do on their device. If this is what you mean, then you have made the same mistake as FSF by conflating unrelated technologies.




Ultimately, DRM is untenable without users also being locked out of their own devices.

Consequently pressure to support more effective DRM will always translate into pressure to restrict what users can do with their devices.

Furthermore, the only defense against this is large open device market share: once closed devices comprise most of the market, DRM proponents can announce they'll stop supporting open devices, creating a downward spiral that further decreases the availability of open devices.

And then we live in a future that's fucked.


This is an FSF level understanding. Android devices are fully open and you can reflash them to whatever OS you want. Some remote servers won't give you service if you do that, but nothing is locking you out of your device. As Android dominates the global market, you already live in that world where most devices are open.


>Some remote servers won't give you service if you do that

This is exactly my problem. Before ideas like this surfaced, the demarcation line between who controls what was purely based on ownership. The machine that I own acts only on my behalf and in my best interests, the server that you own does so for you (or atleast for PCs this has always been the case)

TPMs, attested bootchains and whatnot trample on this whole concept. It's like your very own hardware now comes with a built in Stasi agent that reports on your conduct whether you like it or not. It bothers me on a visceral level and I'm constantly wondering if it's just me.


It's not just you but what people who hate remote attestation tend to forget is that it's a sword that cuts in both directions. Servers can remotely attest to you, not just the other way around. Signal is an example of an app that demands a remote attestation from the server before uploading your sensitive data.

Attestation is just a tool. It can be used for all kinds of things and doesn't privilege one side or another. The average app developer doesn't truly care what device you use, they just want to cut out abuse and fraud, which are real problems that do require effective solutions.

Ultimately, trade requires some certainty that both sides will act as they promise to act. Attestation is more important for individuals attesting to companies because individuals have so many more ways to hold companies to account if they break their agreements than technology, like the legal system, which is largely ineffective at enforcing rules against individuals due to cost.


> Attestation is just a tool. It can be used for all kinds of things and doesn't privilege one side or another.

It priveleges the side that designs and uses it. By and large that's going to be the corporations, not individuals or those acting to maximize their interest.


> The average app developer doesn't truly care what device you use, they just want to cut out abuse and fraud, which are real problems that do require effective solutions.

I don't doubt that. But the price of attestation, if it's not properly isolated from the hosting OS (like Microsoft's completely unrealistic attempts of bringing the whole OS into the trusted computing base, kernel and applications and all), would be a homogeneity of computing I don't think is necessarily worth the benefits.

The good news is that such proper isolation is not only possible but even desirable (it keeps the trusted computing base small), and if done well could actually replace annoying half-measures such as "root detection": Who cares if my phone is rooted, as long as my bank's secure transaction confirmation application is running in a trusted, isolated enclave, for example?


Fair points. I was aware of this anti fraud angle of WEI/attestations before.

From this point on this is more of an emotional argument rather than a technical one, but I feel like the negative effects way outweigh the positive ones. Giving MORE power (be it technical or poltical) to big tech companies is just tipping the scales in their favor so much we will even worse off than we already are.

But if you work in anti-fraud and are fixated on solving this problem as effectively as possible, I can imagine not caring about this too if I were you...


Fully agreed on attested bootchains. General-purpose level OS-wide attestation is indeed a blight on open computing: It's ineffective because it implies a gigantic trusted code base (what are the odds that the entire Windows kernel is completely free of vulnerabilities?), and conversely it does tie you to somebody else's more or less arbitrary kernel build.

Almost complete disagree on TPMs. A better comparison than a spy would probably be a consulate (ok, maybe an idealized one, located underground in a Faraday cage): Their staff doesn't get to spy on you, but if you ever do want to do business with companies in that country and need some letters notarized/certified, walking into their consulate in your capital sure beats sending trustworthy couriers around the world every single time.

To torture that analogy some more: Sure, the guest country could try to extend the consulate into a spy base if you're not careful, and some suspicion is very well warranted, but that possibility is not intrinsic to its function, only to its implementation.


By that same logic evil is not inherent to attested bootchains either. When used to verify that the computer loaded the OS that the end user expected it is a very powerful security tool. It is only bad when the keys aren't under the control of the device owner.


You're mixing up the authentication and attestation parts of secure boot here.

You can absolutely install Linux, run secure boot (e.g. to protect you against "evil maid attack"), use your TPM to store your SSH keys, and live a happy and attestation-free life.

You can also do other things, but if you don't want to, why would you?


* If permitted by the device manufacturer


Attested boot chains aren't normally being used to attest a whole general purpose OS. They attest up to a small hypervisor that allows partitioned worlds to be created and chain attested, and then sensitive computations are done inside that.


> It bothers me on a visceral level and I'm constantly wondering if it's just me.

It's not just you.

It disgusts me so deeply I wish computers had never been invented. A wonderful technology with infinite potential, capable of reshaping the world. Reduced to this sorry state just to protect vested interests. They used to empower us. Now they are the tools of our oppression.


While I don't agree with the FSF on even close to everything regarding trusted computing, I think for a fair discussion you'd have to at least steelman their arguments here:

I think it's fair to assume that in a world in which almost every device supports attestation and makes it available to any service provider by default, without giving users an informed choice to say no or even informing them at all, service providers are much more likely to provide access exclusively to attestation-capable clients.

That, in turn, has obvious negative consequences for users with devices not supporting attestation (whether out of ideological choice, because it's a low cost device and the manufacturer can't afford the required audits and security guarantees etc.): Sure, these users will always be able to just refuse to transact with any service provider requiring attestation.

But think that through: We're not only talking about Netflix here. At what availability rates of attestation will decision makers at financial institutions decide that x% is good enough and exclude everybody else from online banking? What about e-signing contracts for doing business online? What about e-government services?

I am at the same time excited about the new possibilities attestation offers to users (in that they will be able to do things digitally that just weren't economically feasible for service providers, since they often have to cover the risks of doing so) as I am very wary of the negative externalities of a world in which attestation is just a bit too easy and ubiquitous.

In other words, the ideal amount of general purpose attestation availability is probably high, but significantly below 100% (or, put differently, the ideal amount of friction is non-zero). Heterogeneity of attestation providers can probably help a bit, but I'm wary of the inherent centralizing forces due to the technical and economical pragmatics of trusted computing.


The ideal amount of attestation on a general purpose computer which is owned by me is zero. Any nonzero amount implies that control of the device has not actually been turned over to me. It implies not only the slippery slope to which you refer but also things about back doors and opportunity for dystopian political regimes and much more.

When it comes to financial or legal matters (and this includes online banking) a small dedicated hardware element for signing fingerprints is all that's ever been required. Anything more is an overreach.


> back doors and opportunity for dystopian political regimes

No, this is a misunderstanding of what a TPM is.

A TPM is a secure element inside your computer, similar to the chip running your credit and debit card. That's it. Without you using it (i.e. your OS or an application you installed asking it to do something), it's exactly as dangerous as a blank chip card in your house that you don't use and didn't open any account for.

If you don't want anybody to talk to it, don't install applications or OSes on your computer that do things you don't want. You have full control over that! Not running software that's not acting in your own best interests is generally good practice anyway, TPM or no TPM.

> [...] a small dedicated hardware element for signing fingerprints is all that's ever been required [...]

You might be happy to hear that that's exactly what a TPM is, then!


I am fully aware of what a TPM is. I was speaking about trusted computing - ie the "general purpose attestation capability" that you referred to above.

As you say, a TPM alone can't do much of anything and doesn't pose much of a threat. Of course expanding the acronym - Trusted Platform Module - is a bit of a giveaway. They were always fully intended to serve as the root of trust for much more nefarious things.


People keep saying that, yet the only thing I’ve ever seen TPMs used for is full disk encryption and user authentication.

Conversely, DRM is alive and well on almost universally TPM-less devices.

By the way, all of your comments in this thread end up dead – I had to vouch for them to be able to answer. Not sure what’s up with that.


> the only thing I’ve ever seen TPMs used for is full disk encryption and user authentication.

Aren't all device attestation schemes underpinned by authenticated boot which itself is underpinned by a TPM? This is certainly the case for Android - AVB is implemented on top of secure boot on all the devices I've ever owned (and Play Integrity, if I had ever permitted it to run, on top of that). Do I have some misunderstanding about the stack?

> Conversely, DRM is alive and well on almost universally TPM-less devices.

You mean software DRM I assume? Because the only TPM free hardware backed DRM that comes to mind is GPU based encrypted streams where the GPU does the decoding and final compositing locally. And even then the TPM-equivalent exists, it just isn't accessible to the end user.

SGX can be used to do various interesting things without attesting the state of the broader system, but none of the examples that immediately come to mind feel much like DRM to me.

> comments in this thread end up dead

Thanks for letting me know. I guess I should email them?


You think there's no value in your laptop being able to attest its state to your phone in order to give you confidence it hasn't been tampered with? That's something that would be entirely under your control.


There's value in that, but it's a lesser value than the risk of normalizing manufacturer locking of computing devices.

Because the immediate next step after locking devices down is profit extraction from users.

Do you think Apple would have been able to maintain their App Store margins absent device control?


So don't normalise manufacturer locking. We're not going to prevent the bad thing from happening by arguing against the hardware that enables the bad thing - we're going to need to argue against the bad thing.


I would if I trusted them, but IMHO ever device manufacturer is looking at their slim hardware margins and wishing they were fat software margins.

There's simply too much incentive for abuse.

The only way I personally could see supporting them is if there were first a legislative requirement that trusted modules always be user-modifiable.


> Android devices are fully open and you can reflash them to whatever OS you want.

It doesn't matter. Those devices fail hardware remote attestation.

> Some remote servers won't give you service if you do that, but nothing is locking you out of your device.

The device's purpose is to be used. If it can't be used without giving up things like banks and private communications, it won't be used.

Device is not locked, it just turns into a paperweight if you actually unlock it.

> As Android dominates the global market, you already live in that world where most devices are open.

Wanna know what else dominates the global market? WhatsApp. In many regions of the world, without their services, you are ostracized.


As an example of the excess here:

Marriott (the hotel brand) shipped a release of their Android app that refused to run on unlocked devices.

It probably didn't impact the majority of (locked) Android devices, so why would Marriott care?

And with one app update, a valid user configuration became less capable.


When this remote attestation business started, people tried to minimize its impact by saying only apps that really needed it would use it. Such an absurd argument. Everyone is going to use this technology. It will literally become the default.

Everyone loves cryptography and wants it working in their favor. Everyone. It's great for us when it protects our messages and browsing from surveillance capitalism and warrantless government espionage. It's extremely bad for us when it becomes the policy enforcement tool of corporations and governments.

Remote attestation means we either we run the software which does their bidding and protects their interests and bottom line or we don't participate in society or the economy. Only way it could get worse is if the government starts signing software as well. One day even the goddamn ISPs will refuse to link to our hardware if it fails attestation.

It's literally the end of free computing as we know it. Everything the word "hacker" ever stood for, it's over.


Meh. I didn't reflash my phone. I didn't root it. I didn't do anything to modify its system files whatsoever.

I just installed KDE Connect, and an open source keyboard. Banking apps refuse to run because of those (because my keyboard might see my keystrokes!!!). They don't even need a failed hardware attestation to refuse you service.

So even if you don't try to modify your device, your device might still end up like half a paperweight. I either can't do banking, or I can't use the functionality I want.


They do need the hardware attestation to prevent you from lying to them about what keyboard you use.


The ability for someone with a news article or a game to only have you experience it if you pay their fee or watch their ads, preventing you from copying the content off your device or modifying it in some way that is unauthorized (removing ads or otherwise modifying the behavior to circumvent protection mechanisms) is pretty obviously the exact same idea -- not some mere metaphor -- and is a protection of the exact same "right" conferred by the exact same laws as allowing someone with a movie to only have you see it if you pay their fee or watch their ads... I am honestly having a difficult time understanding your confusion here :/.


You are still talking about DRM in the context of copyright. If someone has a news article or a game, they have copyright on that article or game and they use DRM to protect their copyright. All these are applications of DRM.

Applications like Play Integrity could be quite different: say a bank can refuse to move money if your instructions to move money comes from a device deemed not trustworthy by Play Integrity. That's like a bank can refuse to let you into their branch if you are dressed in swimwear. A game can also deploy this tech for anti-cheating purposes; really no different from a real-world casino refusing a customer who is known to be good at card counting.


And this is the root cause you fail to understand - the idea of copyright contradicts the idea of information freedom. You should be able to make a copy for you own purposes such that when you go back, the information is still the same and not manipulated and you should be able to actually share this information given it's important. For example a news story about corruption that has been taken down.

Also why the hell you believe that the same copyright rules that apply to a movie that can take millions to make and keeps relevance for years should apply to a news article for example? It's madness.


Information freedom is merely an ideal not a right. It is an ideal by techno-optimists. But there is no legal basis for information to be free. Indeed I agree with you that the idea of copyright contradicts the idea of information freedom. And guess what, copyright is in our constitution, and information freedom is not.

Furthermore, there is also no legal basis in differentiating copyright by the budget involved to produce the work.


> an app developer has the right to refuse service

They shouldn't have that right any more than a tools manufacturer has the right to prevent you from buying one of their hammers.

The right of first sale is extremely important to a functioning capitalistic society and it's completely absent from the digital world - by design.


> an app developer has the right to refuse service just like I have the right to refuse running an app.

In this case it feels like an app developer having the right to punch[0] you in the face just like you have the right to refuse being punched in the face :-P.

[0] (to use a family friendly verb)


It's not about "rights". It's about power. It's about turning you into a serf in their digital fiefdom. A perpetual consumer.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: