There are some differences. DP-3T proposes two different systems, one with linkable tokens and the other without linkable tokens. The first system is similar to Apple-Google in the sense that your tokens for a day are derived from a key which is uploaded to a central distribution server when you test positive. In the second system the tokens are not linkable and they propose the use of a Cuckoo Filter to reduce the space complexity. A Cuckoo Filter is a probabilistic data structure that can tell you if an item is not or might be in a set. As a result there are some false positives.
DP-3T also explains how records are uploaded to a central server and the interactions with health-care providers. Apple-Google omit this part and focus on proximity data collection.
I'd like to mention the TCN Protocol here (https://github.com/TCNCoalition/TCN), another very similar specification. I bring it up because the readme goes into quite a bit of (easily understandable!) detail regarding the trust assumptions of such a protocol and associated rationale.
Ultimately I think Apple and Google are right to omit record upload and authentication concerns from the base protocol. The low level implementation should be as interoperable and generalized as possible in order to facilitate immediate uptake and maximum reusability. Higher level concerns such as who to trust and how to interact with users can be handled by the various app implementations.
It seems unlikely that anyone will deploy a version of DP-3T that differs significantly from the approach built into Android and iOS, due to the need for apps to obtain special permissions to run in the background. So the alternative variants that go under that brand are probably a dead letter.
"Those privacy principles are not going to change," said Gary Davis, Apple's global director of privacy. "They are fundamental privacy principles that are needed to make this work."
Can an app not simply ask the user for, and subsequently be granted, the necessary permissions? At least on Android I had understood it to work that way in theory, although in practice perhaps it doesn't always behave ideally (https://support.google.com/pixelphone/thread/6068458?hl=en).
Edit: I see now that it's specifically iOS that doesn't provide for granting the required permissions. I find such lack of control over a device that one supposedly owns highly concerning at best.
Guess: many people would tap OK without thinking about it (they don’t understand or care what background means) then would be unhappy that their battery drains..
It seems to me that restricting freedoms to combat ignorance is unlikely to have a desirable outcome. To your specific example, I suspect that bluntly warning that granting the permission has the potential to lead to significant loss of battery life would get even the most technically illiterate user's attention.
More generally, how are background streaming services supposed to work on the iPhone? Does Apple have to individually approve every app that wishes to do so (ex Spotify, Pandora, ...)?
No, they would just click anything that makes that dialog that stands between them and their goal of installing the app go away without reading the text and then be unhappy that their battery drains. Any design that relies on a confirmation dialog is fundamentally broken. Even technically competent users will read most confirmation dialogs as „Let me do what I want [Abort] [OK]“ no matter what you actually write there.
In many situations we may not have better solutions, but that doesn‘t change the fact that this is terrible.
> Any design that relies on a confirmation dialog is fundamentally broken
I'm having trouble interpreting this in any way other than a claim that granting users control over their devices is a fundamentally broken idea. I won't dispute that users often choose to do dumb things in practice, but it seems the two of us have a fundamental disagreement in our underlying worldviews.
> Even technically competent users ... no matter what you actually write
I'd argue that such users aren't actually technically competent then, despite the high opinion they might have of themselves. On the other hand, perhaps the users are technically competent and it's actually the relevant software developers that have done a poor job of communicating? If an actual technically competent user is experiencing significant difficulties using a program, then perhaps the program doesn't work as well as the developers thought it did.
The issue is that because of some "bad" users you restrict all users. What I do when I designed a prompt dialog that gates a dangerous operation is make the user type something, yopu could have the user type a different thing so you can confirm he actually reads the prompt text so there are technical solutions, IMO the justification that Apple is taking your freedom to protect a subgroup of users is not the reality, the reality is that restrictions make Apple more money, if lifting the restrictions would make them more money you will see a lot of praise on how smart tech is behind Apple's dialog prompts that allow you to lift restrictions.
Could this be a feature rather than a bug here? Help maintain tracing to a very high level statistically while still giving plausible deniability for personal privacy?
This is definitely by design. The Cuckoo Filter relies on hashes of the input so there is a chance of collisions. My understanding is a Cuckoo Filter is a recent extension of a Bloom Filter, if you're familiar with those.
Would you okay with me reposting your paper on my site? I'm working on a piece about the regulatory implications of contact tracing apps in the U.S. -- you've done a better job outlining the pros and cons of the various approaches than I could.
Feel free to hit me up henriquez AT protonmail if you'd like to discuss!
DP-3T also explains how records are uploaded to a central server and the interactions with health-care providers. Apple-Google omit this part and focus on proximity data collection.
Edit: Formatting + I wrote a survey paper on a few of the distributed protocols and how they defend against linkage attacks (de-anonymization): https://github.com/robertTheHub/ContactTracingSurvey/blob/ma...