I consider myself unusually sensitive to slippery language and have called out--on HN--several fishy denials by tech companies that I thought were laughably weasely, including Facebook's previous statements. But I'm not seeing it here. A user data request is a request for users' data, and I don't see a way around that. Additionally, this is remarkably unambiguous:
...>this means that a tiny fraction of one percent of our user accounts were the subject of any kind of U.S. state, local, or federal U.S. government request
The only wiggle there is "subject to", and it's not much. So right, they could always be lying, but I don't see much room for interpretation.
More broadly, at some level, we're always going to be dependent on companies to be honest about what they do with our data. Not trusting them is fine, but that's not really a question of transparency or policy; it's just a question of corporate integrity. We should instead focus on what we can concretely hope to control:
- Transparency, which means a) we know exactly what the rules are governing what the government can ask for, b) a detailed accounting of what they actually do ask for, and c) the range of data that can be requested and a precise account of what the standard for requesting the various kinds of data.
- Reform, by which I mean changes to the rules about what can be accessed by whom and under what conditions, with an eye towards individualized requests for specific data, the way wiretaps have been handled (until recently) for a long time. The domestic/foreign distinction should also be removed as part of this point.
- Accountability, by which I mean that companies should never, ever get any form of immunity from law suits. Companies should be liable to their customers about privacy issues, and they should face real consequences if they lie about it. I think this might be the heart of the trust issue.
- Oversight, which is largely colinear with transparency, wherein there is a meaningful adversarial process for balancing the issues of privacy and security, as opposed to the lame, secretive rubber stamping we have now.
- Scale. Last but not least, the relative values of privacy and security are way out of wack, especially given how few people terrorism has really killed and how small of a threat thoughtful analysis reveals it to be (or even the extent to which "terrorism" is a coherent concept). So we just need to raise the bar on what constitutes a reasonable seizure of data.
All of that is stuff you could reasonably pass laws on, so some of the nihilism in this thread is unfortunate. Maybe we'll get all of those reforms, or some of them, or maybe none of them; I don't know. But if those are genuinely our aims as a people, we should see Facebook's release as a real step in the right direction on three of those fronts. If they're lying, of course that sucks, but I'm not sure what we hope to do here besides pushing for more accountability. But in meantime, kudos to Facebook.
...>this means that a tiny fraction of one percent of our user accounts were the subject of any kind of U.S. state, local, or federal U.S. government request
The only wiggle there is "subject to", and it's not much. So right, they could always be lying, but I don't see much room for interpretation.
More broadly, at some level, we're always going to be dependent on companies to be honest about what they do with our data. Not trusting them is fine, but that's not really a question of transparency or policy; it's just a question of corporate integrity. We should instead focus on what we can concretely hope to control:
- Transparency, which means a) we know exactly what the rules are governing what the government can ask for, b) a detailed accounting of what they actually do ask for, and c) the range of data that can be requested and a precise account of what the standard for requesting the various kinds of data.
- Reform, by which I mean changes to the rules about what can be accessed by whom and under what conditions, with an eye towards individualized requests for specific data, the way wiretaps have been handled (until recently) for a long time. The domestic/foreign distinction should also be removed as part of this point.
- Accountability, by which I mean that companies should never, ever get any form of immunity from law suits. Companies should be liable to their customers about privacy issues, and they should face real consequences if they lie about it. I think this might be the heart of the trust issue.
- Oversight, which is largely colinear with transparency, wherein there is a meaningful adversarial process for balancing the issues of privacy and security, as opposed to the lame, secretive rubber stamping we have now.
- Scale. Last but not least, the relative values of privacy and security are way out of wack, especially given how few people terrorism has really killed and how small of a threat thoughtful analysis reveals it to be (or even the extent to which "terrorism" is a coherent concept). So we just need to raise the bar on what constitutes a reasonable seizure of data.
All of that is stuff you could reasonably pass laws on, so some of the nihilism in this thread is unfortunate. Maybe we'll get all of those reforms, or some of them, or maybe none of them; I don't know. But if those are genuinely our aims as a people, we should see Facebook's release as a real step in the right direction on three of those fronts. If they're lying, of course that sucks, but I'm not sure what we hope to do here besides pushing for more accountability. But in meantime, kudos to Facebook.