The essay I wish she had written would have said, "we don't need more 'secure messaging systems' -- we should be making all the following tools secure by design"
For example, (and whether you love them or hate them) Apple takes this seriously: your fingerprints don't leave the device and are implemented by a piece of hardware in such a way that even Apple doesn't have access to them. Compare that to the Android implementations which have fingerprints in the filesystem. I'm not intending to hold them up as some paragon, simply showing that securing information is more than just a messaging issue.
I maintain the Fingerprint stack on Google Nexus/Pixel devices.
> your fingerprints don't leave the device and are implemented by a piece of hardware in such a way that even Apple doesn't have access to them.
This is true for Google, too.
> Compare that to the Android implementations which have fingerprints in the filesystem.
They're not stored in plaintext. They're encrypted with keys that remain in hardware. You can never read the keys, only encrypt files with them, from applications running in a separate physical address space, accessible only through the hypervisor.
This may not be as elegant as doing matching on the sensor, but keep in mind we have to play within the constraints of 2 difference vendors (SoC, Fingerprint Sensor).
Implying that Android is somehow less secure in the case of fingerprint is disingenuous.
First of all, I understand your taking offense (though none was intended) since this is the thing you work on. I feel the same way about stuff I work on! Second, I simply used that as a straightforward example, not to say "Apple good, android bad."
Security is always a risk tradeoff. My house uses special Medeco keys (modified in violation of the Medeco license as it happens) because what I want to defend against is copying when I lend out one of my numbered keys. The house is made of glass so someone could just use a brick to get in. For someone else this is a dumb tradeoff; for me it is appropriate.
And your message includes an example of such a tradeoff:
>This may not be as elegant as doing matching on the sensor, but keep in mind we have to play within the constraints of 2 difference vendors (SoC, Fingerprint Sensor).
>Somehow implying that Android is less somehow less secure in the case of fingerprint is disingenuous.
The channel between the two separate devices is an attack surface (as is the opportunity for offline attack on the encrypted data in the filesystem). The Android team has made the determination that that's an acceptable risk. I won't argue -- perhaps I agree and perhaps I don't, but it is unambiguously less secure than the single, all-encompassing approach. To claim so is hardly disingenuous.
Of course, as with my house, this just changes the class of attack. On either platforms you can consider attacking whatever authentication module talks to the TEE. Both systems use other mechanisms for that.
Finally, in the Android case a vendor is free to do what they want, especially if they don't care about access to branding, google services etc. Google branded devices have a more important brand to defend than, say, Asus (not to pick on Asus), much less, say, DJI's drones, for which the Android branding is utterly irrelevant (conceivably that could rebound on Google though personally I doubt it would even happen. The same obviously can't be said about iOS :-)
I think they were trying to deescalate, and your reply is more aggressive than necessary.
FWIW, "disingenuous" carries the connotation of intentional deception, i. e. making an argument in bad faith, because one knows that it's wrong.
In this case, I can understand where the defence of Android's implementation is coming from: the original description amounted to "plain text in the filesystem, but the name ends in .mp3 so nobody will find it", whereas the reality seems to be a slightly less elegant solution that can probably be subverted by a dedicated nation-state, but is unlikely to ever make a difference in most people's lives.
> My house uses special Medeco keys (modified in violation of the Medeco license as it happens) because what I want to defend against is copying when I lend out one of my numbered keys.
I would like to hear more about these modifications.
If you want stringent key control at the consumer level, I've found a good one in Abloy's Protec2 system, with a Ruby Abloy dealer. Under that system, an exclusive key blank is manufactured and made only available to a specific dealer. No other Abloy dealer is able to get that blank. [1] Put a Drumm Geminy Shield in front of a Protec2 cylinder to put a long delay on typical hand-held destructive attacks against the cylinder, or force an attacker to bring in heavy equipment to break the lock, and most attackers will opt for either softer targets or a brick through the window.
That key system also lets you do the usual key control designing, like a master key that opens everything, and a loaner key that only opens the front door, for example. Like any security system, you will want defense in depth, and have other layers protecting other aspects.
> They're encrypted with keys that remain in hardware. You can never read the keys, only encrypt files with them, from applications running in a separate physical address space, accessible only through the hypervisor.
By 'in hardware', I'm guessing you don't mean real silicon / flash, but rather whatever persistent storage is provided by the SoC vendor's TrustZone implementation. In my opinion calling that 'hardware' implies a level of security and verification that is not actually present, and is disingenuous.
If history has shown us anything, it's that relying on a chipmaker's security assurances for platform software features can only end in tears. You don't need to look very far to find real world examples of exploits against TZ stacks [1].
It sounds like you are saying Nexus devices have a hsm separate from their gsm card for hardware protected keys, do all fingerprint enabled Android devices have this?
Ahh, just say SIM. GSM is the network, SIM is the module.
SIMs are basically just javacards, so having custom crypto applications on them is no big deal, but you need some control of the SIMs your users will want to use in your device for it to be practical.
Google is US focused so they presumably still deal with non GSM networks and the possibility of running a fingerprint without a gsm standard SIM.
* I.e. there are a lot of issues for Google generalizing encryption via some user provided sim as datails come up. So maybe he is referring to an entirely different crypto module.
What? Google CTS demands that devices store fingerprints in TEE and they never can touch filesystem. Where did you get the idea that fingerprints are stored on a filesystem?! O.o
If you read through the TEE design requirements the goal is DRM, and data exchange (not just a handshake) is permitted. Apple uses a dedicated mechanism so that the entire path (sensor->authentication) is hidden from apple by physically bonding the pieces together. This is why, when you get your screen replaced, you have to teach your phone your prints again: they are gone. The Android CTS permits a path from the sensor to the trusted environment, a path which could be subject to interception.
The need for a trusted path has been well understood since the 1960s at least (and its roots run back through the "break" key on the teletype); it's reflected in the rainbow books and of course in the design of Multics. Plenty of people at Google understand this well; some of them even participated in writing those rainbow books!
You can ship a device without passing CTS and it is very common (though TBF, things like my Mavic drone don't have a thumbprint sensor).
Again, I'm not holding Apple up as some paragon, it was simply the example I used to point out the importance of security design at depth, defense at depth, and integration into all aspects of a system.
But fingerprints are so convenient. I'm using iPhone 4S and fingerprint sensor is a huge reason I want to upgrade, always typing PIN-code is boring. And always typing a passphrase — I'm not sure anyone would do that.
Depends on your adversaries, of course. I fear that if police might want to open my phone, they'll beat me until I'll unlock it and whether it's a passphrase or fingerprint — it doesn't matter.
Thing is though that the fingerprints in the Apple design are a "if you want it, it's there" feature. Yes it's strongly encouraged, but those more security minded could bypass it's setup (if that's still possible) or remove the only enrolled finger.
Anyone who takes security seriously should open-source the security aspects of the system. Putting them in a proprietary black box is one the worst ways to guarantee security.
Also, fingerprints or few-digit pins should be only used as casual deterrents to people grabbing and doing something with your phone before you can get to it. Those should not be used to encrypt sensitive data. Good security comes from good, high-entropy passwords.
That list is pure genius. I'm working with reporters mostly in non-OECD countries, 80% of the stuff on that list are real, largely unmet needs. Signal isn't perfect, but it works fine, thanks.
A special place in hell should also be reserved for those in the tech world that mix up security (a real and critical need that people have) and their software politics. If you don't like Microsoft, Google, Adobe, "the Cloud", or smart phones, that's fine. There are plenty of reasons. But don't short circuit security assessments in the interest of promoting technology you find politically superior.
Signal is honestly pretty close. The "failures" are usually design choices that would only affect actual infosec people, which the developers are explicitly not trying to court.
I disagree. Sitting on a phone and sweet-talking sources isn't high-tech, and neither is bundling up five tweets to make a clickbait.
But as soon as you're talking about doing public records research, tracking the activities and ownership of a multi-national company, working in a team with people from 30 countries on a single story, try to manage your confidential evidence across five years, it becomes quite technical.
Think about the "free" apps we're all using that are on that list, and wonder what Google and Apple are doing with the personal information in there.
Calendaring system: Google Calendar, iCal
Contacts databases: GMail Contacts, Apple Contacts
Location coordination tools: Google Latitude, Find My Friends.
Some of those programs don't exist, as far as I'm aware.
A Wiki/Etherpad/Word style change tracking mashup, and Map-based storytelling and analysis systems are two that I'd like to see implemented.
I think matrix forms a nice basis for a secure chat platform. Only the apps are not really all that good. The desktop apps are electron based and the android app is just not up to par with telegram or WhatsApp.
In a lot of regards, the "not up to par" part of riot.im or matrix is the lack of a contact list (or simulated equivalent with same UX). IMHO it's the usability problem #1, and the reason for our team not to switch yet. The issue[0] should receive more attention.
I don't know if it's the app, server or protocol problem but the last time I've evaluated Matrix (Sep 19, based on logs) as a possible Telegram/WhatsApp/Skype/Discord replacement there were some weird issues with "Unable to decrypt: The sender's device has not sent us the keys for this message" even after both parties verifying the fingerprints. I've used a desktop/Electron Riot client, the other party had used the web version.
Restarting the Riot had helped me to see the new messages, but not the old ones (which feels to be a correct behavior in regards to PFS). The actual problem is other party didn't knew I had issues receiving their messages until I've contacted them out-of-band - conversation looked just normal from the other side (except for me writing "test test" and not replying any meaningfully).
We ended up with a decision to try Matrix again sometimes later. Like, next year.
this isn't strictly true - there's a new generation of desktop apps which are native Qt and looking really nice, e.g. https://github.com/mujx/nheko. In terms of the Android app; we're working on it as fast as we can (but it's a one man team atm).
Bandwidth and local storage considerations are missing from this. All of those features are great, but if my decentralized application uses all of my available bandwidth and disk space, then it is not very great.
This is very relevant because decentralized means that all that centralized storage and bandwidth needs to split up and amplified to account for offline users.
> Because we have too many other tools we also need.
Yes, but secure messaging is a well understood domain I suppose so it's the kind of app that is simple(not easy) to write and most tools compete on UX. A lot of tools on the list at the end have complex domains that are not that well understood by stock developers, or they might require a lot of R&D or actually talking to teams that have a problem to solve.
Furthermore it is extremely difficult to market a lot of these tools to corporations. IM? it's fairly easy.
The situation with secure messaging apps today isn't so different than social networks in the 2000's. It's a formula and it's easier to raise capital on that formula at the moment.
I can't help but get an embrace, extend, extinguish vibe from keybase. It was a nifty key discovery service initially, but more and more feels like a closed platform, and gives me a sense of deja vu akin to XMPP on GTalk.
I get what you're saying, but realistically PGP will never cross the chasm, so there was not really a viable business there. I like that they have integrated it, but are moving in a direction that lowers the barrier to entry. There are risks to be sure, but it feels like the most promising of the secure messaging platforms.
It's also worth noting that there are Keybase alternatives in the works, e.g. Linked Identities [0] for proofs about social accounts (Github, Twitter etc.) that just utilizes OpenPGP, and Key Transparency [1] for tracking changes in keys.
I have to agree — I was excited about Keybase, but my interest vanished as soon as they became yet another place to have an account rather than being a way to discover and semi-trust existing identities.
If you really cared about the author's credentials, then why didn't you even bother to do the most basic of Google searches before coming here to complain about not knowing immediately who the author was? I mean you didn't even get the author's gender right. What exactly are you contributing to the conversation here?
This might be something more colloquial to Australians than [wherever you "guys" are posting from], but to me when I see someone referring to 'guys' (as in, "hey guys!", or, "you know, those guys") it isn't always gender specific. You can go up to a group of women and say "hi guys!".
I'm going to gloss over the part where OP then referred to the author as "he", but I just feel that the word "guy" is more and more moving away from meaning "men" and instead just "people".
I'll leave that two cent coin on the table for you guys
Sure, that’s often the case, and I would even have given the usage the benefit of the doubt but then, as you noted, OP said “he”.
But in fact “guy” means “man or woman” the way “he” can mean “man or woman” or how someone will say “ethnic” to mean “non-white. It doesn’t mean the sayer is a sexis/racist/jerk/etc, but it does subtly and casually reinforce a way of looking at the world. So worth correcting without making a huge thing of it.
And yes I’m an Aussie (...and as I was continually reminded as a kid, and sometime still am, I’m a wog too)
For example, (and whether you love them or hate them) Apple takes this seriously: your fingerprints don't leave the device and are implemented by a piece of hardware in such a way that even Apple doesn't have access to them. Compare that to the Android implementations which have fingerprints in the filesystem. I'm not intending to hold them up as some paragon, simply showing that securing information is more than just a messaging issue.