Be careful of taking large percentages of small numbers. Right above the quote you list in the Nokia threat intelligence whitepaper:
> In 2018 the average monthly infection rate in mobile networks was 0.31%. This means that in any given month, one out of every 300 mobile devices had a high threat level malware infection.[0]
Let's assume that sideloading is responsible for literally everything happening on Android (it's not, but let's assume it is). We're talking about a reduction of <0.5% of current devices. I don't think that's a high enough number to justify getting rid of a fundamental user right.
I'm getting my numbers from some press releases[1], and from Google's 2018 security report for Android[2]. Google reports:
> In contrast, 0.68% of devices that installed apps from outside of Google Play were affected by one or more PHAs in 2018. While this number is 8 times higher than devices that exclusively used Google Play, it’s a noticeable improvement from 0.80% in 2017.
So even when looking purely at devices that allow sideloading (assuming that everyone who sideloads on Android is doing so unwittingly and is the victim of phishing, which, again, isn't the case), we still get a possible savings of ~0.6% of current Android devices.
Is it worth allowing Apple to destroy the entire games streaming market on iOS to save 0.5-0.6% of devices (approximately 1 in 200 devices)? Is protecting 1 in 200 devices worth allowing Apple to be anti-competitive towards music streaming platforms like Spotify? No, probably not -- especially since user education around the risks of sideloading means that at least some of those users are already making an educated choice about their own personal security risks.
> we already have two decades of history of malware on Windows and Android to show us what happens when you expose non-technical users to a highly popular, but unrestricted operating system.
We also have two decades of the web showing us that sandboxing untrusted code is a viable model for application distribution. It's not an accident that the web won as an application runtime/distribution platform for most people, and it's definitely not an accident that the web is one of the few platforms where end-users generally trust themselves to execute hundreds of blobs of unverified code per-person every single day.
Additionally, we're seeing data that suggests platforms like Android and Windows are becoming more secure despite the fact that they allow sideloading. So clearly there are gains to be made in this area beyond just getting rid of user rights.
> I'm not opposed to the idea of adding some sort of "developer mode" that allows advanced users to load third-party binaries after some very strict and specific warnings, so people who really know what they're doing can use it.
I think it's kind of a jump to assume that this isn't something that's mostly already happening on platforms like Android. It is very difficult to accidentally sideload an Android app unless you ignore security warnings.
And there's also a kind of double-standard here. We're assuming that every general user who buys an iPhone is doing so because they understand the underlying security model and are comfortable giving up their freedom in exchange for security. But we're not assuming that people who go through warnings to sideload apps are doing so with the understanding that there are security risks. Why is that?
We get into some uncomfortable questions about protecting users against their consent. If it could be shown that the majority of people sideloading today have no idea of the risk they're getting into, that would be something. But I'm uncomfortable assuming that. I'm uncomfortable looking at outcomes this small and saying that obviously those users need to be protected from themselves.
And I just don't buy your arguments around user education. It is possible to train people to be more secure, especially around well-defined boundaries like sideloading. The point of sandboxing and user-controlled permissions is to make it clear what developers are doing under the hood, because 'abusing legitimate APIs' is a subjective call that different users will have different standards for. Obviously there's more work to be done there, but platforms like Android, the web, and even iOS[3] are proving that users can be educated about topics like privacy and malware. I mean, even MacOS allows users to disable Gatekeeper and (in most cases) bypass the store for app distribution. Do we think that's a giant security risk?
Again, perfection is not the goal. If we're talking about an extra 1 in 200 devices getting infected with malware, and it's not particularly complicated for high-risk targets, companies, and even nontechnical users to completely avoid that extra risk, and we have pretty good evidence that we can get that number even lower without taking away user rights, then I just don't see a compelling reason to take away user rights.
> We're talking about a reduction of <0.5% of current devices.
You're trying to use this number to downplay the severity of the malware problem on Android, but you need to be careful with the interpretation of this number. It's a rolling snapshot, not a measure of total devices affected.
What that means is if you get infected this month and fix your phone, and then I get infected next month and fix my phone, and a third person gets infected the next month and fixes their phone, and a fourth person gets infected the next month and fixes their phone, the snapshot will only capture 1/4 of the total number of infections even though all four of us got infected in the end.
What we really need is a metric of how many users are infected by at least one piece of malware during their ownership of the device.
Edit: I looked around and couldn't find this metric exactly, however I did find several even larger malware attacks that have individually infected way more than 0.5% of devices, which leads me to conclude the 0.5% number is extremely misleading.
Is it worth having a strictly controlled review and install process in order to help prevent hundreds of millions of malware infections on your phone, the most important device in most people's pockets that contains all their messages, emails, photos, location history, health data, etc.? I believe so.
> I don't think that's a high enough number to justify getting rid of a fundamental user right.
I take issue with framing this as a "fundamental user right". If you want to execute unapproved code on the iPhone you already have multiple options, such as using the standard developer SDKs or jailbreaking. What you are claiming is a "fundamental user right" is actually the right for third-party developers to distribute unvetted binaries for installation using platform-sanctioned infrastructure. I think it's a huge stretch to call that a "fundamental user right".
(Granted, I also think calling gun ownership a "fundamental right" is completely and utterly ridiculous, but different people have different opinions on what is truly fundamental.)
> > In contrast, 0.68% of devices that installed apps from outside of Google Play were affected by one or more PHAs in 2018. While this number is 8 times higher than devices that exclusively used Google Play, it’s a noticeable improvement from 0.80% in 2017.
So Google's own statistics say devices that use side-loading have an 8x higher risk of malware. That is significant.
> We also have two decades of the web showing us that sandboxing untrusted code is a viable model for application distribution.
I don't think it's fair to compare the two as browser sandboxing is significantly more restrictive than app sandboxing. Sure, if we restricted apps to the same degree that we restrict the browser, that would definitely improve security, at the cost of functionality.
> Additionally, we're seeing data that suggests platforms like Android and Windows are becoming more secure despite the fact that they allow sideloading.
Yes, because they've intentionally made side-loading more difficult with every release, which means fewer people are doing it, which reduces the attack vector.
> But we're not assuming that people who go through warnings to sideload apps are doing so with the understanding that there are security risks. Why is that?
Because we literally saw what happened when Epic attempted to release their app outside the Google Play Store. Non-technical users went ahead and checked the box to allow side-loading because they wanted to play Fortnite. Then they ended up downloading fake Fortnite APKs cause they didn't know where to get the right one.
You're acting as if these risks are hypothetical when we've already seen this same story play out over and over again.
> And I just don't buy your arguments around user education.
I'm not sure you actually understood this argument. Consider an app that might request access to your contacts for a legitimate purpose (like messaging your friends), that then secretly decides to store and transmits that data for a malicious purpose (like selling your contacts to third parties). No amount of sandboxing, education, or permissions management will prevent this kind of privacy abuse.
> I mean, even MacOS allows users to disable Gatekeeper and (in most cases) bypass the store for app distribution. Do we think that's a giant security risk?
Yes, of course it is. Mac OS has a worse malware history than iOS.
> a rolling measure, not the measure of total devices affected.
The Google numbers I list are not monthly rolling measures, they're for the entirety of 2018. They're also specific to users who sideload. So it's not that 0.68 percent of Android users downloaded malware in 2018, it's that of the subset of devices that actively sideloaded apps, ~2/300 ended up encountering malware at some point during the year.
And this ends up mattering because it means that you can almost entirely eliminate that risk by just deciding for yourself whether or not you want to sideload.
> devices that use side-loading have an 8x higher risk of malware. That is significant.
An 8x increase that still results in less than a 1% risk over an entire year. The context matters, we are talking about extremely small numbers. The current numbers mean that if you own an Android device for 6 years and you regularly sideload applications every single year, you have a 4% chance of getting infected during that time. And this is assuming that nothing else changes to make sideloading more secure, that none of the education measures work, and that you don't sideload one or two important apps and then just turn the feature off.
When you only focus on the percentage change, you miss the bigger picture of what the malware risks actually are for phones. 4% is a number we would like to be lower. We always want the number to be lower. But not at the cost of an entire market. That 4% needs to be stacked against the costs of market capture and anti-competitive behavior.
Quick sidenote, I don't think it's that hard to explain the numbers you're seeing online. There are almost 2.5 billion android devices in use globally. 200 million of 2.5 billion is a little less than 1 percent. I could easily see factors like repeat infections driving that number lower (Google is only counted infected devices, it's not counting the number of infections per device). Those numbers are surprising to me in that they might indicate that a lot more people are sideloading than I expected. But even that is balanced out by the fact that the majority of these cases aren't exclusive to sideloaded apps, they also made their way onto official app stores.
I'm definitely interested in hearing more about them, but I'm not looking at these numbers and thinking, "Google's official security reports are lying."
> I don't think it's fair to compare the two as browser sandboxing is significantly more restrictive than app sandboxing. Sure, if we restricted apps to the same degree that we restrict the browser, that would definitely improve security, at the cost of functionality.
If we want to go down this route, iOS is also fundamentally more restrictive than Android. Android has a permission that allows apps to just directly read sections of the SD card. I think that's a stupid permission for Android to have, and I would hazard that the malware numbers you're looking at would be lower if Android didn't have all of this crap. I shouldn't need to give a photo application access to my SD card just to take a picture.
On the subject of the web: yes, the web is more restrictive than native in many ways. But it's rapidly getting less restrictive, and we're now even considering permissions like native file access. That expansion in functionality is happening because we're seeing that sandboxing works. A lot of the legitimate permissions that we're trying to prevent abuse of within native apps (contacts, advertising IDs, location, data-sharing between apps, camera/microphone access) are areas that the web has grappled with and handled, for the most part, adequately.
It's not a perfect comparison -- if the web could do everything native apps could do, nobody would be writing native apps. But the growth of the web as a platform still suggests that sandboxing is something we should be taking very seriously.
> Yes, because they've intentionally made side-loading more difficult with every release, which means fewer people are doing it, which reduces the attack vector.
Reread that. Google saw a 15% reduction in malware among phones that sideload apps. Not overall across the entire ecosystem, among the people doing the behavior you think is too risky for them to do. We can improve the malware stats among people who sideload.
> Because we literally saw what happened when Epic attempted to release their app outside the Google Play Store.
What's our position on cherry-picking again?
More importantly, what's our basis for saying that when people clicked the checkbox and said, "I understand the risks, I still want to take those risks so I can get Fortnite", that was an accident or that they didn't understand what they were risking?
It is possible for someone to do something risky and get malware even though they generally understood the risks. And to get back to what I'm talking about with consent, I am uncomfortable with the idea that we need to go to people and tell them what risks they are and aren't allowed to take. If we believe that everyone who buys an iPhone is doing so because they are consciously balancing their security/freedom, why do we throw that philosophy out the window when someone makes a conscious decision to sideload an app? Not every user is going to have the same risk tolerance, and it's fine for users to have different degrees of risk tolerance.
> Consider an app that might request access to your contacts for a legitimate purpose (like messaging your friends), that then secretly decides to store and transmits that data for a malicious purpose (like selling your contacts to third parties). No amount of sandboxing, education, or permissions management will prevent this kind of privacy abuse.
No amount of anything will stop that privacy abuse other than extensive corporate auditing, which nobody (including Apple) is prepared to do. Apple can't prevent an app from secretly selling your data, it can only ban the app after the fact. And once it becomes public knowledge which apps are selling your data, then education and permissions management starts to matter again.
The only preemptive thing we can do is to make it obvious when apps are transmitting data and to what location. We can also train users to stick to commercially vetted apps and to do a little bit of research to figure out whether a company seems sleazy, or if they've popped up out of nowhere. But that's the most we can do. Apple's moderation team doesn't have any kind of magical ability to tell what I'm doing with user data once I've gotten it onto my servers.
> such as using the standard developer SDKs or jailbreaking
I wonder, back when Apple was arguing that distributing jailbreaks for iOS should be illegal, did they have any idea that it would someday be a core argument as to why they weren't actually suppressing user rights?
If you don't think that the user right to decide what code runs on their own devices is a fundamental right, then that might just be a disagreement we have. I think it is a fundamental right, and I don't think that the developer SDKs or the constantly shifting jailbreaking communities satisfy that right. But if you disagree with me on that, then we disagree, that's fine. There's no short argument I can come up with as to why you should believe it's a fundamental right.
> Yes, or course it is. Mac OS has a worse malware history than iOS.
To that point, usually people don't try to argue that sideloading should be removed from desktop computers. It's an interesting and kind of troubling shift to see this argument popping up now. You're not the first person to suggest it, but I'm still always surprised when I see it. What would the computing industry look like today if early Windows/Macs had only been able to run authorized software?
> The Google numbers I list are not monthly rolling measures, they're for the entirety of 2018.
Fair enough, I was referring to the "average monthly infection rate" from the text you quoted.
However, I am having trouble reconciling Google's numbers with the numbers from other reports. For example, Kaspersky's mobile malware evolution report (https://securelist.com/mobile-malware-evolution-2019/96280/) says 13.89% of users in the United States were attacked by mobile malware in 2019. The number is as high as 60% for Iran.
> 200 million of 2.5 billion is a little less than 1 percent.
That's 8%. I don't understand how Google can say in the same report, that 199 million devices were infected by a single piece of malware, but only a maximum of 0.68% devices were affected? Something doesn't add up.
(I'll address your other points when I have more free time.)
> 13.89% of users in the United States were attacked by mobile malware in 2019. The number is as high as 60% for Iran.
In fairness, if the actual numbers in some smartphone markets are genuinely as high as 60% of Android users/devices infected, then... yeah. In that case, I'm underestimating the impact and it's worth at thinking more about whether or not the security impact is too high for us to naively allow sideloading -- at least without building much better UX or building much better safety measures around it.
That's a number that's high enough where it does make sense to take a step back and think about the security costs and move very cautiously. I mean, heck, to go all the way back to the original argument, if 1 in 10 people were being killed by murderers in a year, I'd be somewhat inclined to take law enforcement arguments about banning encryption more seriously.
At the same time, that number is very surprising to me and I'm kind of suspicious of it. Even the US numbers, I would be pretty surprised to find out that 1 in 10 Android devices is infected, because I'm not sure I would guess that as many as 1 in 10 Android users actually sideload apps.
I almost wonder if different reports have different definitions of malware or something.
> That's 8%.
Good catch, I am bad at counting zeros. I think I must have done 20 million instead of 200. 8% is also a number where I start to think something is weird.
I assume that Google isn't lying, but there's a factor there I don't understand. Unless the average infected phone is getting infected 8-16 times in a row, I'm having trouble thinking about how those numbers reconcile.
Ideological differences aside, these are interesting numbers.
I've been trying to figure out these Google numbers and they just don't make sense to me. In August 2019 a cluster of apps in the Google Play Store with over 100 million total installs were discovered to contain a trojan (https://news.drweb.com/show/?i=13382&lng=en). I would expect the detection and removal of such a large cluster of malware to be reflected in Google's PHA dashboard (https://transparencyreport.google.com/android-security/overv...), but there's barely any change in August. Which leaves me wondering what exactly are they measuring?
Other points I wanted to address:
1. I don't think it's cherry picking to point out that fake Fortnite APKs are the inevitable consequence of Epic choosing to distribute Fortnite outside the Play Store. I expect this will be a problem with every popular app that decides to go fully off-store.
2. I also don't think it's likely that the people falling for these fake APKs are making a knowing decision to accept the risk of side-loading. I think it's more likely they just don't have the expertise to understand what is the correct place to download it, and they're getting lured in by the promise of free V-bucks or whatever. I mean, yes, ultimately they made that choice to check that box, but it seems a bit like handing a toddler a loaded weapon and then being surprised at what happens next.
3. I agree that we can't stop all privacy abuse, but I think the review process provides a useful deterrent that otherwise wouldn't exist if every developer was doing their own distribution and had no review guidelines to adhere to at all. If you compare the incidence of malicious apps distributed via the Play Store compared to the App Store I also think there's a clear indication of the benefit of the review-first model over the publish-first model.
Mystery partially solved: Google's security report is based on the data from Google Play Protect, which apparently has the worst performance among malware detection tools in the industry (https://www.tomsguide.com/reviews/google-play-protect). A recent evaluation by an independent institute found that Google Play Protect only managed to detect a third of the 6,700 malware samples in the test, compared to ~99% from security companies like AVG, Trend Micro, and Kaspersky (https://www.av-test.org/en/news/here-s-how-well-17-android-s...).
Based on this, I don't think the numbers coming from Google can be considered reliable. It seems the reason their numbers are so low is because they simply aren't detecting a large chunk of the malware that is being distributed on Android.
> In 2018 the average monthly infection rate in mobile networks was 0.31%. This means that in any given month, one out of every 300 mobile devices had a high threat level malware infection.[0]
Let's assume that sideloading is responsible for literally everything happening on Android (it's not, but let's assume it is). We're talking about a reduction of <0.5% of current devices. I don't think that's a high enough number to justify getting rid of a fundamental user right.
I'm getting my numbers from some press releases[1], and from Google's 2018 security report for Android[2]. Google reports:
> In contrast, 0.68% of devices that installed apps from outside of Google Play were affected by one or more PHAs in 2018. While this number is 8 times higher than devices that exclusively used Google Play, it’s a noticeable improvement from 0.80% in 2017.
So even when looking purely at devices that allow sideloading (assuming that everyone who sideloads on Android is doing so unwittingly and is the victim of phishing, which, again, isn't the case), we still get a possible savings of ~0.6% of current Android devices.
Is it worth allowing Apple to destroy the entire games streaming market on iOS to save 0.5-0.6% of devices (approximately 1 in 200 devices)? Is protecting 1 in 200 devices worth allowing Apple to be anti-competitive towards music streaming platforms like Spotify? No, probably not -- especially since user education around the risks of sideloading means that at least some of those users are already making an educated choice about their own personal security risks.
> we already have two decades of history of malware on Windows and Android to show us what happens when you expose non-technical users to a highly popular, but unrestricted operating system.
We also have two decades of the web showing us that sandboxing untrusted code is a viable model for application distribution. It's not an accident that the web won as an application runtime/distribution platform for most people, and it's definitely not an accident that the web is one of the few platforms where end-users generally trust themselves to execute hundreds of blobs of unverified code per-person every single day.
Additionally, we're seeing data that suggests platforms like Android and Windows are becoming more secure despite the fact that they allow sideloading. So clearly there are gains to be made in this area beyond just getting rid of user rights.
> I'm not opposed to the idea of adding some sort of "developer mode" that allows advanced users to load third-party binaries after some very strict and specific warnings, so people who really know what they're doing can use it.
I think it's kind of a jump to assume that this isn't something that's mostly already happening on platforms like Android. It is very difficult to accidentally sideload an Android app unless you ignore security warnings.
And there's also a kind of double-standard here. We're assuming that every general user who buys an iPhone is doing so because they understand the underlying security model and are comfortable giving up their freedom in exchange for security. But we're not assuming that people who go through warnings to sideload apps are doing so with the understanding that there are security risks. Why is that?
We get into some uncomfortable questions about protecting users against their consent. If it could be shown that the majority of people sideloading today have no idea of the risk they're getting into, that would be something. But I'm uncomfortable assuming that. I'm uncomfortable looking at outcomes this small and saying that obviously those users need to be protected from themselves.
And I just don't buy your arguments around user education. It is possible to train people to be more secure, especially around well-defined boundaries like sideloading. The point of sandboxing and user-controlled permissions is to make it clear what developers are doing under the hood, because 'abusing legitimate APIs' is a subjective call that different users will have different standards for. Obviously there's more work to be done there, but platforms like Android, the web, and even iOS[3] are proving that users can be educated about topics like privacy and malware. I mean, even MacOS allows users to disable Gatekeeper and (in most cases) bypass the store for app distribution. Do we think that's a giant security risk?
Again, perfection is not the goal. If we're talking about an extra 1 in 200 devices getting infected with malware, and it's not particularly complicated for high-risk targets, companies, and even nontechnical users to completely avoid that extra risk, and we have pretty good evidence that we can get that number even lower without taking away user rights, then I just don't see a compelling reason to take away user rights.
[0]: https://onestore.nokia.com/asset/205835
[1]: https://www.zdnet.com/article/google-newer-android-versions-...
[2]: https://source.android.com/security/reports/Google_Android_S...
[3]: https://arstechnica.com/tech-policy/2020/08/ios-14-privacy-s...