Hacker Newsnew | past | comments | ask | show | jobs | submit | benologist's commentslogin

Buying your neighbor's surrounding houses for privacy...


For months now I have been using youtube-dl to preserve all the concerts and workout videos I like on YouTube because I don't expect this content to remain available and unencumbered. My original plan was to store just the IDs on GitHub so I could recreate the video libraries without backing up the video files, but already some of the concerts have been removed. I expect the exercise videos will be hidden behind a subscription at some point since they provide utility.


About 400 million people pay monthly for access to music instead of listening to radio, occasionally buying albums or piracy so idk if anything really backfired...


I would argue this happened in spite of their efforts. The music industry was dragged kicking and screaming into the streaming world for a variety of reasons, among them the fact that with centralized platforms, artists have no reason to use labels for "discovery" purposes any longer.

Its long been known that the way to defeat piracy is to offer a better service. Its what Steam did, its what Netflix did.

Ironically, the fragmentation of those ecosystems is bringing it back. Piracy is the ultimate invisible hand on the entertainment distribution industry.


The commercial streaming solutions are only going to be good until they're the only game in town. Commercial entities don't simply build good things and leave them intact for people to benefit from with no profit motive. Once the leverage is there, there will be abusive monetization, consumers will be treated like crap, until perhaps some breaking point where it can be disrupted.

How would streaming become the only game in town? Piracy and any other online mischief are being stamped out the hard way, by taking control of our technology from us. Approved operating system, approved drivers, approved software, approved browser, approved websites, no piracy, no privacy, no deviance.


Labels were never about discovery. They were about promotion, production, and all of the expensive stuff you need at scale.

For example, Justin Beiber, Lorde, Billie Eilish, Lil Nas X, were all discovered independently due to their own efforts and had decent success on their own. (Farther back in time, the Beatles and most classic rock bands similarly got signed to labels after demonstrating success.)

But they're all signed to major labels now, because touring is expensive, and the scale of exposure you get with a label is very different from what you get on your own, and the income correspondingly increases as well.

In many (but not all) cases, the artists usually also get lump-sum advances against new albums or singles which removes the financial risks for creating new music.

Piracy was never about discovery. It was simply about people being too cheap to pay for other people's work. Sometimes, as with Adobe and Microsoft, they were okay with it because that just locked in their market dominance and created more future customers. But for fad-driven and taste-driven industries, privacy has a notable impact on creator's earnings.


Depending on the artist and label it's a giant question mark whether the label has any hand in the live performances or tours. Touring has always been expensive (and lucrative), but the agencies that deal with it traditionally haven't been labels.

The traditional business model of a record label is almost entirely obsolete, and they need new revenue streams. That said, the people who run these businesses are antithetical to innovation and creativity in business and that's why industry groups like the RIAA exist in the first place.


> But for fad-driven and taste-driven industries, privacy [sic] has a notable impact on creator's earnings.

Do you have any data that doesn't make the ridiculous and false assertion that 1 download = 1 lost sale, with which to back that claim?

Because from what I've read, piracy has the opposite effect.


If you look at a graph of music industry revenue, it peaks in 2000, collapses until 2015 (at ½-⅓ its peak value), and has been on a strong uptick since then. Peak apoplexy about the impact of piracy was around 2005, and there was a strong desire to tie the collapse in revenue directly to piracy.

From what I can tell, most of the studies establishing a clear economic cost to piracy tend to rely on numbers originating from the music industry without clear attribution to data, or by naïve analyses estimating that 1 download = X lost sales, without considering effects like budgetary limits (I'll spend at most $X on entertainment this year) or conversions to profitable sales.

Additionally, there's a lot fewer studies on piracy post-revenue nadir. It seems as if the rise of streaming has caused the industry to stop panicking so much about piracy, and there's some evidence that streaming has converted consumers to paying customers.

All-in-all, I would say that it's not so much that piracy hurt the industry as piracy filled the existence of a market segment that the industry refused to fill. Piracy only hurt the industry in the same way digital cameras hurt Kodak: the existing business model was unsustainable, but they refused to pivot to take advantage of the clear coming shift in the business model.


I'm not claiming that 1 download = 1 lost sale.

But it's well-documented that piracy negatively affects music, film, and game studio income. See, e.g., https://www.ipi.org/ipi_issues/detail/the-true-cost-of-sound...

Piracy may the opposite effect for software but it definitely has a negative effect on entertainment related IP.


(This user was lying, and their link is, as 'dredmorbius notes, a thinktank funded by Exxon and the Koch brothers.)

https://www.engadget.com/2017-09-22-eu-suppressed-study-pira...

https://gizmodo.com/the-eu-suppressed-a-300-page-study-that-...


Piracy may the opposite effect for software but it definitely has a negative effect on entertainment related IP.

High school events where I grew up were basically an iMac with the student body officers' MP3 collections and a PA system. All the pirated MP3s people were playing at those various official and unofficial gatherings of my youth led to me buying CDs once I had money of my own.

Has a non-industry-affiliated research group produced causal data (not just declining sales figures) showing that noncommercial entertainment piracy is a net harm?


If you read the fine print in that study, it is concluding the cost of the piracy purely in terms of 1 download = 0.2 lost sale. There's not even a discussion of the potential word-of-mouth effects or later sale generation potentials for pirated music.


What if they lied about how much music was worth though? You can stream 10,000+ songs in one month for the price of one album today, in the early 2000s we were asked to believe that was $10,000+ worth of music and their suffering was relative to that...


The Institute for Policy Innovation (IPI) is a think tank based in Irving, Texas and founded in 1987 by Congressman Dick Armey to "research, develop and promote innovative and non-partisan solutions to today's public policy problems."[1]

IPI is an associate member of the State Policy Network (SPN), a network of right-wing "think tanks" and other non profits spanning 49 states, D.C., and Puerto Rico.[2]

The conservative Capital Research Center ranked IPI as amongst the most conservative groups in the US, scoring it as an "eight" on a scale of one to eight.[3] IPI has received funding from corporations like Exxon Mobil and organizations like the Kochs' Claude R. Lambe Foundation, Scaife Foundations, the Bradley Foundation and others....

https://www.sourcewatch.org/index.php?title=Institute_for_Po...


This NPM cache been a huge time-saver for me, you can run it locally or across your whole network for a shared cache -

https://guides.sonatype.com/repo3/quick-start-guides/proxyin...

https://hub.docker.com/r/sonatype/nexus/


I host mine too because I'm not interested in paying ~$100/year for the rest of my life to access music, this is like welfare for tech giants and record companies and very little even trickles down to musicians. I got a Synology NAS and these days I just run the Synology operating system in a virtual machine and it's accessible via web UI or mobile apps and backs up to the cloud, this has served me very well for years now.


I wish the technology existed so they could just correctly bill you when you get back to your car and you drive away with a receipt instead of a fine... the idea that you have done something wrong and you deserve a financial punishment for it is ludicrous.


In much of NYC, parking is free and it's cheaper to eat the ticket for skipping alternate-side parking than it is to long-term park your car in the closest lot.


This is in the same vein as the trial services that start billing you if you forget to cancel. I'd like to see the model for parking but I bet it turns out it contributes a lot to the bottom line so they don't want to change it.


Copenhagen (among other places) has a pretty convenient app-based model for parking. You use one of a few competing apps on your phone and enter your car's registration plate. When you park you open the app and it pulls location and starts a timer. You pick an amount of time you'll be parked but the apps I've used have reminders and you can trivially extend parking as you feel appropriate.

When you get back in your car you open the app and 'stop parking' and the app calculates the cost of the time you were parked (taking into account varying overnight and congestion zone rates). At the end of the month you get billed for that month's charges. You can also have multiple personas if you have a work vehicle/card.

In parts of south Florida there's a somewhat similar system where there's only one app you can use and money is taken immediately for the duration of the time you choose (with a small service fee included, of course). If you move your car before the expiration of your purchased time, you lose that value unless you repark in the same area [1]. The similarity is that the app will remind you that your parking is going to expire soon and you can simply extend (and pay that small service fee again, of course). I've seen this prepay-via-app model at various private parking facilities in other US cities as well.

[1]: The system uses number codes for what area you're parked in so you don't need to have location services necessarily (e.g. parking garage) with the code changing every couple streets.


Same. This recharger + 4 battery set has been invaluable with my remote.

https://www.amazon.com/Lithium-Battery-Charger-Rechargeable-...


WHAT!!!!???? I did not know this existed! Thank you.


I have maintained my fitness and saved so much money by switching to videos on YouTube. I think this stuff should at most be a category on Netflix etc it's not worth $9.99 separately. The trainers are all good, the only thing missing is being able to superimpose your Watch metrics when you want.


You can block a lot of apps individually from accessing the internet with NetGuard -

https://f-droid.org/en/packages/eu.faircode.netguard/

https://github.com/M66B/NetGuard


This and disabling google apps. Surprisingly not much of a difference in functionality, my data usage is low, downloaded local maps on OSM. Motorola something


In real life on the iPhone we have had apps secretly uploading your address book, copying your clipboard and listening for tones embedded in television ads. And "The Fappening" where many people's private photos were leaked.


If that's what happens when you've got hundreds of experts working to prevent it then why do you think it'll be a less of a problem when it's random non-experts?

edit: The imperfection of the current system does not prove that another option is better.


> hundreds of experts working to prevent it

Every major operating system has this regardless of whether they force you to download software through one marketplace. You're not less-secure if you use two marketplaces as long as both those marketplaces are kept secure. iOS is kept secure independently of the App Store as well.


In real life we also have murders and kidnappings, that just means no system is perfect. It certainly doesn't mean there's no point in having law enforcement.


Sure, but think very carefully about whether or not you actually want me to compare Apple to law enforcement. My feeling is that a different analogy would better suit your argument. Is your intention really to make me think about government 'security' talking points around encryption and terrorism?

In real life, if someone told me that murders and kidnappings were a good reason for the government to have absolute control over what computer applications are allowed to be built or what games/media are allowed to be distributed by its citizens, I would call that person an authoritarian.

That's because in real life we balance law enforcement with individual rights. We don't just claim that every single intrusion into people's privacy and autonomy is necessary because otherwise the murderers would come. We also view certain freedoms as inalienable -- we believe that protecting those freedoms is just universally more important than preventing murderers. In fact, many people believe believe that some degree of difficulty and inexactness and imperfection in law enforcement is necessary for the furthering of social progress outside of what the government currently believes is acceptable.

In other words, we balance between anarchy and authoritarianism.

In the same way, we don't only have two choices here. There is a middle ground between "only Apple decides what can run on your devices", and "everyone for themselves, forget trying to make anyone secure." We can get better sandboxing, we can learn more UX techniques around warnings, we can improve public education about computers, we can build out device administration tools, we can build very targeted escape hatches that don't turn the OS into a free-for-all. Even beyond that, we can decide that some user freedoms are worth an increase in malware, the same way that we've decided some security gains are worth a decrease in user freedom.

So I'm not really swayed by someone saying that the only way to prevent malware is if Apple/Google ban porn, and decide for users which payment methods they're allowed to use in an app, and decide whether or not online game streaming apps are allowed to enter the market, and decide whether or not serious games like Sweatshop can be considered art, and decide whether or not podcast apps will be allowed to include COVID podcasts in their directories.

At the very least, we could get rid of most of those restrictions, or we could move all of the security checks to a separate layer and allow people to bypass the content restrictions on their own, and none of that would impact device security.

That we want some security checks does not imply that we should never try to balance security with user freedom.


> My feeling is that a different analogy would better suit your argument.

Feel free to pick whatever example you'd like, the underlying point is the same: just because some bad actors will ignore the regulations anyways doesn't mean we shouldn't have the regulations in the first place or the regulations have no net benefit. In other words, pointing to a few counter examples and saying "gotcha! your regulation didn't perfectly prevent everything!" is not a meaningful critique.

> So I'm not really swayed by someone saying that the only way to prevent malware is if Apple/Google ban porn, and decide for users which payment methods they're allowed to use in an app, and decide whether or not online game streaming apps are allowed to enter the market, and decide whether or not serious games like Sweatshop can be considered art, and decide whether or not podcast apps will be allowed to include COVID podcasts in their directories.

I generally agree that these examples are overly restrictive and unnecessary. However, I don't think legally forcing manufacturers to open up their devices to side-loading is the appropriate remedy, because it increases the level of risk from bad actors attempting to exploit those devices.

I also think Hacker News posters have a tendency to underestimate/downplay those risks because as highly technical people they know what to do to avoid those risks - but the same does not apply to the vast majority of users.


We might be arguing past each other. I agree that cherry-picking doesn't mean that a system should be immediately discarded. But in my mind, the point of bringing up individual malware examples is not to say that all regulation is worthless, it's to drive home that perfect security doesn't exist, that we shouldn't be striving for perfect security in the first place, and that the real world is about balancing security with other concerns.

I don't understand what makes your argument different from, "I don't think allowing encryption is a good thing, because it increases the level of risk from terrorists and traffickers." There is no such thing as a malware free world, and saying, "this would increase malware" is not an immediately persuasive argument.

In other words, if your angle is that you're worried about people cherry-picking counter-examples, my angle is that I'm worried about people pointing at every single security restriction and saying it's critically important, regardless of what it costs users.

We're talking about abandoning a fundamental user right. I need to see stronger evidence that the security gain is so large that it justifies getting rid of that right. The reason your comparison to the government stuck out to me is because it's the same faulty reasoning that the government uses all the time to say that any increase in citizen security or rule enforcement is worth pursuing, regardless of what it means for citizen autonomy.

> I also think Hacker News posters have a tendency to underestimate/downplay those risks

What are those risks? You want to get rid of cherry-picking, what kind of change in malware would we be talking about if we got rid of sideloading on Android or introduced it on iOS? The best data I'm seeing online suggests possibly an impact to 0.5% of current devices based on Android statistics, and that's assuming we can't get any other gains from sandboxing and user-education.

Frankly, even assuming that we couldn't reduce that number farther, that's not a number that's big enough to justify abandoning a user's fundamental right to control what code runs on their device. Especially when we have good evidence that in the absence of that right, companies like Apple will both censor and use their power to control the market and target competitors.

> I don't think legally forcing manufacturers to open up their devices to side-loading is the appropriate remedy

I'm open to lots of solutions here, some regulatory and some market-based. We don't need to focus on just sideloading if there are other solutions other people find more palatable (<cough>Repeal the DMCA</cough>).

But even on the topic of sideloading, I'm open to the idea that this doesn't need to be a general regulation. I'm fine with saying that Apple is in a unique position because it's one part of a duopoly, and that we don't have to make a generalized rule for every company just to target Apple/Google specifically. My position isn't necessarily that manufacturers all need to be forced to open up their devices, it's that it might make sense to impose that regulation on companies in a duopoly when it can be demonstrated that they are actively harming the market with their restrictions.

Even regulatory solutions are a balance; regulating an aggressive duopoly is different from regulating an entire market.


> But in my mind, the point of bringing up individual malware examples is not to say that all regulation is worthless, it's to drive home that perfect security doesn't exist, that we shouldn't be striving for perfect security in the first place, and that the real world is about balancing security with other concerns.

I certainly agree that perfect security doesn't exist and we need to balance security with other concerns. However, I believe that a platform with strict controls directly contributes to increased security and privacy on that platform, and those factors are important to me, so the balance is worth the trade off. You are of course free to prioritize other concerns and purchase the device that best fits your concerns.

> There is no such thing as a malware free world, and saying, "this would increase malware" is not an immediately persuasive argument.

It is to me, because (as I said in my original comment in this thread) we already have two decades of history of malware on Windows and Android to show us what happens when you expose non-technical users to a highly popular, but unrestricted operating system.

> What are those risks? You want to get rid of cherry-picking, what kind of change in malware would we be talking about if we got rid of sideloading on Android or introduced it on iOS?

Nokia's latest threat intelligence whitepaper [1] says:

Among smartphones, Android™ devices are the most commonly targeted by malware. In mobile networks, Android devices were responsible for 47.15% of the observed malware infections, Windows©/ PCs for 35.82%, IoT for 16.17% and iPhones© for less than 1%.

I think the numbers speak for themselves and side-loading is exactly the reason why.

In 2018 Android based devices are once more the main target in mobile networks. In the smartphone sector, the vast majority of malware is currently distributed as trojanized applications. The user is tricked by phishing, advertising or other social engineering into downloading and installing the application. The main reason that the Android platform is targeted, is the fact that once side-loading is enabled, Android applications can be downloaded from just about anywhere. In contrast, iPhone applications are for the most part limited to one source, the Apple Store.

> The best data I'm seeing online suggests possibly an impact to 0.5% of current devices based on Android statistics,

I'm curious where that number came from? Individual Android malware attacks have affected up to 25 million devices [2], so that number doesn't really make sense to me.

> and that's assuming we can't get any other gains from sandboxing and user-education.

Note that most of of the counter examples in the comment I replied to were examples of developers abusing legitimate APIs. (Except the photo leak which IIRC was based on a phishing attack). Sandboxing is great for operating system level security but does nothing to help prevent these types of privacy violations, which are enforced via developer guidelines and the review process instead. Protecting privacy cannot merely be treated as a technical problem to be solved via OS-level security restrictions. User education also does not help here because the users have no idea what developers are doing under the hood.

> that's not a number that's big enough to justify abandoning a user's fundamental right to control what code runs on their device.

I'm not opposed to the idea of adding some sort of "developer mode" that allows advanced users to load third-party binaries after some very strict and specific warnings, so people who really know what they're doing can use it. I just think its a very bad idea for side-loading to become a primary method of app distribution, especially for general users.

[1] https://onestore.nokia.com/asset/205835

[2] https://www.theverge.com/2019/7/10/20688885/agent-smith-andr...


Be careful of taking large percentages of small numbers. Right above the quote you list in the Nokia threat intelligence whitepaper:

> In 2018 the average monthly infection rate in mobile networks was 0.31%. This means that in any given month, one out of every 300 mobile devices had a high threat level malware infection.[0]

Let's assume that sideloading is responsible for literally everything happening on Android (it's not, but let's assume it is). We're talking about a reduction of <0.5% of current devices. I don't think that's a high enough number to justify getting rid of a fundamental user right.

I'm getting my numbers from some press releases[1], and from Google's 2018 security report for Android[2]. Google reports:

> In contrast, 0.68% of devices that installed apps from outside of Google Play were affected by one or more PHAs in 2018. While this number is 8 times higher than devices that exclusively used Google Play, it’s a noticeable improvement from 0.80% in 2017.

So even when looking purely at devices that allow sideloading (assuming that everyone who sideloads on Android is doing so unwittingly and is the victim of phishing, which, again, isn't the case), we still get a possible savings of ~0.6% of current Android devices.

Is it worth allowing Apple to destroy the entire games streaming market on iOS to save 0.5-0.6% of devices (approximately 1 in 200 devices)? Is protecting 1 in 200 devices worth allowing Apple to be anti-competitive towards music streaming platforms like Spotify? No, probably not -- especially since user education around the risks of sideloading means that at least some of those users are already making an educated choice about their own personal security risks.

> we already have two decades of history of malware on Windows and Android to show us what happens when you expose non-technical users to a highly popular, but unrestricted operating system.

We also have two decades of the web showing us that sandboxing untrusted code is a viable model for application distribution. It's not an accident that the web won as an application runtime/distribution platform for most people, and it's definitely not an accident that the web is one of the few platforms where end-users generally trust themselves to execute hundreds of blobs of unverified code per-person every single day.

Additionally, we're seeing data that suggests platforms like Android and Windows are becoming more secure despite the fact that they allow sideloading. So clearly there are gains to be made in this area beyond just getting rid of user rights.

> I'm not opposed to the idea of adding some sort of "developer mode" that allows advanced users to load third-party binaries after some very strict and specific warnings, so people who really know what they're doing can use it.

I think it's kind of a jump to assume that this isn't something that's mostly already happening on platforms like Android. It is very difficult to accidentally sideload an Android app unless you ignore security warnings.

And there's also a kind of double-standard here. We're assuming that every general user who buys an iPhone is doing so because they understand the underlying security model and are comfortable giving up their freedom in exchange for security. But we're not assuming that people who go through warnings to sideload apps are doing so with the understanding that there are security risks. Why is that?

We get into some uncomfortable questions about protecting users against their consent. If it could be shown that the majority of people sideloading today have no idea of the risk they're getting into, that would be something. But I'm uncomfortable assuming that. I'm uncomfortable looking at outcomes this small and saying that obviously those users need to be protected from themselves.

And I just don't buy your arguments around user education. It is possible to train people to be more secure, especially around well-defined boundaries like sideloading. The point of sandboxing and user-controlled permissions is to make it clear what developers are doing under the hood, because 'abusing legitimate APIs' is a subjective call that different users will have different standards for. Obviously there's more work to be done there, but platforms like Android, the web, and even iOS[3] are proving that users can be educated about topics like privacy and malware. I mean, even MacOS allows users to disable Gatekeeper and (in most cases) bypass the store for app distribution. Do we think that's a giant security risk?

Again, perfection is not the goal. If we're talking about an extra 1 in 200 devices getting infected with malware, and it's not particularly complicated for high-risk targets, companies, and even nontechnical users to completely avoid that extra risk, and we have pretty good evidence that we can get that number even lower without taking away user rights, then I just don't see a compelling reason to take away user rights.

[0]: https://onestore.nokia.com/asset/205835

[1]: https://www.zdnet.com/article/google-newer-android-versions-...

[2]: https://source.android.com/security/reports/Google_Android_S...

[3]: https://arstechnica.com/tech-policy/2020/08/ios-14-privacy-s...


> We're talking about a reduction of <0.5% of current devices.

You're trying to use this number to downplay the severity of the malware problem on Android, but you need to be careful with the interpretation of this number. It's a rolling snapshot, not a measure of total devices affected.

What that means is if you get infected this month and fix your phone, and then I get infected next month and fix my phone, and a third person gets infected the next month and fixes their phone, and a fourth person gets infected the next month and fixes their phone, the snapshot will only capture 1/4 of the total number of infections even though all four of us got infected in the end.

What we really need is a metric of how many users are infected by at least one piece of malware during their ownership of the device.

Edit: I looked around and couldn't find this metric exactly, however I did find several even larger malware attacks that have individually infected way more than 0.5% of devices, which leads me to conclude the 0.5% number is extremely misleading.

- SimBad: 150 million (https://www.zdnet.com/article/almost-150-million-users-impac...)

- HummingBad: 85 million (https://www.zdnet.com/article/this-android-malware-has-infec...)

- Chamois: 199 million (https://source.android.com/security/reports/Google_Android_S...)

Is it worth having a strictly controlled review and install process in order to help prevent hundreds of millions of malware infections on your phone, the most important device in most people's pockets that contains all their messages, emails, photos, location history, health data, etc.? I believe so.

> I don't think that's a high enough number to justify getting rid of a fundamental user right.

I take issue with framing this as a "fundamental user right". If you want to execute unapproved code on the iPhone you already have multiple options, such as using the standard developer SDKs or jailbreaking. What you are claiming is a "fundamental user right" is actually the right for third-party developers to distribute unvetted binaries for installation using platform-sanctioned infrastructure. I think it's a huge stretch to call that a "fundamental user right".

(Granted, I also think calling gun ownership a "fundamental right" is completely and utterly ridiculous, but different people have different opinions on what is truly fundamental.)

> > In contrast, 0.68% of devices that installed apps from outside of Google Play were affected by one or more PHAs in 2018. While this number is 8 times higher than devices that exclusively used Google Play, it’s a noticeable improvement from 0.80% in 2017.

So Google's own statistics say devices that use side-loading have an 8x higher risk of malware. That is significant.

> We also have two decades of the web showing us that sandboxing untrusted code is a viable model for application distribution.

I don't think it's fair to compare the two as browser sandboxing is significantly more restrictive than app sandboxing. Sure, if we restricted apps to the same degree that we restrict the browser, that would definitely improve security, at the cost of functionality.

> Additionally, we're seeing data that suggests platforms like Android and Windows are becoming more secure despite the fact that they allow sideloading.

Yes, because they've intentionally made side-loading more difficult with every release, which means fewer people are doing it, which reduces the attack vector.

> But we're not assuming that people who go through warnings to sideload apps are doing so with the understanding that there are security risks. Why is that?

Because we literally saw what happened when Epic attempted to release their app outside the Google Play Store. Non-technical users went ahead and checked the box to allow side-loading because they wanted to play Fortnite. Then they ended up downloading fake Fortnite APKs cause they didn't know where to get the right one.

You're acting as if these risks are hypothetical when we've already seen this same story play out over and over again.

> And I just don't buy your arguments around user education.

I'm not sure you actually understood this argument. Consider an app that might request access to your contacts for a legitimate purpose (like messaging your friends), that then secretly decides to store and transmits that data for a malicious purpose (like selling your contacts to third parties). No amount of sandboxing, education, or permissions management will prevent this kind of privacy abuse.

> I mean, even MacOS allows users to disable Gatekeeper and (in most cases) bypass the store for app distribution. Do we think that's a giant security risk?

Yes, of course it is. Mac OS has a worse malware history than iOS.


> a rolling measure, not the measure of total devices affected.

The Google numbers I list are not monthly rolling measures, they're for the entirety of 2018. They're also specific to users who sideload. So it's not that 0.68 percent of Android users downloaded malware in 2018, it's that of the subset of devices that actively sideloaded apps, ~2/300 ended up encountering malware at some point during the year.

And this ends up mattering because it means that you can almost entirely eliminate that risk by just deciding for yourself whether or not you want to sideload.

> devices that use side-loading have an 8x higher risk of malware. That is significant.

An 8x increase that still results in less than a 1% risk over an entire year. The context matters, we are talking about extremely small numbers. The current numbers mean that if you own an Android device for 6 years and you regularly sideload applications every single year, you have a 4% chance of getting infected during that time. And this is assuming that nothing else changes to make sideloading more secure, that none of the education measures work, and that you don't sideload one or two important apps and then just turn the feature off.

When you only focus on the percentage change, you miss the bigger picture of what the malware risks actually are for phones. 4% is a number we would like to be lower. We always want the number to be lower. But not at the cost of an entire market. That 4% needs to be stacked against the costs of market capture and anti-competitive behavior.

Quick sidenote, I don't think it's that hard to explain the numbers you're seeing online. There are almost 2.5 billion android devices in use globally. 200 million of 2.5 billion is a little less than 1 percent. I could easily see factors like repeat infections driving that number lower (Google is only counted infected devices, it's not counting the number of infections per device). Those numbers are surprising to me in that they might indicate that a lot more people are sideloading than I expected. But even that is balanced out by the fact that the majority of these cases aren't exclusive to sideloaded apps, they also made their way onto official app stores.

I'm definitely interested in hearing more about them, but I'm not looking at these numbers and thinking, "Google's official security reports are lying."

> I don't think it's fair to compare the two as browser sandboxing is significantly more restrictive than app sandboxing. Sure, if we restricted apps to the same degree that we restrict the browser, that would definitely improve security, at the cost of functionality.

If we want to go down this route, iOS is also fundamentally more restrictive than Android. Android has a permission that allows apps to just directly read sections of the SD card. I think that's a stupid permission for Android to have, and I would hazard that the malware numbers you're looking at would be lower if Android didn't have all of this crap. I shouldn't need to give a photo application access to my SD card just to take a picture.

On the subject of the web: yes, the web is more restrictive than native in many ways. But it's rapidly getting less restrictive, and we're now even considering permissions like native file access. That expansion in functionality is happening because we're seeing that sandboxing works. A lot of the legitimate permissions that we're trying to prevent abuse of within native apps (contacts, advertising IDs, location, data-sharing between apps, camera/microphone access) are areas that the web has grappled with and handled, for the most part, adequately.

It's not a perfect comparison -- if the web could do everything native apps could do, nobody would be writing native apps. But the growth of the web as a platform still suggests that sandboxing is something we should be taking very seriously.

> Yes, because they've intentionally made side-loading more difficult with every release, which means fewer people are doing it, which reduces the attack vector.

Reread that. Google saw a 15% reduction in malware among phones that sideload apps. Not overall across the entire ecosystem, among the people doing the behavior you think is too risky for them to do. We can improve the malware stats among people who sideload.

> Because we literally saw what happened when Epic attempted to release their app outside the Google Play Store.

What's our position on cherry-picking again?

More importantly, what's our basis for saying that when people clicked the checkbox and said, "I understand the risks, I still want to take those risks so I can get Fortnite", that was an accident or that they didn't understand what they were risking?

It is possible for someone to do something risky and get malware even though they generally understood the risks. And to get back to what I'm talking about with consent, I am uncomfortable with the idea that we need to go to people and tell them what risks they are and aren't allowed to take. If we believe that everyone who buys an iPhone is doing so because they are consciously balancing their security/freedom, why do we throw that philosophy out the window when someone makes a conscious decision to sideload an app? Not every user is going to have the same risk tolerance, and it's fine for users to have different degrees of risk tolerance.

> Consider an app that might request access to your contacts for a legitimate purpose (like messaging your friends), that then secretly decides to store and transmits that data for a malicious purpose (like selling your contacts to third parties). No amount of sandboxing, education, or permissions management will prevent this kind of privacy abuse.

No amount of anything will stop that privacy abuse other than extensive corporate auditing, which nobody (including Apple) is prepared to do. Apple can't prevent an app from secretly selling your data, it can only ban the app after the fact. And once it becomes public knowledge which apps are selling your data, then education and permissions management starts to matter again.

The only preemptive thing we can do is to make it obvious when apps are transmitting data and to what location. We can also train users to stick to commercially vetted apps and to do a little bit of research to figure out whether a company seems sleazy, or if they've popped up out of nowhere. But that's the most we can do. Apple's moderation team doesn't have any kind of magical ability to tell what I'm doing with user data once I've gotten it onto my servers.

> such as using the standard developer SDKs or jailbreaking

I wonder, back when Apple was arguing that distributing jailbreaks for iOS should be illegal, did they have any idea that it would someday be a core argument as to why they weren't actually suppressing user rights?

If you don't think that the user right to decide what code runs on their own devices is a fundamental right, then that might just be a disagreement we have. I think it is a fundamental right, and I don't think that the developer SDKs or the constantly shifting jailbreaking communities satisfy that right. But if you disagree with me on that, then we disagree, that's fine. There's no short argument I can come up with as to why you should believe it's a fundamental right.

> Yes, or course it is. Mac OS has a worse malware history than iOS.

To that point, usually people don't try to argue that sideloading should be removed from desktop computers. It's an interesting and kind of troubling shift to see this argument popping up now. You're not the first person to suggest it, but I'm still always surprised when I see it. What would the computing industry look like today if early Windows/Macs had only been able to run authorized software?


> The Google numbers I list are not monthly rolling measures, they're for the entirety of 2018.

Fair enough, I was referring to the "average monthly infection rate" from the text you quoted.

However, I am having trouble reconciling Google's numbers with the numbers from other reports. For example, Kaspersky's mobile malware evolution report (https://securelist.com/mobile-malware-evolution-2019/96280/) says 13.89% of users in the United States were attacked by mobile malware in 2019. The number is as high as 60% for Iran.

> 200 million of 2.5 billion is a little less than 1 percent.

That's 8%. I don't understand how Google can say in the same report, that 199 million devices were infected by a single piece of malware, but only a maximum of 0.68% devices were affected? Something doesn't add up.

(I'll address your other points when I have more free time.)


> 13.89% of users in the United States were attacked by mobile malware in 2019. The number is as high as 60% for Iran.

In fairness, if the actual numbers in some smartphone markets are genuinely as high as 60% of Android users/devices infected, then... yeah. In that case, I'm underestimating the impact and it's worth at thinking more about whether or not the security impact is too high for us to naively allow sideloading -- at least without building much better UX or building much better safety measures around it.

That's a number that's high enough where it does make sense to take a step back and think about the security costs and move very cautiously. I mean, heck, to go all the way back to the original argument, if 1 in 10 people were being killed by murderers in a year, I'd be somewhat inclined to take law enforcement arguments about banning encryption more seriously.

At the same time, that number is very surprising to me and I'm kind of suspicious of it. Even the US numbers, I would be pretty surprised to find out that 1 in 10 Android devices is infected, because I'm not sure I would guess that as many as 1 in 10 Android users actually sideload apps.

I almost wonder if different reports have different definitions of malware or something.

> That's 8%.

Good catch, I am bad at counting zeros. I think I must have done 20 million instead of 200. 8% is also a number where I start to think something is weird.

I assume that Google isn't lying, but there's a factor there I don't understand. Unless the average infected phone is getting infected 8-16 times in a row, I'm having trouble thinking about how those numbers reconcile.

Ideological differences aside, these are interesting numbers.


I've been trying to figure out these Google numbers and they just don't make sense to me. In August 2019 a cluster of apps in the Google Play Store with over 100 million total installs were discovered to contain a trojan (https://news.drweb.com/show/?i=13382&lng=en). I would expect the detection and removal of such a large cluster of malware to be reflected in Google's PHA dashboard (https://transparencyreport.google.com/android-security/overv...), but there's barely any change in August. Which leaves me wondering what exactly are they measuring?

Other points I wanted to address:

1. I don't think it's cherry picking to point out that fake Fortnite APKs are the inevitable consequence of Epic choosing to distribute Fortnite outside the Play Store. I expect this will be a problem with every popular app that decides to go fully off-store.

2. I also don't think it's likely that the people falling for these fake APKs are making a knowing decision to accept the risk of side-loading. I think it's more likely they just don't have the expertise to understand what is the correct place to download it, and they're getting lured in by the promise of free V-bucks or whatever. I mean, yes, ultimately they made that choice to check that box, but it seems a bit like handing a toddler a loaded weapon and then being surprised at what happens next.

3. I agree that we can't stop all privacy abuse, but I think the review process provides a useful deterrent that otherwise wouldn't exist if every developer was doing their own distribution and had no review guidelines to adhere to at all. If you compare the incidence of malicious apps distributed via the Play Store compared to the App Store I also think there's a clear indication of the benefit of the review-first model over the publish-first model.


Mystery partially solved: Google's security report is based on the data from Google Play Protect, which apparently has the worst performance among malware detection tools in the industry (https://www.tomsguide.com/reviews/google-play-protect). A recent evaluation by an independent institute found that Google Play Protect only managed to detect a third of the 6,700 malware samples in the test, compared to ~99% from security companies like AVG, Trend Micro, and Kaspersky (https://www.av-test.org/en/news/here-s-how-well-17-android-s...).

Based on this, I don't think the numbers coming from Google can be considered reliable. It seems the reason their numbers are so low is because they simply aren't detecting a large chunk of the malware that is being distributed on Android.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: