I know precisely who you are, and I think you're jumping to conclusions in this case. People love to read morals, ethics, and politics into analyzing bug bounty submissions, but none of that is the primary factor here.
In general, it's not possible to "stumble upon" a $200k bug like the kind Apple is looking for, you have to set out with the intention to find one and work for it. But you're not on a contract and you're not getting paid the whole time. You only get paid on delivery and there's a high chance you won't have anything to deliver. It's just too much risk for an individual or company to commit months of time at potentially $0/hour.
You alluded to it, and it bears mentioning that there is a balancing act with future revenue streams too. If you set up the process to find one $200k bug then presumably you want to use all that know-how to find a second one. But reporting the first one to Apple will, generally, kill your entire revenue stream as Apple knows all your secrets now.
In general, working on $200k bug bounties is a bad business proposition. Period.
We don't disagree on your conclusion: where we disagree is on the goal of these bug bounty programs and (thereby) how we can judge if they work. The value of a bug to the black market can be insanely high: it just isn't possible for Apple to outbid the negative ramifications a bug can have on the outside world. You are always going to get more money by going to the black market than by selling it to Apple.
So, I am going to say it again: I believe the article is not actually focusing on the same problem as Apple's bug bounty program. The issue you were quoted for in the article is "how do you get someone already doing this to give the bug to Apple instead of selling it at market rates". Well, if someone is willing to sell it at market rates, then Apple is kind of screwed as that person likely does not have much in the way of morals and Apple is never going to outbid the market rate.
Now, you are focused on a different issue: "how do you incentivize people who otherwise wouldn't work on this stuff at all to work on this stuff". This still isn't something Apple is focused on. If you want to do this but just need a steady salary, I bet Apple would love to hire you.
What a bug bounty program does is it makes the situation that does sometimes come up where you stumble across an issue while working on something else, or you just got really curious about a thing and find yourself two months later having "wasted your life" on scavenging a bug, or you have some other motivation (consulting or academic work) to be panning for gold, and now you don't know what to do next: rather than doing nothing, you give it to Apple for some reasonable payout.
If you were morally OK with selling your bug on the black market, Apple can't be going around trying to outbid evil, so that can't be a goal of this program and you can't judge it for not solving it.
And if you wouldn't have spent your time on this at all, and would have instead gone off to work on some other project, Apple isn't trying to replace "jobs": they have a massive security staff of some extremely talented people (many of whom at this point used to play for the other side).
> You are always going to get more money by going to the black market than by selling it to Apple. [...] , then Apple is kind of screwed as that person likely does not have much in the way of _morals_ [...] If you were morally OK with selling you really bug on the black market, Apple can't be going around trying to outbid _evil_,
(_emphasis_ was mine)
I agree with you that dguido has framed Apple's Bug Bounty program incorrectly. The money amounts are deliberately not structured as a incentive to build a remote "red team"[1] to do speculative work. (I would go farther to say it would be folly to attempt to create such a salaried red team to uncover security bugs since history has shown that many types of security exploits that come to light had leaps of creativity that exceeded the imaginations of the systems' designers.)
However, I'd add some nuance to your moral scenarios by stating that a sophisticated buyer of Apple bugs would not present it to the hackers as "selling to evil entities". Instead they're actually selling to the "good guys" -- it's just a different set of good guys from Apple. (E.g. Zerodium may in some instances act as a front to buy the bug on behalf of NSA/FBI/CIA).
The article talks about hackers that are willing to go to dinner with Apple and meet-&-greet Craig Federighi. Since these "security researchers" are willing to show their faces to Apple, we'd probably classify them as "white hat" or "grey hat" hackers instead of evil "black hat" hackers. Therefore, one way to persuade these researchers to sell the secret to the black market is to convince them they are still serving "the greater good"... it's just the black market group is paying $500k instead of Apple's $200k.
Summary of my interpretation of Apple's Bug Bounty:
1) Apple did not set it up to incentivize speculative work. This looks like an economic design flaw but but it actually isn't.
2) Apple is not trying to persuade immoral black hat hackers to sell to Apple. Apple already knows the amounts are too small to compete there. The audience they are trying to appeal to is the white/grey hats.
Apple could easily set aside 50+ million per year on bug bounties. They don't do this for many reasons such as reputation hits etc, but cost is not the most important one.
>People love to read morals, ethics, and politics into analyzing bug bounty submissions, but none of that is the primary factor here.
Just disregarding these things and justify it with "economics" or "it's bad business" doesn't make ethics disappear. I'm guessing you wouldn't sell exploits to ISIS even if they were the highest bidder..? What about North Korea? China? Russia? Syria? The US? The US with Trump in charge?
I personally like tptacek's comment on this from last year:
>None of them are adequate compensation for the full-time work of someone who can find those kinds of bugs. Nor are they meant to be. If you can, for instance, find a bug that allows you to violate the integrity of the SEP, you have a market value as a consultant significantly higher than that $100k bug bounty --- which will become apparent pretty quickly after Apple publicly thanks you for submitting the bug, as they've promised to do.
> which will become apparent pretty quickly after Apple publicly thanks you for submitting the bug, as they've promised to do
This is the same twisted logic that people try to use to hire photographers or graphic designers to work for them for free, because they will get "exposure". If someone does something valuable for you, you should pay, no matter how famous your brand is.
And Apple has enough cash to spare.
It's not though, since the alternative to disclosing the bug to apple is to either hoard it for yourself or sell it to someone, both of which keep the attack vector open and millions of users at risk. That's where the ethical discussion comes in, and there's not really a parallel to the case with the photographer/graphic artist.
(And just to be clear, I do think that fair compensation is a part of that ethics discussion, but it doesn't trump other concerns.)
I was not addressing the ethical question involved - just the logic behind "Apple thanks you, therefore you should be happy".
If Apple is not providing reasonable money compensation to white hat security researchers then they are willingly leaving this space opened for black hats.
I read that more as "Apple thanks you, at which point you realize you were smart enough to have made more money doing something else", not "therefore you should be happy".
You are trying to read the comment as positive of Apple's thank you and exposure, when it sounded to me much more like a reality check: I would go so far as to say Thomas's point might have been "when Apple thanks you you will come to regret wasting your time--which you now know was always valuable--on them". That isn't "you should thank Apple for making you realize that"...
> hoard it for yourself or sell it to someone, both of which keep the attack vector open and millions of users at risk.
I think this logic is inherently flawed.
If there was no monetary incentive and the only ROI was a thanks from Apple, maybe the bug in question would not have been found in the first place. Becoming aware of a bug does not suddenly put people at any more risk than they were previously in, prior to bug discovery.
>Becoming aware of a bug does not suddenly put people at any more risk than they were previously in, prior to bug discovery.
I agree, which is why I said "keep[s] [...] millions of users at risk" not "puts millions of users at risk". An unfound bug is still a potential zero-day. With something as valuable as an iphone exploit, we know multiple entities are desperately looking for it, so I wouldn't err on the side of assuming that any exploit would not be found. (or put more succintly: If you've found it, someone else might've to)
There's a variety of models that others have experimented with, and they all tend to fail at bugs this deep. For instance, Google tried a thing where you open a bug ticket once you think you've found an issue and you stream out everything you tryalong the way. I think there were ongoing or partial rewards and it was Android-centric. I'm pretty sure approx. 0 people took them up on it.
Microsoft also had/has mitigation bounties for around the same dollar amounts and it turned out nearly the same: lower than expected interest given the price point. Most of the interested parties tend to be academics, for fairly obvious reasons when you think about the economics of it all.
I think that if Apple wants to find bugs this deep in specialized, hard-to-audit surfaces within iOS, they ought to hire experts at expert rates and provide them the tools they need to look. In my perfect world, I would hire an expert consulting firm at their market base rates and then offer bonuses for findings on top of it. I would make the engagement low intensity and long in length, to build competency and familiarity with the codebase over time.
> they ought to hire experts at expert rates and provide them the tools they need to look.
I'm really curious where this idea that the existence of a bug program means they must have no in-house bug hunting team. A competent, deep pocketed corporation (which I think we all agree Apple is) would certainly do both.
Want to hunt Apple bugs but not work for them? Do that. Want to hunt bugs with a steady salary? Do that.
> I think that if Apple wants to find bugs this deep in specialized, hard-to-audit surfaces within iOS, they ought to hire experts at expert rates and provide them the tools they need to look.
...Apple does this. The people I complain about and call "mercenaries" are the people from
the grey hat hacking community who decide to go work for Apple. If I wanted to go work for Apple on this I could probably get myself hired there in a matter of days, if it weren't for the issue that I am morally opposed to it (and they know that at this point, so they would rightfully be highly skeptical of me finally "coming around" ;P). The bug bounty program is not focusing on this issue.
When you work to help Apple lock down their operating system, you are helping an oligopoly (Apple along with Google and Microsoft) to control the future of all software development. The mitigations that are put into the operating system serve two purposes: sure, they lock out invaders... but they also lock out the user and rightful owner of the hardware.
The moral costs of the latter do not pay for the benefits of the former. I look at people who work for Apple on security as similar to people who work for the TSA: yes, they are probably in some way contributing to the safety of people... but they are actively eroding the liberties of people in such a frightful way that the benefits are not worth the cost.
So when I see people working for either the TSA or for Apple, I ask myself "did they really really need this job? or could they have gotten work elsewhere"; and if the answer is that they didn't absolutely need to work for Apple, I model them as someone who has left the Resistance to go work for the Empire because they thought laser guns were cool and the Empire happened to have a larger laser gun for them to work on, and hell: the Empire is managing to maintain order on a large number of planets, right? :/
In case you think that this is just some future concern, it is a war which is already playing out today in countries like China, where the government knows that Apple has centralized control of what can be distributed on that platform and uses that knowledge to lean on Apple with threats of firewalls and import bans to get software and books they dislike redacted.
> Apple, complying with what it said was a request from Chinese authorities, removed news apps created by The New York Times from its app store in China late last month.
> The move limits access to one of the few remaining channels for readers in mainland China to read The Times without resorting to special software. The government began blocking The Times’s websites in 2012, after a series of articles on the wealth amassed by the family of Wen Jiabao, who was then prime minister, but it had struggled in recent months to prevent readers from using the Chinese-language app.
Having a centralized point of control at Apple is not helping the lives of these Chinese citizens, at least according to the morals that I have (and I would have guessed you have, but maybe you are more apologetic to these regimes than I; we have never really spent that much time talking as opposed to just kind of "sitting next to each other" ;P).
So, yes: I absolutely am against security experts "helping Apple secure iOS" when what we know is actually happening is that they are "helping Apple enforce censorship by regimes such as the Chinese government on their citizens". There are tons of places in the world you can go work for where your work on security will actually be used for good: go work for one of those companies, not the oligopoly.
Thank you. This sounds like is a viewpoint I strongly disagree with, as I believe potential misuse of vulnerabilities is a concern which outweighs device freedom in this context (example: https://citizenlab.ca/2017/07/mexico-disappearances-nso/), but nonetheless I really appreciate the detailed explanation.
FWIW, I don't disagree with you that the security issues here are real. My issue is that Apple has managed to tie together the security of the user with the maintenance of their centralized control structure on computation. I think that people should lean on Apple to provide a device which is both open and secure. Part of this is what amounts to a bunch of people refusing to work for their company, particularly on the parts of their products which are essentially weapons (so the same argument about refusing to work for governments). Hell: even if they didn't seem to go out of their way to be actively evil about some of this stuff, it would be better :(.
I mean, here's a really simple one: they already have "free developer" account profiles. However, you can't install an app that uses the VPN API using a free developer profile. So if I build a VPN service with a protocol designed to be used by people in China to help people bypass the Great Firewall, as VPN services are illegal in China, China has complete control over Apple's app distribution, and Apple not only polices the use of their enterprise certificates but has in the last year or so started playing whack-a-mole on services which use shared paid developer certificates, users in China are not going to be able to install it on their iOS devices.
Why did Apple go out of their way to block access to the VPN API from free developer accounts? I can't come up with any reasons for this that make me feel warm and fuzzy :/. So yes: the US military does a lot of good protecting people on foreign soil, as does the FBI here at home, and I'll even grant that the TSA probably does something good ;P. You can show me a ton of reports of active terrorism in the world, and say "look, this stuff is important, peoples' lives are on the line"... but as long as working for those groups is tied to mass surveillance, installing puppet regimes, and maintaining resource imbalances, the moral issues remain :(.
(I'm also going to note that I find it a lot less weird if someone is consistent and always worked for Apple, whether directly as an employee or indirectly by handing them information and bugs than if they "switch sides" and go from simultaneous disclosure to "responsible" disclosure or even forced disclosure by being an employee. That's why this thread was spawned from me noting that I have at times used the term "mercenary". It makes some sense to me that there are people who work for the Empire because they believe in the goals of the Empire; it just irks me, though, that there are people who once worked for the Resistance who get a job offer from the Empire and are like "wow, that sounds great!" and go work for them without seemingly believing that anything has changed about what they are fighting for... it tells me that, at the end of the day, they really just thought "working with lasers is fun!" and the moral issues of which side they were on never mattered.)
I get where you are coming from. It is worth keeping in mind that not as many folks see it in a political manner, so if their viewpoint is a choice between "working hard to release free research/tools and getting people angry/complaining in response" versus "working hard and getting a decent salary to do what they love" then it makes some sense as to why people would go that direction.
I'm in agreement with you regarding Developer ID. I have no idea why Apple would want to limit that, I know they recently relaxed restrictions on NetworkExtension though (except for Wi-Fi helpers) - Are you referring to the old "e-mail here to apply for the entitlement" process they had in place? Or do they still not allow the entitlement for free developer IDs now?
> Are you referring to the old "e-mail here to apply for the entitlement" process they had in place? Or do they still not allow the entitlement for free developer IDs now?
I really do mean the latter: they still block this entitlement from use by free developer IDs. If you try to activate it you get an error #9999 with the following message.
> The 'Network Extensions' feature is only available to users enrolled in Apple Developer Program. Please visit https://developer.apple.com/programs/ to enroll.
It seems to me the actual solution is to hire experts as employees. That solves the "can't put food on the table" problem. It's probably a lot more efficient and effective use of their cash. As far as I know Apple has already done this over the years.
Doesn't a high bounty create a perverse incentive for employees to introduce or leave bugs they spot so that they/some accomplice can claim the bounty?
I know silicon valley employees' honnesty is above all suspicions and rogue employees is a monopoly of the financial industry but one has to consider the risk.
Do you believe this is possible? Introducing or leaving such bugs would need more than one person, probably from different departments and even if it gets past through code-review, there is still a risk someone notices the connection between the one who introduced the bug and the one who found it.
The thing is it often only takes one line of code running with the right privilege to make the whole system insecure. And a dishonest employee wouldn't even need to write it. Having access to source code, all it would take is to spot it but not fix it.
Developers unimpeachable morals aside, I think the risk of getting caught would outweigh the benefit. Most developers at Apple hardly need a secondary sort of income.
Is everyone in this thread being sarcastic about developers morals? No one here really thinks that developers have as a whole are more moral than any random subset of people, right?
In this case, the reward doesn't outweigh the risk.
Insider trading can earn very large amounts of money, and the higher your current position, the larger scale of insider trading becomes available.
Now, for Apple iOS core developers - does it make sense to risk a $200k/year job and possible jailtime to get a $50k bounty (the limit for kernel exploits) that you'd have to share with whoever helps you to launder it?
No, it doesn't... but the context of this subthread was "Apple should pay a lot more for this bounty" which turned into "but if they do that then it will create a perverse incentive to do this evil thing". The people you are arguing against thereby agree with you: $50k is not enough of an incentive; but they feel that at some point as that number gets higher the incentive will make sense.
Yeah, okay, with a factor of 10 increase, that would start to make sense, and there could certainly be people willing to go for it.
Many high profile bugs seem to have plausible deniability where they can reasonably be errors but might have been deliberately inserted. Anybody can make mistakes similar to Heartbleed, especially if they want to.
In general, it's not possible to "stumble upon" a $200k bug like the kind Apple is looking for, you have to set out with the intention to find one and work for it. But you're not on a contract and you're not getting paid the whole time. You only get paid on delivery and there's a high chance you won't have anything to deliver. It's just too much risk for an individual or company to commit months of time at potentially $0/hour.
You alluded to it, and it bears mentioning that there is a balancing act with future revenue streams too. If you set up the process to find one $200k bug then presumably you want to use all that know-how to find a second one. But reporting the first one to Apple will, generally, kill your entire revenue stream as Apple knows all your secrets now.
In general, working on $200k bug bounties is a bad business proposition. Period.