I had the.. pleasure.. of speaking to Comcast's CISO after doing a security risk exposure disclosure. Before talking to her, there were mentions of bug bounties, etc (neat). After talking to her, though, she said in a hand-wavy way that:
1. The exposure wasn't a "bug", so it's not worth a bug bounty.
2. The amount of effort it would take to start a bug bounty program would be far too cost prohibitive. In other words, "Everything's broken. We know it. If we start paying people to find what's broken, we'd go bankrupt." Heh.
Problem is - that's such an easy thing to say, whether it's true or false. For a device that's owned by millions, it's pretty grandiose of them to think that their internal team is all it takes. There's so much an internal team can do, so having an outside "team" is significantly better - even if it's just for a different view from a different vantage point. So, good on apple for doing this, but I'm questioning their past decisions. In particular their poor use of "they ran out of things to find" is worth discussing. The way this article is worded, their stance sounds incredibly naive, where a, "We don't have the same breadth as the infosec research community, and we would like to work with them." response might have been more appropriate.
I think it's more like Apple is patient and waits to get things right. Bug bounty programs are relatively new (past few years). The article notes that Apple faced a more complicated landscape than your typical company, one where state actors are bidders. So they needed to craft a more targeted program.
What's the going rate for an Android vuln? The FBI paid ~$1M for an iOS one. Android has a lot more malware, unpatched old installs, etc., and there are myriad ways to attack email and web accounts, so my guess is the marketplace for iOS is on a whole different level.
I suspect for large companies most bug bounty programs are net economic positives, especially weighed against cost of probable breaches or the comparable spend required on in-house engineering to find all the bugs otherwise cheaply and quickly identified by the bounty. The problem is social/political for senior executives to accept that discussion of flaws in the open is a good thing.
1. The exposure wasn't a "bug", so it's not worth a bug bounty.
2. The amount of effort it would take to start a bug bounty program would be far too cost prohibitive. In other words, "Everything's broken. We know it. If we start paying people to find what's broken, we'd go bankrupt." Heh.
So yeah. Don't be surprised.