> Apple is working hard to commoditize the complements to their hardware [0]
I've heard this argument before, and if it's true, it's suicidal. Apple will have failed to learn a critical lesson from Microsoft's dominance of the early PC era: it's the apps that matter. Even once there was a significant potential market of would-be platform "switchers" who wanted something beyond what Windows was offering, these switchers remained on Windows because the critical apps in their day-to-day usage (be that Office, CAD software, or whatever) only ran on Windows.
If Apple's view is truly so narrow that they kill the ability of their "complements" to make good money, then Apple is pushing hard to kill the goose that lays the golden eggs. Even if Apple had unquestionably the very best manufacturing processes, hardware design, and platform software design, it would all be worthless if no one other than Apple can afford to write and support high-quality software for that platform.
Now, I'll agree that from Apple's publicly observable positions with its App Stores that they (incredibly) don't seem to understand the need and necessity for maintaining a virtuous cycle between developers and their platforms.
A wild-a-guess: This may stem from a deep misunderstanding of the nature of software. Sure, a random tapping game or timer app or whatever is essentially a replaceable one-off, a fungible commodity. But software that's a fungible commodity fundamentally doesn't create any platform stickyness. If it's really so easy to recreate, it can and will trivially show up on a new platform. This is especially true in a world where many software organizations really are getting better at delivering on multiple platforms.
Beyond the fungible stuff, there's the important category of software that I increasingly view as a "living" thing rather than a static artifact. Such software requires ongoing maintenance and care. This allows it a lifespan across the changes of its underlying platform(s), and to absorb and embody deep problem domains. I pose that these apps, no matter the genre (games, "creative", technical, etc.) are the ones that can create platform stickyness. This, in turn, implies that humans must be able to make a living supporting that software. Undermining this is like cutting off the "oxygen" to a vital part of a platform's ecosystem.
I might be wrong but in the case of Apple it seems that to its customers it's Apple that matters, not apps. It's a brand many people want to buy because of the same mechanisms that make people buy luxury brands. The only time I remember when Apple's branding wasn't really enough has been in the 90s when graphic designers left Macs for Windows machines because Adobe software run so much better there. It was an undeniable combination of hw and sw problems. All of them came back and more thanks to the iPod first and the iPhone later.
So you might be right that too many inconveniences will send faithful customers away (I gave an example) but I don't believe commoditized apps are such a problem. People buy Apple because it's Apple and only a barren app store could drive them away now.
Once upon a time, Apple invested to own 19% of Adobe, which went onto become a powerful software anchor of Apple's hardware ecosystem. Perhaps a bit too powerful for Jobs' liking, hence the modern strategies to commoditize ISVs on iOS.
> the modern strategies to commoditize ISVs on iOS
What modern strategies?
I've seen this theory proposed multiple times, that Apple wants to commoditize software, but I've never seen anyone actually demonstrate ways in which Apple is doing that, just speculation that it would be in Apple's interests.
It's more about what they don't do to help developers make money. A web search will find research papers on software ecosystems, which discuss best practices for mutually-reinforcing, virtuous circle, feedback loops between platforms and developers.
it's not that they don't make money, but they don't allow developers to operate good businesses. being able to give trials to users, or to handle advertising in certain ways, or to have more flexibility with payments, etc - all of these things would be beneficial for businesses to be able to control directly, but they can't, as they're in the walled garden.
Many are making money right now, but I would suggest they don't have terribly good businesses, in that they don't own the relationship with the end user. One change of Apple's policies can put you out of business.
Apple could attract more developers (and thus more iPad users and more corporate revenue -- reversing declining iPad growth) if developer success was more closely aligned with Apple success.
I think there's lots of evidence for that theory. For example, giving away iLife, a package that contained replacements for many of the most common app types, with every Mac. Also, if you look at their Pro apps, like Logic & Final Cut, Apple has consistently undercut the competition agressively, likely as some kind of loss-leader to sell Macs. For example, I remember when Apple bought eMagic (the developers of Logic) they slashed the price from something like $600 to $200, a shockingly low price at the time. All the other software makers had to respond with "lite" versions to compete in this price bracket.
I've heard this argument before, and if it's true, it's suicidal. Apple will have failed to learn a critical lesson from Microsoft's dominance of the early PC era: it's the apps that matter. Even once there was a significant potential market of would-be platform "switchers" who wanted something beyond what Windows was offering, these switchers remained on Windows because the critical apps in their day-to-day usage (be that Office, CAD software, or whatever) only ran on Windows.
If Apple's view is truly so narrow that they kill the ability of their "complements" to make good money, then Apple is pushing hard to kill the goose that lays the golden eggs. Even if Apple had unquestionably the very best manufacturing processes, hardware design, and platform software design, it would all be worthless if no one other than Apple can afford to write and support high-quality software for that platform.
Now, I'll agree that from Apple's publicly observable positions with its App Stores that they (incredibly) don't seem to understand the need and necessity for maintaining a virtuous cycle between developers and their platforms.
A wild-a-guess: This may stem from a deep misunderstanding of the nature of software. Sure, a random tapping game or timer app or whatever is essentially a replaceable one-off, a fungible commodity. But software that's a fungible commodity fundamentally doesn't create any platform stickyness. If it's really so easy to recreate, it can and will trivially show up on a new platform. This is especially true in a world where many software organizations really are getting better at delivering on multiple platforms.
Beyond the fungible stuff, there's the important category of software that I increasingly view as a "living" thing rather than a static artifact. Such software requires ongoing maintenance and care. This allows it a lifespan across the changes of its underlying platform(s), and to absorb and embody deep problem domains. I pose that these apps, no matter the genre (games, "creative", technical, etc.) are the ones that can create platform stickyness. This, in turn, implies that humans must be able to make a living supporting that software. Undermining this is like cutting off the "oxygen" to a vital part of a platform's ecosystem.