I'm pulling my video editor Shave (http://shavevideo.com) out of the Mac App Store for these reasons, though most specifically the lack of paid upgrades. For a $10 video editor that, quite frankly, beats the pants off a lot of crap out there for day to day editing tasks, I can't afford to devote the time and resources it requires to maintain it without it having a baseline level of income. Not having an upgrade path to a major version number bump is ridiculous.
Also, the lack of direct contact with customers is obnoxious at best, crippling at worst. But the sandboxing entitlements kill the best bits of functionality, so I have not much choice but to say "fuck it" and move on.
I doubt Apple will miss me, though I'm the only other worthwhile editor on there other than iMovie and Final Cut in terms of actual editing functionality. The video category is a giant pile of shit save a few select pieces, mine included.
Have you tried purchasing the "latest" version of World of Warcraft?
You can't. There is no standalone "latest version". You have to purchase the original, and each version since, and finally the current version.
Correlate those game content / game mechanics updates with In-App Purchases, and it's clear how someone could offer "major upgrades" if they stop thinking in terms of version numbers and start thinking in terms of actual upgrade features. (See "Navigon" for an app that has done this.)
And if the entire app architecture changes, make a new app.
Developers could very well re-architect their application so that when you download the update to v2, you have to unlock the new features with an in-app purchase, but to be honest, it's a lot of extra work for the developer, and there is no way to make it less painful for the users, who will feel like they are getting nickel and dimed by the developer.
Also, for those users that are brand new on the v2 version, it's a very bad experience when they have to immediately buy an in-app purchase just to unlock functionality that should be included. If you drop the price of the v2 version to accomodate this, then your original v1 userbase will complain that they paid twice as much.
Here's an example:
v1.0 of App X sells for $9.99 on the app store. Users buy it and are happy.
v2.0 of App X comes out and to unlock the new features, you must pay $4.99 in-app purchase. Developer drops the price of the app to $4.99 so that new users don't have to pay $14.99 to get all the features.
New users now complain that they have to immediately make an in-app purchase just to unlock the software - they don't notice that it's half price now because they never bought it previously.
Existing users complain that they paid $9.99 and are now being nickel and dimed for updates, when the price is cut in half now.
You can't win. Unfortunately, Apple has create a model that makes a lot of profit for them, and makes it relatively painless for the end user, but it's a model where the developers suffer and can't really support the large applications that receive many new features every year.
The idea to have new features exist as unlockable extras actually is an advantage to users: if they don't need it, they still get updates for the feature set they bought.
So all that's missing is a way to give customers of "date X or later" unlockable features as part of the baseline sale.
Except the developer takes a hit there, too. Given that not everyone needs every feature, users end up subsidizing each other's desired improvements as part of a major upgrade. On top of that, not all features have clear lines that can be drawn around them, technically or marketing-wise.
It's a great model for games and certain specific types of apps (Paper is very clever). But to push all apps onto this model is highly unrealistic.
I'm not sure if the developer necessarily takes a hit. Say I have $15 to spend on a video editing app. $10 for the initial app and $5 for the only premium feature I want results in me giving $15 to the developer (less Apple's cut etc). However if the developer adds 5 new features to the app and I have to choose between buying features that I mostly don't want and can't afford or not spending the money I'll most likey give $0 to the developer.
Similar situation with luxury features on cars: it makes business sense to sell them individually.
If you wanted/needed the feature badly enough (ie, it was a mission-critical tool), you might pay much more for an update that included your desired feature in addition to stuff you don't need (or even stuff you might not yet know that you want). Also, the extra "wasted" revenue helps pay the developer for the thankless but necessary task of under-the-hood improvements, whose costs have to absorbed elsewhere.
I'm not saying the model can never work, but it's not a good fit for every product. The developer should be able to create the relationship with their customers that they think fits best; sometimes that will be IAP, sometimes subscriptions, sometimes upgrades. There will never be "one business model to rule them all", or else we'd see all marketplaces naturally converge in that direction.
1. It's an easy way to charge for new development on platforms that don't have paid upgrades.
2. It can turn be much more profitable as long as you don't just take the price you'd charge for the upgrade and divide by the number of features. Games are leading the way with this business model.
For example, IIRC League of Legends is insanely popular right now. (I want to say it's the biggest game at the moment, but I could be misremembering. Anyway.) It's free to play, but unlike most games, you don't get any permanent characters or upgrades when you start. To unlock a character, you have to buy them. Buying all the characters with money would cost almost a thousand dollars. And if you want to customize your characters so they look cool – well, that can cost several times more. Suffice it to say, Riot Games is doing pretty well for itself by unbundling as much as humanly possible.
Why lower the price of the initial purchase? Just keep it simple: basic app is $10 forever. The basic version of the app comes with some basic useful features. Premium features become available via in-app-purchase as the developer makes them. Users can choose to use the premium features or not. Bug-fix updates are just free as they should be IMO.
Using a theoretical video editor as an example: $10 gets you basic clip cropping, placing clips next to each other, background music, importing, exporting and titles. So a basically useful app. Later the developer updates the app with a few bug-fixes and a premium "transitions" feature that's available via IAP. This update is of course free to download. So existing users get the bug-fixes for free. Anyone who want the premium feature just unlocks it and uses it indefinitely.
I think you misunderstood. He's suggesting that instead of releasing v2.0 on the App Store, you release an in-app purchase option in the original app that upgrades it to v2.0. A similar idea would be to release new features a la carte as in-app purchases instead of releasing new numbered versions. I'm not sure if the marketing hit would be worth it, but it is one way to a consistent revenue stream.
There are a lot of problems with that idea, but the biggest I can see is that the user experience is horrendous. You're basically describing a world of itemized crippleware, where buying an app that's more than a few years old involves a half hour-long process of adding on feature packs.
The problem is getting customers of version 1 onto version 2. There is no way to do this with MAS. I have a mechanism built into Shave so I can push "news", but it's annoying and bad user experience.
Why not a "fat" app? Always include two major versions in one binary, instead of putting the new features behind a paywall. Have an IAP purchase that just configures the app to run "Side B" from then on.
I think the following set-up avoids that problem: app is always the same price to download and has a few basic features unlocked. App has an upgrade view or screen where users can unlock additional features via in-app-purchase.
It works like this: developer releases the app for, say, $10. People buy it and use the basic features for a while. So far there's no upgrades available. Developer codes up a premium feature and updates the app in the store. Existing users can use the app as-is with the basic features that came with the $10 price tag. Existing users can also make an in-app-purchase for, say, $5 that unlocks the premium feature. When new users download the app for $10 they just get the basic features. Since another "version" of the app is available the new users can just buy the premium features if they so choose.
Fast forward to version 5 of the app. Now there's 4 upgrades available via in-app-purchase. However users are only prompted to purchase the next version based on their in-app-purchase history.
I don't think there's a problem for new users who purchase the updated app.
This does away with everyone-is-on-newest-version paradigm of automatic updates that makes support easy - in fact, it makes it much worse. Instead of five old versions of the app, you might have 1 base + 12 permutations = 13 different combinations of functionality to support.
Good point. This approach probably wouldn't work for some apps. However if the app lent itself to a very modular design it could work. Certainly would take some creativity on the developer's part. For example premium features that are troublesome if only some people have them could be included in a free bug-fix update to make things easier.
I suspect it would be much nicer for both the dev and the users if there was just a single "Upgrade to the latest version for $5" option. Don't try to splinter the upgrade path into features or sub-steps. So, as a consumer story: you buy v3 for $10. Over the next year, v4,5,6 come out and you pass on them. V7 finally convinces you to upgrade and you shell out $5 to go from 3->7 with a single click. Meanwhile, if you were a power-user, you might have been impatient and shelled out $5 each for v4,5,6 along the way. But, I think that's a pretty reasonable way do perform price discrimination.
If a developer is going to offer a version upgrade, it's the fact of a previous version purchase that signifies eligibility. The app store clearly has this purchase information. So why not give the developer the option to set previous versions to act as "coupons" reducing the price of the new version by a developer specified amount? When version 5 is released at $20, let old versions be set so they can no longer be purchased anew. Let the developer set owners of version 4 as seeing the price of version 5 as $10. If you own version 3, then perhaps 5 is $15. Let previous versions themselves be upgrade coupons.
In-app purchases are left as orthogonal to these upgrades. The developer could perhaps be allowed to continue to push out bug fixes for older versions, or even old version DLC.
How would this work? I mean, even assuming Apple let you get away with it:
If you store the version on the device, the user would simply have to delete the app (which wipes all data the app has stored on the device) and re-download from the app store (which is free) to get the latest version. It's not going to take long for users to notice that and then you'll never sell an upgrade again.
So you'd need to move away from a straight app and also create a server component. But to store it on the server, you need something that uniquely identifies the user.
You can't get access to the UDID any more (and besides, people would catch on when upgrading their phone got them new version of everything for free).
You can't generate some kind of GUID, because that has the same problem as just tracking it on the device. Upgrades are just a reinstall away.
And you can't force them to sign up for your site on first launch: App Store reviewers won't approve apps that make users register solely to store information about them. So you'd need to push content through the site, at which point you've moved well beyond the purview of most apps, and you ought to just give up the game and sell access to the content itself.
Anyone who has purchased any IAP content at all has an IAP receipt that you have access to even across installs.
This leaves the problem that v>1 first time buyers will have to IAP something immediately after buying the app, kind of pushes you to have a free trial mode.
Hopefully apple will eventually introduce time-limited free trials and upgrade options. If their history with iOS and Mac versioning is any hint, the current simple system is not a religious thing for them and they will listen if they see enough negative feedback. Which we should provide.
Is that necessary? For some reason I can't think of any problems with the freemium-like approach. Basic app has a fixed price and basic features. It's what everyone gets initially. "Versions" are really just premium features available via in-app-purchase. Bug-fix updates just affect the basic version of the app. And they're available for free to anyone who has the app just like the iOS store.
World of Warcraft is a bad example of this. The client you have downloaded is always patched to the latest version, even if you haven't bought the expansions.
Each expansion unlocks content. So once you buy The Burning Crusade, you get access to the Outlands zone. That only matters if you have a character that's level 58 or higher.
Blizzard redid the entirety of the Vanilla WoW experience in the Cataclysm expansion. Even if you did not buy that expansion, the level 1-60 are still the new content.
Having version 2 as in-app purchase means your version 2 app has to include version 1 in it. Good luck with implementing that. What if my version 2 is a major rewrite and I change software architecture or want to use new features. Can you imaging writing an application with 2 different interfaces, database formats and feature sets, just to make it possible to upgrade with in-app purchase? No, thank you.
Can't people decide not to purchase the app then? They have a content review system - 10 recent 1 star "Bug fixes cost $$$" and the app is dead, no? Or am I overlooking something?
did you just compare a game with a 15$ subscription and 40$ expansion packs for major updates and lvl bumps to a 10 dollar app? do you realize how asinine that is and how dumb you sound?
if i want to get to current content in WoW i would of had to get Vanilla(50$ at release) + burning crusade (40$ at release) + wrath of the lich king(40$ at release) + cataclysm(40$ at release) PLUS every month im paying 15$ on subscription
if i just buy vanila i would be limited to 1-60 content, not allowed full access to any of the later continents and cities, skills, talents, cant pvp properly, not allowed in arena, cant do any endgame, and am generally wasting my time. how is that at all conducive to what you are saying? for any real application that does work or major game that requires constant updates you cant expect support and update for 10$ that's completely unfeasible.
you sound like someone who never played world of warcraft in his life, and it baffles me how you came up with such a nonsensical rebuttal
>do you realize how asinine that is and how dumb you sound?
That was unnecessary and did not add to the conversation.
And I'm not sure that I understand your point. Parent comment is saying that apps can follow the WoW model to create a consistent revenue stream without paid version number upgrades. As you mention, "for any real application that does work or major game that requires constant updates you cant expect support and update for 10$ that's completely unfeasible."
you cant follow the wow model, the wow model requires constant fees every month, thats my whole point, when you play wow you pay blizzard constantly, and anytime there is a major update you pay another large chunk. you cant do that with a video editor, and its not like maps where you can segregate off chunks of content. like i said completely asinine, and unequatable in this circumstance, the fact that so many people agreed with such an obviously flawed argument speaks volumes.
The App Store sells millions of cookies. You eat the cookie. You don't upgrade the cookie.
You're absolutely right about the lack of upgrade path. Wonder how Apple will deal with this when their Keynote/Pages/Numbers sales taper off? At some point they'll want to refresh those designs and finance a fresh major version, and they'll want to charge for it.
I bet you'll see the paid upgrade features (or maintenance features) then.
One thing to realize is that the entitlement system is critically necessary and inevitable. It's really a case of taking the pain now, or taking it later. Might as well rip off that particular band-aid now. It won't get easier.
Well Mountain Lion was released as a new 'appp' rather than an update to the Lion app, so it seems Apple want people to release their major updates as new releases.
I wouldn't be surprised if they continue to update the iLife products such as as iPhoto as they always came free with the OS and I don't think many people bought the iLife pack separately but I'm really not sure how they'll handle keynote/pages/numbers.
Apple can afford to stick to their guns on this, as most of their money is made on hardware. I highly doubt that GarageBand for iPhone makes a profit, for instance. They'll just release the new version at the same price. Most devs, even some big ones, don't have that luxury.
It supports anything that QuickTime can play, some formats better than others. But I use it to edit DiVX. Note, that it will always save a quicktime movie, even if you are editing an AVI or MPEG, but it won't re-encode, it just changes containers. Email me jon@interfacelab.com and I'll hook you up with the latest version that supports AVCHD.
This bummed me out, not only for Shave, but as a user. But, you know, Apple's direction with media is a little f'd in my estimation. AVFoundation is dope, but it's a huge step back from QuickTime. Shave 2 is written using it, and so far (and it has a long ways to go), it's not half what the QuickTime version is.
Shave does less than iMovie for $10 more. Maybe that's why it's not providing decent income? It's hard to compete with free, even more a good free app.
iMovie isn't free. It's $14.99. It's also slow as shit where Shave is fast as you know what - for a very specific type of editing.
Trust me, I've done the comparison of same editing tasks in both and I can do in a minute in Shave what takes 20-30 minutes to do in iMovie. It's the reason I wrote it in the first place. And good luck opening a DiVX or another weird codec in iMovie. Or editing an Mpeg-2/4 movie without having to re-encode. Or editing at all without re-encoding.
So, your customers are users with very specific editing needs, willing to pay for something they already have (disregarding performance), and/or that need to edit DivX or weird codecs. That sounds like a pretty tiny market compared to users who want a step up from iMovie; building a quality app for your own needs doesn't guarantee a market for it. That's just my point of view, you have the numbers.
One of the most annoying things about the app store are the review times. With our app (http://armadilloapp.com) we had to wait over 5 weeks to get into the store. 2 weeks till the first review, then there was a rejection where I had consult Apple's DTS which took another week to get resolved plus another 2 weeks waiting for a new review.
Now we have version 1.0 in the app store and have version 1.1 waiting for review since almost 3 weeks. In v1.1 we have added some new features people were requesting and a few minor bug fixes.
Now all we can do is tell our customers: "Version 1.1 will be out soon - as soon as Apple decides it's time for a review".
It's really a shame that Apple takes 30% of our sales for such a bad service.
I'm in the same boat; I have a small iphone app that's been waiting to get into the store for 2 months now. Rejections are such random nitpicks and take forever to get resolved.
They rejected the app repeatedly for the following reasons:
- didn't like one of the supplied screenshots.
- in-app purchase does not provide a restore button (which I've never seen in any other app btw)
- the restore button of an in-app purchase is not labeled restore.
The average wait between rejection and re-review is 2 weeks.
I think this is a typical example of bikeshedding, I have another (retired) app that is still in the store, it's full of bugs, and crashes constantly, but the functionality is so big and complicated that it was accepted instantly without any rejection. But this is a very simple utility app, so they want to nitpick over all small details.
The iOS App Store does the same thing with changing the rules in the middle of the game, but it is not subject to potential irrelevance because it is the only place to get software for iOS devices. So the obvious solution for Apple is to make the Mac App Store the only place to get software for MacOS.
I'm actually a little surprised this didn't happen in 10.8.
Which would instantly kill the Mac as a development platform, and render all previously purchased software irrelevant. This just won't happen, though I can see a version of OS X that does this by default (perhaps on all "non-pro" models).
I think the only viable way to do this would be for Apple to make Macs obsolete. Like boiling a frog, if they upscaled the ipad to a desktop-lite device over the next few years, and eventually replaced their fully featured cousins with these devices, they would be able to accomplish it with far much less resistance.
(The iMac may have been a few years too early - they could have positioned the iMac as the trojan horse in this story).
It's all about where the mass market is. Just like the low-selling Mac Pro machines, Apple could quite happily keep OS X going for the professional minority that require a POSIX-like experience. The vast majority would use a slightly upgraded iOS, where all of that is hidden away.
OS 9 -> OS X was supported by the Classic runtime for years.
PPC to Intel was supported by Rosetta for years.
I suppose if the announced that this was the plan, and gave people years to migrate across, it might be something they'd do. But it would kill basically any technical use of the platform (development, scientific computing). I find it much more likely they'll evolve iOS "up" to support users for whom a closed appliance approach is helpful, whilst adding options to OS X to cater for those who want a more traditional computer but need a bit more help.
But they didn't - they made Rosetta and supported it a couple of years just to give the developers and the users time to update their apps/computers to Intel versions.
The day they do this is the day that I will partition my disk, install another OS and never look back. OSX is the only Apple product that is still attractive to me, but I fear that I will have to jump out sooner or later.
With the shift towards iOSness and sandboxing for Windows & OSX, I don't see myself using anything but Linux - or recommending anything but Linux - for the future.
* Taking something away from someone that they've always had (as would be the case with OS X) is very different to never giving them it (as has been the case with iOS).
* People are used to doing stuff that isn't possible with the current restrictions - with iOS they've only ever had those restrictions so they're not aware of what they might be missing out on.
* Developer goodwill would evaporate overnight. I'm not even sure it would be possible to develop on a Mac with this sort of restriction in place given the low level activity you often need to play around with.
* They'd need to develop a parallel mechanism for managing machines or lose what little they have of the Enterprise who are never going to use the app store.
They'd need to make a "developer version" of OS X, which would just be OS X with this hypothetical super-Gatekeeper shut off. Then the whole thing just gets silly.
I believe the PC will remain a PC. Take away its nature and you might as well just hand everyone an iPad.
I can see them changing the language around the various Gatekeeper options and adding additional "are you sure?" warnings to scare people off the allow everything option and unsigned / non-Mac App Store installations but I don't think it will go any further than that.
By default Mountain Lion can only run software from the App Store or by developers Apple has specifically certified. You can tune this restriction both up ("App Store only") and down ("any software can run").
There is no "special certification", it's just a matter of creating a developer account and generating a certificate.
Also, right-click -> open will bypass the verification under the assumption that if you know about that command, you probably somewhat know what you're doing.
By default, you can right click anything from anywhere and choose "Open". It will install/run, and you will never be asked about that program again. But you have to make a conscious decision to run it.
Gatekeeper was on for appstore and signed applications and I had to disable it to install something I downloaded (ironically because OS X told me to - X11).
Here's one thing I imagine: 10.9 by default won't allow apps outside the App Store, and you have to buy the 10.9 Server upgrade to have this privilege. They would bundle XCode with it or something to rebrand it as a "developer-friendly" add-on, and developers will be happy to pay, but all the consumers will be chained to the App Store.
The only thing that gives me comfort is that Apple itself has thousands of developers writing its own code, so they can only cripple their OS X so much. Otherwise what would they do, write all their software on Windows machines?
I thought the same thing as I was reading John Siracusa's review of Mountain Lion. My stomach sunk when I saw the default 'gatekeeper' options.
[ ] Mac App Store
[x] Mac App store and identified developers
[ ] Anywhere
Seems likely at some point down the road the third option will no longer exist. My next project is getting Linux to run on an old MBP. I still love their hardware, but OSX is getting fuckin' uppity. I realize that this is just an attribute on a binary that you can set and unset, but still, the direction this is going seems clear to me. The further iOSification of OSX is driving me back to Linux on the desktop.
What's wrong with Apple trying to control which binaries it will and won't let you run on its OS? Doesn't Apple always have your best interests at heart?
Since OS X is a better/simpler choice for many people and has WAY more quality supported apps than Linux has. If Linux became the superior choice for most people, a lot of those people would switch.
The built-in terminal is basically broken:
-page-up and page-down don't work unless you hold shift
-It uses CTRL-C instead of Command-C, unlike the rest of the OS.
-I still can't find hotkeys to jump to the beginning of end of line, or skip over entire words (home, end, ctrl-arrow in Linux and Windows)
Home and End are COMPLETELY USELESS on OS X. I have never, in decades of computing thought "Oh, I want to scroll to the top or bottom of this document with one keypress, but not even bring the cursor with me." Given how often during programming I want to select an entire line, this is broken.
The mouse acceleration is stupid. I know you can get used to it, but then try to play StarCraft or something on it, and you will be awful (or at least severely handicapped), because the mouse acceleration system just doesn't work for stuff like that. Also, Lion broke all the work arounds.
The XCode debugger doesn't let me inspect anything that even might be out of scope. Most of the inspections are useless anyway. You can't even see a list of what is in an NSArray (isa = Class, puh-lease).
Viewing hidden files in Finder is hard enough to not remember. Viewing hidden files on a remote server seems to be impossible (maybe it isn't, I don't care). Simply typing in the folder you want to go to is a huge project.
OS X is a poor network neighbor. It rarely detects my other machines, and when it does it's after waiting forever, and it still barely works.
And, oh yeah, it cost me as much as my other 3 computers combined and is the worst of the 4, spec-wise.
I'm surprised Ctrl-C is a problem for you, especially since you come from a nix background. I'm super ok with it, especially since Cmd-C is the copy command.
Mouse acceleration does suck. It's directly led to a big decrease in the number of amusing photoshops I make; on the plus side, I've started using the keyboard much more.
I haven't had any issues showing hidden files; I just always show them. Going to a specific folder is as simple as either Cmd-Shift-G or typing "open /Some/Folder/Name" in terminal.
You make a lot of good points. I hated OS X at first, but I got used to it; I could probably move to a nix machine, but there would definitely be a long and painful adjustment period, much as there was for you. And as far as the cost, I'm paying for the OS X design, not the hardware - something not everyone agrees with.
Agree with everything. Mouse acceleration curve is the deal breaker for me because I couldn't get it close to Windows feel. The pointer slowdown is too sharp at the end of your mouse move, the pointer itself gets jerky and jumpy when tracking over a small area.
I've tried various free and paid tools, but no joy (OS X Lion). Funny that I have no such problems with trackpad, but maybe that's attributed to myself appreciating the trackpad too much while using it because every other one I've tried sucked unbelievably.
I frequently want to scroll the display without moving the cursor. As for selecting lines, you do realize that Ctrl+A and Ctrl+E work everywhere in OS X, right? Generally whenever I select an entire line, I do so to erase, move, or duplicate it, and Cocoa's Emacs-style shortcuts are much nicer than Windows/CUA-style shortcuts for these operations (and difficult to support on systems where Ctrl is used for menu command accelerators).
Far more frequently, I want to delete from the current cursor position to the end of a line, and Ctrl+K is much nicer than Shift+End, Backspace.
iOS has never had a legitimate way to install code in userspace outside of the App store. OS X has a long and rich history of allowing people to run any code in userspace that they want.
This is one of those situations where you can't really close the barn door after the horse has already left.
What Apple may do is stop developing MacOS altogether and make a variant of iOS for desktop after OSX (or simply rebrand). And I don't think this would be a total shock for the users by the time. Largely because Metro app would be following the store model too.
As a Mac developer who's also affected by the mandatory sandboxing requirement, I fully agree.
One of our applications, Trickster, doesn't work sandboxed (being a system utility) as is and we're in this situation where we have customers on the Mac App Store who can't receive updates anymore. Needless to say, neither us nor our customers (who'll mostly blame us) are happy about this.
Same. I develop Space Gremlin and I've been sitting on an update now for 6 months because any new submission will severely cripple the usability of the current app. I really wish there was a good alternative like Steam that I could sell on instead.
The accounting software I develop [1] is a year out of date on the Mac App Store because of the restrictions that Apple have introduced.
And yet I disagree with Marco: This is a one-time problem as Apple tighten requirements for publishing on the Mac App Store. In a few years users will have forgotten this problem (if they ever noticed in the first place) and the benefits of the Mac App Store will make it the dominant distribution platform. Those benefits are: easy to make a purchase, easy to see reviews for a product, easy to find a product, easy to install on multiple machines.
Eventually I'll have to change the software to meet Apple's stricter requirements so it can be published on the Mac App Store again.
Really think he's missing the part where laypeople will see the Mac App Store as their main portal for software. Sure us geeks will always know what the best option is, but what about from a common majority consumer's perspective?
Every time that layperson goes to look at their purchases and some are missing or they update and lose a feature, they may not know or care to find out what's going on, but Apple's "it just works" marketing message gets a black eye.
They'll find out about software the same way that the "majority consumer" used to; reading magazines like Macworld, seeing advertisements in various places, and talking to friends. If anything, the App Store is the one trying to break into the old model of buying software (find something, pay for it, buy it).
In fact, the two conveniences offered by the App Store (visible to users) is ease of finding stuff and security. Security isn't much of an issue with the old model because, historically, there hasn't been many security concerns. If the "finding stuff" benefit goes away too, then the App Store will be completely irrelevant.
>> They'll find out about software the same way that the "majority consumer" used to; reading magazines like Macworld, seeing advertisements in various places, and talking to friends.
I don't know many "consumerish" Mac users who read Mac magazines, Daring Fireball, etc. Hell, I know a lot of more sophisticated Mac users who don't read any Mac related content except when they have problems. Many of the consumer Mac users I know don't pay attention to ads for Mac software either.
On your third point (word of mouth), that one has a lot of validity, assuming that a consumer mac user has a mac nerd or two in his/her circle of friends.
If I look at what most of my consumer Mac friends and acquaintances use, most of their time is spent in Safari, iTunes, iLife, iWork and maybe something like Parallels or VMWare to run "work stuff". A few might install some apps here and there, but few would even know that there was anything outside the App Store that was worthwhile.
I've nightmares about endless apps having their own updaters. Adobe/Microsoft updaters really are the culprits here. Its this that makes me wish the app store would drop these product-killing restrictions. I don't trust every app developer to do updaters.
Many (most?) indie mac apps use Sparkle (http://sparkle.andymatuschak.org) to autoupdate. It's generally easy to use, so most people don't need to go the microsoft/adobe route.
Sparkle is amazing. For years and years, I thought it was actually part of the OS, as almost every app I had used it. That's how ubiquitous it is on the Mac!
Edit: Very interesting! The guy behind Sparkle (Andy Matuschak) is actually an Apple employee, working on UIKit! http://andymatuschak.org/
> By day, I work on UIKit to help other people make things.
this whole article goes over my head, because it assumes the reader is already familiar with the sandboxing issue. I'm not a desktop OS X developer; I'm not familiar with the topic. I searched around and the only articles I could find are blog posts about how the sky is falling, with no sources from Apple that plainly state the app sandboxing requirements.
If you're going to write an article about how some upcoming thing is a doomsday event, please explain the event clearly and link to the proper sources, especially if your article is a criticism of a technical specification. That one or a few applications are backing out of the app store is not, in itself, evidence that they are justified in doing so.
This article has taught me nothing, but it has made me more afraid.
that's exceedingly arrogant. All I'm asking for is a link to the Apple rules. Surely there is a document that explains the rules. That is all I want to see.
There probably is and we have great tools for you to find it yourself. This is a growing issue and Marco's audience has been following it for a while. I see no need to bring readers up to speed with every single post.
(I, too am not a developer. But I know how to use Google to educate myself.)
I'd like to understand the implications of sandboxing too. It must have some large negative impact on Instapaper for Marco to care this much. What is it?
In short, sandboxing is an idea where every app lives in an isolated environment. By default an application can only access its own files, every other system resource is unavailable until the user explicitly allows the app to use it. It has impact on many apps, for instance your favorite git client won't be allowed to open the repositories you've been lately working on, just because they're located in Documents.
His complaints are entirely from the Mac user perspective. (And are complaints that I share.) Tons of great apps can't be sold via the app store unless they degrade themselves to comply.
Future of irrelevance to who exactly? The 1-in-10000 users who like to play with root level access and scripts and stuff perhaps, but do the other 9999 even notice the change? I'm thinking not.
Granted it's a big change from earlier times, the strategy of sandboxing works really well for security and general OS integrity (as iOS has proved) in a world that has no patience for fragmented and exploitative software. Perhaps we lose a little something in the process, but it's in the service of a better overall integrated service and experience.
Oh and by the way: Microsoft is going the same way too with Metro.
Not exactly. Metro and its app store are only "half" of Windows 8. The other half isn't sandboxed in the walled garden style. To put it another way, Microsoft isn't building the Windows App Store on the back of existing software, but from the ground up using only sandboxed apps.
Microsoft isn't likely to create the sort of moving target for developers that Apple has recently because doing so would not be consistent with the B2B model that underpins their corporate culture.
FYI, this is not true for Windows RT (the ARM version). That version will only support applications purchased through MS's store. Given that MS is clearly pushing for Metro to be the future of Windows, I think this is somewhat concerning.
On the other hand, Windows on ARM is an extension of the Windows franchise in the same way that WP7 with it's app store is. In a sense it is like talking about the iPad when discussion the Mac App store...or would be if iOS applications ran on the Mac.
Metro is a future of Windows, just as Server Console mode is (though admittedly on a potentially different scale).
Well of course, it's all in what one believes to be the way Apple is going in regards to the Mac platform. In the context of the original article, that way was toward a walled garden in which the wall was moving and growing higher from a developer's perspective in order to effectively bring more applications in house.
I don't see Microsoft going that way because they give a shit about software developers.
The concern is not the user experience of developers. As you point out, that's a small crowd with little market significance.
But sandboxing cripples, besides our ideals and personal experience, the apps we make. Users (of powerful, hard-to-sandbox apps) will suffer from this. Those users far outnumber the developers, and have great market significance.
Again, Apple doesn't really care what developers feel. What matters are the masses. Which don't care about these issues. The App Store is easy to use, accessible and allows them to download their apps on multiple Macs. It's a win win from their point of view.
Apple's thinking is simple: You (us developer) don't like it, PayPal is that way". Just that simple. And since The App Store is where most non-developer mac users get their apps from.. we have a simple choice.
Do I like it, hell no. Will I play by Apple's rules, yes.
Sure developers will play ball as long as Apple is where the money is. There's no doubt that iOS's amazing run of exponential growth and the creation of a distribution platform as frictionless as the App Store have been irresistible to developers.
However I think it's also fair to say that early adopters and developers were key to Apple's recent success. For instance, OS X lured me back to Macs from Windows in 2000 largely on the promise of unix under the hood. Many early iPhone users were jail breakers either to develop apps or to use them. Developers were on the forefront of Apple success, and I think it's impossible to quantify the impact that we had, and thus extremely easy to underestimate it.
It's true that Apple is very mass market and could not achieve it's design excellence by listening to geek feature requests, but I think they'd better tread lightly on the larger issues. If they turn OS X into a toy OS that doesn't allow serious development and pushing the technological envelope, and all future innovation comes exclusively in the form of Apple new APIs, then I think that will be a harbinger of Apple decline as the geeks and early adopters look for the next frontier. Apple without the hacker community is not nearly as strong as it is today.
All these problems could be solved if there were a way for companies to transfer your existing non-App-store licenses to the App Store, allow for paid upgrades on the App Store, and allow App Store licenses to be associated with real serial numbers (or something else) that could be used to "take" your license with you, in the event the App Store suddenly becomes too restrictive.
Basically make it seamless, monetarily and "upgradily", to go to/from the app store.
That would be a good start, although a significant expansion in the available sandbox entitlements would also very likely be needed for developers who have already fled to go back to the store.
Well I imagine you'd only ever need to move to the App Store because of an upgrade which required it (because certain features are only available to App Store-distributed applications). So Apple would get their cut of the upgrade fee, and future upgrade fees as well. Obviously it's not as much as the full app immediately, but ideally over time it would add up for them with paid upgrades.
As a customer I totally agree. I wouldn't care about simple 5$ tools being removed. But if more expensive tools like PDFPen Pro (which I bought) get removed then I will certainly be very upset. And just because this danger is present I'm very careful and not buying any expensive software anymore from the Mac App Store whenever given the choice (except Apple software of course). That's a pity for both developers and customers.
Somewhat related anecdote in light of 10.8: I never properly upgraded from OS 10.5. I pirated 10.6 and skipped 10.7, partially because I'm a cheap college student and partially because the changes between versions never seemed that dramatic to me.
I had to re-install my OS and thus downgraded from 10.6 to 10.5. All of a sudden, none of my iLife apps are working, plus a host of others, and I can't download Apple ones designed for 10.5. I lose momentum scrolling, and realize that, hey, an upgrade once every three years is a good idea. I wait for 10.8 to come out, so I can buy it like a good, normal consumer, and soon discover I need 10.6 to install it.
So I'm buying two upgrades? Alright...wait, I can't buy 10.6 anymore. 10.8 is the only OS available, and I don't have 10.6 on a disc. So it looks like I'm pirating again.
The neckbeards will buy from the developer. The ordinary customers Apple has been relentlessly targeting for a decade will buy from the Mac App Store. Which group is bigger?
It's not about where we buy our apps, it's about where we sell them.
If the App Store doesn't work for developers, it won't work for users. The users can't buy stuff that isn't there just because they outnumber the developers.
Other developers will happily come along, work within Apple's guidelines, and sell to the giant slice of the pie chart. It's obviously a radically different model of software development, and one that's probably far less appealing to current developers, but that's what happens when industries change. The incumbents never like it.
The same thing's coming to Windows, too, thanks to the Windows Store. The whole industry's headed toward appliance computing. Maybe you can sell software to Linux users.
That makes it sound like sandboxing is a matter of personal taste, and that if I as a developer consider my self "too good to sandbox" someone else will eat my lunch.
I fully agree with your line of reasoning in many other debates about the App Store, and I'm not against appliance computing. Take the 30% cut, for instance. If Smile had discontinued TextExpander's App Store presence because they didn't like the 30-70 split, I would be the first to point out that this leaves money on the table for someone who runs a lower-margin business, and that everything is in order as per the free market.
Sandboxing is different in that many features can't be done; not because of price point or some rms-esque FOSS principle, but because policy is holding the technology back. Developers are not given a choice.
It's not about "a different model" where incumbents refuse to compete and lose their foothold because of stubbornness. It's about limiting what any developer can do; they're not given a choice.
We will have perfect competition in the realm of distraction-free writing environments, but we will have no third-party backup tools. Nobody can happily come along and write one, because nobody is allowed, incumbent or otherwise.
TL;DR: No, I'm not speaking out of fear of competition. Sandboxing doesn't just hurt incumbents. It limits products, not development models.
I don't disagree with you at all; I just think that in the appliance-computing future where your computer is basically a toaster, the vast majority of users (ie, potential software customers) won't care about the "missing" apps at all. It's not a coincidence that most of the apps running up against sandboxing are tools aimed at power users -- third-party email clients, application launchers, BBEdit, backup tools, etc. These are specialized apps for users on the thin end of the bell curve. I don't think Apple cares much if they lose all the people who care about third-party text editors if they're able to start selling computers to millions of other ordinary people who have heretofore been terrified of installing software on their computers.
I'm not happy about it either; I guess I just understand what Apple's trying to do, and why Microsoft seems so eager to follow suit by setting up its own store. The "you have to be a computer guy to use computers" era is almost over.
For ordinary customers, software that's not on the Mac App Store might as well not even exist. Developers who try to make a living selling serial numbers on the web via Paypal are going to be selling to a tinier and tinier slice of the market. We're headed toward appliance computing for almost everybody and a Linux box for the rest of us.
Apple understands this; hell, even Microsoft is starting to understand this. Does anyone doubt that we're headed in the same general direction on the Windows Store?
There are two issues that are coming up here: sandboxing and paid upgrades. They are quite different.
As a consumer I am completely for sandboxing for myself and for other consumers. In a world where malware is increasingly a problem sandboxed apps will become the norm. That's the reality we live in. Sandboxing being a requirement means that I can fairly safely install anything from the (future) Mac App Store.
The OP correctly points out that certain system utilities cannot be sold this way. He is correct but consider the alternative: to not require sandboxing means no one will bother implementing it. Of course Apple could make effort to promote apps that do (or hide apps that don't) but this puts a considerable education burden on the consumer. I'm with Apple on this one: it's simpler and better this way.
Now paid upgrades I have mixed feelings about.
On the one hand paid upgrades can produce the wrong incentive on the developer: I've seen good apps go from 18 month major version upgrades, to 12 months to 6 months with no reduction in upgrade price. I've also seen old versions abandoned for pretty lame reasons.
IMHO having all users on the same version is better for the developer and the consumer. It makes support easier. It creates a consistent experience.
But on the other hand I do feel like there is a place for paid upgrades.
Are in-app purchases a possibility here? I honestly don't know what's possible with the Mac App Store here.
I think developers do get too concerned with turning a user into a perpetual revenue stream however. This is really an old business model that is somewhat outdated.
Steam provided the first evidence of this that I can recall. Some years ago they started selling older games for $5 and under. In some cases IIRC the revenue for discounted sales exceeded release date sales at the premium price. More: [1]
The iOS App Store produced and continues to produce further evidence that lower prices and a higher volume can often be a better result than selling the "old" way (higher price, fewer units, which typically also involves paid upgrades).
Often content producers (and I include developers who sell software in this) don't always know what's best for them. This all sounds remarkably like Netflix in many ways. Netflix has provided a means of monetizing old and less popular content yet Hollywood seems to view them as the enemy.
Perhaps another model worth considering is to start the price of your app low and as it improves and gains popularity, steadily (and predictably) raise the price.
Has anyone tried this? Did it work?
EDIT: sandboxing goes beyond "malware". I increasingly don't want apps making arbitrary changes to my system. Some may be what I want but most won't. This includes things like forgetting to untick the checkbox that installs some browser toolbar to (on Windows anyway) apps making arbitrary (and sometimes wrong) changes to local policies, registry entries, etc (so the Mac equivalent of that).
EDIT2: as a consumer, I want to buy through the App Store. Apple has my payment details. I have a common place to get updates. When I buy from a third-party site I have to deal with:
1. Registration;
2. A payment gateway that may or may not work;
3. Despite an automatic payment a human may need to email me a license file and/or download link that in some cases has taken days;
4. Whether or not to trust your site with my information; and
5. A completely separate process for updating.
So anecdotally as one consumer, if your app can be on the Mac App Store and isn't I'm simply not buying it with very few exceptions (eg I'd still buy Photoshop even if I don't want to).
> In a world where malware is increasingly a problem
I don't see that as the case. There is malware out there in torrent land, sure. But if you acquire software from reputable sources (like a paid app store, referral from a friend, heard about it on a forum like HN, package repository), malware just isn't a concern.
If you put malware on an app store, the world will notice, its rating will tank, and people will stop downloading it. Reputation is the sandbox.
It's funny , for years Apple were all like "Macs can't get viruses!" now they're saying "we need app store lock in too prevent malware!".
I guess there's a distinction to be made between actual malware and software that co-installs crap (Ask toolbar etc) but often end users do not see that difference.
Besides, if something has passed app store verification then surely Apple are happy that it is not malware? Therefor they can be somewhat more lenient with sandboxing restrictions?
>Besides, if something has passed app store verification then surely Apple are happy that it is not malware?
I'm not so sure. What is their process for verifying that an app is not or does not contain malware? If it's simply to run the software and see what it does then they can really only verify that apps aren't immediately misbehaving. What if the app is set to do its misdeeds after the 100th time it is run, or after being installed for a month? There is really only so much a reviewer can do in order to push an app out within a reasonable time frame.
Sandboxing in a way is just as much protection from liability for Apple as it is protection from malware for its users.
>Besides, if something has passed app store verification then surely Apple are happy that it is not malware? Therefor they can be somewhat more lenient with sandboxing restrictions?
It's about minimizing the attack vectors. Sure, Acrobat, for example, is not malware and could be sold in the App Store. But there are tons of viruses and malware that targets holes in Acrobat. If Acrobat was also sandboxed, they could not do much harm.
>It's funny , for years Apple were all like "Macs can't get viruses!" now they're saying "we need app store lock in too prevent malware!".
Yeah, it's funny because:
1) Apple never said that explicitly.
2) It was (and still is true), i.e not that Macs could not technically get viruses, but that they had got no viruses, with the exception of some lame trojans. In all, a minuscule number of OS X Macs were ever affected by anything in the last 12 years, and even those clicked and installed it themselves.
3) All other naysayers, ignoring the practical lack of any real viruses on the platform, pushed for more protection and security measures.
Yes, they did. "Macs are safe and don't get PC viruses" to an expert means "it is possible that attack vectors still exist", but to the general public means "no viruses".
The very next sentence was "a Mac isn’t susceptible to the thousands of viruses plaguing Windows-based computers".
Which it wasn't.
As for custom viruses targeting OS X, none had been seen in the wild for a decade (only some trojans did exist). So the general public's assumptions "Macs are safe" was grounded in pragmatic reality.
That something is theoretically possible (e.g a meteor hitting my house) doesn't make it a real threat.
Now, one could argue that an OS X virus is not only theoretically possible but, unlike the meteor example, also easily achievable.
But still, something being both theoretically possible and easily achievable doesn't make it a real threat.
E.g a neighbour setting my house on fire. I'd rather start worrying about it when it starts happening frequently (instead of never).
I don't see that as the case. There is malware out there
in torrent land, sure. But if you acquire software from
reputable sources (like a paid app store, referral from
a friend, heard about it on a forum like HN, package
repository), malware just isn't a concern.
Are you kidding me? Have you missed the large number of browser-based vulnerabilities, from Flashback and MacDefender, to the huge number of vulnerabilities the latest Safari fixed?
Would having a sandboxed app store help with any of these? The point I think he's making is that a curated app store isn't going to be a significant malware source.
I don't think that was his whole point. He's speaking in much broader terms in his first paragraph, which is what I quoted. He does make mention of reputation and the app store specifically in the second paragraph.
Right, but if a malicious JPEG exploits a code execution vulnerability in libjpeg, a malicious Web page exploits a WebKit vulnerability, or a malicious certificate exploits a bug in OpenSSL, proper sandboxing can do a great deal to mitigate the damage. Thus if "most" applications on OS X are sandboxed, the platform and its applications become less attractive targets for those exploiting what are often cross-application, cross-platform vulnerabilities.
Sandboxing is not restricted to the App Store, by the way.
"malware just isn't a concern." - beg to differ. The fiasco over Path (of which I continue to be an avid user) shows that Malware is an issue even in the iOS store. Sandboxing, and absolute limitations over what data any given application has access to, is the future for OS X applications.
Wow. You should get out and talk to the person on the street and what they think about this.
The only reason your sources are "safe" is because they are not the popular ones.
Malware goes after the high volume targets. If your OS has 2% of the market, yes, your binary packages are probably relatively safe.
But the situation is different for the Microsft's, the Google's (Chrome will no doubt be targeted as it gets more popular) and, eventually, the Apple's.
Apple was always safe because it was not the OS of choice for most of the population. It was niche. If you haven't noticed that is changing.
It's funny because some of the stuff I'm working on is, by design, "sandboxed", but I never think of this as it's most valuable "feature".
Sometimes we do not see the obvious. I'm sure in my case I'm missing something, and I think in yours too. Malware is ++huge problem. And there's no solution on the horizon. If your OS relies on people outside the OS developers contributing "apps", which users prefer to download and install as opposed to reading code and compiling themselves, then your "app store" is vulnerable.
A friend of mine recently got a virus. The kind that starts emailing all your contacts. I've seen this happen to family and friends repeatedly over the years. Nothing has improved.
When I mentioned it to him, his comment to me was along the lines of "Yeah, it was especially difficult to deal with because it was a Mac."
It's not easy to get at the innards of anything Apple makes, expecially these days when they are trying as hard as ever to prevent you from understanding how it works. If something goes wrong, you're fsck'ed. Unless of course "Customer Service" can help you. But when you become the next MS, there is no such thing. Customer Service, the human kind, does not scale.
As someone who makes a living on the app store, I have to say having an ARPU of 70 cents really sucks. Also, that people expect updates forever because they gave you 70 cents also sucks. Had an absurd conversation today giving customer support for someone with unbelievable expectations for half the price of a cup of coffee.
You will have to work really really hard to convince me that this race to the bottom is a good thing for software devs.
I think we need to be separating the concept of sandboxing from Apple implementation of "Entitlements". From a technical and user perspective, sandboxing is a good thing, it keeps the app from doing what it's not supposed to be doing. However, Apple entitlements are a laughable hand-me-down subset of sandboxing. If Apple expanded their list of entitlements and simply required justification for use of those expanded entitlements during the review phase, they could still keep things safe for end users.
As a consumer, I only buy from the App Store when it's absolutely required.
I vastly prefer to live outside the App store and the Sandbox. I haven't gotten malware in years, and I don't particularly expect I'll get it in years to come; if I do, it'll probably be a zero-day in Flash, OpenSSH or HTML5.
So, anecdotally, as one consumer, if you have a program in the Mac App Store, I really won't be buying it unless its an OS upgrade or I really want it.
MAS has a killer feature (as with iOS App Store) - one purchase for multiple users or machines with one credential set to manage purchase and upgrade.
Handling multiple software instances and licenses is a HUGE pain point for the customer - it gets exponential if you're forced to manage licenses across machines and/or users (thus site-licenses for businesses). It is tough enough even for a power user with license management software, but literally a nightmare for the un-initiated.
I'd argue the exact opposite. Case in point #1: I bought a license for Sublime Text 2 from the developer. I'm free to use it on all of my Windows, Mac, and Linux systems. (Meaning, when I'm forced to use Windows, I actually have a decent text editor I like!)
Case in point #2: I'm trying to buy 15 copies of OS X 10.8 for our older (>1 month old) Macs. I have two options: create 15 Apple IDs, or buy 20 Volume Licenses from Apple, 5 of which I will never need as we don't buy used computers.
Don't get me wrong; I love that I paid for Aperture and can effortlessly install it on multiple systems. (Photoshop, on the other hand...) But it's not everything to everyone.
I don't understand why this [1] doesn't work for you - I was able to put exactly 12 licenses in my cart. According to this page [2], the process is to download the installer once and distribute manually.
It should be obvious, but to make it even clearer.
- Centralization of control. Apple has frequently jacked around with iOS programs being allowed/disallowed; why should we let some central authority control what we have on our computing devices?
- Centralization of malware. Monocultures are subject to waves of viruses.
- Limitations. The more interesting your app, the more places it needs to touch. A "fun" limitation I noticed this morning is that Mail.app's sandbox poses significant limitations to GPGMail. That's not good- hopefully I can continue to have encrypted email at will with Mail.app. Fortunately, the open source world provides encrypting email clients.
- Should Apple know what I have installed? App stores give them that knowledge. Is there a right that App stores take away?
Obviously, I have no great faith in Apple, Microsoft, Google, Facebook, or the other centralization advocates. I don't see that I should.
"It should be obvious", followed by a semi-paranoid rant just makes you sound like a wild-eyed conspiracy nut. "Monocultures are subject to waves of viruses." Whoo! Actual waves! That is scary. And buying from the App Store lets Apple know what you have bought. I never thought of that, that is really scary too. But the following non-sequitor just baffles me.
I think he was driving at the idea that people who make malware tend to go for larger ecosystems.
The reality is that everything you do on a network involves some form of risk. You can mitigate these risks by performing tasks in a standardized way using only approved software, but a packaged Zero-Day that's tuned for your environment will generally succeed.
Getting a Kaspersky Payload isn't that hard to find any more; preventing hackers from knowing what anti-virus you're running is your responsibility.
In short, everything is about risk mitigation. Running the same software as everyone else exposes you to the same risk.
By the way, this point is tangential to the larger point at hand which is: Apple doesn't care about its developers.
I don't see how any of what you said is relevant. The distribution mechanism (Mac App Store) has absolutely nothing to do with everyone running the same software. And in fact the required sandboxing (one of the alleged problems with the Mac App Store) goes a long way to mitigate a lot of risks in remote exploits.
If the Mac App Store had a grand total of 5 apps then I could see where you're coming from, but it launched with over a thousand apps and it's had 1.5 years since then to acquire many more. There's no monoculture.
Do you have any evidence for your claim that malware would be worse due to centralization?
Apple has a very good record on malware via both the mac and iOS app store, best I can tell.
I completely get why a dev would hate the app store, but form an average consumer standpoint - it seems brilliant. Unless you are scared of Apple finding out that you installed "Evernote" or "Twitter". OMG!
There may be a deeper issue here. That being the difference between owning a computer which can do what ever you tell it, and owning a tool which only does the things you bought it to do.
It used to be pretty clear that your phone wasn't a 'computer' in the sense that it had to offer up generalized support for doing development. So pre-packaged 'apps' were the norm, and the model of privilege constraint made a lot of sense.
If my laptop is simply a web browsing/mail reading/status posting machine with a bigger screen, then it has a similar requirement set.
But laptops have evolved from desktops which evolved from personal computers, which evolved from computers in general and the flexibility knob was historically turned to 11.
I'll reiterate my surprise that Apple doesn't (yet?) have two 'loads' for the Macbook, one which is a general purpose OS (MacOS) and one which is an application hosting OS (iOS).
> I think developers do get too concerned with turning a user into a perpetual revenue stream however. This is really an old business model that is somewhat outdated.
Outdated?!?! I thought it was more relevant than ever with all the web apps out there.
>The OP correctly points out that certain system utilities cannot be sold this way.
It's not that simple. Arbitrary features of your favorite app may not be allowed on the App Store just because Apple didn't feel it worthwhile to create an entitlement for that feature. Everything they add has a cost and it's very easy to say "that's not important."
Perhaps another model worth considering is to start the price of your app low and as it improves and gains popularity, steadily (and predictably) raise the price.
. Has anyone tried this? Did it work?
Huh? What apps are you installing? I've never seen an app use an increasing upgrade cycle or attempt to make arbitrary changes to my system (or toolbars).
I guess I'm just used to Mac being a professional platform.
> On the one hand paid upgrades can produce the wrong incentive on the developer
The incentive to continue development?
> ... Steam ...
You can't compare traditional software to games. A game is finished at some point. There are only bug fixes to come.
Traditional software is expected to get feature updates. If you want to see what happens when software gets to the "finished" status take a look at the outcry Sparrow created. They halted development of new features and people went crazy.
As a developer I'm perfectly fine with no paid upgrades but then don't expect updates for your already bought software - you got what you paid for.
>> You can't compare traditional software to games. A game is finished at some point. There are only bug fixes to come.
>> Traditional software is expected to get feature updates.
One might make the argument that DLC might be the gaming equivalent of a feature update. There are quite a few games that are released with the promise of new functionality via DLC.
Marco's follow-up post to this one is only technically correct because he uses words like "most", "many", "probably", and "nearly". In fact, do a Google search for "most many probably nearly" and his post is #6!
He should have just stood by his words. Instead of taking criticism, he reverts to treating readers as idiots who didn't understand his post. We understood. And many disagreed. It happens. From his follow-up post (quotes from original post):
I’ve gotten a lot of feedback on my Mac App Store post this morning, and I’d like to clarify some points and respond.
I did not say or intend to suggest any of these:
1. I will not buy anything from the Mac App Store again.
"But now, I’ve lost all confidence that the apps I buy in the App Store today will still be there next month or next year. The advantages of buying from the App Store are mostly gone now. My confidence in the App Store, as a customer, has evaporated.
Next time I buy an app that’s available both in and out of the Store, I’ll probably choose to buy it directly from the vendor."
2. Most Mac users will stop shopping in the Mac App Store.
"And nearly everyone who’s been burned by sandboxing exclusions — not just the affected apps’ developers, but all of their customers — will make the same choice with their future purchases. To most of these customers, the App Store is no longer a reliable place to buy software."
3. Most developers will stop putting apps in the Mac App Store.
"And with reduced buyer confidence, fewer developers can afford to make their software App Store-only.
This even may reduce the long-term success of iCloud and the platform lock-in it could bring for Apple. Only App Store apps can use iCloud, but many Mac developers can’t or won’t use it because of the App Store’s political instability."
I'm going to put this comment here even though it doesn't belong because I wanted to make people aware and there's nowhere else to put it. Some HN moderator is censoring stories about Twitter being down, probably the biggest tech story of the morning.
Then this other story, https://news.ycombinator.com/item?id=4296811, appeared briefly on the front page. But interestingly it wasn't killed it just doesn't show up on the front page.
If you scroll through the new story submissions you'll find a bunch of dead Twitter stories.
Yet another twitter outage does not qualify as "gratifying one's intellectual curiosity". Besides, links can get killed if too many people flag them, it's not necessarily moderation.
Anyway, such complaints should not hijack other threads, post a new thread instead.
This is one of the top 10 sites world wide being down for over an hour now.
Why is Google Talk down, which affects a fraction of the users, on the front page but Twitter is being actively censored?
Besides, HN comments are a lot of times more insightful and interesting than the link. We might have gotten a Twitter engineer commenting on what was going on or something else interesting.
> This is one of the top 10 sites world wide being down for over an hour now.
sure, and it's big news on outages@outages.org. (which used to be just fiber cuts but has evolved into 'major webapp down' notices, too.) - the idea behind outages@outages.org is to notify network operators when services that may impact them are down. The idea is that if my customer is complaining about things being slow to boston, well, if I read outages that morning, I might remember a fiber cut that would explain it. I guess the same could be true if my customers were complaining about not being able to get to twitter.
(personally, I'm in the camp that gets irritated when "random webapp is down" messages are posted to outages@outages.org. I don't care that twitter is down. I've chosen my customers well enough that they don't complain to me when something that is obviously not my fault like that happens. But, I am only one person, and the majority have spoken; I won't fight it. I will whine a little, though.)
News.ycombinator is not about outages, and really not about network and systems operators. Most of you use IAAS or PAAS. Though so far, I've seen a lot of tolerance for interesting network and systems operator stories.
But yeah, "x is down" is... not an interesting story. "X went down earlier today; here is what happened" sometimes is.
>Besides, HN comments are a lot of times more insightful and interesting than the link. We might have gotten a Twitter engineer commenting on what was going on or something else interesting.
I think uninteresting articles ought to be voted down or flagged or otherwise gotten rid of. If the comments are more interesting than the article, then it's an uninteresting article, unworthy of the front page link.
>Why is Google Talk down, which affects a fraction of the users, on the front page but Twitter is being actively censored?
You keep using that word. I do not think it means what you think it means.
How about a content rating system? People are familiar with this from movies, television and music. A developer could declare their un-sandboxed application the equivalent of NC17 while most users live happily in their equivalent of a PG13 universe.
I'd like to believe Marco is right, but let's not kid ourselves. There's been large, high-profile, five-year experiment on exactly this and the disappointing results are in: [almost] nobody gives a sh*t about Freedom 0.
I'm wondering how soon until we see OSX being jailbroken.
I suspect system level choices like this are bound to continue in order to push users in the direction of iCloud and AppStore usage (ie. vendor lock-in).
Though such moves appear draconian from our perspective, I believe Apple times such moves around when some internal metrics indicate a tipping point has happened, namely the developer backlash won't be substantial enough to affect Apple's bottom line.
It's only a matter of time before the iPhone dev team puts some effort towards serving the OSX crowd that wants to jump ship but not give up "OSX" completely.
It's bad enough that apps are leaving the App Store because of sandboxing. You don't have to make shit up. The owner of a Mac can enable root access through the Directory Utility, completely legit, and can install and run code from any source without even doing so. Jailbreaking OS X makes no sense; there's no jail.
That's quite another story. If I have to write Sci-Fi: Perhaps an underground scene that use and code for older versions of OS X, patching the system themselves to get only the good parts of system updates, while staying out of Apple's redefinitions of the computing experience.
Contrast this end-user experience with that of Steam's. These are essentially the exact same applications except that Steam has 1) carefully cultivated a huge, multi-platform ecosystem of apps and 2) has never willingly broken the application's environment.
Steam is arguably the only game distribution platform of significance (yes, yes, battle.net and its limited selection of widely played titles) and will continue to dominate because of the careful thought they've put into distribution.
I'm not positive about this, but it sounds like lots of the applications/utilities made by Many Tricks are going to be pulled, or will at least see slower/limited updates if any: http://manytricks.com/blog/?page_id=2208
TextExpander 4 is one of major examples. TextExpander 3 is still in store but version 4 is only available direct from Smile. There are also ClipStart (I think that's the name, a video management app), SourceTree (version control) and probably some other. More will follow, I believe.
A lot of software can't be sandboxed, full stop. So they joined the App Store before sandboxing even existed, but Apple won't approve any more updates so they will have to be sold outside the store.
This even may reduce the long-term success of iCloud and the platform lock-in it could bring for Apple.
Apple is never going to have the kind of platform dominance that would make iCloud really compelling. At least the other vendors realize that they need to operate in a polyglot world.
As much as I like my MBA and iPad I refuse to lock my essential data into a system I can only really access with hardware from a single vendor.
In that particular case he was lamenting the App Store approval process and how much it alienated developers. Here we are 3 years later with no change to that policy and no slowing down the App Store juggernaut.
I hope they will weaken this restriction, and will just not have non-sandboxed apps show up without searching for them, or without switching some "power user" option on.
Some types of applications simply cannot exist with these types of restrictions. They are removing entire classes (e.g. window managers, system utilities) of applications from the App Store with this.
But what's the real market size for window utilities? Not something your average user buys. I use size up and cinch, but I'm a developer. My dad doesn't care about fairly obscure system utilities. Besides, it isn't like those tools can't be used on ML-- just not through the MAS. MAS is a shopping mail, not a specialty shop. I don't by rock climbin gear from Target, but I'll gladly by an espresso machine from Target.
It seems like there's a market opening here for a third party App Store equivalent, that offers the single-point-of-payment / ease of upgrades advantages for the consumer, but eliminates the developer pain points (sandboxing and paid upgrades).
This is viable for as long as third party applications can still be downloaded and installed outside of the Apple App Store.
I don't know if it's been said or discussed before, but why don't Developers just make yearly revisions to sell on the Mac App Store, or even the iPhone iOS app store. You could have a Genericproductname'2012 or '12, and next year release a 2013 ('13) that has the new features, then remove the prior versions.
Several of my apps won't receive any new updates because they use the Accessibility API, which isn't sandbox-compatible. It's a real shame, because developing for the Mac App Store for the past year has been awesome. Unfortunately the money/discovery isn't there for sole devs to go outside the Store.
I think the first sign that App Store wasn't trustworthy was Apple leaving gcc out of the free version of XCode. I never trusted App Store. I think without Steve Jobs Apple has lost it's ability to pretend it's a decent company. Why doesn't someone do what RedHat did except for for consumers.
Seriously? Do you remember what company this is? Apple does this every time and then corrects it down the road. Why? Because they receive more press that way. Come on guys, they are not stupid.
The lack of paid upgrades is irrelevant. I never understood what people are missing here. Apple has shown how to do it: Phase out the old version and offer the new version as a new product.
On a second thought: If your customers are satisfied, they will give good reviews on the the new version and it will regain its rankings. That's a great risk for the developer, I have to admit.
This approach also forces the developer to build in a notification system to tell people the new app is out and how to find it.
I guess you could can make a small update to the 'old' app that will pop up a window notifying of the new one when the old app first starts, but that seems rather kludgy.
The thing that kept me from installing even free apps
from the mac app store is the same that keeps me away from Market/Play on Android: it requires Apple ID.
I have an incorrect understanding of the word sandbox in this context. So to clarify, what does the author of this blog mean by "the sandbox is restricted"?
There are a lot of cases where iCloud isn't relevant (at the moment) -- many types of apps are not document-centric and I believe that, going forward, fewer will be.
Sometimes being early is the same as being wrong -- we've seen that time and time again. If a new player were to come along and capture the momentum of this meme, I personally believe that they'd stand a better chance than a player who has been in the space a long time. It's unfortunate, but that's just the way it is -- or always has been.
There is already such an app, Bodega, which was launched before Mac App Store. There was (and probably is) another one from the developers of iusethis.com
for those looking for an alternative app store for mac, look at http://appbodega.com/ I think it is great. It looks good. It is easy to interface with and they appear to be quite responsive.
I think the idea that Apple can't require MAS is wrong. Maybe they can't for everyone but I would bet they can for a lot of people and probably plan to. At some point developers will be buying the developer version of the OS that gets them around the sandbox.
Marco is accusing Apple of having made a critical strategic error. Where are all the people who call him a member of the Apple cult and all that crap? Funny how they disappear when it comes time to talk about what Apple is doing wrong.
To me the tone of the article is that he is worried that Apple is missing chances of further enhancing lock-in to the Apple brand, specifically on the Mac OS desktop platform. The guy re-purchased apps he already had so that he could help with that lock-in. I see the rationale for doing that but when you start throwing free cash on duplicate app purchases to enhance lock-in I don't know what could be more cult-like.
Picking which operating system to use enhances lock in by itself already. I've bought software both inside and outside of the mac app store and since I have that software I'm locked in to the program. If I could move all my applications to the app store I would do it in a heartbeat because it gives me one place to go for updates. Apple is certainly missing out on opportunities, but not only for "locking people in" but also for making their user experience better.
In the article he says he did it for the purposes of convenience. I don't understand why anybody other than apple would be interesting in "helping" with lock-in.
Also, the lack of direct contact with customers is obnoxious at best, crippling at worst. But the sandboxing entitlements kill the best bits of functionality, so I have not much choice but to say "fuck it" and move on.
I doubt Apple will miss me, though I'm the only other worthwhile editor on there other than iMovie and Final Cut in terms of actual editing functionality. The video category is a giant pile of shit save a few select pieces, mine included.