Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This has to be a management problem. Apple has total control over the hardware, total control over third party developers, and $203 billion in cash. What are they doing wrong?

Apple has the resources to approach software like aerospace. Formal interface specs, heavy Q/A, tough regression tests, formal methods. It's expensive, but Apple ships so many copies that the cost per unit is insignificant.

Microsoft did that, starting with Windows 7. Two things made Windows 7 stable. The first was the Static Driver Verifier, which examines driver source code to check if there's any way it can crash the rest of the OS. This includes buffer overflows. The driver may not work, but it won't take the rest of the kernel down with it. All signed drivers have passed the Static Driver Verifier, which anyone can run. Driver-caused crashes stopped being a big problem.

With the driver problem out of the way, any remaining kernel crashes were clearly Microsoft's fault. (This has the nice problem that kernel bugs could no longer be blamed on third party drivers.) Microsoft had a classifier system developed which tries to group similar crash reports together and send the group to the same developer. It's hard to ignore a bug when a thousand reports of crashes from the same bug have been grouped together.

That's part of how Microsoft finally got a handle on their software products. Is Apple doing anything like this?



Nah, I think it's a perception problem.

As someone whose starry-eyed Mac obsession predated Windows 95 - Apple's software has always been buggy. It was buggy under Sculley, it was buggy under Amelio, and it was buggy under Jobs. I remember getting plenty of sad Macs under System 6 and 7, and early versions of OS X weren't any better.

We just didn't care because Steve Jobs was really good at distracting us with promises about how great things were going to be, really soon now.

The comparison with Microsoft is instructive. Microsoft software was even buggier than Apple's during their period of greatest dominance. Win95/Win98/WinME would crash all the time, and was an open barn door for security. Early versions of IE were pieces of shit. Even later versions of IE (6-9) were pieces of shit. Microsoft finally got a handle on security & software quality just as the world ceased to care about them.

Apple's been driving change in the computer industry since the iPhone was introduced in 2007. New products are always buggy - the amount of work involved in building up a product category from scratch is massive, and you don't know how they'll be received by the market, so there're frantic changes and dirty hacks needed to adapt on the fly, and they often invalidate whole architectural assumptions. It's just that most of the time, this work goes on when nobody's paying attention, and so by the time people notice you, you've had a chance to iron out a lot of the kinks. Apple is in the unenviable position of trying to introduce new product categories while the whole world is looking.

The Apple Watch is buggy as hell, but I still find it useful, and pretty cool.


I think you've touched on something key: people's acceptance of bugs is often outweighed by the new features they get. In other words, if the "new functionality" is perceived to outweigh the costs of the lack of reliability, the product will be deemed "less buggy" because people "understand" why there are issues.

This made total sense in some of Apple's biggest products: OS X and iPhone. When OS X first came out it couldn't even burn CDS, but we all "understood" the magnitude of the project and thus gave it some slack. Similarly, the iPhone lacked a lot and was slow, but it was such a revolution that we let it slide -- in fact we let the rest of the products slide.

The problem today I think is that these decisions are being made for reasons that users don't deem "worthy". Introducing some new music services is not a good enough reason to break my iTunes. The fact that a new watch was released is not considered important enough to let other platforms languish. We "get" why less attention is being paid to other products, but unlike with the phone, its not deemed a good trade off.

In other words, I don't think Jobs was distracting us with promises, but with actual shiny things that made the bugginess worthwhile.


As a long-time Apple user and former employee, this is exactly how I feel about the current situation. I still think that my Mac/Apple devices are more solid than my Android/PC devices, but this is exactly what I'm noticing more regularly.

I forgive most faults that happen because it's almost as if I can forgive them not working all the time since 99% of the time, everything is awesome. On Windows, that same forgiveness manifests itself as me not using my Windows machines as much as my Macs. I still love to use my PCs, but not for anything that I need to rely on the majority of the time.

Now, though, Apple is making changes to things (iPhoto/Aperture were a really great example) where it seems like the change is just to bring parity of some sort to OS X and iOS rather than introducing new features. iPhoto was buggy as hell when they added Faces and Places to it, but I totally forgave that because 99% of the time it was making my life way easier than it was before by detecting faces properly. If it crashes every now and then, it at least saved the data, so I was still better off than I was before the update. I still like Final Cut X (I know, I know... I'm an outlier), but convincing me that a switch like iPhoto/Aperture -> Photos is worthwhile is much harder since there's nothing to distract me away from those issues and I've somehow managed to actively lose features that they convinced me were necessities in the past.

I hope this is not an indicator of things to come. One thing that gives me some hope is that they've gone back to alternating between feature updates and stability updates. Leopard was cool, but Snow Leopard was incredible to me. If that pace comes back, I'll be happy again. Until then, Apple needs to get their software game back in line with the rest of the company.


If you think that it was bad when they removed features from iPhoto, then I'd hate to think what you thought when they removed features from Numbers!


Oh yes... That was a bad move, I think. Luckily, I rarely have to use Numbers so I didn't really care. It just annoyed me that they removed some of the features that I actually did use when I needed to use Numbers. If they added the features back as quickly as they did with other apps, I wouldn't care, but they didn't. :(


I (Apple) will one-up you with Final Cut Pro X http://arstechnica.com/apple/2012/01/more-fcpx-fallout-top-r...


I love the new FCP. As a long time user of FC7, I'm ok with losing out on some of these features as long as they added them back over time and they've done that, for the most part (at least for my uses). The old FCP really needed a facelift and was trapped in an such an old mindset when video was still mainly stored on tape drives and needed to work like real life video editing tools. FCP X is so fast for me and such a treat to use for 99% of things that I can deal with having to jump back to FCP 7 every once in a while. As long as Apple doesn't somehow prevent me from using FCP 7, I don't care and love the new direction of FCP X.


Reminds me of a discussion that was on anothe HN article a few weeks back where someone proudly stated that if a feature customers used didn't fit for in with the companies strategic direction they'd drop it, and tough luck for the customer.

Apple seem to have the same mentality. They used'to get away with it, but mostly because they replaced it with something better. Now they just seem to drop features entirely. That's not a good way of going. As much as I despise Steve Jobs, he never let that quality of a product drop to the degree that big customers (or even smaller customers) left Apple without a major fight to keep them.

It's looking like Apple's obsession with making great and quality products is taking a bit of a backward seat. I think they probably need to worry a little less on their schedule, and more on polish and feature completeness.

Rather remarkable I'm actually saying this, to be honest! Apple would be the last people I would have guessed needed this advise...


I think that's an awesome sentiment if you're talking about something like an Arduino, where part of the experience is working around its quirks and limitations. If you've bought a device expecting it to basically be a transparent window into the internet (or your documents, etc.), having to deal with its quirks and limitations can put a really bad taste in your mouth. Especially if you paid top dollar for it.


"I think you've touched on something key: people's acceptance of bugs is often outweighed by the new features they get. In other words, if the "new functionality" is perceived to outweigh the costs of the lack of reliability, the product will be deemed "less buggy" because people "understand" why there are issues."

Right - which is why we have all of the snow leopard nostalgia: because none of the newer releases have given us anything substantive that we really needed to justify the hassle and the bugs.

I am trying to think of something - anything - that compels me to upgrade SL on my mac pro, and all I can think of is that nifty take-a-picture-of-your-signature in the Preview app that you can then insert into PDF documents.

Ok, and maybe USB3 ?

That's all I can think of.


AirPlay; much better multi-display support; tags in Finder; Spotlight enhancements. More than anything, the iCloud/iOS continuity features were also big if you had an iPhone or iPad, everything is just much easier to keep in sync.

I'm a Safari user (better battery usage for the # of tabs I have open) and it too has improved with El Capitan though that's irrelevant for Chrome/FF users.


ok, airplay I guess - although I've never used it, I do see people using it to good effect.

Worth mentioning that airplay is just userland software - nothing special, and no reason it couldn't have been added to SL.

I don't know about multi display, though - I've been under the impression that that is broken in new and fascinating ways with every single release...


Yeah absolutely. Snow Leopard's multi-display was great. As was Leopard's, Tiger's, and Panther's.

Then Apple broke it massively in Lion, and only finally resolved most of the (severe, productivity-destroying) issues with Mavericks.


Handoff is a really useful feature (when it works).

Also SL mamed Expose (that weird non-proportional grid view) that was reverted to the Leopard-style in Mission Control (of which Mavericks/Yosemite had the best implementation, and they've now broken its utility in ElCap thanks to hiding thumbnails by default. FFS.)

But apart from that... I think I preferred the Apple apps back in 2009-or-so.

To be honest, I think the latest Apple release cycles have been more about "remove a feature so that we can add it in again and sell it to our users again". Think multi-monitor support, something that worked perfectly in SL and earlier, and then broke fantastically with the full screen apps in .. Lion? ML? One of the two.


True Apple software has always been buggy, Apple calls it undocumented features.

Apple always makes up for bugs with newer devices with faster CPU and GPU units that make OS code run faster. That means buying a new Apple device to get better performance. The older Apple devices are left out of updates eventually and if they do update to a newer OS version it runs slower.

Apple is driven by an upgrade model to buy a new Apple device every three years or so. In the PC world Windows 7 can still run on old Pentium 4 systems and if I am not mistaken some of them can upgrade to Windows 19, the 32 bit version but it can still work. For example I used to have a Macbook that only ran up to 10.7 and 10.8 needed a newer Intel CPU to install. Anyone with an iPhone 4 is going to find the latest iOS slow as well.

It is in Apple's business model to sell customers a new device every few years or so and phase out old Apple devices.

Apple doesn't care if their software isn't the best quality as long as it is easy to use and will keep people buying new Apple devices to run things faster.

I myself like GNU/Linux better than OSX, because it can run on older PC systems and it runs quite fast and has a good quality to it. GNU/Linux is virtually unknown to the average consumer and when people get tired of Microsoft they usually just buy an Apple device. Apple devices are easier to maintain and use. You even got toddlers using iPads, that is how easy they are to learn to use.

Apple has saved up billions just in case they have problems. Apple has done well financially in an uncertain economy where other companies are struggling.

Only Alphabet seems to be doing better for some reason. Google's parent company. Google's Android needs better quality as well and since Oracle sued them over the Java API they have to change the way the OS works. The Web Services seem to earn a lot of money and Google's AI is very advanced.


Apple is driven by an upgrade model to buy a new Apple device every three years or so.

I disagree.

My wife's iMac is 6 1/2 years old, and still gets the latest OS updates (for free!), though we're upgrading to a new iMac soon.

My iPad 2 is almost 5 years old, and still gets the latest OS updates (for free!), though we're upgrading to a new iPad soon.

It is precisely because our older Apple hardware is still working well, and Apple still supports us with the latest updates to that older hardware, that my family is not only sticking with Apple, but we've recently invested in new iPhones.

Apple has earned our trust.


Yeah, OS X and the iPhone were new products, not just upgrades of what came before (as you say, OS X had significant limitations compared to OS 9). You didn't have to upgrade right away, but if you did, you got an entirely new experience.

Compare this to today's Apple, where upgrades add "hundreds of features" but feel mostly the same (except everything runs a bit more slowly). There's no coherent vision of what the future of the software should be like.


wow, I couldn't put my finger on it, but this is why I am starting to hate apple. I spend a lot of time on the computer so I like a lot of things apple does, mostly the interface and the UI. To clarify what UI means to me, it is how simple and intuitive it is to use the device, and give it the instructions to do what I want. I am willing to forgive a lot because this is good.

Apple has always been a fairly closed system and it didn't bother me more than not having the features I wanted. In El Capitan, it was different. Things didn't work well and Apple took over my whole system. With SIP(system integrity protection) I had no control. It would seem to turn protection back on after being disabled, and it takes a nontrivial amount of time to turn it off because you have to reboot the entire system into recovery mode, wait for it to connect to the internet and download a bunch of apple shit, and then select a language preference and then type a command into bash and reboot.

Deleting apps is difficult, changing settings is difficult, having siri take up 10% of my iphone is annoying, removing apps destabalize the system, installing my system from time machine reinstalls their system and settings and overrides mine.

I disabled most of apples applications and processes, the system in fairly stable, although I think I went to far with disabling notification center, but your point is correct.

tl;dr users are willing to accept a lot for revolutionary changes. Evolutionary changes with only marginal improvements are not going to make me forget that they unpredictably disallow me from using sudo and are fucking up all my devices doing things I don't want them doing in the first place.


A lot of people in this thread seem to think this is all about Apple not adding enough revolutionary features or something. But consider this alternate explanation: with years of experience comes a more sophisticated judgement. What used to seem good enough now seems to have obvious flaws, even if it the same as it was before. Lack of control is an example: beginners often don't notice or care much, especially if it feels simpler, but as your needs deepen it becomes more important. Being able to set things up and then not keep touching it is one of those tastes that develop with experience.


that is a good point as well. I definitely agree with it. The one thing I would add is that I repeatedly get update notifications on my iPhone. Due to the increase in lockdown of all features, I am legitimately afraid to update as the:

* provides security update

* increases iTunes performance

type descriptors, do not provide enough information about how they will fundamentally change my system. Most notably when I updated my iPhone and found out I loaded in some horribly inefficient talking pseudo AI that was not neutral, but a straight up negative feature consuming system resources.

I think you are really correct though, as you gain more experience and skill with technology you have more needs and better judgement. You can evaluate things better because you are aware of what is possible. The biggest problem isn't that they make changes, it is that those changes are not predictable so they become difficult to mitigate.


I'm also concerned about updates. For instance, I'm currrently having to route all my iPad web traffic via Charles proxy to remove any instances of style="overflow:hidden;" in a body tag are cleared out.

Why? Because in iOS 9.2 Apple released it with a bug that causes the viewport to zoom incorrectly on these web pages. This affects LibreOffice's OpenGrok, which I browsed regularly on my iPad.

They still haven't fixed this, and it's a major regression. iOS updates are few and infrequent. Consequently I'm seriously questioning what their updates actually do to my iPad and iPhone.


I wouldn't hold my breath. The iOS Mail app can not negotiate any TLS version above 1.0 (for IMAP, possibly SMTP too) even though it obviously supports TLS 1.2 because it sends a TLS version of 1.0 in the ClientHello message even though that same message will contain TLS 1.2 ciphers (AES-GCM).

I reported it in October and Apple's security team replied they're aware of it but it's still not fixed 2 releases later even though they probably need to fix like 1 line of code (the advertised version flag).


They have actually fixed it - if you get bit with it then you can reference rdar://problem/22242515

The WebKit bug is here:

https://bugs.webkit.org/show_bug.cgi?format=multiple&id=1528...

The patch to fix it is here:

https://bugs.webkit.org/attachment.cgi?id=268394&action=diff

The workaround, FWIW (thanks Simon!) is to add shrink-to-fit=no” to the meta viewport tag.

For me, it was too much effort to get OpenGrok fixed, so I just did a rewrite rule in Charles Proxy that gets rid of the style attribute.


I agree with your summation. To add to this, there are many things going on under the hood that none of us asked for that are taking up system resources, dialing home and draining battery life.

Some time, try this yourself:

    sudo opensnoop


The original iPhone wasn't slow at all. One of its main selling point was the speed of menus and apps (I forget what they actually called apps before the app store).

I think you forget how crazy slow feature phones were. Opening a GPS app and finding your location could take 5-10 minutes in 2007 on a feature phone.


They called them apps too. It's easy to remember with the infamous (quoting from memory): "You can write apps in HTML".


Guess I completely forgot that. Thanks.


I think you're right but people tend to take for granted the features that are worthwhile and underestimate the difficulty of making anything ever work correctly all.


>> Apple's software has always been buggy. It was buggy under Sculley, it was buggy under Amelio...

Classic Mac OS was buggy by design - it didn't have multitasking and memory protection, it was single user... Windows had that same problem before 2000 (well, NT 4, but not many people used that).

I use OS X daily and I use Windows 7 daily. I have far fewer issues with OS X for whatever reason that may be. My computers don't magically reboot or bluescreen nearly as much. It might happen every 6 months at the most, where with Windows it probably happens every 2 months.


I agree that OS X is more reliable. I have both a Mac and a PC in my office, and my 5K iMac has never crashed. I had some weird issues with Windows 10 after I upgraded my laptop, where it would hang after an update.


I switched from using Windows for 20 years to OS X (excluding Linux for work related stuff.) This is the first time I've been able to work from a laptop with my productivity level as good or better than a desktop. The design and usability surpassed what I expected. I haven't noticed any bugs.


I can't imagine having a different brand of computer, but there are lots of OSX bugs that cause my teeth to grind. The Finder doesn't remember that I only want one layout ever, and it resets to a random alternative setting regularly. There are still progress bars that pop over, and can't be hidden. The data display which shows used disk space and it is nearly all "other" is a bug as far as I'm concerned. My AppleTV(8) has stopped incrementing itself, but its not ideal. And as a genuine question, has anyone not had strange Xcode behaviour or crashes at least once per day? I currently have a slow motion simulator that changes views over 5-10ish seconds.


In terms of laptop OSes, OS X is by far the best. It's stable, usable and 100% desktop-OS focused. I can't stand the touchscreen features that Windows 10 tries to still force on you. Somehow my Windows 10 laptop got put into "tablet mode," and it was pretty unusable. I couldn't access the desktop anymore, it was slow and it took awhile to figure out the issue.

I think it was flipped on after an update, but why would I even want to be able to enable that mode on a laptop without a touchscreen?


If Windows 7 is "magically rebooting and bluescreening" often enough to comment on, then you have a hardware problem.


I don't use Windows very much, because I hate using Windows. But what I will give it is that I haven't seen a blue screen in the past decade for any reason short of bad memory, overheating, dead disk. I suspect that a lot of the Windows image problem is that people are free to buy really cheap hardware and fiddle with things they don't understand.


Apple merging MacOS with NextOS to make a Unix OS was the best move they could make at the time. It happened during the time of the $10,000 Unix workstation and Apple made OSX as an easier to use Unix. It was cheaper to buy a Macintosh than it was to buy a SparcStation or some other Unix workstation.

Because of Apple making OSX Unix based, it cut into sales of other Unix companies like SGI, and also GNU/Linux cut into sales of SGI and others as well.

But making OSX Unix based solved a lot of problems that Classic MacOS had that they couldn't solve.


"...magically reboot or bluescreen nearly as much." Mine never magically reboot. Ever. (OS X)


My Macbook Pro did which reminds me the extended warranty period is ending.

https://www.apple.com/support/macbookpro-videoissues/


I regularly get asked to fix someones windows PC, no such problem with people who have Macs. With OS X there are waves of releases(major versions) if I remember correctly - some introduce swaths of new code/features/replacement code. Some other are more of a speedup and bugfix versions. Maybe I'm wrong.


I go places on my Windows pc that I wouldn't dare take my Mac. I expect it to need repair.

My pc is my beater car, and it needs repair--regularly.

My Mac is the classic car in the garage, that only gets used for work, or safe places.


Speak for yourself. I used an old G4 PowerBook for grad school, and it travelled over 100,000 air miles, and into the various labs where I had to work, and also to far-off Asian countries for holiday. Plus, I didn't have to buy a developer kit: it came free with the machine.


I assumed that the travel referred to dangerous parts of the web.


Actually NT 3.51, which I used for dev and was great compared to my colleagues on plain windows.


Actually NT 3.1. 3.5 followed, then 3.51, then 4.0, then 5.0 (2000), and then the NT line ended as it was unified with the non-NT line.


Technically 9x line ended since Windows XP was NT-based and not 9x-based.


Yes, I suppose so. I didn't want to say the 9x line ended, because it's really the line of DOS-based OSes, and while the NT line is ongoing, it's no longer called NT. 2000 was the last version to mention NT, and it wasn't part of the name itself, just a tagline.


One of the things about OS X is that most of the time it's put on high quality, but non-exceptional hardware. So things like bad RAM, flakey power supplies, bad GPU drivers etc are almost never an issue with a stock machine.

Windows, not so much. The only stability issues I've had with windows have been related to poor drivers, almost exclusively from nVidia or ATI/AMD. The equivalent hardware for Apple machines either didn't exist at the time, or was running much less ambitious drivers.

I probably have more issues with my Macbook Air (relating to sleep, hibernate, and wake-up) than I do with my Windows machines these days.


To give you a counter-anecdote, I use macbooks in work. For the last five years, I've had two hardware failures and gray screens maybe once every four months. In addition to that, I have issues maybe once a month where the machine more or less locks up (from the logs it looks like windowserver/loginwindow has crashed and OS X is trying to do spindump to them).

Compare that with the _desktop_ Windows 7 machine. It first crashed intermittently (memory failures), but after I changed the motherboard, it has not crashed at all. But then again, I am not using, for example, the most cutting-edge graphic drivers.

I remember quite some crashes during the Windows XP times, but I've since taken a more conservative approach to hardware and drivers.


Jeez, you're talking about something designed and built in 1982-1983 (over 30 years ago) and meant to run on something with 128KB of RAM with no hard drive and a 400KB floppy disk.

You try fitting all that plus a GUI those constraints.

What's amazing is that it had the features it had and that it worked at all.


You might like to check out MenuetOS/KolibriOS and the old QNX demo disk.

Both provide GUIs and rudimentary Web browsers. QNX was full POSIX, too, although the demo disk didn't include a terminal.


... and there goes my macbook (2008?) where windows 7 runs harder better faster than osx, and pretty much more reliably than the monster imac at the office.

Pity that nobody remembers Windows NT4, it was miles better than OS7. I stopped using mac altogheter after starting to use it.


> Microsoft finally got a handle on security & software quality just as the world ceased to care about them.

Not quite. They got a handle on security when Linux list a fire under their ass.

Competition, true honest to market competition, spurs improvement.

The thing about Apple is that they may have competition on hardware, but they have no competition on Software.

If you buy a Mac or a iPhone, you have already thrown money at Apple. But you can easily assemble a PC without Windows and then install Linux on it.

Keep in mind that the latest US warship is not running Windows, but RHEL. That is a very big wake up call for Microsoft, where before we have seen the likes of Win2k (US ship) and XP (UK submarine) used around the world.


> They got a handle on security when Linux list a fire under their ass.

I have my doubts that all 0.02% or whatever of PC users who are dedicated to using Linux as their desktop OS influenced Microsoft to do anything but if you have data to show otherwise I'd be interested in it.


I'd assume parent was thinking less of PC users and more of other OS consumers: https://en.wikipedia.org/wiki/Usage_share_of_operating_syste...


And those are the public ones.

BTW, i seem to recall that the London Stock Exchange had a spectacular failure when they went with Windows. The result being that they switched to Linux with in a year or two of bringing their brand new Windows system online.

Ah yes, found it: http://www.computerworld.com/article/2467082/data-center/lon...


Desktop smesktop. For MS the desktop has always been a means to an end. Its have been about "total cost of ownership", where they can claim people need less training before being productive at their new job.

But to manage all those desktops you need server, and with MS the billing pr active user etc.


"if you have data to show otherwise I'd be interested in it." ...

Some data says Linux desktop/laptop share is 1.5% (not counting chromebooks)

https://en.wikipedia.org/wiki/Usage_share_of_operating_syste...

Please note also that Android is Linux and iOS is Darwin BSD unix.

Linux and Unix based kernels are more numerous than Windows based units.


The fact that iOS and Android have a UNIX like kernel doesn't count much if the majority of userspace apps use non UNIX APIs and tooling.

They could replace the kernel with something else and most devs wouldn't even notice.


> 0.02% or whatever of PC users who are dedicated to using Linux as their desktop OS

Wikipedia: 1.5% [1] NetMarketShare: 1.71% [2] W3Schools: 5.6% [3]

[1] https://en.wikipedia.org/wiki/Usage_share_of_operating_syste... [2] https://www.netmarketshare.com/operating-system-market-share... [3] http://www.w3schools.com/browsers/browsers_os.asp


I don't know about Linux influencing Microsoft on the desktop, but surely it has lit it up on the server. You are 100% trolling by claiming 0.02% of PC folks use Linux. It is at least two orders of magnitude higher than that, and then android... :)


The obvious example would be netbooks. MSFT moved very fast to counter Linux on that front.


Not exactly, oem pc's have ( mostly ) an oem version of Windows on it ( which is pretty cheap, don't know what other incentives microsoft has though).

But you can install Linux on it for sure :)


I had to pay $100 extra for the Windows license that came with my Lenovo Ideapad.

And Lenovo wouldn't let me buy their laptop without the windows license.




Frankly i suspect MS would love to ignore the consumer world, except that then they would lose their beloved "total cost of ownership" argument for doing B2B sales.


OS X and iOS are both way more stable than any pre-OS X Apple OS. I can't believe people forget it.

When people talk about OS X having issues, they often mean some new feature is a little flaky. Classic Mac OS lacked basic stability and security features like preemptive multitasking and memory protection. Classic Mac OS was just like the pre-NT Windows: crash prone.


That's not a high bar though. The only other high profile desktop OS around at the time, Windows NT, was more stable than any pre-OS X Apple OS for a long, long time.


I agree with this sentiment. We can complain all day, but the fact that we have these devices and software that have been made accessible to us by Apple is astounding from a historical perspective. My parents are in awe of the calendar app, and apple maps, etc. As they should be!

OSX also is still the best development platform despite it's flaws.


Are you missing an /s tag there or am I temporarily dumb?


Neither? Not sure which part you are referring. I really believe that it's the best dev platform, I've tried all of them. Ubuntu can come close, but orders of magnitude less user friendly. In my opinion and the opinions of folks I've discussed this with.


I would agree. I love my Macs for dev work. Web dev and app dev alike are an absolute treat with a Unix-style backend but a much more polished front-end. Ubuntu is probably the only Linux distro that comes close to giving me that terminal power without rubbing it in my face constantly when I'm just trying to manage my day to day stuff and, even then, it's not even close to OS X. Windows, on the other hand, is only usable for me with third-party software for everything and then I feel like I'm spending just as much time futzing with everything as I am doing anything productive.


Same with me, I basically do 3 things on my computer: develop code, edit pictures and write stuffs. Almost all my files are in the cloud available through the browser, and the fullfledged terminal with lots of convenience tools just feels great.


I still can't work out how to get XCode to load up the LibreOffice gbuild projects. When I do I think I'll probably be a convert. Till then, I guess I remain with vi.


OK, thanks for your polite answers.


Perception problem is the right description. Let us take a devil's advocate view and try to fit the facts into a narrative that inverts the public wisdom.

Microsoft is making over 4 times what it made in its glory days, growing year by year, across a wide range of products and services. Windows and office account for only half of that, making them a diversified company with plenty of potential for revenue growth. Windows 10 is by far the most successful windows release ever, with more active installs than os x (any version). Basically the only place microsoft is truly failing is phone.

Apple by contrast gets two thirds of their revenue from the iphone. They have nothing else that even comes close, and nothing that could replace it if iphone sales start dropping. Mac sales are down, ipad sales are down, and the apple watch is a dud. Since 1990 apple has basically had two hits: ipod and iphone. I did not mention ipad because it is just another iphone model, which you can tell by its sales slumping as iphone screen sizes moved up. Success for Apple is rare, and most of what they do isn't all that amazing. The apple tv isn't going anywhere, even after the refresh. The apple watch distinguishes itself from other smartwatches only by its price. Basically the only place that apple is truly succeeding is phone.

Perception is everything. How you choose to look at the facts determines which facts you see. Apple is perceived as strong and microsoft as weak, but the facts give you the option of going either way.

Regardless, apple has few excuses for any quality issues. They have the resources, and they have had enough time (given that aside from the watch everything else is half a decade old or more). Personally my mac and ipad anno 2015 have the same amount of glitches as my mac and ipod anno 2005. For me, Apple doesn't seem to be getting worse, but they don't seem to be getting any better either.


> We just didn't care because Steve Jobs was really good at distracting us with promises about how great things were going to be, really soon now.

I tend to characterize the "reality distortion field" as a magician-like talent to focus an audiences attention on a particular subject.


Jobs was taught the RDF by his professor. This means it can be learned. The reader is encouraged to learn how to perform the field, so that they can defend themselves, and others, from its effects.

As the villain in the Incredibles said:

"When everyone's a superhero, no-one will be."


> Jobs was taught the RDF by his professor.

I didn't know that story. Who was the professor? What was the technique?

> This means it can be learned.

Oh definitely. Magicians learn all their tricks, and they are very useful for anyone performing in front of a crowd.

> The reader is encouraged to learn how to perform the field, so that they can defend themselves, and others, from its effects.

Which reader is that?


Robert Friedland, apparently. And he wasn't a professor, but rather a classmate (and Reed's student-body president).

https://en.wikipedia.org/wiki/Reality_distortion_field


We cared when we tried using System 7 Macs to control industrial machines, as I did.

If you treated 16- and 32-bit Windows nice -- typically running one program over long time periods -- they were quite stable on the plant floor.


We have two different theories here. We're smart people, right? We should be able to figure out if we just percieve software quality to be worse or if it really is.

So how do we measure this in some valid manner?


> I remember getting plenty of sad Macs under System 6 and 7

When System 7,8 crashed, it crashed hard. Complete system lockup. And it crashed rather often. No recoverable... progressive crash like Windows.


When you say the world doesn't care you might be nearly right from a consumer perspective but that's not really their target market. Outside of cool IT and design companies almost everyone's business machine is running Windows, a lot of servers are running windows too and sql server and visual studio are at an all time prominence for business software development.


As someone who regularly uses OS X and Windows, I'd say OS X is as reliable as Windows 7, 8.1 or 10 or even more reliable, particularly with regards to crashes. Apple did have a really annoying wifi bug that has been fixed but that did take awhile.

iOS is also in pretty good shape, but almost every time Apple releases a new version its buggy. By now, iOS 9 it's a very stable and robust OS, but it needed work up front.

The biggest places were Apple is having trouble are with new products. Watch OS was slow, buggy and limited at release. It's pretty much at a 1.0 state right now. The new Apple TV is by far the best version of the Apple TV, but the OS is buggy and still needs refinement.

My take on this is two-fold:

1) Apple is doing more and more products, causing there to be issues with newer products. They haven't been putting in the QA work on newer software. OS X is old and mature software, so it's pretty stable, but something like Watch OS is very new.

2) Apple's insistence on yearly OS upgrades is causing there to be a lot of 1.0 roughness each year. Just slowing down to a two-year cycle would allow for a lot more time to refine and more time where the OS has been patched and is the latest OS. iOS 10 will be announced in a few months, but iOS 9 still has at least one major point update to go.


This is more or less my take - other than some occasional high profile bugs - I'm not sure Apple has a declining problem, their software seems to still be well above industry norms for quality (and either in line or better than Microsoft, largely because of their strategy to abandon backwards compatibility).

Apple however is held to a much higher standard than they've ever met, and much higher again than industry norms - and you raise very valid points that they're shipping shit before its ready, I don't think yearly upgrades are terribly compelling anymore, or really needed - I want a computer that works really well and does the things that I want it to do, with a minimum amount of fuss.


Apple's R&D budget seems to be mostly focused on hardware. It's hard to say definitively, given how secretive they are.

Microsoft invests a massive amount of money into MSR, and creates tools out of the most useful results. The Static Driver Verifier depends on Z3, an SMT solver developed at MSR. Other verification tooling like SAL (C/C++ annotations to assert contracts for functions) has a similar history.


They should probably consider using that "static driver verifier" because Surface Pro 4 is crashy as hell.


Yeah, but that's because Skylake is buggy as hell. It's a bit of a bummer. MS cannot publicly blame Intel for the raft of skylake bugs, but it's first gen hardware, first gen software, and first-gen firmware.

For what it's worth, my Surface Book (obviously a product they care a lot about) has basically gotten to a crash rate equal to my mac, which is maybe 1 actual problem every 2 weeks). I suspect the Surface 4 Pro will get that love next as the other Skylake power issues sort out.


If Microsoft was serious about enhancing the real and perceived quality of the brand they are establishing with the Surface Pro line (which seems to be very nice based on the my mother's earlier generation model), they should have tested Skylake, and it in the Surface Pro 4 prototypes, detected the problem, and delayed the launch. Intel's problem became their's when they shipped it in their hardware.


What's interesting about this is that it appears they did! Very very early SB models exist and tested MUCH better than the first run of production hardware. You can find a lot of early reviews that praise the battery life, etc.

Then the first wave of consumer-facing SBs went out and it was a total disaster. This might be something that Microsoft can fix, because they have dramatically improved the product experience and been very receptive to trading defective hardware. Mine was traded up the instant they looked at it, with apologies and a customer care call.

> they should have tested Skylake, and it in the Surface Pro 4 prototypes, detected the problem, and delayed the launch. Intel's problem became their's when they shipped it in their hardware.

Did you say the same thing when Apple shipped a massive defect rate on their first gen retina macbooks? Because they DID test thouse, and they still ended up shipping a truly phenomenal number of lemon screens with huge defect and failure rates.

Oh, and Apple refused to replace all but the most egregious failures. I still have a machine with such significant ghosting that it can be difficult to use. Ironically, DaringFireball is actually unreadable. I keep this machine around because it was part of a very special segment of my life, but also because I like showing people, "Yes even Apple's legendary hardware is rife with first gen bugs, and your iPhones and hypothetical new macbooks are no different."


Yep. That's likely why there are no Skylake MacBook yet.


The fact that Skylake was totally out of cycle with Apple's usual release efforts might have something to do with it. MS was in an unusual position.


As an engineer,this is the correct response.


Well as an engineer, I think we all know that manufacturing defects can creep in even after prototypes pass.

It's also the case that it's incredibly hard to test things like battery life, wifi connectivity and the effects of heavy processor workloads in a systematic way. You hope that your vendors do a good job (and I bet Microsoft's contract with Intel involves penalties for these major defects to try and incentivize Intel to handle these).

Look at the first Iphone4. How did they miss something as simple as skin contact causing significant antenna interference? Most of us hit it immediately. The answer: hardware in the real world is really hard.


Just a little bit of trivia which I found interesting -- Apple actually did not miss the antenna interference problem. They knew that it was an issue, but I guess they figured it was an acceptable tradeoff for the design they wanted.

http://www.bloomberg.com/news/articles/2010-07-16/jobs-says-...

I get the impression thatJobs did not think it would be received as negatively as it was.


I suspect if they knew how badly it hurt my reception (I totally lost signal and it took a long time to get back), they would be less surprised at the reaction.


That's not what I'd call a bug that "crept in". Surface Book I bought crashed unprovoked _several times a day_. That's deliberately shipping a completely faulty product that any self-respecting customer will take right back to the store. Which I did.


You should take your Mac to the store and have them take a look. Or at least run hardware diagnostics. I've been using Macs for well over a decade and in that time I have maybe seen 4-5 crashes total, across multiple laptops and desktops. A crash every two weeks is not a normal situation in OS X.


Depends on what you are doing.

Just yesterday, OSX was convinced that I had an external monitor. I did, but that was 2 hours back when I was at the office. So I got to the preferences screen and.. the kernel crashed.


Do you have some kind of third-party display or window manager installed? I regularly (as in "every day") have my Macbook hooked up to dual monitors and I've never had the kernel crash due to a disconnect or a change on the preferences screen. Does that happen regularly for you or was that just a one-off occurrence?


I've got an otherwise unexceptional LG monitor that with one specific generation of MacBook causes all sorts of problems. My windows machines and newer macbooks don't have this problem, and connect to the monitor fine.

So it can be hardware issues. Often subtle ones.


Connected via HDMI? And goes into YPbPr mode because OS X thinks its a TV? And has no override.


Some combination of (Windows on VMWare, startup utilities,corporate virus/malware protection tool and external monitors) are my bane.

I've given up keeping VMWare open, and I experience very very few issues - even on an older OSX release (again, due to corp IT).


Lucky you, VMWare is getting rid of Fusion anyway.


Where did you read that?



If you use virtualization, it is.


I don't know if it's fair to compare Microsoft's efforts with Windows and Apple's with OSX. Windows runs on hardware from a variety of different vendors. OSX is pretty much commercially locked down to Apple's own hardware (unless i'm missing something?). It's actually a shame that it isn't damn hear flawless.


OS X runs pretty well on non-Apple-certified hardware. I have a Hackintosh I've been running on desktop for a few years. There's a large community around these things and users' experiences are mostly positive. Of course, this is due to the dedicated efforts of a small group of hackers.

My hardware configuration matches no Apple product.


Microsoft indeed improved quality a lot, though Windows 7 inexplicably grinds to a halt and sometimes outright hangs on my desktop occasionally (one can blame my Wacom tablet but that contradicts the thesis of driver verification working wonders), and Windows 8 periodically renders the laptop unusable, using near-100% of the disk bandwidth (I tried like 5 tweaks recommended on the web for this problem, nothing helped.)

But that is not nearly as bad compared to having to rely on software developed the way they do in the aerospace business! From http://blogs.law.harvard.edu/philg/2010/02/09/public-tv-figu...:

> Who crashed Colgan 3407? Actually the autopilot did. … The airplane had all of the information necessary to prevent this crash. The airspeed was available in digital form. The power setting was available in digital form. The status of the landing gear was available in digital form. …

> How come the autopilot software on this $27 million airplane wasn’t smart enough to fly basically sensible attitudes and airspeeds? Partly because FAA certification requirements make it prohibitively expensive to develop software or electronics that go into certified aircraft. It can literally cost $1 million to make a minor change. Sometimes the government protecting us from small risks exposes us to much bigger ones.

(I agree that Apple's cash hoard does not make $1M sound like a lot, however, they also have much more software to tend to.) Overall, it seems that today you have to trade correctness for features and development time, and the cost in features and development time cannot be borne by a market participant unless the market is regulated so that all competitors have to do it, in which case the user is going to get way, way less functionality. I believe that the cost of bulletproof correctness might drop significantly enough at some point to change the game - and I really, really hope formal methods will take off big time, without being sure they can - but it doesn't seem like we're there yet. (This is my opinion, not data, of course; the one thing that I think $millions buy that works very well without costing too much time or features is automated testing.)


Tim Cook doesn't seem to care about anything other than the supply chain. He came from being a COO, so that's all he knows. Consider how many different models of iPad there are.[1] It's like he's bragging. "Look how good we are at managing suppliers. Look! LOOK!" Meanwhile, the watch, Apple Music, and everything else that reached a v1 release under his tenure so far has been buggy and broken. But hey, at least they have a "gold" Macbook now!

[1] http://www.apple.com/ipad/compare/


>>Apple has the resources to approach software like aerospace. Formal interface specs, heavy Q/A, tough regression tests, formal methods. It's expensive, but Apple ships so many copies that the cost per unit is insignificant.

They don't have the time, however.


They have the money to buy more resources and a lot of those tasks can be done in parallel.


There's this really great book you should read about myths and man-months


That's true but only if there's only one team working on software at Apple. There's no reason to assume that the iTunes team needs to be the Photos team; while there might be certain dependencies on something like iCloud or OS X, the areas everyone complains about tend to be clearly contained within a single app and there are well-practiced ways to deal with things like API changes.


9 women combined can't make a baby in 1 month


That just means involving more people _right now_ won't speed things up. But you can "make 9 babies in 9 months" if you plan early and involve enough people. I.e. know how much Q/A people you need and hire them upfront.


Yeah, and have them twiddling their thumbs while there's no product to test?


9 women can make a baby every month for 9 months


with an initial delay of 8 months.


Or rather, the marginal return isn't worth the investment.


>>> Apple has total control over the hardware, total control over third party developers, and $203 billion in cash. What are they doing wrong?

Nothing. And that is the problem. When you are generally speaking doing everything well there is little room for massive improvement. The days of perpetually exponential improvement, and resulting growth, are over. Apple is not a startup. Like Ford, Sony and GE, they now have to settle into the grind of incremental improvements for reasonable returns.

Or they can put their markers down on ever more grandiose schemes. They could branch into transportation by starting an airline, or a robot taxi service, but I doubt shareholders will tolerate such outflows for long. If doing so causes the neglect of the core business (iPhone) shareholders will revolt.


Apple is losing market share to Android. The gravy train may not go on forever. Apple today is in the position of GM in GM's glory days, the wonderful days of powerful V8 engines, HydraMatic transmissions, tailfins, and the "longer, lower, wider" wide-track Pontiac. GM didn't think they needed massive improvement in quality. They were wrong.

Watch this commercial for the 1967 Pontiac GTO.[1] Looks a little like one of Apple's teaser ads from the Jobs era, doesn't it?

[1] https://www.youtube.com/watch?v=tzF_CdKLTP0


I think that the Apple/Google/Microsoft/IBM quadrifecta perfectly illustrates one of the lesser-known points of The Innovator's Dilemma: customers care about different values in different points of the product's lifecycle, and that leads to differing companies becoming dominant.

When a new product category is introduced, customers primarily care about ease of use and relevance to their lives. Radical vertical integration is usually necessary to achieve this, because any friction in the product's interface is on top of the friction of getting consumers to use a product that they're completely unfamiliar with. Hence, the market is totally dominated by one company that makes everything from the chips to the hardware to the OS to the apps. This is Apple. This is the Apple II in 1976, the Mac in 1985, the iPhone in 2007, the iPad in 2010, and now the Apple Watch in 2015. (It's also Netscape in 1993, Yahoo in 1998, Amazon in the early 2000s, and AWS today.)

As the market matures, more competitors enter. An ecosystem of third-party apps develops. Hardware supplier prices drop as more hardware manufacturers develop expertise and enter the market. Customers start to value compatibility, options, customizability, and adherence to standards over raw ease of use. This is Google now and was Microsoft in the 80s & 90s. This is MS-DOS in 1981, and Windows 3.1 in 1991, and IE5 in 1999, Google Search in 2000, Chrome in 2008, and Android in 2011-present.

Eventually the technology moves up-market. Customers start to care more about security, stability, reliability, and performance. That's Microsoft now and IBM in the 70s & 80s. That's mainframes in the 80s, and MS Office and Win7 now. At this point, the technology is already being disrupted, but the disruptive technology isn't reliable enough for a segment of the market.

Finally, you get to the point where customers care about brand and compatibility with existing installations. This is maintenance work, where the company becomes a consulting outfit to keep all the technologies they invented a generation ago running. That's IBM now.


This seems like a good description of product maturity process in B2B markets. But consumers are motivated by different things(once products are good enough) - the chief among them is psychological/social value/perception, mostly created via marketing or by being first - and in general pretty hard to disrupt.


I think that the emotional-value aspect slows down the disruption cycle in consumer markets, but it doesn't stop it.

Emotions, after all, are just the brain's way of processing lots and lots of information that can't be compared on a rational basis. Part of that information is "What do my friends use?", part of it is "How does it fit into my life?", and part of it is "What does it say about me as a person and what I value?"

But all of those factors are still subject to reality: if a new product comes out that fits into your life better, eventually somebody's going to break ranks and adopt it, and they'll be able to explain to their friends, authentically, why they believe it's better. All of the catalysts I mentioned in the original post reflect changes in the ecosystem: the shift from ease-of-use to features & compatibility reflects more things you can do with the product, the shift from features to reliability reflects using the product in more consequential situations, and the shift from reliability to branding & maintenance reflects how you're perceived for choosing the product.

The Tipping Point describes the mechanism for this in consumer markets well. Product adoption starts off with Mavens, people who like trying & evaluating new technology on its own merits. It spreads through Connectors, who have a wide circle of friends and enjoy telling them about interesting new things that might benefit their life. Finally, the holdouts are convinced by Salespersons who explain, point-by-point the benefits and answer objections.


>> I think that the emotional-value aspect slows down the disruption cycle in consumer markets, but it doesn't stop it.

That may be true.

But in the context of iOS vs Android:

1. Most features come from apps - both have strong app ecosystems, and iOS probably has the stronger app ecosystem because it serves wealthier people. To a certain extent that applies to reliability.

2. Some features are native to the OS. So you see a competition, and Android is certainly faster there, via the rooting community, competition between OEM's , etc. But Apple usually respond - at least when things appeal to the mainstream , and don't negate their strategy.As for the question of reliability - i'm not sure Android is viewed as more reliable(think security vulnerabilities like stagefright). But yes, maybe Google can lead Apple here ,because they seem stronger technologically. The only question will they do this permanently or will it just buy them some time and would it be enough ?

Also , let's not forget the network effect embedded in iOS via iMessage(which many users say it prevents them from moving to Android).

>> the shift from reliability to branding & maintenance reflects how you're perceived for choosing the product.

I'm not sure that's true. it all depends on how psychologically important that product is to you, versus how important is the features/reliability differential.


Thanks, that was a really interesting read. However, like my sibling states, consumer markets are subject to the whims of marketing, which may distort this somewhat.


> Apple is losing market share to Android

Who cares ? They are siphoning profits from the market and selling more phones than they ever have before. They have periphery businesses e.g. Apple Pay, App Store that are doing very well and I am sure more will come in the future. They are never going to be the company that goes for market share above all else.

> Apple today is in the position of GM in GM's glory days

I don't think so. Apple seems to be quite happy just to acquihire their way out of any innovation slump. There are a ridiculous number of companies especially in the VR space that they have acquired that we have seen no evidence of in their problems.

Pretty exciting times ahead I imagine.


They're losing marketshare in the phone market as a whole, but in the high end, premium market, they're doing quite well. And that's the market where the profit is, not in the low end, free on contract devices.


Fantastic commercial!

"The Great One"

I can almost see Don Draper standing in the shadows behind the car.


2 possible causes for this:

1) Apple still develops the OS using waterfall over the year. Entire sweeping changes are made only at x.0 releases that trickle down to teams that have to work around the instability all year long and there's no other approved way to get in significant changes.

2) They keep adding more apps to the core OS image that can only be updated with a full software update now. This makes delivery of quick fix updates near impossible since they have to go through the OS release management teams.

It certainly sells better to have a huge list of changes at WWDC that then become reasons to upgrade, but software delivery has moved on from waterfall, so in that respect Apple's OS teams are behind.


>They keep adding more apps to the core OS image that can only be updated with a full software update now.

This is a huge drawback for Safari, both on desktop and mobile.


#2 is just not true. They deliver point releases that add new core functionality all the time. For example, Photos for OS X was delivered in 10.10.3. A point release that came mid-year and delivered a huge amount of new functionality, including photo streams shared between iOS and OS X.

They also deliver many bug fix releases throughout the year, on both platforms. The 9.x releases have seen them add support for WatchOS 2, and many other things.

It's a huge ecosystem, and many teams, that all have to line up their product release schedules, and now 4 operating systems (OS X, iOS, tvOS, WatchOS) that have features that all work together. This is not trivial.


I would be skeptical about agile on something like OS development which is on a different scale from your average software project.

Not saying it won't work, but I would like to see some comparisons of OS level projects that have gone agile and compare it to the waterfall approach.


Well Linux is run like that right? Releases very often

Not to mention Facebook itself... We all know FB has fallen flat many times but its never been busted for weeks at a time to my knowledge


One possible answer is that near-perfection (the perception of "Job's" products), including recognition (widespread adoption), is attainable at any given moment in time, but typically unsustainable long-term...given human and technological constraints...

It could be entropy, as some have suggested, or simply the difficulty of maintaining a level of quality one has become associated with producing...

Maintaining the (high) level of quality one has reached is difficult enough...

Incremental gains on a level attained become much more difficult...opportunities become infinitesimally smaller...


>Apple has the resources to approach software like aerospace. Formal interface specs, heavy Q/A, tough regression tests, formal methods.

I would think that what you stated, while true that they currently have these resources, would directly go against their product roadmap schedule, the consumption of said devices in that schedule and thus their bottom line ($203B in cash)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: