Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Nah, I think it's a perception problem.

As someone whose starry-eyed Mac obsession predated Windows 95 - Apple's software has always been buggy. It was buggy under Sculley, it was buggy under Amelio, and it was buggy under Jobs. I remember getting plenty of sad Macs under System 6 and 7, and early versions of OS X weren't any better.

We just didn't care because Steve Jobs was really good at distracting us with promises about how great things were going to be, really soon now.

The comparison with Microsoft is instructive. Microsoft software was even buggier than Apple's during their period of greatest dominance. Win95/Win98/WinME would crash all the time, and was an open barn door for security. Early versions of IE were pieces of shit. Even later versions of IE (6-9) were pieces of shit. Microsoft finally got a handle on security & software quality just as the world ceased to care about them.

Apple's been driving change in the computer industry since the iPhone was introduced in 2007. New products are always buggy - the amount of work involved in building up a product category from scratch is massive, and you don't know how they'll be received by the market, so there're frantic changes and dirty hacks needed to adapt on the fly, and they often invalidate whole architectural assumptions. It's just that most of the time, this work goes on when nobody's paying attention, and so by the time people notice you, you've had a chance to iron out a lot of the kinks. Apple is in the unenviable position of trying to introduce new product categories while the whole world is looking.

The Apple Watch is buggy as hell, but I still find it useful, and pretty cool.



I think you've touched on something key: people's acceptance of bugs is often outweighed by the new features they get. In other words, if the "new functionality" is perceived to outweigh the costs of the lack of reliability, the product will be deemed "less buggy" because people "understand" why there are issues.

This made total sense in some of Apple's biggest products: OS X and iPhone. When OS X first came out it couldn't even burn CDS, but we all "understood" the magnitude of the project and thus gave it some slack. Similarly, the iPhone lacked a lot and was slow, but it was such a revolution that we let it slide -- in fact we let the rest of the products slide.

The problem today I think is that these decisions are being made for reasons that users don't deem "worthy". Introducing some new music services is not a good enough reason to break my iTunes. The fact that a new watch was released is not considered important enough to let other platforms languish. We "get" why less attention is being paid to other products, but unlike with the phone, its not deemed a good trade off.

In other words, I don't think Jobs was distracting us with promises, but with actual shiny things that made the bugginess worthwhile.


As a long-time Apple user and former employee, this is exactly how I feel about the current situation. I still think that my Mac/Apple devices are more solid than my Android/PC devices, but this is exactly what I'm noticing more regularly.

I forgive most faults that happen because it's almost as if I can forgive them not working all the time since 99% of the time, everything is awesome. On Windows, that same forgiveness manifests itself as me not using my Windows machines as much as my Macs. I still love to use my PCs, but not for anything that I need to rely on the majority of the time.

Now, though, Apple is making changes to things (iPhoto/Aperture were a really great example) where it seems like the change is just to bring parity of some sort to OS X and iOS rather than introducing new features. iPhoto was buggy as hell when they added Faces and Places to it, but I totally forgave that because 99% of the time it was making my life way easier than it was before by detecting faces properly. If it crashes every now and then, it at least saved the data, so I was still better off than I was before the update. I still like Final Cut X (I know, I know... I'm an outlier), but convincing me that a switch like iPhoto/Aperture -> Photos is worthwhile is much harder since there's nothing to distract me away from those issues and I've somehow managed to actively lose features that they convinced me were necessities in the past.

I hope this is not an indicator of things to come. One thing that gives me some hope is that they've gone back to alternating between feature updates and stability updates. Leopard was cool, but Snow Leopard was incredible to me. If that pace comes back, I'll be happy again. Until then, Apple needs to get their software game back in line with the rest of the company.


If you think that it was bad when they removed features from iPhoto, then I'd hate to think what you thought when they removed features from Numbers!


Oh yes... That was a bad move, I think. Luckily, I rarely have to use Numbers so I didn't really care. It just annoyed me that they removed some of the features that I actually did use when I needed to use Numbers. If they added the features back as quickly as they did with other apps, I wouldn't care, but they didn't. :(


I (Apple) will one-up you with Final Cut Pro X http://arstechnica.com/apple/2012/01/more-fcpx-fallout-top-r...


I love the new FCP. As a long time user of FC7, I'm ok with losing out on some of these features as long as they added them back over time and they've done that, for the most part (at least for my uses). The old FCP really needed a facelift and was trapped in an such an old mindset when video was still mainly stored on tape drives and needed to work like real life video editing tools. FCP X is so fast for me and such a treat to use for 99% of things that I can deal with having to jump back to FCP 7 every once in a while. As long as Apple doesn't somehow prevent me from using FCP 7, I don't care and love the new direction of FCP X.


Reminds me of a discussion that was on anothe HN article a few weeks back where someone proudly stated that if a feature customers used didn't fit for in with the companies strategic direction they'd drop it, and tough luck for the customer.

Apple seem to have the same mentality. They used'to get away with it, but mostly because they replaced it with something better. Now they just seem to drop features entirely. That's not a good way of going. As much as I despise Steve Jobs, he never let that quality of a product drop to the degree that big customers (or even smaller customers) left Apple without a major fight to keep them.

It's looking like Apple's obsession with making great and quality products is taking a bit of a backward seat. I think they probably need to worry a little less on their schedule, and more on polish and feature completeness.

Rather remarkable I'm actually saying this, to be honest! Apple would be the last people I would have guessed needed this advise...


I think that's an awesome sentiment if you're talking about something like an Arduino, where part of the experience is working around its quirks and limitations. If you've bought a device expecting it to basically be a transparent window into the internet (or your documents, etc.), having to deal with its quirks and limitations can put a really bad taste in your mouth. Especially if you paid top dollar for it.


"I think you've touched on something key: people's acceptance of bugs is often outweighed by the new features they get. In other words, if the "new functionality" is perceived to outweigh the costs of the lack of reliability, the product will be deemed "less buggy" because people "understand" why there are issues."

Right - which is why we have all of the snow leopard nostalgia: because none of the newer releases have given us anything substantive that we really needed to justify the hassle and the bugs.

I am trying to think of something - anything - that compels me to upgrade SL on my mac pro, and all I can think of is that nifty take-a-picture-of-your-signature in the Preview app that you can then insert into PDF documents.

Ok, and maybe USB3 ?

That's all I can think of.


AirPlay; much better multi-display support; tags in Finder; Spotlight enhancements. More than anything, the iCloud/iOS continuity features were also big if you had an iPhone or iPad, everything is just much easier to keep in sync.

I'm a Safari user (better battery usage for the # of tabs I have open) and it too has improved with El Capitan though that's irrelevant for Chrome/FF users.


ok, airplay I guess - although I've never used it, I do see people using it to good effect.

Worth mentioning that airplay is just userland software - nothing special, and no reason it couldn't have been added to SL.

I don't know about multi display, though - I've been under the impression that that is broken in new and fascinating ways with every single release...


Yeah absolutely. Snow Leopard's multi-display was great. As was Leopard's, Tiger's, and Panther's.

Then Apple broke it massively in Lion, and only finally resolved most of the (severe, productivity-destroying) issues with Mavericks.


Handoff is a really useful feature (when it works).

Also SL mamed Expose (that weird non-proportional grid view) that was reverted to the Leopard-style in Mission Control (of which Mavericks/Yosemite had the best implementation, and they've now broken its utility in ElCap thanks to hiding thumbnails by default. FFS.)

But apart from that... I think I preferred the Apple apps back in 2009-or-so.

To be honest, I think the latest Apple release cycles have been more about "remove a feature so that we can add it in again and sell it to our users again". Think multi-monitor support, something that worked perfectly in SL and earlier, and then broke fantastically with the full screen apps in .. Lion? ML? One of the two.


True Apple software has always been buggy, Apple calls it undocumented features.

Apple always makes up for bugs with newer devices with faster CPU and GPU units that make OS code run faster. That means buying a new Apple device to get better performance. The older Apple devices are left out of updates eventually and if they do update to a newer OS version it runs slower.

Apple is driven by an upgrade model to buy a new Apple device every three years or so. In the PC world Windows 7 can still run on old Pentium 4 systems and if I am not mistaken some of them can upgrade to Windows 19, the 32 bit version but it can still work. For example I used to have a Macbook that only ran up to 10.7 and 10.8 needed a newer Intel CPU to install. Anyone with an iPhone 4 is going to find the latest iOS slow as well.

It is in Apple's business model to sell customers a new device every few years or so and phase out old Apple devices.

Apple doesn't care if their software isn't the best quality as long as it is easy to use and will keep people buying new Apple devices to run things faster.

I myself like GNU/Linux better than OSX, because it can run on older PC systems and it runs quite fast and has a good quality to it. GNU/Linux is virtually unknown to the average consumer and when people get tired of Microsoft they usually just buy an Apple device. Apple devices are easier to maintain and use. You even got toddlers using iPads, that is how easy they are to learn to use.

Apple has saved up billions just in case they have problems. Apple has done well financially in an uncertain economy where other companies are struggling.

Only Alphabet seems to be doing better for some reason. Google's parent company. Google's Android needs better quality as well and since Oracle sued them over the Java API they have to change the way the OS works. The Web Services seem to earn a lot of money and Google's AI is very advanced.


Apple is driven by an upgrade model to buy a new Apple device every three years or so.

I disagree.

My wife's iMac is 6 1/2 years old, and still gets the latest OS updates (for free!), though we're upgrading to a new iMac soon.

My iPad 2 is almost 5 years old, and still gets the latest OS updates (for free!), though we're upgrading to a new iPad soon.

It is precisely because our older Apple hardware is still working well, and Apple still supports us with the latest updates to that older hardware, that my family is not only sticking with Apple, but we've recently invested in new iPhones.

Apple has earned our trust.


Yeah, OS X and the iPhone were new products, not just upgrades of what came before (as you say, OS X had significant limitations compared to OS 9). You didn't have to upgrade right away, but if you did, you got an entirely new experience.

Compare this to today's Apple, where upgrades add "hundreds of features" but feel mostly the same (except everything runs a bit more slowly). There's no coherent vision of what the future of the software should be like.


wow, I couldn't put my finger on it, but this is why I am starting to hate apple. I spend a lot of time on the computer so I like a lot of things apple does, mostly the interface and the UI. To clarify what UI means to me, it is how simple and intuitive it is to use the device, and give it the instructions to do what I want. I am willing to forgive a lot because this is good.

Apple has always been a fairly closed system and it didn't bother me more than not having the features I wanted. In El Capitan, it was different. Things didn't work well and Apple took over my whole system. With SIP(system integrity protection) I had no control. It would seem to turn protection back on after being disabled, and it takes a nontrivial amount of time to turn it off because you have to reboot the entire system into recovery mode, wait for it to connect to the internet and download a bunch of apple shit, and then select a language preference and then type a command into bash and reboot.

Deleting apps is difficult, changing settings is difficult, having siri take up 10% of my iphone is annoying, removing apps destabalize the system, installing my system from time machine reinstalls their system and settings and overrides mine.

I disabled most of apples applications and processes, the system in fairly stable, although I think I went to far with disabling notification center, but your point is correct.

tl;dr users are willing to accept a lot for revolutionary changes. Evolutionary changes with only marginal improvements are not going to make me forget that they unpredictably disallow me from using sudo and are fucking up all my devices doing things I don't want them doing in the first place.


A lot of people in this thread seem to think this is all about Apple not adding enough revolutionary features or something. But consider this alternate explanation: with years of experience comes a more sophisticated judgement. What used to seem good enough now seems to have obvious flaws, even if it the same as it was before. Lack of control is an example: beginners often don't notice or care much, especially if it feels simpler, but as your needs deepen it becomes more important. Being able to set things up and then not keep touching it is one of those tastes that develop with experience.


that is a good point as well. I definitely agree with it. The one thing I would add is that I repeatedly get update notifications on my iPhone. Due to the increase in lockdown of all features, I am legitimately afraid to update as the:

* provides security update

* increases iTunes performance

type descriptors, do not provide enough information about how they will fundamentally change my system. Most notably when I updated my iPhone and found out I loaded in some horribly inefficient talking pseudo AI that was not neutral, but a straight up negative feature consuming system resources.

I think you are really correct though, as you gain more experience and skill with technology you have more needs and better judgement. You can evaluate things better because you are aware of what is possible. The biggest problem isn't that they make changes, it is that those changes are not predictable so they become difficult to mitigate.


I'm also concerned about updates. For instance, I'm currrently having to route all my iPad web traffic via Charles proxy to remove any instances of style="overflow:hidden;" in a body tag are cleared out.

Why? Because in iOS 9.2 Apple released it with a bug that causes the viewport to zoom incorrectly on these web pages. This affects LibreOffice's OpenGrok, which I browsed regularly on my iPad.

They still haven't fixed this, and it's a major regression. iOS updates are few and infrequent. Consequently I'm seriously questioning what their updates actually do to my iPad and iPhone.


I wouldn't hold my breath. The iOS Mail app can not negotiate any TLS version above 1.0 (for IMAP, possibly SMTP too) even though it obviously supports TLS 1.2 because it sends a TLS version of 1.0 in the ClientHello message even though that same message will contain TLS 1.2 ciphers (AES-GCM).

I reported it in October and Apple's security team replied they're aware of it but it's still not fixed 2 releases later even though they probably need to fix like 1 line of code (the advertised version flag).


They have actually fixed it - if you get bit with it then you can reference rdar://problem/22242515

The WebKit bug is here:

https://bugs.webkit.org/show_bug.cgi?format=multiple&id=1528...

The patch to fix it is here:

https://bugs.webkit.org/attachment.cgi?id=268394&action=diff

The workaround, FWIW (thanks Simon!) is to add shrink-to-fit=no” to the meta viewport tag.

For me, it was too much effort to get OpenGrok fixed, so I just did a rewrite rule in Charles Proxy that gets rid of the style attribute.


I agree with your summation. To add to this, there are many things going on under the hood that none of us asked for that are taking up system resources, dialing home and draining battery life.

Some time, try this yourself:

    sudo opensnoop


The original iPhone wasn't slow at all. One of its main selling point was the speed of menus and apps (I forget what they actually called apps before the app store).

I think you forget how crazy slow feature phones were. Opening a GPS app and finding your location could take 5-10 minutes in 2007 on a feature phone.


They called them apps too. It's easy to remember with the infamous (quoting from memory): "You can write apps in HTML".


Guess I completely forgot that. Thanks.


I think you're right but people tend to take for granted the features that are worthwhile and underestimate the difficulty of making anything ever work correctly all.


>> Apple's software has always been buggy. It was buggy under Sculley, it was buggy under Amelio...

Classic Mac OS was buggy by design - it didn't have multitasking and memory protection, it was single user... Windows had that same problem before 2000 (well, NT 4, but not many people used that).

I use OS X daily and I use Windows 7 daily. I have far fewer issues with OS X for whatever reason that may be. My computers don't magically reboot or bluescreen nearly as much. It might happen every 6 months at the most, where with Windows it probably happens every 2 months.


I agree that OS X is more reliable. I have both a Mac and a PC in my office, and my 5K iMac has never crashed. I had some weird issues with Windows 10 after I upgraded my laptop, where it would hang after an update.


I switched from using Windows for 20 years to OS X (excluding Linux for work related stuff.) This is the first time I've been able to work from a laptop with my productivity level as good or better than a desktop. The design and usability surpassed what I expected. I haven't noticed any bugs.


I can't imagine having a different brand of computer, but there are lots of OSX bugs that cause my teeth to grind. The Finder doesn't remember that I only want one layout ever, and it resets to a random alternative setting regularly. There are still progress bars that pop over, and can't be hidden. The data display which shows used disk space and it is nearly all "other" is a bug as far as I'm concerned. My AppleTV(8) has stopped incrementing itself, but its not ideal. And as a genuine question, has anyone not had strange Xcode behaviour or crashes at least once per day? I currently have a slow motion simulator that changes views over 5-10ish seconds.


In terms of laptop OSes, OS X is by far the best. It's stable, usable and 100% desktop-OS focused. I can't stand the touchscreen features that Windows 10 tries to still force on you. Somehow my Windows 10 laptop got put into "tablet mode," and it was pretty unusable. I couldn't access the desktop anymore, it was slow and it took awhile to figure out the issue.

I think it was flipped on after an update, but why would I even want to be able to enable that mode on a laptop without a touchscreen?


If Windows 7 is "magically rebooting and bluescreening" often enough to comment on, then you have a hardware problem.


I don't use Windows very much, because I hate using Windows. But what I will give it is that I haven't seen a blue screen in the past decade for any reason short of bad memory, overheating, dead disk. I suspect that a lot of the Windows image problem is that people are free to buy really cheap hardware and fiddle with things they don't understand.


Apple merging MacOS with NextOS to make a Unix OS was the best move they could make at the time. It happened during the time of the $10,000 Unix workstation and Apple made OSX as an easier to use Unix. It was cheaper to buy a Macintosh than it was to buy a SparcStation or some other Unix workstation.

Because of Apple making OSX Unix based, it cut into sales of other Unix companies like SGI, and also GNU/Linux cut into sales of SGI and others as well.

But making OSX Unix based solved a lot of problems that Classic MacOS had that they couldn't solve.


"...magically reboot or bluescreen nearly as much." Mine never magically reboot. Ever. (OS X)


My Macbook Pro did which reminds me the extended warranty period is ending.

https://www.apple.com/support/macbookpro-videoissues/


I regularly get asked to fix someones windows PC, no such problem with people who have Macs. With OS X there are waves of releases(major versions) if I remember correctly - some introduce swaths of new code/features/replacement code. Some other are more of a speedup and bugfix versions. Maybe I'm wrong.


I go places on my Windows pc that I wouldn't dare take my Mac. I expect it to need repair.

My pc is my beater car, and it needs repair--regularly.

My Mac is the classic car in the garage, that only gets used for work, or safe places.


Speak for yourself. I used an old G4 PowerBook for grad school, and it travelled over 100,000 air miles, and into the various labs where I had to work, and also to far-off Asian countries for holiday. Plus, I didn't have to buy a developer kit: it came free with the machine.


I assumed that the travel referred to dangerous parts of the web.


Actually NT 3.51, which I used for dev and was great compared to my colleagues on plain windows.


Actually NT 3.1. 3.5 followed, then 3.51, then 4.0, then 5.0 (2000), and then the NT line ended as it was unified with the non-NT line.


Technically 9x line ended since Windows XP was NT-based and not 9x-based.


Yes, I suppose so. I didn't want to say the 9x line ended, because it's really the line of DOS-based OSes, and while the NT line is ongoing, it's no longer called NT. 2000 was the last version to mention NT, and it wasn't part of the name itself, just a tagline.


One of the things about OS X is that most of the time it's put on high quality, but non-exceptional hardware. So things like bad RAM, flakey power supplies, bad GPU drivers etc are almost never an issue with a stock machine.

Windows, not so much. The only stability issues I've had with windows have been related to poor drivers, almost exclusively from nVidia or ATI/AMD. The equivalent hardware for Apple machines either didn't exist at the time, or was running much less ambitious drivers.

I probably have more issues with my Macbook Air (relating to sleep, hibernate, and wake-up) than I do with my Windows machines these days.


To give you a counter-anecdote, I use macbooks in work. For the last five years, I've had two hardware failures and gray screens maybe once every four months. In addition to that, I have issues maybe once a month where the machine more or less locks up (from the logs it looks like windowserver/loginwindow has crashed and OS X is trying to do spindump to them).

Compare that with the _desktop_ Windows 7 machine. It first crashed intermittently (memory failures), but after I changed the motherboard, it has not crashed at all. But then again, I am not using, for example, the most cutting-edge graphic drivers.

I remember quite some crashes during the Windows XP times, but I've since taken a more conservative approach to hardware and drivers.


Jeez, you're talking about something designed and built in 1982-1983 (over 30 years ago) and meant to run on something with 128KB of RAM with no hard drive and a 400KB floppy disk.

You try fitting all that plus a GUI those constraints.

What's amazing is that it had the features it had and that it worked at all.


You might like to check out MenuetOS/KolibriOS and the old QNX demo disk.

Both provide GUIs and rudimentary Web browsers. QNX was full POSIX, too, although the demo disk didn't include a terminal.


... and there goes my macbook (2008?) where windows 7 runs harder better faster than osx, and pretty much more reliably than the monster imac at the office.

Pity that nobody remembers Windows NT4, it was miles better than OS7. I stopped using mac altogheter after starting to use it.


> Microsoft finally got a handle on security & software quality just as the world ceased to care about them.

Not quite. They got a handle on security when Linux list a fire under their ass.

Competition, true honest to market competition, spurs improvement.

The thing about Apple is that they may have competition on hardware, but they have no competition on Software.

If you buy a Mac or a iPhone, you have already thrown money at Apple. But you can easily assemble a PC without Windows and then install Linux on it.

Keep in mind that the latest US warship is not running Windows, but RHEL. That is a very big wake up call for Microsoft, where before we have seen the likes of Win2k (US ship) and XP (UK submarine) used around the world.


> They got a handle on security when Linux list a fire under their ass.

I have my doubts that all 0.02% or whatever of PC users who are dedicated to using Linux as their desktop OS influenced Microsoft to do anything but if you have data to show otherwise I'd be interested in it.


I'd assume parent was thinking less of PC users and more of other OS consumers: https://en.wikipedia.org/wiki/Usage_share_of_operating_syste...


And those are the public ones.

BTW, i seem to recall that the London Stock Exchange had a spectacular failure when they went with Windows. The result being that they switched to Linux with in a year or two of bringing their brand new Windows system online.

Ah yes, found it: http://www.computerworld.com/article/2467082/data-center/lon...


Desktop smesktop. For MS the desktop has always been a means to an end. Its have been about "total cost of ownership", where they can claim people need less training before being productive at their new job.

But to manage all those desktops you need server, and with MS the billing pr active user etc.


"if you have data to show otherwise I'd be interested in it." ...

Some data says Linux desktop/laptop share is 1.5% (not counting chromebooks)

https://en.wikipedia.org/wiki/Usage_share_of_operating_syste...

Please note also that Android is Linux and iOS is Darwin BSD unix.

Linux and Unix based kernels are more numerous than Windows based units.


The fact that iOS and Android have a UNIX like kernel doesn't count much if the majority of userspace apps use non UNIX APIs and tooling.

They could replace the kernel with something else and most devs wouldn't even notice.


> 0.02% or whatever of PC users who are dedicated to using Linux as their desktop OS

Wikipedia: 1.5% [1] NetMarketShare: 1.71% [2] W3Schools: 5.6% [3]

[1] https://en.wikipedia.org/wiki/Usage_share_of_operating_syste... [2] https://www.netmarketshare.com/operating-system-market-share... [3] http://www.w3schools.com/browsers/browsers_os.asp


I don't know about Linux influencing Microsoft on the desktop, but surely it has lit it up on the server. You are 100% trolling by claiming 0.02% of PC folks use Linux. It is at least two orders of magnitude higher than that, and then android... :)


The obvious example would be netbooks. MSFT moved very fast to counter Linux on that front.


Not exactly, oem pc's have ( mostly ) an oem version of Windows on it ( which is pretty cheap, don't know what other incentives microsoft has though).

But you can install Linux on it for sure :)


I had to pay $100 extra for the Windows license that came with my Lenovo Ideapad.

And Lenovo wouldn't let me buy their laptop without the windows license.




Frankly i suspect MS would love to ignore the consumer world, except that then they would lose their beloved "total cost of ownership" argument for doing B2B sales.


OS X and iOS are both way more stable than any pre-OS X Apple OS. I can't believe people forget it.

When people talk about OS X having issues, they often mean some new feature is a little flaky. Classic Mac OS lacked basic stability and security features like preemptive multitasking and memory protection. Classic Mac OS was just like the pre-NT Windows: crash prone.


That's not a high bar though. The only other high profile desktop OS around at the time, Windows NT, was more stable than any pre-OS X Apple OS for a long, long time.


I agree with this sentiment. We can complain all day, but the fact that we have these devices and software that have been made accessible to us by Apple is astounding from a historical perspective. My parents are in awe of the calendar app, and apple maps, etc. As they should be!

OSX also is still the best development platform despite it's flaws.


Are you missing an /s tag there or am I temporarily dumb?


Neither? Not sure which part you are referring. I really believe that it's the best dev platform, I've tried all of them. Ubuntu can come close, but orders of magnitude less user friendly. In my opinion and the opinions of folks I've discussed this with.


I would agree. I love my Macs for dev work. Web dev and app dev alike are an absolute treat with a Unix-style backend but a much more polished front-end. Ubuntu is probably the only Linux distro that comes close to giving me that terminal power without rubbing it in my face constantly when I'm just trying to manage my day to day stuff and, even then, it's not even close to OS X. Windows, on the other hand, is only usable for me with third-party software for everything and then I feel like I'm spending just as much time futzing with everything as I am doing anything productive.


Same with me, I basically do 3 things on my computer: develop code, edit pictures and write stuffs. Almost all my files are in the cloud available through the browser, and the fullfledged terminal with lots of convenience tools just feels great.


I still can't work out how to get XCode to load up the LibreOffice gbuild projects. When I do I think I'll probably be a convert. Till then, I guess I remain with vi.


OK, thanks for your polite answers.


Perception problem is the right description. Let us take a devil's advocate view and try to fit the facts into a narrative that inverts the public wisdom.

Microsoft is making over 4 times what it made in its glory days, growing year by year, across a wide range of products and services. Windows and office account for only half of that, making them a diversified company with plenty of potential for revenue growth. Windows 10 is by far the most successful windows release ever, with more active installs than os x (any version). Basically the only place microsoft is truly failing is phone.

Apple by contrast gets two thirds of their revenue from the iphone. They have nothing else that even comes close, and nothing that could replace it if iphone sales start dropping. Mac sales are down, ipad sales are down, and the apple watch is a dud. Since 1990 apple has basically had two hits: ipod and iphone. I did not mention ipad because it is just another iphone model, which you can tell by its sales slumping as iphone screen sizes moved up. Success for Apple is rare, and most of what they do isn't all that amazing. The apple tv isn't going anywhere, even after the refresh. The apple watch distinguishes itself from other smartwatches only by its price. Basically the only place that apple is truly succeeding is phone.

Perception is everything. How you choose to look at the facts determines which facts you see. Apple is perceived as strong and microsoft as weak, but the facts give you the option of going either way.

Regardless, apple has few excuses for any quality issues. They have the resources, and they have had enough time (given that aside from the watch everything else is half a decade old or more). Personally my mac and ipad anno 2015 have the same amount of glitches as my mac and ipod anno 2005. For me, Apple doesn't seem to be getting worse, but they don't seem to be getting any better either.


> We just didn't care because Steve Jobs was really good at distracting us with promises about how great things were going to be, really soon now.

I tend to characterize the "reality distortion field" as a magician-like talent to focus an audiences attention on a particular subject.


Jobs was taught the RDF by his professor. This means it can be learned. The reader is encouraged to learn how to perform the field, so that they can defend themselves, and others, from its effects.

As the villain in the Incredibles said:

"When everyone's a superhero, no-one will be."


> Jobs was taught the RDF by his professor.

I didn't know that story. Who was the professor? What was the technique?

> This means it can be learned.

Oh definitely. Magicians learn all their tricks, and they are very useful for anyone performing in front of a crowd.

> The reader is encouraged to learn how to perform the field, so that they can defend themselves, and others, from its effects.

Which reader is that?


Robert Friedland, apparently. And he wasn't a professor, but rather a classmate (and Reed's student-body president).

https://en.wikipedia.org/wiki/Reality_distortion_field


We cared when we tried using System 7 Macs to control industrial machines, as I did.

If you treated 16- and 32-bit Windows nice -- typically running one program over long time periods -- they were quite stable on the plant floor.


We have two different theories here. We're smart people, right? We should be able to figure out if we just percieve software quality to be worse or if it really is.

So how do we measure this in some valid manner?


> I remember getting plenty of sad Macs under System 6 and 7

When System 7,8 crashed, it crashed hard. Complete system lockup. And it crashed rather often. No recoverable... progressive crash like Windows.


When you say the world doesn't care you might be nearly right from a consumer perspective but that's not really their target market. Outside of cool IT and design companies almost everyone's business machine is running Windows, a lot of servers are running windows too and sql server and visual studio are at an all time prominence for business software development.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: