Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The void left by Apple (hackernoon.com)
68 points by theaeolist on Nov 27, 2016 | hide | past | favorite | 101 comments


What's more worrying for me is that entire product categories are either left abandoned (Mac Pro) or updated with inferior products (latest MacBook Pro), and it seems that Jony Ive's form-over-function approach encounters no opposition within the company. We lose good functionality for no good reason, other than products become "simpler" and "more pure".

This in itself would not be a problem for me, if it weren't for the lack of competition. Mac OS is so far ahead in terms of "getting things done" and "just working" that it isn't even funny. And yes, I do know, I also regularly use Windows (10) and Linux (Ubuntu 16 LTS). I feel locked in: both "competing" platforms are huge time-wasters for me. I know some people feel different — past discussions have shown that most people do fewer things with their computers than me (for example, if all you need is a web browser, Ubuntu is a great choice).

Now that Apple is abandoning me as a customer (I'm not the mainstream), I feel squeezed: where do I go from here? What's even worse is that the cost of entry is now so high: developing an environment that works, and then convincing developers to embrace it is nearly impossible. And the only companies that have the resources to do it (Microsoft and Google) do it badly.


So I'm sitting here, just unlocked my Mac through putting my finger on the touchbar and reading your post, wondering who that "pro" is that I keep on hearing in conjunction with Apple during the last 6 months. I'm a dev, doing the Kool Aid for a 100bn$+ company, together with designers in my team. I always thought devs and designers would be that pro usergroup people are referring to. I've read all the complaints in the last time of "Apple is abandoning the pros" but funny thing, I only heard that on the internet and in hypothetical perspectives, nobody actually complained about the lack of Kaby Lake irl. I'm very fine with the current lineup and I'm wondering if that "pro" you're referring to is actually just an extremely narrow niche.


This is a misguided rebuttal. Fixated against the a sub-set of users that are clearly and unanimously voice a complaint, when the problems here is apple.

PROs can work with anything, not matter how bad, and get results done.

However, that not mean we are so STUPID to not know when a tool is made for us or not.

Is about a specific spec? NO!.

Is about the trend that Apple have in the whole ecosystem.

Just now, after years waiting for a refresh, what is given is a stronger confirmation that the trend not only is unstoppable but the trade-off is not good enough.


> PROs can work with anything, not matter how bad, and get results done.

I picture this being written on a Commodore 64.


My 2013 MBP is great, I can see it lasting a few more years even. It was that forward looking.

A 2016 MBP... I can't see it holding up. The cap on RAM I'm already butting up against it...

Photoshop, VMs, site crawls, loading large spreadsheets and trying to crunch the numbers... 16 GB taxes my current system.

I've sort of just given up, I have a gaming desktop I built myself a year or so back with 64 GB of RAM. I just use that when I want to do "work" now... everything loads faster. Freakin' hate that I can't just buy a new MBP for that stuff, but... it's not top of the line, isn't built to last like previous generations were.

I'll keep my 2013 MBP until it gives out on me I guess... running out of Apple Care on it and the battery cycles are high... also not wild about trusting work to a 3-year-old SSD... but cross my fingers I guess.

My sister still uses my old MBP from 2008 or so -- it holds up for most of her needs after a replacement battery and SSD, it's still a solid computer. I don't see these new MBPs holding up like that at all.

Not going to say the new MBPs are shit, but they aren't built to be light-years ahead like they once were. That's what I think everyone is upset about.


I think what's most upsetting to me is that not only are they not built to be light-years ahead, but the fact that they aren't is explicitly due to self-imposed constraints by Apple. And then because of this, they now sell a machine that's effectively the same as the one they sold 3 years ago, but now at a higher price.

This is virtually unprecedented in the history of computing, and the last time these kinds of shenanigans happened was at the tail end of the 8-bit era, and all of those companies, even ones who had a virtual monopoly on the market, don't exist anymore.


Think it's an intentional move to have a shorter product lifespan to sell more computers? I go to thinking about that more... seems like they benefit form making less-great computers than they used to.


I guess the "pro" is anyone who wants more than 16GB RAM ?


More than 16 GB RAM or CUDA or a multiplatform graphics API that wasn't 6 years out of date...


The lack of capacities > 16GB has been covered to death ... if Apple put non low-power RAM in their machines they would be crucified for the drop in battery life. Until Intel actually makes a release date (increasingly unlikely) there's not a lot they can do before the next refresh.


Apple could have kept the extra millimeters in thickness for a larger battery. High-power >16GB RAM plus long battery life would be "pro". Nobody would be crucifying if MBP was same thickness as before.


It would be nice yeah but I doubt the niche is big enough to justify the R&D + logistical burden when those limitations are resolved by Intel in next year's chipset.

It sucks but IMO the memory line of argument is probably the weakest in this whole discussion.


Apple created its own dilemma of having to choose between RAM or battery life when it decided to reduce the battery capacity.


Even if they had a bigger battery, doing so would have required them to do develop and maintain another board/chipset which is nontrivial.


those "pros" complaining about the 16gb limit, I wonder how they've been getting any work done up until now?


Because they bought their computers 3 years ago when you asked why anybody would ever need more than 8gb.


It really depends on what you are doing; I spend a lot of my time doing dev on a 512mb system and my main system where I do 80% of my work on has 4gb (with ssd though!). For the work I do: JS dev, Android dev, most image recog, C/C++ dev, Python dev and just playing around with esoteric langs that works fine. I do not use heavy window managers (I prefer Ratpoison or i3; both are incredibly light) and when I need graphical software that is not being devved by me at the moment itself, I start it, bundle my tasks and close it again. It works well for both battery life (15-17 hours on 2011 tech) and for me it is perfect. I was an avid Mac user before that but I like Linux more for some reason; matter of taste.

It seems however though, and this is an assumption I can only base on personal experience of what I see, that people like all the 'stuff' during their work that eats battery and memory and cycles.

All the animations and things that go on in the popular wms and OSs and browsers that deliver no 'direct value' to me (like said; this is I personal thing I believe) seem a must for most people even though they are devving.

Why would one use the incredibly heavy and animated WMs over something fast and simple which even works faster (for me!). When I use Unity (latest) my laptop has 5 hrs battery; switch to i3 and plop 10+ hours more.

Do people like that so much or am I just completely wrong?

(one other reason I see with colleagues is very heavy IDEs but let us not go onto that)


I think you're confusing cause and effect. Heavy WMs are only possible because there is so much RAM to spare on the average machine. The RAM is originally needed for applications.

For example, I'm working on a Macbook, but inside a Linux VM. It was running at 3-4 GB RAM for a long time, but I had to bump it up to 6 GB when I was working on an interconnected set of Rails applications. (Ruby and Rails are notoriously memory-inefficient.)

Another example: In 2010, I had to upgrade my notebook from 1 GB to 2 GB of RAM, not because of KDE 4 (which was considered absurdly resource-intensive at the time), but because of Git. (Git is designed to use the filesystem cache extensively, and will become absurdly slow when the cache is too small.)


>I think you're confusing cause and effect

I was responding to many people who 'cannot work' because they have less than 16gb, the latest discrete gpu and cpu. Which is where all the mb pro whining now revolves around. You are playing to 'if you have more you spend more?' human condition right? If you have more money, memory, speed whatever, you will find some way to use it fully; that is true. But you do not /need it/. I was the same a few years ago until I dropped Mac OS in favor of Debian and bought a stack of old (2011) laptops. After tweaking everything but the stuff I do not need on it, it works as well /for te dev I need it for/ as the last (2015) MB pro I touched for a fraction of the price and i3 does make me more productive.

Anyway; I am not trying to convince or sway anyone, just saying that, imho, it is nonsense that most (by very far margins) really need that kind of metal. But sure, if you can get them, then get them. In my slightly older age I try to get back to simpler and more optimal. The cyclewaste/bitwaste is getting annoying to me (IF it is waste; I do deep learning and play/fiddle with games on heavy metal).


For comparison, the minimum amount of RAM required by the precursor of Mac OS, NeXT, - 32MB (that's megabytes) - seemed insane. But then again, 90% of the computer science - including experiments in AI - had been done (at least, up until mid-80s) on computers that had less than 1 megabyte of RAM. (IIRC, the first version of UNIX was designed to run on a computer with 24 kilobytes of RAM.)


"640K ought to be enough for anybody."


I'm clinging for dear life to my beloved five year old MBP. I don't know if my next laptop will even be Apple at all.


Gave my Macbook Pro 13 (2013) to the missus a few months ago as I was looking forward to the new one.

As it is, i've just spent £1400 on a XPS 13 fully loaded and instead of a iMac i'll be buying a Surface Tablet. The missus after having each iPhone since the first is going to be buying a Pixel.

Apple have lost the plot and are taking the piss out of their users. We are voting with our wallets.


A friend of mine has a Surface Pro, and I have a not-insignificant amount of envy. It's the first time in almost 20 years that I've looked at a Windows laptop and felt it was in any way ahead of Apple. Great design, and it doesn't have the inevitable pointlessly-cheap compromises that plague every other "high end" Windows laptop.

Of course, it's still Windows underneath. Sooner or later, I'd get tripped up by the pathetic reality of 8.3 case-insensitive filenames buried in a not-deep-enough grave.


Yeah Android really has iOS beat right now on UI/UX. It's pretty embarrassing.


Same here. New battery, new secondary SSD, reconditioning of the MagSafe power supply and maxed out RAM are underway. Performance-wise it should be about the same as the latest Macbook, which doesn't feel like an upgrade for software development.

Slow hardware helps one write good software after-all..


I know multiple people who are buying 2014 and 2015 model years or even PCs instead of the new one.


Isn't this (form-over-function) a similar but not exactly identical phase that Microsoft went through, that resulted in the now publicized "Raymond Chen camp vs. MSDN camp" mantra?

https://blogs.msdn.microsoft.com/dareobasanjo/2004/08/25/the...

Is this ever avoidable once you gain a large swath of market share, but are still under pressure to grow even more?


>We lose good functionality for no good reason, other than products become "simpler" and "more pure".

You can disagree with the reasons, and certainly many do, but it's just silliness to allege that moving to USB-C was done for "no good reason".

It's the exact same silliness that surfaced in the griping about removing 9-pin serial ports, and optical drives, and Ethernet ports in laptops, and FireWire in favor of something more than an order of magnitude faster, and every other innovation by Apple.

It's always for a good reason.


Actually, I never mentioned USB-C.

But let's take it as an example: adding USB-C was a great idea.

Removing every other kind of port, including the fantastic Magsafe, on the other hand, was a bad idea. Many people try to justify it by saying that it had to be done so that we can "make progress". Well I'm fine with progress, but I don't want to be held hostage. And it certainly feels like it: if I buy the new Macbook, I will be able to connect exactly ZERO of my devices to it.

But — what I actually meant was things like magsafe, charging light on magsafe, function keys, Esc key, little wire organizers on the charger, keyboard key travel, or the headphone jack on the iPhone. You could easily make the same case for battery life (I'd happily trade 2mm of thickness for longer battery life) and ports: I really don't care that much about thickness, so don't sacrifice ports to make my machine 2mm thinner. Again: we lose good functionality for no good reason.


Sorry, I don't find the MagSafe fantastic at all, at least not in the MagSafe 2 incarnation. The thing detaches far too easily for my liking, leaving me ever more frustrated. It's the only thing I dislike about my 15" rMBP


Agree. The first MagSafe from 2009 worked, but I went through multiple chargers. The MagSafe on my new MBP (previous gen) doesn't work nearly as well, and just disconnects every time.


Why would you want to move to USB-C other than not having to flip the cable when you inevitably put it in the wrong way.

If your answer is power delivery, remember that not all cables and not all devices support power delivery on USB-C so it's nothing that couldn't have been done with "USB4" utilising the same plug we've always had.


Few reasons for me:

* HDMI and DP alt modes make laptop docks a lot cheaper and more universal

* Don't need an OTG cable to use a Type-C device on my Type-C phone

* Potential Thunderbolt 3 port reuse


Well, not always. Removing the iPhone headphone jack was dumb and pointless.

Losing the ethernet jack on Pro laptops is also pretty bad. The reason is thinness, I dunno if that's good or not. WiFi is sufficient for most people, but it sure as hell isn't better when you need high bandwidth and consistent low latency.


> Removing the iPhone headphone jack was dumb and pointless.

I suspect there was a point. This is part of Apple's standard MO.

It's a way to wean people onto wireless (BT). Apple's expensive BT headphones point the way to cheap third party BT headphones that aren't painful to connect (right now bluetooth infrastructure is pretty sucky).

Once people are comfortable without the headphone wires they will have inductive charging like the watch and have a water resistant device like the new watch.


> This in itself would not be a problem for me, if it weren't for the lack of competition. Mac OS is so far ahead in terms of "getting things done"

I wouldn't touch MacOS for it. It's nowhere comparable in productivity for me to something like current KDE Plasma 5 on Linux. Not sure if you ever used it, but Linux is nowhere limited to Ubuntu's Unity.


I didn't want to get into these kinds of discussions, but I'll venture a guess that you don't often connect/disconnect multiple monitors to your machine, and especially not one HiDPI and one HD. That experience alone is enough to ditch Linux altogether.

BTW, Windows 10 is slightly better, but not by much. The major difference being that one can predictably get a signal on both monitors, I guess, which is good, but then the HiDPI story is still pretty bad.


As a keyboard user, I feel Windows more polished than macOS when you work with multiple screens.

To name a simple example, pressing CMD + Shift + -> moves an application to the next monitor, with macOS you should pay for a window manager to get a similar functionality, or give up the keyboard and embrace a touchpad-centric gimmick like Expose, which I perceive as distracting with these animations.

It is certainly an example of YMMV.


Spectacle is free and does this and a lot more should you need it.


Ah, multi-monitor support. I always joke how I ditched hardware random-number generators because the window placement of macOS is a superior source of randomness.

Meanwhile, multi-monitor support in Plasma 5 has worked for me for years without any trouble whatsoever.


Well, it was buggy until not very long ago:

* https://bugs.kde.org/show_bug.cgi?id=365455

* https://blog.martin-graesslin.com/blog/2016/07/multi-screen-...

This also was rather annoying and took a long time for distros to catch up: https://bugreports.qt.io/browse/QTBUG-42985

Since Plasma 5.8.x, it's much much better.


I do. Plasma nailed down multi-monitor setup in 5.8.x. Before that it was significantly worse.

For different DPI on each monitor you'd need to use Wayland though.


Can it work with a high dpi laptop monitor? I want to put Linux on an old MacBook Pro Retina but it appears that large parts of the interface are still hard coded for certain resolutions. Also read that battery life will take a big hit.


Yes, it can. Plasma 4 wasn't really suitable for it, but Plasma 5 is OK. Can't say much about battery life - it can differ a lot depending on the laptop and how well various power features of the chipset are documented. A lot of time Intel for example are pretty sloppy and don't document things like that, so there isn't much Linux developers can do.

mjg59 wrote a number of comments on this subject.


I'm not sure about hardware and design teams, but I'm noticing that the entire software org is going downhill at Apple fast.

Budget cuts, delaying releases for arbitrary reasons, more and more technical debt on key portions of iOS seem to be the norm rather than the exception. Each subsequent release of iOS is more bloated than ever, and teams are reluctant to port their stuff over to Swift because of the aforementioned technical debt.

Fundamentally, the problem is that the software org at Apple is a third class citizen. In order for that to change, there needs to be a shift in resources toward the SW org. But I don't see it happening any time soon :)


Actually, although I believe the hardware is overpriced, and personally have no gripes about new designs -- I just don't care about them -- I would agree this is where the core problem lies with Apple right now. Almost squarely in software. I could drone on about the reasons for this, but to me it feels like something fundamental is missing in that area now. Something that only Apple used to have.


I see this as a huge problem too. Apple just doesn't seem to have high level advocacy for software quality. This is driving me to Ubuntu at the desktop.

I am sticking with my Macbook but on my next laptop purchase it will be a matter of if Apple hardware quality makes up for their software failings. I haven't looked closely in 2 years but back then I didn't see any hardware close to Apple's laptop quality.

I do agree with those disappointed with the MacBook Pro (or whatever they call it now) but those disappointments are not as great as my disappointment with Apple software quality the last 5 years.


From experience (retina mbp 2015, 15"), I would highly recommend you do not try to run Ubuntu (or other Linux) on newer Apple hardware. Take the money you'll save by buying a "lower end" laptop/desktop x86 machine and put it toward buying more RAM, a better SSD, nice peripherals, etc...


2 other areas for Apple to boost SW: AppStore backend seems inflexible to change (why can't I buy AppleTV apps from iPhone?). And Siri is going downhill with each iOS releases (I think should be the competitors are gaining much more with each iOS releases).


In addition to the technical problems, the usability has gone downhill too.

My work iphone automatically sent balloons to a coworker when I messaged "happy birthday, send me the file". I also have to cancel out of "send with effects" screen each time I press the send button a fraction too long. If I turn my phone sideways, I get the handwritten message screen. All this in addition to the hidden buttons and other UI problems. Ugh.


Most of my career has been in Java and C and similar, but recently I was asked to work on an iOS GPS plotting application. The client provided me with a Mac laptop and an installed copy of XCode.

That was the most frustrating, poorest experience with an IDE and software development environment I have ever had, and I've dealt with most operating systems, Visual Basic, Pick, Cache etc. Even keyed in a 6502 assembler back in the day which was more fun.

Crashing constantly, bizarre "4GL" drag-and-drop of controls onto source code to link them, atrocious documentation, awful GUI builder/layout managers, random behaviour... ugh. I cannot understand how anyone would be enjoying iOS development.


The author says Apple's leaving a void because Google and Amazon are inventing new product categories instead of Apple, but Apple's successes haven't been new inventions. The iPod wasn't the first MP3 player, the iPhone wasn't the first smartphone, the iPad wasn't the first tablet, and the MacBook wasn't the first laptop. What Apple does well is waiting to add enough incremental upgrades to products until it surpasses the "amazing" threshold. This ends up looking like Apple is either late to market (Apple TV) or so incredibly successful that it appears as if the market didn't even exist before Apple entered it (iPhone/iPad).

Based on Apple's history, we should really be expecting them to crush the Echo and Chromecast sometime in the next few years.


They've already had a stab at the Apple TV, so I doubt we're going to see a revolutionary chromecast-like competitor.

An Echo competitor would be possible but has Siri made any noticeable revolutionary leaps recently?


Siri does now have an API (although limited) on iOS, but I haven't seen many uses of it yet.

As for the Apple TV and Chromecast, I've been using a Chromecast (1st gen) for the last few months since I gave my Apple TV to my parents, and I miss my Apple TV. The Chromecast is great when it works, but it's extremely hit or miss depending on the app you use to stream. YouTube and Netflix work perfectly, but HBO Now and FX are terrible. There's no option to scrub backward/forward, and often when you hit pause the app will disconnect. With HBO Now and the FX app, it usually takes a good 3-4 tries to connect and actually watch something. With the Apple TV, everything worked as expected.

I think the major factor stopping the Apple TV from crushing the Chromecast is price. The old Apple TV is $70 - they should really be pushing for a much lower price point for this to compete with the Chromecast, but Apple doesn't even list this one on their site as far as I can tell.


I never got on with the Chromecast but the FireTV stick on the other hand is great.

Our AppleTV and PVR have both gone now.

Funniest thing with the Apple TV OS was when my 7 year old complained that it was broken as pressing the right arrow on the remote (as indicated by the right arrow on screen) didn't work - I showed him you had to press OK, but for a business that's supposed to pride itself in great UI it's a spectacular fail


You would not have seen the Siri leaps that matter, that's not how Apple operates. They develop things in secret and then drop them on the world. I'm not saying that's for sure what will happen with Siri but that's Apple's track record and mode of operation.


>Based on Apple's history, we should really be expecting them to crush the Echo and Chromecast sometime in the next few years.

My impression is that Apple is far behind in the machine learning/AI/personal assistant space, and does not have the engineering resources to catch up.

I'd like to be wrong about that, because I like Apple products and I believe that competition is good, but I'm pessimistic.


How can a company with cash reserves as large as several nations "not have engineering resources"?

They don't want to get engineering resources on par with competitors, for whatever reason.


Not necessarily. A company can't just magic the best or even good/adequate "engineering resources", AKA human beings, out of thin air, as anyone who's tried to start a startup even with serious cash in hand can attest.

Apple's corporate culture has a variety of things that I would find hard to imagine not suppressing their ability to hire and retain the best; I'm sure others have a better idea than I, but the #1 thing staring us in the face is that software is not respected.


I hear this regularly, and I don't buy it.

It's not that black and white. No-one is suggesting or implying that Apple throws some money at HR and Monday morning comes around and they're sitting down a new multi-hundred person engineering team like magic.

But at this point, OS X/macOS improvement has been in obvious decline (since 10.6? 10.8?) for _several years_.

No, Apple can't just magic an engineering team out of thin air.

Apple can, however, create engineering teams given several years of it being a pressing priority. They were able to for iOS, for watchOS, etc.

But therein lies the rub - with every month that goes by, it's clear that to call it a "pressing priority" to Apple would be to give it an importance that they clearly don't feel it has.


> But therein lies the rub - with every month that goes by, it's clear that to call it a "pressing priority" to Apple would be to give it an importance that they clearly don't feel it has.

The question is, what is Apple's priority right now? The latest iPhone and Apple Watch models are incremental upgrades, and in the case of the iPhone, everyone expected a redesign (and Apple had two years to make that happen, but they either chose not to, or failed). macOS or whatever they call it now hasn't changed much in several years, as you pointed out. The Mac Pro is stagnant. The Mac Mini is stagnant. The new Macbook Pro model has the touch bar, after several years of sameness.

The only recent Apple products that are a notable upgrade from their previous versions are the Apple TV with Siri, and potentially the touch bar Macbook Pro. That's all I can think of since the release of the iPhone 6 in 2014. I guess you could throw in the giant iPad Pro too, but I've never actually seen one outside of an Apple Store, so I don't know if they sell that well.

I hope Apple has some exciting products lined up for 2017, because the past few years have been incremental at best.


The latest iPhone and Apple Watch models are incremental upgrades, and in the case of the iPhone, everyone expected a redesign

Pretty much the only thing that didn't change on the iPhone in the latest rev was the form factor. Apple is also very well known for continual incremental upgrades. Yes, I'm going to quote Gruber:

In every regard, from performance to battery life to camera image quality to haptic feedback to water resistance to wide color gamut displays to sound quality from the speakers, the iPhone 7 and 7 Plus are impressive year-over-year improvements over the 6S/6S Plus, and stunning improvements over the two- and three-year old iPhone 6 and 5S — which are the models most people considering the new iPhones will be upgrading from.

https://daringfireball.net/2016/09/the_iphones_7

Is the lack of a form factor change in the iPhone 7 make you think the iPhone is not (still) a priority for Apple?


Every two years, there was a new iPhone form factor. Everyone expects that to continue. Apple is aware that this expectation exists, because Apple created that expectation by releasing a new iPhone form factor every two years. However, Apple did not meet that expectation this year, and did not offer any explanation. If that was deliberate, it was probably a stupid business move and definitely a failure to manage expectations. If it was not deliberate, it indicates that Apple can no longer finish a complete iPhone redesign in two years, which it was able to do in the past. Either way, it's not good.

Frankly, John Gruber is a poster boy for rationalization of Apple's mistakes. He's like a dog that gets kicked and keeps coming back to its owner. If the iPhones were exploding instead of Galaxy Notes, Gruber would have defended Apple. If Apple decides to release smallpox in 2017 instead of a new iPhone, Gruber will eventually come around to Apple's side, like he always does.


Yup. John Gruber is well-known as a fan boy. That doesn't deny the points he makes in the article. Do you contest his analysis of the changes between the iPhone 6 and iPhone 7? After all, that's what you're arguing, correct? Something that lines up the spec differences between at least the iPhone 6 and 7, and even stronger, over the past few gens, and show that the improvements between the 6 and 7 show some reduction in pace of improvement than previous revs.

At what point do you approach a near-perfect design? Look at the other phones out there, pretty much mimicking Apple's lead (with some exceptions, granted. The Samsung wrap-around screen is pretty cool). The whole attitude of "Apple isn't innovating fast enough, but I don't have any ideas as to what they might come up with, but I want something new and shiny" gets old. For me, I'd like to see a return to something like the 4s (I have small hands), but I'm okay with the 6. I'd also like a more camera-like camera, as I have some difficulty lining up the lens with the subject as the lens isn't centered on the body. I'm used to a DLSR. I don't take a lot of pictures, so this isn't a big deal for me, and a lot of people seem perfectly happy with the current set up. A lot of other phones have similar lens positions.

What aspects of the iPhone form factor would you like changed?


> Yup. John Gruber is well-known as a fan boy. That doesn't deny the points he makes in the article. Do you contest his analysis of the changes between the iPhone 6 and iPhone 7? After all, that's what you're arguing, correct?

That's not what I'm arguing. I don't care if the insides of the iPhone 7 are 100% different than the insides of the iPhone 6S.

The bottom line is that Apple created an expectation, and failed to meet it. The particular expectation in the case of the iPhone was for an external redesign, but that's not really what matters. What matters is that Apple created an expectation and failed to meet it. That's not what good businesses do. Either they don't care, which is bad, or they can't meet the expectation, which is worse.


Okay. In addition to updating the form factor on a regular basis, how else did Apple create or reinforce this expectation? Frankly, I think that would make for a very interesting blog post.

Looking back at the history of iPhone releases:

Source: http://www.techradar.com/news/phone-and-communications/mobil...

- iPhone 1 June 2007

- iPhone 3G July 2008 (13 months)

- iPhone 4 June 2010 (23 months)

- iPhone 5 Sept 2012 (27 months)

- iPhone 6 Sept 2014 (24 months)

There have been 3 cycles of the bienniel updates, the majority. Honest question, is that alone enough to set expectations?

Should they iterate even when they really don't have any idea how to improve it? Iterate for iteration's sake? Wouldn't that mean at least meaningless cost at best, and possibly decrease the value of the design?

Given that the iPhone is a major source of Apple revenue, this is an important question. How much can we read into Apple's prospects for the future based on the lack of a significant form factor update with the iPhone 7?

By the way, I agree with you on the Mac Pro and the Mac mini. I'd really like to see strong updates in these areas. However, I don't know what the market looks like for these form factors these days. Of the two, I'm only interested in a mini. I have been for quite a while (it would replace a PowerPC model, FFS) yet I haven't pulled the trigger, so am I really in the market? Same with pro-level software such as Final Cut and Aperture (pretty significant when it came out).


Most of those things are also things that require (relatively speaking) little engineering effort from a software perspective, which goes back to my original point.


Since Apple is partnered with IBM why wouldn't they explore using Watson as a backend for Siri?


When has Apple ever used anyone else's core technology?

Apple either develops their own stuff, or acquires a company and brings it in house (that's how they got Siri in the first place, as well as their own processor designs). Apple is not going to buy IBM.


In the end he nails it without fully realizing it. The problem is that the iPhone is now everything for Apple. Having one single mega cash cow tends to result in the whole company organizing itself around that cash cow. Microsoft did it for Windows and Office and missed a generation or two of developers as a result.

This is why Apple is stagnating and only seems to care about phones and locking down that ecosystem. They are the new Nokia.

Of course Amazon and Google have their cash cows too, so they are not much different. Their home speakers are ways for them to more aggressively market and capture customer buying behavior. Personally I consider them deeply creepy for this reason and will not get one.

The true innovation is going to come from hungrier players: startups or older companies that are on the ropes like Apple once was. Microsoft is getting interesting again, and I'd also keep an eye on old PC vendors like Dell and HP or even old IBM. These companies now have nothing to lose.


Bingo. Also, let's not forget that Apple has a history of not releasing the first of something, but waiting until the market is there, and then releasing a much more user friendly product that redefines the market and grows it. They're not a tech first company so much as a UX first company.


+1. I wrote about this a few weeks ago, and the team building the Linux subsystem at Microsoft took note: https://jhatax.blogspot.com/2016/11/developers-windows-linux.... Microsoft is truly hungry to win back developers, and the team is willing to do what it takes to bring non-Windows developers into the fold.

Apple seems to still be operating as if it can only focus on one core platform at any given time as opposed to two or three. The talk on HN is only about the core OSes -- macOS and iOS -- but what about their productivity apps? They haven't seen a meaningful update in years (Pages, Keynote, etc. in case you are unaware).

With their huge cash reserves, Apple needs to recognize that it can build multiple platforms in parallel without sacrificing the quality of either. It could very well be that Jony Ive not being a core software guy is the issue. It could not. I don't have insider access. I will say this: Apple needs a hard core software leader and it doesn't seem to have one today. That's the 'real void'.


While I do understand the author's point about the void - the particular example of the Amazon Echo and Google Home is something I really don't get. I find those devices invasive and irritating - I'm not sure I see the attraction.

That being said, I also hate talking on the phone so this may be more of an issue at interacting vocally with a device. It's somewhat lessened when I use FaceTime with my phone but only because I can see who I'm talking to.


It seems to me that Apple attempts to avoid the "invasive" use of deep learning, while Echo and Home utilize it aggressively. Perhaps this is why Siri is seemingly lagging behind Google Now.

Another example of this is Photos apps - Google Photos constantly calls back to the cloud and showers you with computer curated albums, auto-slideshows and "remixes", and automatically tag peoples' faces, while Apple Photos does its learning on-device and buries the "creepy" facial/background recognition features deep in its menus.


On top of that, calling them category defining and comparing them to the iPhone is kind of crazy since they are niche products so far. Nobody I know has an Echo. The few people I know that have Chromecasts bought them only because it was a cheap way to wirelessly stream video from their computer to a TV for presentations. They didn't buy it as a substitute for a Roku or Apple TV.

I share the author's concerns about Apple's devices since about 2012 being underwhelming but I don't agree with their conclusion that Google and Amazon are filling the void. The distressing part for me is that as far as work computers go nobody is.


I have an echo. My parents have an echo and I'm giving one to my sister and her family this xmas.

It is absolutely a category, especially if you have a larger home where your phone isn't near you. Instead, you just speak to it and it responds without having to find a device instead. It's integration with an ecosystem in your home means it's the new remote -- one that's far more powerful.


Out of curiosity, did your parents seek out the Echo or did you buy it for them? I'll believe you regardless, but the comment about you giving one to your sister tends to be more of the story I hear when I hear about niche products "everyone has".

There's a slight difference in a lot of the tech that gets pushed as AppleProduct-Killers in that most of the time the products usually kill the Apple equivalent only on paper; historically, when push comes to shove, it's been iPhones, iPads, and MacBooks people seek out or ask for because they're still considered hot items. This isn't to dismiss the quality of other products, but it's also not the same as everyone wanting them specifically.


My parents sought one after seeing/using mine. My brother in law commented a lot on mine after being over recently, and I watched my friend's 3 young girls ask to it play various Kate Perry songs every 7 seconds (they now have one). It's a new, somewhat expensive device, so it'll take time for it to make its way into the mainstream. It's also not for everyone, but it is for a lot of people.


WARNING: Snarky Response

I just write it down on a notepad rather than buying a $150 device.

That said, you do you. If y'all like it, have at it and I won't be one to criticize the way you spend your money.


Wait, the point was to control things around the home, how do you do that by writing into a notepad?


Or play music, or set timers (especially in the kitchen), convert between units, set alarms (at night), or get random information like the time in various places? (I do all these things on a frequent basis)


I see Amazon Echo and Google Home as research devices, where the way they work is just what's easy to stick microphones and hardware inside and not look like a student project. The fact that Amazon are building a significant ecosystem where devices are controlled by the Echo is a happy accident but one where the article alleges Apple are also failing.

If voice control was significantly more capable it would make devices which are constrained in other input methods (e.g. the Apple Watch doesn't have a physical keyboard or the space for much on-screen input, legal restrictions on touchscreen use when driving) much more useful.


I've been a Mac user for over ten years, drawn to the platform due to the combination of OS X and solid hardware. Since the disappointing MacBook Pro announcement, I've been preparing to transition myself back to the world of Windows and *nix. I bought myself a refurbished ThinkPad T430 to get myself used to Windows 10 and modern Linux desktop environments, as well as the software running on them. I'm trying to wean myself off OS X and the Mac ecosystem. If it turns out that I'll be able to transition to either Windows or Linux/BSD without too much pain, then sometime next year I plan to buy something like a ThinkPad P50, which has options for Xeon processors and 64 GB RAM. Apple sadly makes nothing like this.

I love OS X and I still believe it's the best desktop OS out there, but I'm very disappointed in Apple's direction, and this experience has made me realized the dangers of being locked to a specific hardware platform: if Apple doesn't make the hardware you want, then there are no alternatives if you want to stay on OS X legally. It was a fun ride while Apple made the hardware I wanted. But Apple started changing direction from powerful, expandable, upgradeable Macs to non-upgradeable, disposable computers for those whose computational needs aren't very intensive. I considered jumping ship back in 2013, but due to the controversies of Windows 8, GNOME 3, and KDE 4 at the time, I held my nose and bought a MacBook Air despite its non-upgradeability. This time I plan to jump ship. I learned my lesson regarding relying on closed platforms for my computing needs; you're at the mercy of the provider of the platform. For this reason alone I'm interested in returning to Windows and Linux/BSD, although I wish they one day will have the same level of polish as OS X.


Why do we say Apple came up with this and Apple came up with that? Steve Jobs came up with those things. The man left Apple, Apple tanked, requiring a bailout from Microsoft. Jobs comes back, very much Return of the King-esque, and Apple rises again. Meanwhile in his exile he helps start a little studio called Pixar. This is very much a "Built to Last" situation. Great book. Companies with Cult leader CEO's lose direction when that CEO leaves or dies. Cook is Apple's Ballmer. A great business man, not a great innovator. At some point, my guess is like Ballmer, Apple will fall far enough behind that someone else will get the job. Hopefully, Apple finds someone like Satya. But none of this has been surprising. You can't lose someone like Jobs and expect the progress to continue. The guy understood how to translate technology to the every day man woman and child like nobody else. These are the Ballmer years at Apple. Hopefully, they go by faster than they did at Microsoft.


Pixar had started well before Jobs entered the picture. His role in Pixar was of a mercurial money-man who constantly threatened to shut down the whole operation, and almost killed the studio on the verge of their big break because he couldn't stand the thought of giving out shares to all workers they had been promised to. Not much of a help, more like a hindrance who was tolerated because of the money he brought to the table.

He certainly did great at Apple but Pixar succeeded despite him rather than thanks to him.


They had been trying to find buyers for almost 2 years. It was going to be Jobs or nothing. Give the man credit for being the only one to see at least part of the vision.


It was always clear to me that the direction Apple was taking was not a good one for people outside the "2 sigma" chunk of the bell curve. Apple's focus has been on creating the smallest amount of products for the largest possible audience.


He's not sure why they bought Beats? Is there any proof that this deal didn't work out for them? They clearly integrated some of the IP into iTunes Music and it seems like the products are still selling like hotcakes.

I'm scratching my head.


It seems like the Echo is a repackaged Siri, with an optional wake word. The Echo Dot lets you buy stuff through it much like the Kindle which seems to be Amazon's thing.


Ya -- one that permanently has a place in your home so you don't have to worry where your phone is, one that integrates with popular 3rd party home-based IoT, is always listening/on, works better than Siri, and comes with better and free music integration (but is better if you're a prime member). The dot is just echo junior, there is no difference in utility of it.

The result is a very meaningful improvement and a change to how you interact with technology in your home.


The Echo is terrific and (unlike Siri) I wouldn't be without it. Sadly, Amazon's management app for the Echo is an unstable, slow, buggy piece of unusable garbage that someone should be fired over.


But luckily you don't really need to use the app much after initial setup.


Not so. It is how you access the to-do list and shopping lists. So I use it, or try to, almost everyday.


Many would agreed that the new MacBook Pro is overpriced. With its hugh profit margin, hardly anyone complain that iPhone is overpriced.


How many more of these articles do we need?

Yes, we understand that Apple is no longer innovating at all, much less at the clip that they used to.


Current article title: What happened to Apple?


Since changed (reverted?) to "The void left by Apple"


Moaning about Apple leaving a "void" because of Google Chromecast is pretty hilarious. Especially when Apple's version, the Apple TV, is a superior device.

Chromecast is the best Google can do? Color me un-worried.


Perhaps they're considering allowing third parties to produce Macintosh compatibles, and macOS would be sold thru OEMs like MS-Windows?


I can't tell if you're joking or not. The first time around licensing the OS was part of the near bankruptcy of Apple. Apple is a hardware company that also makes software that makes the hardware compelling. I don't see it happening, but I think it would be more likely that Apple get rid of Macs and macOS entirely (relying on iOS products) rather than license macOS to third parties.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: