Apple is already on it's way to the trashheap. There's a reason why innovation has essentially stopped and now comes down to "a few megapixels more in the camera", while quality control - both in their hardware as well as their software - has taken a big hit.
I guess they will succumb faster than the old Xerox-age market leaders, because they are not focussed on consulting (= making other companies believe their constantly syphoning money from them brings value).
> How is literally designing your chips to a point where they could be desktop class to replace x86 not innovation?
That's more a sign that the company wants more profit by owning the top to bottom stack.
Perhaps they saw the existing chipsets as not delivering what they wanted or not scaling to fit demand, but it's still an investment not directly tied to product (their core competency).
> That's more a sign that the company wants more profit by owning the top to bottom stack.
Is bringing mobile/embedded and now desktop-class CPU design in-house really something one does as a cost-saving measure? Apple wants control over their entire stack, and sure, that relates to their business as a whole, but if this was solely about profit maximization surely there would be better strategies.
> it's still an investment not directly tied to product
I'm not sure I follow your reasoning here. Are you arguing it's not a direct investment because the CPUs aren't products in and of themselves, but rather components for other products? If so, I don't agree -- Apple's investment in, say, case tooling/manufacturing processes and equipment exclusive to their products is surely an investment directly tied to those products, right? The CPUs are likewise components exclusive to Apple products. That seems to me to be a pretty direct investment.
I don't understand this response at all. Does innovation not count if you're not doing it for charity? For several years I was reading articles about how Moore's Law was totally over and we couldn't expect any more improvements in chips, and then along comes Apple to blow x86 out of the water.
> but it's still an investment not directly tied to product (their core competency)
I don't even agree with this- Apple's core competency is the top-to-bottom customer experience, which they (almost certainly correctly) think they can improve by making their own silicon. But even if it was true, so what? Again, "investment not directly tied to product" doesn't make innovation "not count".
> How is literally designing your chips to a point where they could be desktop class to replace x86 not innovation? What other company is doing this?
Facebook, Google and Amazon are known to do their own server design; it wouldn't surprise me if any or all of them were doing custom processors (e.g. better virtualization features for their clouds, or processors that are more oriented towards their workloads). It doesn't sound like innovation, more like cost cutting; these processors aren't delivering a step change to end users, at best they're squeezing out a little more battery life. (By contrast e.g. that sapphire screen that was rumoured would have been innovative, because sapphire can do stuff that glass simply can't).
Those companies don't need third parties to develop anything so why would we know? Facebook was building custom server hardware for years before it became public knowledge that they were doing it.
You're right — we don't know. That gives us two options. Either we acknowledge the innovations we do know about and have proof of, or we use wild guesses and assumptions to dismiss those innovations.
I'll be happy to praise Google or Facebook for advancements in CPU tech if and when they show us such a thing. Until then, publicly available facts are that Apple is innovating in that field and they're not.
The very fact that we can't tell shows that this isn't any significant innovation. Even the part about handing out dev kits doesn't actually show anything - using an off-the-shelf ARM would create the same impact.
They're designing CPUs - something that many companies have done and many companies will do. Big whoop.
Both of IBM's current chip architectures, Z and POWER9, are indeed interesting.
Apple hasn't shipped their "desktop class" architecture yet.
Perhaps you are frustrated by the design constraints of low power mobile chips. Ok.
But if you're paying attention, architecture wise, it may be of interest to note that Apple's ARM chips, so far, have delivered good performance in their handheld applications by careful attention to sustained memory bandwidth. Competitors went with more CPU cores.
So there's some fun chip architecture to be had, even in 2020.
A desktop Apple architecture might use something like HBM for main memory, rather than DIMMs.
There's lots of room to innovate, out there in consumer computing.
Memory bandwidth is generally kind of needed to take advantage of more cores.
Last time I checked, Z had 500gb/sec; more than 10x Apple's. Kind of wish IBM had won processor wars. Generally speaking whenever I look at their mainframe doodads, then look at the hot garbage being slung over at Amazon or whatever FAANG shit hole, it makes me sad. The company with the best engineers is an also-ran that mostly sells consultant hours. Maybe they'll sell off the mainframe business independent of the rest of the horse shit and it will undergo a renaissance. Doubt it though.
IBM has been building chips since day one. I'd expect IBM to divest their mainframe business eventually. The distinction you are missing is that usually "finance-driven" companies don't usually decide to pour billions of dollars into bringing in an already outsourced component in house that they hardly have experience in.
Furthermore, I don't see IBM using the Z to "innovate" - They aren't pushing the mainframes to anyone other than people who are already buying mainframes.
>Z is a more interesting architecture by far than the turd Apple is shipping.
The Z, an architecture for people who are pretty much already buying mainframes, is more interesting than a desktop class chip with what will probably be a completely unmatched in performance/watt? I don't see how the Z is more interesting than a chip that is finally attempting to challenge the 30 year x86 dominance in desktop computing.
There's 2 trillion USD betting that you're wrong, and virtually zero betting that you're right (AAPL Short Percent of Float = 0.00% as per Nasdaq). Buy some puts and you'll make a killing.
Not sure why this is being downvoted, Apple’s hardware reliability has become horrible.
Every Apple device I’ve purchased in the past few years has suffered from a defect: AirPods Pro, iPhone X, 2019 MacBook Pro, iPad. And this doesn’t include Batterygate.
That's why. Even if they're lost some of their lustre (and, if we're being literal, could be "on it's way" in the sense that it was the most valuable company in the world and might've dropped a few percentage points).
Not only the past few years, heck, they launched a phone you couldn't hold properly and people lapped it up. That's not stopped their offerings not only being the best in their class, but often the only products of note (AirPods, iPads, Watch...).
Apple doesn't have to beat Apple. They just have to beat the best of the rest - and apart from perhaps Samsung and Huawei in phones, they're looking pretty peachy still.
I guess they will succumb faster than the old Xerox-age market leaders, because they are not focussed on consulting (= making other companies believe their constantly syphoning money from them brings value).