My argument is they were able to develop the chip because of their control. The constraints allowed them that freedom and the constraints come from the top down integration and control.
I'll bow out here because I can just tell this won't be a worthwhile thread.
> But what other advantage did this give them? Like name specific examples. Feel free to leave, but I honestly don't see where you're coming from.
Back when Apple used Intel processors, they were at the mercy of Intel's roadmap; if Intel missed a deadline for a new chip, Apple had to change plans. Obviously, that's no longer the case.
Back in the Motorola/IBM days, their G5 processor ran so hot that Apple had to create a special case with 7 fans to cool it. It was an amazing engineering feat, but something Apple would never do unless they had no choice. I've used a Power Mac G5—it sounded like a jet taking off, and the fans stayed on. [2]
They get to integrate new technologies quicker than being constrained by the industry.
Apple launched the first 64-bit smartphone, the iPhone 5s, in 2013—at least a year before any Android manufacturer could. And when Qualcomm finally shipped a 64-bit processor, no version of Android supported it. [1]
There are dozens of examples where Apple's vertical integration has allowed them to stay a step ahead of competitors.
The latest is the C1 modem that shipped in the iPhone 16e. Because the C1 is much more efficient than Qualcomm's modem, the 16e gets better battery life than the more expensive iPhone 16 with Qualcomm's modem. [3]
And because Qualcomm's licensing fees are a percentage of the cost of the device it's in, shipping the C1 enables them to put modems in laptops. The Qualcomm fee is significant: an iPad Air starts at $599; the same iPad Air model with one of Qualcomm's modems costs $749.
Customers have wanted MacBooks with cellular modems forever; now they'll be able to do that, since the modem will become part of Apple's SoC in the near future.
That's what you can do when you're not constrained by off-the-shelf components.
They've been able to reap some real technological efficiencies because of their vertical integration. Notable ones I know about:
- The integrated on-chip RAM dramatically speeds up memory access. Your full 16 GB of RAM on an M1 functions at cache speeds; meanwhile, the L3 cache on an Intel processor is 1-8M, more than 3 orders of magnitude smaller.
- Apple takes full advantage of this with their software stack. Objective C and Swift use reference counting. The advantage of refcounting is that it doesn't have slow GC pauses, but the disadvantage is that it has terrible locality properties (requiring that you update refcounts on all sorts of different cache lines when you assign a variable) which often make it significantly slower on real-world Intel hardware. But if your entire RAM operates at cache speeds, this disadvantage goes away.
- Refcounting is usually significantly more memory-efficient than GC, because with the latter you need to set aside empty space to copy objects into, and as that space fills up your GC becomes significantly less efficient. This lets Apple apps get more out of smaller overall RAM sizes. The 16GB on an M1 would feel very constraining on most modern Wintel computers, but it's plenty for Apple software.
- The OS is aware of the overall system load, and can use it to determine whether to use the performance or efficiency cores, and to allocate workloads across cores. The efficiency cores are very battery-efficient; that's why Macbooks often have multiple times the battery life of Windows laptops.
- The stock apps are all designed to take advantage of efficiencies in the OS and not do work that they don't need to, which again makes them faster and more battery efficient.
Apple M1 (or any M-series) RAM absolutely does NOT function at cache speeds. Do you know how expensive that memory would be? The RAM is not literally "in the CPU", but colocated in the same SoC "system on chip" package as the CPU.
It feels like a core part of your claim--at least half of it--relies on most software for "Wintel computers" being written in garbage collected languages, which would be shocking if it were true.
I'll bow out here because I can just tell this won't be a worthwhile thread.