Hacker News new | past | comments | ask | show | jobs | submit login
How Apple Built a Chip Powerhouse to Threaten Qualcomm and Intel (bloomberg.com)
223 points by walterbell on Jan 30, 2018 | hide | past | favorite | 146 comments



I have only skimmed the article but two things jump out:

> Recently the company got a fresh incentive to go all-in on silicon: revelations that microprocessors with components designed by Intel Corp., Arm Holdings Plc and Advanced Micro Devices, Inc. are vulnerable to hacking.

Apple isn't immune: https://support.apple.com/en-us/HT208394 (indeed seems iOS devices suffer from Meltdown where AMD devices do not).

> The result: a chip powerhouse that could one day threaten the dominance of Qualcomm Inc. and even, eventually, Intel.

Surely to threaten dominance Apple needs to start selling it's chips to other people? Whilst it's possible it would run counter to the way Apple likes to do things, namely get as much in house a possible to get maximum control and seamless vertical integration, they've never shown any interest in selling bits of their technology to others (well recently anyway).


Apple may be a (large) minority purchaser of modem chips overall, but it's a hugely dominant chunk of the market for high end devices. If they do their own high end modems, it would gut the high end modem business for Qualcom. That could also drive up prices and jeopardise supply for other high end phone manufacturers. It might even make some high end features uneconomical for Qualcom to include.

Right now Apple is several years ahead of other high end phone manufacturers in CPUs. If they cripple Qualcomm's high end modem business by taking most of it away from them, they could extend the same lead to modems.


> Right now Apple is several years ahead of other high end phone manufacturers in CPUs.

I hear this statement a lot... how is this determined?


Performance and features. In single threaded performance, Apple's A chips are way ahead. That's because Apple highly customizes and optimizes their core designs, while Qualcom and Samsung use largely vanila ARM reference designs with maybe a few tweaks. Apple chips also have considerably more cache. That's difficult for the competition to counter because they are very price sensitive and cache size eats up expensive die space.

The counter argument is that the other systems make up for this by having more cores, which is really a cop out. Single threaded execution has a far more direct effect on user experience, gaming performance, etc. They can't compete in core engineering so make it up by slapping on more fairly generic cores. It does appear that Samsung is responding to this and investing in more advanced core designs.

The final area is custom features like the neural engine behind real-time face recognition, real time 3D face lighting effects and such. The competition don't really have these at all, so we don't know how far behind they are.

There's a pretty good article on this linked below.

https://seekingalpha.com/article/4138071-apple-cpu-advantage...


>That's because Apple highly customizes and optimizes their core designs, while Qualcom and Samsung use largely vanila ARM reference designs with maybe a few tweaks

Here's the Anandtech article on the "few" tweaks Samsung has made to its 6 wide decode custom M3 cores in the upcoming Exynos 9810.

https://www.anandtech.com/show/12361/samsung-exynos-m3-archi...


The Samsung M1, M2 and soon the M3 are all fully custom.


Yep, it does look like Samsung may be in a position to catch up soon. That could make a real problem for Qualcom the other Android manufacturers. Samsung are the biggest manufacturer of high end Android phones. If Qualcom can't sell high end CPUs to them any more, it might make manufacturing any truly high end CPUs uneconomical, or at least drive up costs even further.


Watch out for what comes out of ARM Austin. Their last core was the A72 from a couple of years ago. Rumour has it they've been working on a big core to better compete with Apple and Samsung.

Looking back, it makes sense that Qualcomm discarded their underperforming custom design efforts (at least for the mobile market). If ARM can deliver a competitive design, why not fully commit to their roadmap and save significant amounts on R&D costs?


Because if ARM cannot deliver they are fucked.


A lesson Apple learned at the hands of Motorola.

I sometimes wonder how much of It Just Works was influenced by having slower machines. When doing everything takes longer, if you do it right the first time then it’s still faster than doing it twice.


ARM is not just one company - it's a semi-standard.


Would you mind backing up your claim? To my knowledge, ARM is one company that licenses their designs to other companies, who are free to parameterize and modify as they please.


I think the idea is that it probably matters very little to Apple or Samsung or any other company that has an architectural license how well the ARM originated core performs because they are building their own core and ARM is basically a standard committee to them. Obviously companies that are using the actual ARM cores care about ARM's ability to perform.


There are two types of ARM licenses. One is an Architectectural license. This allows licensees to design and implement the ARM ISA. Apple and QCOM among others have Architectural licenses. They can add customizations such as # pipelines stages. This license is very expensive.

The other license is getting ARM's implementation of a CPU (ie A57, A72 etc). Customers can parametize # cores, cache size, memory width etc, but the basic architecture is fixed.


I don't know what the above comment was referring to but they are more than what you describe. I'd submit AMBA as an example of this; I've seen this tech utilised by IP for ASICs that don't actually contain any ARM processor cores themselves.

https://en.m.wikipedia.org/wiki/Advanced_Microcontroller_Bus...


Why don't companies just come up with their own designs?


First off designing a whole core architecture from scratch is a huge and very expensive undertaking. Youd need to be able to get a really big competitive advantage to make it worthwhile.

Secondly theres a large ecosystem of add on components designed work with ARM and be dropped into ARM SOCS, such as GPUs, Wifi modules, Gyroscopes, wireless modems, GPS, etc.

Thirdly there are a lot of SOC engineers very familiar with ARM. You can hire them streight from competitors, or college, including PHDs that have done research on it. Youd need to train up any new hirs from scratch on your architecture.

Finally theres a huge software development tool chain built around the established processor architectures. To support a new architecture youd also need to build a set of compiler back ends, bearing in mind the existing ones benefit from many years of tweaking and optimization.


Yes, SW is the key; especially Linux. ARM is way ahead of other embedded CPU architectures in terms of Linux support.


Because it gives you no competitive advantage.


>Performance and features.

I think both these metrics are now adequate, battery life is the single metric I am interested in now.


Battery life is, realistically, part of performance- it’s just “per watt”. Apple is also a leader in that area as well.


Apple made the decision to pursue performance at the cost of device life.


Apple's CPUs win benchmarks, but a lot of this is more down to economics. They're making CPUs for a single client, themselves, mostly for the high end, and they don't need to make a profit on them.

Qualcomm probably could make better CPUs...but it needs to sell them for a profit, and they might not fit into all phone's physical size or power budget.

So, Qualcomm ends up making somewhat lowest-common-denominator chips. Economies of scale make it difficult to make a run of super good chips when Samsung uses its own in many markets, Apple uses its own--the world of high-end smartphones outside of Samsung and Apple is just too small.


Isn't Apple cheating?

They're taking their CPUs and clocking them at speeds the device can't support, except when brand new, then letting the device slow down over the corse of its usable lifespan.

It would be interested to compare an Apple and Qualcomm CPU after a year and a half, to see how the benchmarks have changed.


It takes quite awhile for the battery to degrade to the point it can’t run the CPU at max capacity anymore. And it still could, but it would be potentially unstable.


That's incorrect. The CPUs don't degrade, the batteries do.


You're completely ignoring the question. Do other smartphones start out fast due to new batteries and slow down due to the battery degradation like Apple or do they account for the battery degradation and keep performance stable?


Other smartphones either do the same thing (slow down) or just plain crash (most of them do this). It’s a common complaint.


The CPU speed decreases. So doesn't matter if it technically is still able to reach higher clock speeds. In reality it runs slower.


Right, as an alternative to random crashes. Bear in mind this is not an Apple problem, its a battery problem. All the other manufacturers have this problem, they just dont attempt to detect battery power fluctuations and mitigate it with throttling. They just eat the system crashes. Its got nothing to do with the CPU design, otherwise replacing the battery wouldnt fix the problem.


> All the other manufacturers have this problem, they just don't attempt to detect battery power fluctuations and mitigate it with throttling. They just eat the system crashes.

Really, because I have not heard of any other manufacturer with this problem?


Then you have not looked into it. Android forums have been filled with complaints over “random crashes” when doing something CPU heavy for years.



Source: basic understanding of how batteries & CPUs work while following tech forums for decades.

This isn’t considered controversial


Twitter pundits commenting on Geekbench benchmarks, mostly.

Snark aside, Apple's CPUs are pretty(really pretty) far ahead of Qualcomm's in single-threaded performance.


Qualcom's chips tend to struggle to keep up with whatever Apple released two generations ago.

But it doesn't matter because if you're building an Android device you can't buy an A chip for it.


Given how highly multithreaded mobile OSes are, that is only advantageous for lazy coding.

Any proper iOS application with be doing lots of GCD dispatchs, for starters.

Likewise on Android, Google had to change the OS behavior to just kill apps that insist in misusing the UI thread for long running tasks.


Uh nope, single threaded performance is the key parameter. It’s what made intel (et al) blow up with Meltdown and Spectre. It’s what they’d still be pushing if it weren’t for physics.


Single threaded programming isn't 'lazy coding'


It is, in the age of ubiquitous multi-core hardware.

I don't remember when it was the last time any of my applications only had a single thread of execution on them, beyond shell and Python scripts.

Maybe around 2000.

And even then, the OS is juggling processes across all cores every few ms, so outside any benchmark winning game, there isn't much real world value in single thread performance.


Are you serious? Multithreadjng is not Parallelism. It’s not about processing an image in the UI thread, it’s about making that processing faster on the background thread that picks up the work. Don’t get me wrong but lots of people still write iterators, for loops and single threaded functions. Parallel algorithms are hard



What I see is the quality of a JIT compiler implementation, not the use of cores.


"Right now Apple is several years ahead of other high end phone manufacturers in CPUs. If they cripple Qualcomm's high end modem business by taking most of it away from them, they could extend the same lead to modems."

Err, certainly this would just result in vertical integration (IE one of these companies buying Qualcomm), or any other situations, rather than "everyone sucks except for Apple" as you posit


Err, certainly this would just result in vertical integration (IE one of these companies buying Qualcomm), or any other situations, rather than "everyone sucks except for Apple" as you posit

Or, it could result in vertical integration and "everyone sucks except for Apple."


>Right now Apple is several years ahead of other high end phone manufacturers in CPUs

Does Apple even integrate a modem into their SoC's? I find it hard to believe that they can be years ahead of other high end phone SoC's manufactures when they haven't even accomplished that. Additionally, Samsung claims a 2x increase in their new Exynos 9810 SoC [1] which means that it should have single thread performance that is in the ballpark of Apple SoC's.

With a clock speed of up to 2.9GHz, a 3rd generation custom CPU offers higher computing power so that its single-core and multi-core performances are improved around two-fold and 40 percent respectively when compared to its predecessor.

[1]http://www.samsung.com/semiconductor/minisite/exynos/product...


They probably have milk in the fridge bought from a store even though they have had time to breed their own cows. They probably don't want the hassle of it when the whole herd could get foot and mouth disease. They let the milk supply chain deal with such things.


>The result: a chip powerhouse that could one day threaten the dominance of Qualcomm Inc. and even, eventually, Intel.

If by dominance being the largest Semiconductor company then yes, possibly. But not because Apple continue to grow infinitely, but both competitor are shrinking.

Apple is already close to Intel in terms of processor shipped, and is certainly larger then Intel in terms of CPU. And when Apple decide to have its own modem, Intel will ship another ~150M less unit.

Apple has roughly 20% of world wide market share. Qualcomm, with the shrinking % of 300M unit modem to Apple, may one day lost all of that when Apple decide to make it themselves. For the other 80%, there is roughly 20% at the bottom that Qualcomm will not compete in. With less then 60% of market, 40% of those are from Samsung, Huawei, BBK, and Xiaomi. Both Samsung and Huawei are planning to use their own SoC in the high end flagship phone, BBK are thinking of making their own as well. And you have Mediatek making gains at the bottom to mid end of the market. All of a sudden Qualcomm has less then 50% of Smartphone market to work with.

If Apple do actually have an team working on Modem, I wonder when will Apple becomes the largest purchaser in Semiconductor.


My understanding was that Samsung (Exynos) and Huawei (Kirin) do release global models of their phones based on their own SoC for GSM markets.

But they can't overlook the Nth American market where Snapdragon dominates the US with CDMA2000 support.


That is assuming CDMA 2000 is still relevant. You can wonder if anyone using a Flagship Smartphone cares about dropping CDMA support. The world has moved to LTE, and only much more so in the future.


CDMA matters less and less in the US, but for some reason Qualcomm is seemingly the only company making a modem with comprehensive US LTE band support.


Say what? Intel modems covers all US LTE bands that Qualcomm covers. Qualcomm has an edge on max bandwidth and supporting CDMA, but not LTE bands (for the US anyway).


And Intel isn’t? Intel’s modems have substantially all LTE bands.


Save for posing a major threat to suppliers who rely heavily on them, it would be a surprise if Apple sells its chips to anyone. Their chips are certainly great, if the responsiveness of their iOS devices is anything to go by. The OS and restrictions on background processes doesn’t harm them either. I wish they could somehow share their all advances with everyone.

This brings me to my main problem with Apple; as a consumer of technology in a the developing world. For Apple, I don’t exist, or at least the vast majority of my us don’t. The technology is always priced beyond our reach. Some individuals can afford them, but the vast majority have to use alternatives. Any success in slowing down Android would have meant no smart phones for us. Most people I know can’t afford an iPhone. This is what I consider the other side of the Apple story.


I don't think that's the 'other side' of Apple's story.

Respectfully, Apple is/was/tries to be a luxury and aspirational brand. They're expensive, everyone knows it. It's like saying that BMWs fault is that their cars aren't cheap.


I suppose the whole "luxury and inspirational" bit is my problem. Maybe an even bigger part of this is the fact that I don't look at technology the same way I look at cars. More and more, technology is becoming a necessity, more than a luxury. So I find myself routing against Apple, even if I love their phones and tablets. I hope this doesn't make me a hypocrite.

Totally unrelated; car manufacturers share technology quite a bit.


But cars are a necessity as well (for some parts of the world), and there's cars to fit all sorts of price points in the same way that you get get smartphones and tablets at any price.


They might have trouble selling their chips. They optimize far more towards performance over price than Qualcomm does. Apple can afford to do that, because of its high profit margins. Potential buyers probably couldn't afford that tradeoff.


The only market I can think of where higher prices wouldn't be as much of an issue is the server market. Imagine that. High density, Apple designed, ARM SOCs for servers. Or maybe they'll bring back Xserve.


But if you look at their chip design road map they started with an ARM Soc, now they have a motion co-processor, integrated graphics, and an ML chip. The road map of their SoC has been to specialize towards phone applications exclusively, the majority of that stuff just wouldn't make any sense in a server.

So what you're asking is not only for them to sell their IP to 3rd parties (a business model of partners they've repeatedly crushed) but also to build out an entirely separate chip design team focused on a new market.


On the other hand, servers are running more GPU / ML workloads these days and semiconductor designs are more automated. It’d be interesting to see if they could cost-effectively sell those designs with, perhaps, capacity tweaks similar to what they’ve done with the iPad X-series releases where the chip design is very similar but they’ve added extra GPU cores, cache, etc.

That’d also be consistent with their past history of trying to pick some edge for a particular market: optimized for media processing, ML, VR/AR, etc. rather than just competing with x86 for the generic workloads. I still don’t see it as a high probability but I’d expect a niche if it happens at all.


I just want Apple to team up with Nintendo and make a home console + Apple TV.


I’ve always felt Apple and Nintendo have a common thread that rarely gets appreciated.


That would be awesome, but it’s difficult to see how that would fit in with Nintendo’s vision, which seems to be merging console and mobile gaming. How do you do that without competing directly with iPhone/iPad?


Most of Nintendo's hardware obsession ends up being accessories more than core system tech. I would like to believe that Nintendo bringing ideas for hardware with Apple executing would work really well. Plus, Apple could use a bit more, "Friendly for children aesthetic" and not so much "Future Techno Glass World".


Would IMO totally make sense in markets where they cannot loose anything, like servers, workstations and PCs. Question is whether Apple has what it takes to disrupt completely new markets (with their current management).


I would suggest Apple has absolutely no interest in those markets. Even when Steve Jobs had been fired, Apple was focussed on products for the end-user, not components for others to assemble from.


Apple won't sell their components unless its MFi DRM chips. They aren't in the same market segment as Intel, because they don't sell CPUs for consumer or server markets.


The SoC timeline is not uninteresting but it's lacking at least one of the major events: Apple switching from ARM-licensed to internally designed cores with the A6. It should also outline the A7 more, that wasn't just Apple's first 64b SoC, it was an industry-wide-first of shipping 64b cores in a product, that took the industry very much by surprise (64b products were planned for 12~24 months down the line before that IIRC).

These are the events which genuinely placed Apple on a serious footing as a chip designer, the A4 was Apple-branded but only a preparatory step.

The essay also lacks other silicon acquisition of Apple, like Intrinsity, whose work actually showed up as early as the A4 (before it was acquired).

It's also oddly padding the timeline with X-variants.


I think the A7 being the first 6-wide core in the mobile space was far more important than the fact it was 64 bit. That gave it unprecedented single threaded performance and competitors are struggling to keep pace on that front to this day, despite them also being 64 bit for several generations now.


It's true that 64-bit wasn't as big a difference in terms of performance or user facing features, but it was an utter humiliation for the rest of the industry.

Nobody could pretend that Apple was not way ahead of everyone else in terms of chip design capability and design sophistication. Right up to that point Samsung and Qualcom had been telling everyone they had all the experience and Apple was a newbie at this chip design stuff that didn't know what it was doing.

All of a sudden they were faced with the reality. If they could do 64 bit a year before anyone else, they could do anything a year before anyone else. They could ship working products that other teams barely had on the drawing board, which means nobody could reasonably predict anything they could do next. How can you design a flagship phone for next year to compete with them, when you have no clue what features are even possible for them to have? That's what it brought to the table.


why? Did Apple achieve something that was difficult or unexpected?

IIRC, it was Apple who asked Samsung to collaborate with Intrinsity, most known for their FAST tech, to develop the Hummingbird core which was used in both Samsung and Apple's AP's. Only after having tested Intrinsity's tech, Apple acquired it -- so it's a bit stretch to agree with your narrative Samsung didn't see what was coming. Apple also took their time developing and releasing their own first AP about 4 years later. It also seems like while Apple is focusing on the single-core performance as their marketing point, Qualcomm and others are focusing on power-saving, having first implemented multi-core, then octo-cores on their chips. I'm not sure if that's a surprise or unexpected, since Apple's AP remained a single-core AP for a while.

As for the 64bit, was there any compelling reason to go 64bit mobile? I remember when I used to work for wall street banks in the 1990's, the shortcomings of the 32bit arch limited our ability to scale and SUN's UltraSPARC which was widely used by in the industry came to relieve that problem.


I don't think Intrinsity's tech had much to do with 64 bit, all their work before the acquisition was on 32-bit architectures.

The main attraction for 64-bit ARM is, apart from the obvious future proofing, it has a redesigned and much more efficient ISA. It also has the optional secure enclave feature which Apple used for TouchID and now FaceID. That feature doesn't actually require 64-bit per se, but it is only a feature of the 64-bit ARM architecture. As an aside, is anyone else using it for anything? I'm only aware of Apple using it, but technically it's not an Apple exclusive feature.

Your 'focused on power saving' point is correct as far as it goes, that was their intent, but unfortunately the power saving benefits of BIGLittle turned out to be much more modest and full of caveats than hoped. It turns out that a fast efficient single core that can complete an instruction in less time, then quickly shut down into a power saving mode, is more efficient in power terms for most cases than a slower lower powered core taking longer. As a result Apple's fewer faster cores approach turned out to actually offer better power efficiency as well. Since then Apple has also adopted BIGLittle, it's not a failed technology, it's just that the inflection point when it becomes worth doing was at a very different point than previously realised. Whether thats down to luck or judgement on Apple's part is moot.


I'm pretty sure the Secure Enclave is an Apple technology. It's similar to ARM's TrustZone but not the same thing.


It started off as an implementation of TrustZone but has likely diverged significantly from it since. In any case 64bit ARM had architectural features to support the implementation if such technology. Doing so on 32 bit ARM would have been a huge amount of work, and really not feasible.


The Secure Enclave never used TrustZone. It's a physically separate processor. (Not that this matters.)


No, Intrinsity acquisition was in 2000, years ahead of Apple's first 64-bit release.

> The main attraction for 64-bit ARM is, apart from the obvious future proofing ...

ok

> unfortunately the power saving benefits of BIGLittle turned out to be much more modest and full of caveats than hoped ... As a result Apple's fewer faster cores approach turned out to actually offer better power efficiency as well.

Sure, I'm guessing you are comparing some very early implementation of bigLITTLE by Samsung in 2012 versus Apple's single core. Or are you denying the power efficiency benefit of multi-core architecture in mobile processors (or non-mobile processor for that matter)? It's one thing to criticize Samsung's very early, first and second iterations of big.LITTLE chips in 2012, which was highly workload dependent, but it's completely another to claim Apple's single-core superiority over multi-core power-efficiency, including those non-octo core Samsung AP's. Perhaps you can substantiate this with some references?


> No, Intrinsity acquisition was in 2000, years ahead of Apple's first 64-bit release.

You are off by a decade[0] on the date of the acquisition.

[0] https://en.m.wikipedia.org/wiki/Intrinsity


Were talking about the same thing, this is a historical discussion about the evolution of these processors that lead to this point. How can you suggest that I'm denying the advantages of multi-core when I pointed out Apple has now also adopted a variation of big.LITTLE?


Even if they’re focused on power saving, the devices their chips end up are still less efficient than an iPhone AND they have lower performance. This might mostly be a software issue, but it still speaks to how far ahead Apple is by most metrics.

I use my phone relatively heavily, and I still get better battery life with an iPhone because the passive energy usage is just so damn low.


64-bit ARM has similar advantages as 64-bit x86. It cleans up some cruft but more importantly it doubles the number of registers.


Sure, I get that. What I'm asking is if there were any technical challenges (even to this date) in mobile computing that necessitated a 64-bit mobile chip.


I guess it depends on your perspective. Is performance a technical challenge? IMHO I say yes.

64-bit enables a number of other things. It has a better ABI for argument passing. It allows more extensive use of tagged pointers (the ObjC runtime can fit many common 11-char or shorter strings in a single tagged pointer). It's also a chance to make any other ABI-breaking changes you want to make in the runtime.

If you're forward looking you might say that mobile devices will exceed the 2GB/4GB limit soon enough so why not make the change now rather than waiting for it to become a problem? You'd get all the performance benefits listed above and a future-proof foundation on which to build.


Whats a 6-wide core?



Fetched & decoded instructions per cycle.

Parent is wrong though, the A7 has 3-wide cores (and dual-core like previous chips)


The Cyclone core has 6 decoders. It can decode, rename and retire up to 6 instructions per cycle[1].

[1] https://www.anandtech.com/show/7910/apples-cyclone-microarch...


All Apple A series chips were branded as theirs although it was fairly clear that prior to the A6, all A series chips were designed and manufactured by Samsung. The A4 chip was also designed by Samsung (and Intrinsity in collaboration) to compete with Qualcomm and go beyond the 1Ghz clock speed barrier.

Bloomberg also seems to think that the A4 was Apple's first processor designed in-house, when Intrinsity was not part of Apple. So I'd like to know how much of Apple's own engineering went into the design of the A4, that wasn't Samsung or Intrinsity.


The opening paragraph also glosses over the fact that some of Apple's chips are also vulnerable to Meltdown and Spectre.


I'd missed that, but you're perfectly correct.

It's also stating that Apple could topple Qualcomm… despite Apple not selling to third parties, the only way that'd happen is if they obtained a complete monopoly on the smartphone market. I don't see either happen.


If Qualcomm sits around and does nothing. However Qualcomm are competing for the nascent ARM server market with the (incredible IMO) Amberwing, and Qualcomm would still have 50%+ of the mobile market for selling their modems and Snapdragons to fund development.


Theoretically, if the premium brand smartphone dropped Qualcomm and managed to make a go of it, that does put a lot of pressure on them and, perhaps, encourages others to do the same (Samsung, for example.)


The BBC News breaking article on Meltdown and Spectre was hilarious. The title was something along the lines of "Apple phones have bug", then in a paragraph down near the end it noted that oh, by the way, almost all chips used everywhere by everyone are also affected.


Apple were one of the last of the major companies to come out with a statement regarding Meltdown and Spectre. The BBC articles for Intel and the rest were written a day or so earlier.


Well media production orgs have a massive myopia for Apple tech, so it would not surprise me...


Yeah, it's vague at best on the detail.


I can only hope for Apple to revive the Mac Mini, or the " Bring Your Own Display, Keyboard and Mouse" concept.

I think there is a (small) market for Apple, but it is also a way bring new users into the Mac ecosystem (although the iPhone/iPad do this).

I want a small, but affordable Mac to have as a backup and toy around... and as a Mac user, the Mac Mini was perfect. I'll cross my fingers.


Tim Cook supposedly confirmed that a new Mac mini is in the works by way of email:

> I'm glad you love the Mac mini. We love it too. Our customers have found so many creative and interesting uses for the Mac mini. While it is not time to share any details, we do plan for Mac mini to be an important part of our product line going forward.

https://arstechnica.com/gadgets/2017/10/the-mac-mini-isnt-de...


Maybe it's just the mishmash of corporate speak but that statement says the exact opposite to me.


Maybe I'm just more used to corporate speak, but "we can't share details yet but this product will be important to us in the future" couldn't be more clear in indicating that a redesign is in progress and they're not dropping the line.


Think my point is more what would Tim Cook say if he didn't care about the Mac Mini at all going forward? I feel it would be similar to that statement.

It hasn't been updated in 3 years and the phrasing he used didn't mention "future" just "an important part of our product line going forward." which is pretty non-committal IMHO


If he didn't care about the Mac Mini at all he wouldn't have said anything.


At this point of time(3 years after the latest release), not saying anything would be construed as abandoning the Mac Mini.

I would bet that he was force to say something to assuage concerned Apple users.


I see it as their inevitable first step of an iOS based computer. With their CPUs and GPUs being where they are at now I'd imagine it won't be too long.


Right. I see the iPhone with external displays & keyboard. See it's revived!


I think, Apple should have a roadmap where all of their Mac will be SSD by default. The current iMac and Mac mini, is still on super slow HDD.

But those NAND prices per GB aren't coming down. I think might be one of the most important reason why Apple hasn't has new Mac mini.

This is especially true when Apple said APFS is specially designed for NAND, and not HDD.


The middle 21.5" 4k iMac still come with a HDD. In Canada this is a $1,729.00 machine. This is simply ridiculous.

The 27" models all have Fusion Drives, but I'm not familiar with them, so don't know their performance level.


NAND prices per GB are actually starting to come back down, now. Most of the 3D NAND tooling change-overs are completing at the various NAND fabs, so NAND prices should only continue to fall in the coming months/years in $/GB pricing metrics.

You can get consumer grade 500GB-class SATA 2.5" SSDs for $130 (or less!) now from multiple vendors. This time last year these were up around $200 (EDIT: this may not be accurate!). Consumer SATA SSDs were some of the first to transition to 3D NAND.

For example: https://camelcamelcamel.com/SanDisk-Ultra-2-5-Inch-Height-SD...


Note that many of these high capacities SSDs are able to sell at lower prices due to them not having any onboard dram, meaning the mapping data for wear leveling is either stored on the nand itself or in the system ram which causes random read/write to slow down a significant amount.


DRAM is still extremely expensive, so you're probably on-point about part of the low price being the lack of DRAM. It would be interesting to see tear downs of current consumer and enterprise SSDs to better understand how they work.


Well it is coming down now only because Samsung is trying to lock in as much market share of NAND as possible this year, before the Chinese has their NAND Fab running and start race to bottom.


> but it is also a way bring new users into the Mac ecosystem

Does Apple want new users in the Mac ecosystem? I thought Tim Cook's future was "Post-PC"


I do hope they keep the Mac. It's the main reason why I am in Apple's ecosystem. If that goes, so does my interest in their mobile devices.


Until you can run Xcode in iOS, you'll need a Mac to make apps for Apple's various platform.


Who knows what WWDC 2018 might bring.


yeah, I think Xcode for iOS is going to be here sooner rather than later. They are already testing the water waters of compiling (I think) in Swift Playgrounds.


I'd like this too, but can we enhance to have a standard monitor mount where you can clip in a Small box to the monitor which supplies power and cable-less HDMI port? Kinda like a componentized all in one.


They have that already, the macbook or macbook pro if you need additional power. These can act as a hub you connect externals to, with the added benefit that they can be used offsite.


iPad is converging on this space IMO.


> Apple has wisely focused on designing its silicon (for its system on a chips, Apple uses reference designs from Arm Holdings Plc).

I was under the impression that the current A cores are not reference designs from ARM... That it was a purely Apple-designed core implementing the ARM ISA. Am I mistaken or is the article mistaken?


Apple is one of very few companies that has an ISA license. This allows them to implement the ARM ISAs using their own microarchitecture and physical implementation.

Digital was afaik the first ISA licensee, which allowed them to design the StrongARM. https://en.wikipedia.org/wiki/StrongARM

StrongARM ended up at Intel when they bought Digital. The design team for StrongARM split up into two companies: SiByte and Alchemy. The design work i in SV on StrongARM was lead by Daniel W. Dobberpuhl. Daniel was cofounder of SiByte and (more relevant) founder of PA-Semi.

https://en.wikipedia.org/wiki/Daniel_W._Dobberpuhl https://en.wikipedia.org/wiki/P.A._Semi

PA-Semi was a fabless company that designed really powerful and power efficient CPU cores based on the Power (PA) architecture. PA-Semi was really good at both architecture, but also physical design for low power. PA-Semi was acquired by Apple in 2008 (as mentioned in the article. Boutique chip maker is an interesting phrase btw). PA-Semi basically became the processor design team at Apple.

Looking at the timeline, the first A-series CPU (A4) was released in 2010 which would be possible for the PA-Semi team to have developed.

(If my understanding of the history is correct.)


Small correction: Intel didn't buy Digital, but got StrongARM with it's design team (and IIRC some tangentially related products like 21152) as part of settlement of some kind of patent dispute.

Also of note is that Success of HTC is to some extent also coming from DEC and it's StrongARM reference platforms (mainly Itsy) which Compaq then commercialized as iPaq which was manufactured by HTC (with pressumably HTC-designed "HTC ASIC" which contained most of the various PDA-specific glue required for StrongARM).


> Intel didn't buy Digital

Compaq did. Rose to glory as an IBM PC clone maker, had enough cash to buy busted out DEC, dwindled away in turn to be purchased by HP, if memory serves.

Sic transit.


https://www.bloomberg.com/news/articles/2018-01-30/google-is...

"Google officially closed its $1.1 billion deal with HTC Corp., adding more than 2,000 smartphone specialists in Taiwan to help the search giant chase Apple Inc. in the cut-throat premium handset market. The deal will help Google design more of its own consumer hardware and could set it up to wade deeper into special-purpose chips -- like Apple. Google’s most recent Pixel model came with a new image processor to improve the device’s camera. More of this "custom silicon" will come in the future, Google’s hardware chief Rick Osterloh said in an interview."


I'm surprised they don't mention Jim Keller, the engineer who designed the A4/A5 chips, as well as the AMD Zen and K8 (Athlon 64) microarchitectures: https://en.wikipedia.org/wiki/Jim_Keller_(engineer)


I'm surprised that you're surprised; this article clearly isn't meant to be a serious piece of journalism. Regardless, thanks for posting a reference to Jim Keller's wiki page.


Do you know what he worked on? According to WP those didn't have in-house CPU or GPU yet.


I wonder if this all started with Apple buying the company behind this PowerPC processor:

https://en.m.wikipedia.org/wiki/PWRficient

I rarely see that company mentioned when people talk about Apple getting into hardware. What I dont know if it's because they were uninvolved in ARM-related stuff done later or just not well known.


I feel like this article is just an advert for Apple - in the wake of the recent processor flaws we've collectively suffered.


I am not worried but instead welcome the integration of their chips to protect the integrity of my machine's data from hackers or even government agencies. That I fully support and expect to see.

I do seriously doubt they will move away from an Intel compatible platform, at most if they did then perhaps AMD chips. If Apple took their Macs away from Intel compatibility I doubt I would buy a new one and it would put into question the ability to maintain what software library that exists for the platform.


>If Apple took their Macs away from Intel compatibility I doubt I would buy a new one and it would put into question the ability to maintain what software library that exists for the platform.

I'm not sure there's much reason to be that tied to the chip anymore... When you load up MariaDB on your Fedora box the commands to install it and run your software are identical whether it's Intel or ARM. Apple of all people just like (for example) the Linux community are perfectly capable of writing device drivers for each processor architecture and once that work is done really what else makes a MacOS machine on ARM vs Intel any different than a Linux distribution found on multiple architectures?

Particularly since Apple focus so heavily on positive customer experience I can't help but feel they'll make it as painless as possible. Most likely it's not a concern at all.


>> I'm not sure there's much reason to be that tied to the chip anymore...

How about virtualization? Desktop virtual machine software seems to be a pretty healthy industry on Mac.


Apple already provides a hypervisor framework (and has previous experience with Rosetta), it’s not beyond reason that they could solve that issue themselves.


I'd be cool with it.

The main reason I was convinced to get a Mac was the move to Intel CPUs. I figured if there was a problem, I'd just install Windows, and I even bought a license for Parallels.

Since then I've occasionally spun up some Linux VMs in VirtualBox, I even installed Windows using Boot Camp on my first Mac for a while before deleting it to get the space back, but I've never made serious use of Intel compatibility. It's just never really been necessary, just occasionally useful but not enough to mean losing the option would be in any way a deal breaker.


Windows compatibility was always a checklist feature meant to convince anxious Windows users that switching would be fine. In practice hardly anyone bothered with it. And this was when the PC mattered more. If they're switching today I don't think they'll care at all about Windows compatibility (and it might not even be an issue since Microsoft seems to be switching to ARM as well).


I guess it depends on what you do. For example, if you're deploying to x86 servers and you're testing on local VMs using Vagrant or whatnot, it helps to be able to be using the exact same platform and not worry about unexpected bugs/issues that might be platform specific.


They could buy AMD and then have license to implement the x86_64 ISA in a similar fashion to how the A-series implement the ARM ISA. Bit far fetched, but not completely implausible.


On desktops or notebooks, where the power budget is much bigger an cooling more efficient, they might push for a high performance ARM chip, coupled with an x86 translation layer. They pulled this successfully twice already (m68k -> PPC and PPC -> x86).


I've been expecting a iPadTop ever since the A series chips started beating medium-end Intel chips in browser benchmarks. Just imagine how thin the device could be, how could Apple not do it? The A12X should be more than powerful enough to make a notebook feel snappy. It would be sort of an Apple Chromebook.


If you add in the heritage of NeXT, you will find more transitions. And while Apple only supported two platforms at a time, NeXT simultaneously supported various architectures: x86, PA-RISC, and SPARC. (And maybe 68k, but not sure if that was concurrent with the others)


Considering that Microsoft is doing it, it's not unlikely. But, they seem to also push iPad forward.


Didn't Intel start passively aggressively signalling that they'd start enforcing their patents with lawsuits if companies do that?


Yes, but that’s solveable by buying AMD.


Who knows, but I think the most logical move for Apple would be to converge into a universal OS with the same architecture for all their products. If that happens it would also make sense for Apple to move all their hardware into their own chips.


"But Apple is just good at marketing"


We've banned this account for posting too many unsubstantive comments. If you don't want to be banned, you're welcome to email hn@ycombinator.com and give us reason to believe that you'll follow the rules in the future.

https://news.ycombinator.com/newsguidelines.html




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: