What I find interesting about Apple designing their own chips is the notion that the whole is greater than the sum of the parts. Apple's value proposition with Macintosh is that by owning the entire experience--hardware, OS, even distribution to the store--Apple can deliver an optimized product that is superior to products where company H makes the hardware, M makes the OS, and CC distributes the resulting PCs.
Designing chips is the logical extension to this model. The chips are designed with the end product in mind, and everything works together to deliver the product's value proposition.
Other manufacturers end up hostage to whatever chips Intel and AMD feel like selling to everyone. They are hostage to whatever OS features Google feels like adding to Android, whether I integrates with the chips or not.
It's not a given that Apple will necessarily succeed with this strategy, it requires an ability to juggle multiple balls at once, a very rare trait. There's a reason most businesses try to do just one thing well and commoditize everything else.
But it is certainly beautiful to watch them try to sail the opposite tack.
Many car manufacturers build and design the entire car themselves, internal and external parts included and sell them at one car brand dealerships and service them at dealerships. They might contract things like tires, audio speakers and stereo systems out, and occasional a part here and there, but by default they build everything else themselves.
I just want to frame the discussion a bit by differentiating “purchasing” from “subcontracting.” If a company uses a part from a supplier, there’s a continuum from “We picked this part form the supplier’s existing catalog” through “We designed this part in partnership with the supplier, but they sell it to anyone” to “We designed this part and our supplier makes it exclusively for us and no-one else.”
Apple obviously subcontracts a lot of its manufacturing, and that is very similar to what most automobile companies do with parts: The parts are manufactured for them, but they are specific to that car and not sold to other manufacturers. OTOH, a tyre is a tyre is a trye in most cases, just as most of Intel’s chips are sold to anybody who is prepared to place an order.
I’m not making any specific point, just suggesting we be clear about what we mean when drawing comparisons between designing CPUs and outsourcing part or all of the manufacture of an automobile.
Apple used to have some parts that were standard, but with the direction they've taken the Macbook Air, I don't think it's got a single one left. Both the 2.5" hard drive and the optical drive are gone.
It's still made of standard components like the webcam and processor, but beyond some electronics it's all designed in-house. And the electronics may not stay standard on the low end, though I don't see them competing with Intel's chips in the Pro lineup.
Most car manufacturers do not. For instance, how many of them do their own ECUs? And large swaths of the American car industry have long been defined by outsourcing relationships. Remember Delphi?
True. Being around Detroit you can see the entire industry that supports the "Big Three". There are companies that either supply or assist in the build of nearly every component of a car.
I immediately thought of Delphi as well, but there are also numerous other counter-examples to the grandparent post. For example, I had a Plymouth in the 90's with an engine built in collaboration with Mitsubishi.
Of course that kind of thing happens. Honda had a V6 engine with mitsubishi in their accord too, but the typical behavior is to do most things in house, not all things all the time.
Indeed. Ever owned a European car? Heard of Bosch?
Ever owned a Japanese car? Heard of (Nippon-)Denso?
You would be astonished how many bits of a car is subcontracted. What the auto manufacturer typically focuses on is engine design (at the mechanical level), chassis design and suspension topology. Most things that "bolt on" are made by somebody else.
This was true at one point in time, in particular, with the Rogue River Ford plant, which was pretty amazing. Iron ore, raw material for making glass, and the like--all done there.
For good or bad, this is no longer how it works. Entire subassemblys are outsourced.
Hyundai are advertising that they have an ultra-modern (relatively) environmentally friendly steel plant in order to lower GHG emissions and control quality.
An interesting play.
I had a chance to visit it before they stopped the tours. The most impressive part was the plant where they took in a huge chunk of steel and heated it red hot, then repeatedly rolled it until it was thin, then rolled it up.
Ever heard of Bosch?... I bet they have tons of products in almost any car around, especially european. There's alot of subcontracting going on in the car industry also.
Actually they build their cars much like we developers build web sites around a web framework. There's very few base models of cars (most shared by several manufacturers) that get customized to look like a Saab, BMW or whatnot. The difference is mainly in looks and some parts, but the total solution is calculated once by joint projects between manufacturers, it not economical to build a car from scratch for just one manufacturer.
They also contract out things like brakes, engines, spark plugs, and the whole electronic shebang (which is now the majority of the engineering effort). And basically all car companies now are conglomerates of several different brands, precisely so that Cadillac and Chevy don't have to build and design most of the car — and, more to the point, the production lines for it — themselves.
This is interesting because when Jobs left the computer industry, a large part of his life there enjoyed little specialisation. I remember reading folklore.org where Steve was hanging around when Hertzfeld, Burrell Smith and the like wirewrapped their Mac prototypes.
Now, Jobs has brought even more technical capability in-house, figuring that he had never needed to have the technical knowhow himself, but what he needs is ability to manage people, and recruit the best and then give them the opportunity to do something beyond what they would have been capable of on their own.
Sometimes I think the hype around Apple's A4/A5 chips is a bit much. Does anyone know if these chips are significantly different from e.g. Samsung's Exynos chips or for that matter any other chip on this list: http://en.wikipedia.org/wiki/ARM_Cortex-A9_MPCore ?
Well, there's a spectrum where on one end they just paint a big "A5" on the package and hype it as their own. On the other end they completely architected everything on their own (which, clearly, they have not done). Somewhere in the middle there is the reality where they have not only made some power management tweaks, but have (according to the last keynote) added an Image Signal Processor that implements face detection, white balance, and image stabilization algorithms that are, presumably, not on the stock chips you mention.
I don't know why this was voted down. The A4 is very similar to the contemporary Samsung SoC and I wouldn't be surprised if the A5 is very similar to Exynos. So far, Apple doesn't appear to be getting any dramatic advantages from having their own chips, especially considering the risk of not being able to switch chip vendors.
The A4 is very similar to the Samsung Hummingbird. The A5 isn't particularly like the Exynos, though, except in that it uses A9 cores; the A5's GPU is _much_ faster.
I think the one big advantage of having their own chips is the marketing angle. It's much cooler to say an Apple iPhone comes with an Apple A5 chip than with a Samsung Exynos chip.
I don't think customers care much about that, not even those who defended their non-Intel Macs until the bitter end.
To be more than anecdotal, the iPhone 4S site lists the "Dual-core A5 processor" as both the first bullet item, and later as its own slide.
Surprising given that tech specs are said to be obsolete. But I think it's more about the dual-core, and it doesn't mention that it's an Apple processor.
Let me rephrase: What is the main difference (feature wise) between the Exynos 4210 and A5? What technical backing does their marketing have (9x better this and 100% faster that).
The Exynos uses an ARM Mali 400 GPU; the A5 uses an Imagination Tech SGX543MP2. The latter is far faster; at least twice as fast in most benchmarks. Also, the A5 is known to have a lot of extra DSP hardware on chip, but the purpose of that is somewhat unclear for now (it has been claimed that it's to do with facial recognition, fast camera operation and/or Siri, but really no-one knows for now).
the text of the article doesn't seem to explain how the assertion is made. Until last year (when I left apple), the PA Semi team (which is the A4/A5 team) was about 20 people (and they were hiring 3-4 more), so this is somewhat difficult to believe, but I might have outdated information. The designs are mostly modified arm designs.
1,000 is probably wrong, counting every possible person who is vaguely related to hardware.
But the PA Semi team is way larger than 20 engineers, that's for sure. Even before the acquisition by Apple, they had way more engineers than 20. I'm the guy who filmed their product launch at the Power Conference.
Wikipedia says 150 people for PA Semi before they got acquired.
1,000 has to be wrong for two reasons. 1) That's an explosion in staff which your organization can't handle 2) 1,000 people on one chip is ridiculous and only something Intel gets away with.
A good estimate for a SoC would top out at around 200 people for the design, implementation and a 100% NIH syndrome.
If this is true, I have to start wondering whether Apple plans to port OS X to ARM. Granted, they’d probably have to evolve ARM into a 64-bit architecture, but the power savings would be incredible. A MacBook Air with the battery life of an iPad 2.
(We already know Apple is more than capable of executing CPU transitions on the Mac. I suppose the only other question is how much people value Parallels and Boot Camp.)
They already did all the ground work when they forked OSX into iOS, and they're notoriously paranoid about keep their options open. I bet you they already have those prototypes lying around – but the economics won't make sense for another five years.
No way it'll take 5 years to move to ARM. I bet we'll see ARM Macbook Airs by the end of next year.
Here's what I think will happen - the Macbook Pro line will be slimmed down to Air specs, the Air that exists today gets discontinued to make way for an all-ARM version.
The Air brand will be the ARM laptop line, the Macbook Pro's will be x86 for a few years before they too switch.
>the Macbook Pro line will be slimmed down to Air specs
Wow, I hope not. Right now I use MBP as a desktop replacement. I've been able to get away with that so far because the iMac isn't normally dramatically far ahead of the 17".
The heavy lifting was done decades ago by NeXT. They had NEXTSTEP running on different architectures, even different endian ones, in the 1990s. I'm guessing they already have OSX running on ARM in the lab.
I wouldn't be surprised if they pulled out a high-performance ARM chip (more cores) to completely replace x86. Someone's got to do it, right? Windows 8 is also headed in this direction with support for both ARM and x86, but all current ARM-chips are low-power low-performance and can't compete with x86.
We can only hope. x86 has so much backwards-compatibility baggage at this point — it’s too complex for its own good and all that extra silicon is just a power drain.
It's worth remembering that this has traditionally been offset by the fact that x86 chips are manufactured in such epic volumes and on such advanced processes that there was no way for more "efficient" silicon to dislodge it.
And unlike the transition from Power chips to x86, ARM won't be so much faster that something like Rosetta will be a workable transition solution for real applications -- effectively everything will have to be ported, or it won't work.
The epic volumes of ARM chips make it possible to overcome the manufacturing and scale issues, but doing complex super-custom ARM chips for PCs drops the chips back out of the mass market again unless an enormous number of PCs get converted all at once -- though this is something Apple is capable of.
Overall, there are enough logistic issues involved that I'm not convinced that ARM is the future of PCs unless the current ARM devices like iPads and Android tablets grow up to be our PCs of the future.
While we're at it, why not jettison the whole paradigm, and start over? ARM itself is getting quite complicated, and there are ideas to be picked from, say, the B5000, or Transmeta.
How can you even have 1000 people designing a chip? A micro-processor is not THAT complicated. Perhaps you can have people doing research on particular aspects, but research teams are typically small (maybe 5-15 people). It seems to me difficult to believe that it's even possible to have that number of people designing a single and insular hardware product, of which the manufacturing is being outsourced (so they don't have to build the making-machines).
I interviewed with their ASIC design team last year (they made me an offer, which I did not take).
the 1000 number might be large, but definitely in the ball park. Designing a microprocessor is not that easy. Besides the ARM core(s), you have the graphics cores, custom DSPs (I heard they have Siri-specific cores in the A5), power control logic, various IO macros and other random stuff. You need an architecture team to figure out what needs to go there and how things interact, a micro architecture team to develop the RTL, a test team to make sure the RTL behaves as planned, a physical implementation team to transform all this mess into polygons that run at desired speeds, various tool/methodology support teams to develop internal flows, evaluate tools and make sure everybody is following the rules and nobody forgot anything, plus various levels of management.
note that most probably they are working on a couple of different designs at the same time.
Though it makes sense that 1000 people on one component is an unrealistic estimate, your logic here is a caricature. How many different groups of people are, for instance, involved in a single game title? Similar dynamics in chip production: you have design teams, verification teams, tools teams, and support staff for 3-4 levels of functionality.
And that's before you start picking apart the system-on-a-chip stuff, which involves design and verification for all the components that the processor talks to, and tooling, layout, &c.
You'll likely have multiple R&D teams competing against each other, with support teams for each. Don't forget to include operations, assembly, shipping, and any teams directly needed to support this research and development of new chips.
It's not inconceivable that Apple would have many teams working on many different ideas and designs that compete with each other. Apple is brilliant at understanding you need to disrupt yourself, before other companies disrupt you.
Beyond the fact that they're probably working on multiple chips, a modern CPU is incredibly complicated, and SoC architectures add another level of complexity to the equation. The idea of having 100s of people involved in the design is not at all surprising.
Applied Materials (etc.) build the making-machines anyway.
As for "how can you have 1000 people involved", well, a modern chip can have a billion transistors on it. The MOSTek 6502 design team was nine people, although maybe you could argue that only five of them were design engineers. The chip contained about 3500 transistors, about 700 per design engineer.
If a modern design team applied the same amount of attention per transistor as the 6502 team, it would need to contain 1.3 million people.
They probably don't all work on the same chip even if the final results are worked down into a SOC. They probably have teams working on the graphics side as well as the audio processor.
Agreed. This article is ridiculous. But hey, people love to read stories about Apple, no matter how fabricated and unrealistic they may be. Every article posted from TechCrunch should get a warning label placed on it.
I would never count Apple out based on their track record over the last decade, but I think it remains to be proven that they can execute on new devices without Jobs.
None of their big successes, from the iPod to the iPhone to the iPad, were anything brand new. They were better execution on an existing concept. We had portable music players, we had mobile phones, we had tablet computers... Jobs and Apple just out-executed everyone else in the market at identifying and delivering a better implementation.
Here they are apparently talking about really new stuff. Stuff we've never had before. Stuff that's "mind blowing." Last time I remember something like that from Apple it was called Newton.
I'm curious if anyone knows of any developers who now regularly use something like an iPhone/iPad (more likely the latter) to program. I know Paul Graham has written about this stage in the evolution of mobile devices, but is anyone doing it yet? For myself I've played with a Lisp interpreter written for the iPod Touch, but of course it was something of a ridiculous activity. If we can start doing that then I'll feel a bit more "post-PC".
I'm thinking Apple may start bringing out their own TV's/Loungeroom hubs.
They're known for their high quality displays and they already have a lot of the other technology that would be coupled with it - FaceTime, Siri, iCloud/iTunes, App store.
In fact, I think Siri and Facetime are both the key parts here. If you can get your TV to do things for you (eg book flights) and also use it as a simple communication point (FaceTime) - its in the realm of "future technology" that people think of when they imagine what's coming.
Exactly. There is nothing that would stop us from making smartphones water resistant and shock resistant and sowing them into clothes. It’s not that we don’t do it because it’s impossible, we don’t do it because it’s a stupid idea.
It's got to be something like that. Otherwise it would be a complete waste of resources. (5% of non-retail staff would be something like 10%+ of all engineering staff...)
Will they sell their chips to other vendors or will Apple treat the chips like their OS and not allow access beyond their sphere of control? Will they put in hardware-level controls to prevent jailbreaking?
The devices people referred to as "mobile" for all those years weren't ever really intended as PC replacements... except perhaps in the developing world, where there weren't many PCs to replace.
"The post PC era" is meant to refer to a world where (except in some specific professional contexts) people largely stop buying PCs because of a new class of mobile devices that takes over their workload entirely.
If I say "the mobile era" to people, that means something very different than "the post PC era". This is what language is for. We create new words so we can talk about new things, even when they're small changes.
Larry Ellison whines about the word "cloud" and insists it's just "servers". But "cloud" is about instantaneous provisioning and other infrastructure that lets you target an undifferentiated set of servers, rather than having to administer each server one by one. And unlike a "cluster" you don't generally own the whole thing. What word should we use, if not "cloud"?
Yes, it's still servers. Yes, the iPad is a mobile device. Yes servers and mobile devices are old. But whats your problem with using new words for the parts of these phenomena which are new?
"The post PC era" is meant to refer to a world where (except in some specific professional contexts) people largely stop buying PCs because of a new class of mobile devices that takes over their workload entirely.
Then I think we're a long way from seeing this era, simply because anything that requires more than a couple lines of text isn't just better on a keyboard than other input devices—it's vastly, insanely better (at the moment), a problem I don't see being solved in the nearish future. Rather, I suspect many if not most people will have one PC-esque device for keyboard heavy things and one or more "mobile" devices, using your definition.
This also assumes you're not counting laptops under your "mobile" rubric, which I don't think you are.
"Then I think we're a long way from seeing this era"
Agreed, but we're just starting, maybe in a few years, who knows...
IMO, the term post-PC is a nice way to differentiate Android/iOS/WebOS/etc tablets from previous generation tablets that ran Windows or some Desktop version of Linux.
I think the laptop, rather than tablet, form factor is going to dominate in the medium-term future, because the portability vs usability tradeoff just doesn't encourage giving up your keyboard.
That doesn't necessarily mean that the laptops of the future will have a full-featured OS rather than a locked-down iOS-style OS. I mean, I'm sure yours will, and mine will, but will your mother's?
They'll just put their iPads into a keyboard dock, or bring a keyboard attachment like the HP TC1100 had. The keyboard could be something like 100g by itself too.
Designing chips is the logical extension to this model. The chips are designed with the end product in mind, and everything works together to deliver the product's value proposition.
Other manufacturers end up hostage to whatever chips Intel and AMD feel like selling to everyone. They are hostage to whatever OS features Google feels like adding to Android, whether I integrates with the chips or not.
It's not a given that Apple will necessarily succeed with this strategy, it requires an ability to juggle multiple balls at once, a very rare trait. There's a reason most businesses try to do just one thing well and commoditize everything else.
But it is certainly beautiful to watch them try to sail the opposite tack.