Hacker News new | past | comments | ask | show | jobs | submit login
Has AMD's Clock Run Out? (thirdyearmba.com)
72 points by dlevine on Oct 23, 2012 | hide | past | favorite | 59 comments



Times are tough, but I sure hope not. First to 64bit, first to multicore x86, first to on-chip memory controller, first to APU. We need them to keep pushing the envelope.

I'd love to see AMD at a 22nm process with Intel and then compare them. They've managed to stay close even two process generations behind(The chip that just came out today is at 32nm).


Sad.

If AMD wasn't there in 2004, forcing Intel to (effectively) drop the Itanic and do what customers actually wanted - the AMD64 platform - we'd either be much farther back now in terms of x86; Or would be much farther in terms of ARM; (or both).

AMD's gut to do AMD64 helped us all.


Not so much the Itanium (which simply nobody cared for) as the P4 dead-end, forcing intel to come back to the P6 architecture (via the Pentium M).

They showed the path to better integration and performances (on-chip memory controller, significantly better interconnect, better multicore integration) as well.

It's just sad how much they lost their way since the Athlon 64, and how Intel's Core just curb-stomped them.


Actually, it was very much the Itanium. At the time, Intel had no plans whatsoever to introduce a 64-bit x86 architecture - they were the only game in town for a long time, and believed that they can force the 64-bit market to be Itanium. HP and Compaq/Alpha had already given up on their 64-bit offering at that point, and either blessed Intel as the 64-bit heir (HP) or sold it to them (Alpha). The only other game in town was SPARC, which Intel wasn't really facing in the 32-bit market.

While it is true that the P4 was going nowhere, it was the 64-bit market that forced Intel to reconsider their road map; If it wasn't for AMD, 32-bit might have sped up to Core, or stayed at P4, but we'd be nowhere close to where we are today.


Seeing the words "Athlon 64" reminds me of the halcyon days of the 4400+, one of the greatest innovations at its price point in the consumer-grade processor market. It really does bring a tear to the eye.


I am planning on building a mini-itx system with a 65watt A10 chip sometime soon, but AMD's trinity didn't launch with mini-itx mobo's and the 65 watt chips aren't on newegg yet.

The small form factor gaming/htpc market could be a good one for AMD, but they haven't been able to get any other companies to build them in volume. Usually you get the eeeBox or something underpowered using an E-350. Heck, HP is sticking those in full desktop cases which is absurd.

All the innovation left in the PC market seems to be on tablets and ultrabooks, not on the desktop at all. AMD hasn't pushed as hard on either form factor and it's hurt.

Also, why hasn't AMD done what nvidia did and become an ARM chipmaker? Tegra has sold well enough for nVidia and an ARM desktop box could be quite competitive in the next few years for the average user.


The problem with AMD making ARM chips is that they would be cannibalizing their own market.

Their best move would probably be to license the ARM platform and use their chip designers to make custom chips (like Apple and Qualcomm have), but I don't think this is realistic. Also, I'm not sure whether any of their GPU technology is low-power enough to be useful in ARM designs.


Since they didn't cannibalize their market themselves, someone ate it out from under them.


More like cannibalizing Intel's market, these days at least.



ARM chips are cheap so it's a big business model switch. Not easy to do.


I don't know if AMD has anyone to blame but themselves. Over the past few years, I kept listening to the statements of their CEO's, and I got the impression that they "don't get it", and aren't very visionary.

When they were supposed to do something about the mobile market, they said they will "wait and see". That was even after Nvidia did the right move and created Tegra. Nvidia was clearly a more visionary company than AMD, and Nvidia will survive because of this. It might even out-survive Intel because of their move to ARM. AMD won't. They'll be crushed by both Intel and ARM chip makers.


Who in their right mind, in the chips industry would consider taking the CEO job at AMD? In order to thrive, you'd need to be smart, lucky, and able to convince Intel to let you. Intel, on the other hand seems interested in the survival of AMD, but only inasmuch as it keeps the FTC and Justice Dept. from sending nasty letters about monopolistic business practices. Add to that, they both have to worry about ARM eating their lunches.

tl;dr Nobody with any sense would take the CEO job at AMD.


No true, considering how much CEOs are paid regardless of whether the companies they manage ultimately succeed or fail. It would be financially irresponsible to turn down a CEO job offer in a high profile company.


If games consoles do well in the next few years AMD will do okay. The Wii U has AMD graphics, the PS4 will and the XBOX successor is rumored to have AMD too.


Yes, turns out that ATI acquisition may save them. Nice hedge against struggles on the CPU side.


It turns out ATI's death sentence was rather pronounced when it was acquired by AMD. ATI is now far behind nVidia, and there is no catching back now that they depend on AMD's situation so much.


"Far behind"? Please elaborate. This may apply to the CPUs, which is unfortunate. But the GPUs are very competitive.


The entire history of the GPU business is one of companies leapfrogging each other. And unless you have more specifics of how 'they depend on AMD's situation so much', then it sounds like you're just making stuff up.


I wonder what the market effect will be if AMD exits the picture entirely. Once Intel has near-total market share for PC chips, wouldn't they become a prime antitrust target?


AMD has been second-sourcing Intel CPUs since the 1980's, and I've been hearing "how will AMD survive now?" for just as long.

Having a second source for strategic parts was said to be a requirement for US government/military procurement. My guess is that if AMD is really on the ropes, they'll get a DoE contract for a new supercomputer or three to hold them over for a while longer.


The question would be, would Intel be guilty of anything with AMD out of the picture that they are not guilty of now? Being the only player in the market is not illegal, the illegality is in keeping others out. If Intel didn't continuously crush upstarts, they should be fine.

I guess a question would be, would Intel be found guilty of illegally causing the demise of AMD?


I guess a question would be, would Intel be found guilty of illegally causing the demise of AMD?

I don't think so. However it's worth mentioning that Intel certainly acted in anti-competitive ways to severely damage AMD's ability to gain marketshare.[1] The suit was eventually settled though, with Intel paying AMD a huge sum.

http://en.wikipedia.org/wiki/AMD_v._Intel


The only reason Intel would be even close to anti-trust worthy is because they use their own fab tech that they keep ahead of the competition to make their processors hard to beat.

And it isn't even really their fault that they are building new fab plants a year ahead of anyone else. It would seem wrong to me to force Intel to sell off yields on the market from their own plants if they don't want to, but that is the major reason they always dominate the pc space (besides the fact most software is for x86 and they license the architecture, but I don't really buy that anymore - hardware virtualization has come a long way, and I can pretty effectively emulate x86 on ARM under qemu with binary address translation and hardware instructions supporting it, which every major architecture now has).


ARM processors are starting to move up into the higher end as well, like servers. On an energy/computation basis they make more sense than x86 processors right now.

The only issue is that software needs to be optimized for ARM processors but if the savings are there, this will happen pretty quickly.

So I'd actually be worried for Intel as well. Not just AMD.


>> On an energy/computation basis they make more sense than x86 processors right now.

Does anyone have ANY numbers that back this up? I've heard this refrain many many times but I've never seen any hard numbers to prove this. On a processor-per-processor basis sure an ARM SoC beats a Intel Xeon. But flop or dhrystone, x86s destroy ARM processors. In a virtualized world, where the number of real systems doesn't have to match the number of servers, x86 still appears to hold the lead.


ARM cores tend to be more power efficient for three main reasons: they operate at a lower clock frequency, they don't have the CISC legacy baggage, and the low-level software guys have spent a lot more time on power management than on x86.

It has to do with the evolutionary heritage of both systems. Most x86 systems are still sold to individuals or small businesses that will plug them into the wall and forget about power dissipation. A typical x86 desktop machine will draw between 300-500 watts. ARM evolved more for the cell phone and tablet market, and typical power consumption for one of those systems would be under 5 watts.


A typical x86 desktop machine will draw between 300-500 watts.

With a beefy GPU, perhaps. The new i3 3220T CPU is only 30W MAX TDP, flat out - a typical (ie. non-gamer) rig is looking more like 150W max, and a lot less idle (20W should be possible). Not 5W, but nowhere near 500W.


Yea, that metric seems bizarre. It is worth mentioning though that the lowest TDP chips Intel is selling are 17w mobile ones, comparing a desktop to a mobile chip seems apples and oranges.

You also need to consider that Intel is at least a fab tech ahead of ARM chips. I have a Tegra 3 Transformer tablet and that chip was yielded at 40nm, 2 generations behind Intels 22nm, and it has a TDP of 15 watts. Of course under load it would run higher than that, but so would an Intel chip.


20W, 200W, what's the difference? Either way, you can't get that out of the battery on a mobile device. And consumers don't generally buy PCs based on power dissipation. Sad, but true.

And if you think your typical beige-box PC can handle a power supply that is specc'ed for 150W-- go ahead and put one of those in there. I DARE you.


The lower figure of 20W is pretty standard: go take a look at the battery in your laptop. Most Dells are 65Whr batteries - ie. 20W for 3 hours of battery life (to a first approximation, LiIon is a bit more complicated than that).

The ubiquitous small form factor PCs like the Optiplex 780 (http://www.dell.com/downloads/global/corporate/environ/compl...) use a 235W (max) power supply, which will be overspecced to trade off failure rates for manufacturing cost. Those machines actually draw less than 150W flat out. And they're everywhere. A certain large e-tailer with an emphasis on frugality used to use them as developer desktops(!).

Who knows what's in a typical consumer beige box, but it isn't pulling 500W continuously, unless they're playing, say, Skyrim 24/7 with a big graphics card - in which case of course one would specify the correct (safe) component for the design. I'd argue that they're not typical by that point; most people won't spend £300 on a graphics card (I do).


So to recap:

* You point a to 235W power supply as an example of the bare minimum PC power supply-- not too far from my 300W round number.

* You point out that a power supply rated for X isn't drawing X continuously-- a true statement, but it's responding to an argument nobody made. You have to pick a power supply which is rated for your max load-- everyone knows that, or should. It still doesn't change the fact that both max load and average load for X86 are orders of magnitude greater than for most ARM devices.


500 watts is a seriously overclocked with a sli/crossfire gfx setup or the super-high-end offerings like dual gpu on a single card. it's probably less than 1% of all desktop pcs.

my ageing desktop with an e8400 and an radeon 4850 draws 270 watts at full load, including display - number straight from the ups.


What CISC legacy baggage? Most of the old and underused operations are emulated with microcode on modern x86 processors. Chip makers use profiling tools just like software makers, optimize the most common cases and your benchmarks improve. There are decades worth of engineering optimizations in Intel and AMD's chips. Intel has a massive engineering and R&D budget and competing with that is pretty damn hard.

It has nothing to do with evolutionary heritage and more to do with you repeating ancient myths that haven't been remotely close to the truth of a decade.


One example of legacy baggage is the fact that x86 doesn't reorder loads and stores as freely as ARM. It would be better for optimization, but it would break the creaky old software that is the lifeblood of the platform.

Another example of "evoluationary heritage" is the fact that x86 chips require a northbridge and/or southbridge, whereas with ARM chips, everything is integrated on the chip. This was one reason why Atom-based designs often weren't that low power-- the CPU itself might be low-power, but the glue logic was thirsty. There is evidence that Intel is trying to change this, and put everything on one chip.

I'm not trying to say that x86 will never succeed in mobile. I don't have a crystal ball. I'm just saying that the burden of proof is on Intel to prove that it can be cost and performance-competitive in that space. And I am not the only skeptic-- Apple and Microsoft use ARM for most of their mobile offerings.


There are certainly no server platforms that "make more sense" on a power/performance metric "right now". There are a few startups hawking their upcoming wares out there, but nothing I can buy with a credit card. And in any case at the very high end power/performance is still owned by Intel boxes (check the current record holders on Joulesort, for example).


I think that ARM-based chips will probably have significant market share by the time AMD exits completely.


The thing about AMD is that for every advantage it has two or more disadvantages. For example, Trinity rocks but the number of laptop models available with it is staggeringly low and in most cases you have to make due with lowend specs like crappy screens, no SSD option and a case which feels like it was made from recycled Compaq PCs from the 1990s.

Which is ironic since Trinity could drive a retina-like display without a discrete GPU on the side, yet I could only find a handful of Trinity laptops with optional 1080p displays and two weren't available stateside. The only performance unit I could find was made by MSI.

There's nothing like the Zenbook or the ENVY15 available, so at the end the problem isn't a compromise on CPU power alone but on nearly everything else too. So you have to choose: either you get a good laptop or an AMD laptop, and that's not fair.

I guess AMD should start working more closely with OEMs to make sure its APUs are available on products that are not all bargain-bin units but at least some mid-to-high end units with good features and build quality.

That or do like MS with the Surface and make their own highend laptops and tablets.


I don't buy this story at all, mainly because AMD never could fight Intel in a straight up fight. AMD is at least an order of magnitude a smaller company than Intel, so much so that Intel spent more money on R&D (http://newsroom.intel.com/community/intel_newsroom/blog/2011...) than AMD makes in total revenue (http://phx.corporate-ir.net/phoenix.zhtml?c=74093&p=irol...) last year.

That isn't even about monopolistic business practices, decisions, or market forces. You are comparing two companies operating on effectively different planes of existence. Intel owns the instruction set, has the most advanced silicon fabs in the world (and still makes their chips in house) and spends more on R&D than AMD even makes. And all Intel does is make CPUs.

Meanwhile, AMD bought ATI and took a tremendous gamble on APUs. They are just starting to mature their APU line with Trinity in the last few weeks, and are still reeling from integrating two large companies together like that. They had to sell their own fabs off and couldn't even make their most recent generation of GPUs at Global Foundries because they aren't keeping up anymore. On that front, the 7000 series graphics cards (from my objective viewpoint) basically crushed Nvidia for the first time in a while. They were first to market, as a result didn't have major shortages, and price cut at the appropriate times to keep their products competitive. It took Nvidia almost half a year to have their GPU line out after AMD, and their chips, at competitive prices, are almost exclusively openGL / graphics devices, being beaten in GPGPU operations by the old 500 series and easily by the 7000 series because they tried going many core limited pipeline over more generic cores in the 500 or 7000 series that were better at arbitrary GPU compute tasks.

So they are doing really well in graphics. And their APUs are really good graphics chips too. The only flaw in AMD right now is that they are floundering on the cpu fabrication front as badly as Nvidia did with their graphics line (only with their cpus). They eat power, they are effectively 1.5 generations of fab tech behind, and the bulldozer architecture is weak in floating point and serial operations.

That doesn't ruin a company. Hopefully next year is the year they really start moving forward, because I really think AMD is the company to finally really merge gpu and cpu components into some kind of register / pipeline / alu soup that can really revolutionize the industry (imagine SIMD instruction extensions to x64 that behave like opencl parallel operations and have the normal processor cores work on register ranges and vectors like a gpu, rather than just having a discrete gpu and cpu on one die).

Even barring that kind of pipe dream, Steamroller is shaping up to be sound. It finally gets a die shrink AMD desperately needs to stay competitive, if only to 28nm, and finally puts GCN into their APU graphics instead of the 6000 series era VLIW architecture.

They can't really stand up and fight Intel head on anymore, because Intel got on the ball again, and their cpus are crushing AMD in a lot of use cases, especially power usage. But AMD still has significantly better graphics, and are leveraging it, and they are finally getting over the ATI growing pains, so I'd wager they are still in the game, if only barely. They have a lot of potential still.

Footnote: I really think market is a big reason AMD is falling behind. The Ultrabook campaign is stealing wealthy pc buyers from them, and that is where chip makers get a majority of their profits (look at the high end mobile i7 chips selling for a thousand bucks). Desktop sales are abyssmal besides OEM systems or businesses. Intel wins at getting business contracts just by size alone, they just have more reach. Desktop enthusiasts can bank on AMD being a cost effective platform, but the wow factor lies in Intel chips, even at the premium, and they steal that market too. AMD doesn't even do well in the cheap HTPC market because their chips burn so much power. They are in a crossroads where all their target markets are either becoming obsolete or they are losing ground, and not because they have bad products, but because their perceptions and influence are growing worse.

Right now, AMD is really strong in mid ranges. Mid range laptops with a trinity APU are really good and extremely cost effective (I had a friend buy an A8 based Toshiba because it was $500 cheaper than a comparable Intel machine that could run League of Legends). Piledriver is good enough in the desktop space to recommend one of the 4 or 6 core variants to friends looking for a budget PC gaming experience, because they are pretty much more than enough with a proper overclock for anything major. But AMD has (from my experience) a bad image right now as a dying company and as a maker of budget goods, even when their GPUs kick butt and their desktop CPUs can (at least according to the recent Phoronix Piledriver FX benchmark) hold ground against even Intels best Ivy Bridge offerings in some cases at almost half the price.

TLDR: I guess after graduating college I had withdrawal on writing essays. This is a really long wall of text, holy bacon.


Actually AMD owns the x86-64 instruction set and licenses it to Intel in return for the x86 license.

Also doesn't Intel also make GPUs?


They made GPUs, and they are getting back into it, but any review of the hd 4000 graphics vs anything from the Trinity line reinforces that Intel is still way behind in the graphics front.

They are saying Haswell will be an improvement, but AMD has the architecture cohesion to be able to pair discrete and integrated cards in their hybrid crossfire x, and they just have a decades worth of gpu experience from the ATI acquisition, that they are better positioned to exploit heterogeneous cores. Same way Nvidia Tegra in the mobile world is a gaming / video powerhouse because the gpu is so strong.

Also, x86-64 was / is a specification extension by AMD, not an instruction set, so I don't think Intel licenses it. They even spent a few years calling them AMD 64 and Intel 64 even though they were written to the same spec. AMD still licenses x86 from Intel though. It is like how SSE and other instruction set extensions are not cross licensed between the two. https://en.wikipedia.org/wiki/X64#History_of_Intel_64


The market for computing devices has shifted. Desktop devices aren't as useful as mobile devices, which also use less energy to achieve the same goal. Consumers will leave behind companies which aren't making compelling mobile devices.

Intel makes SSDs too. Intel made chipsets and wifi chips for centrino in the past. I think Intel has made every major internal PC component, just not all at the same time. Now, they're getting into ARM chips and fighting to put x86 into mobile.

Likewise, nVidia has made nForce chipsets and other major PC components. And doesn't the Tegra also do general-purpose processing? Either way, they're becoming entrenched in mobile devices more firmly than Intel.

AMD is better positioned to come up with a unified product, but after 10 years of fanboying for both brands, I'm still not convinced that their merger made any sense. A unified product could have been made from a technical partnership, without having to merge the companies. I haven't seen a compelling product since the merger. I haven't even seen a roadmap for a mobile device component.

The AMD/ATI merger seemed only to happen because both companies were based in Canada. At this point, it only makes sense for AMD to swallow RIM, another Canadian company -- at least that way ATI would be a mobile device manufacturer.


I don't know how you reached the conclusion that AMD is doing really well in graphics and the 7000 series "crushed" NVidia's 600 series. Aside from numerous inaccuracies in your analysis (for example, the 600 series was released three months after the 7000 series, not six - March 22nd 2012 vs December 22nd 2011), in most benchmarks Nvidia fared really well against AMD with equal framerates and lower power consumption/temperatures.


The 680 came out really quick, but was pretty much out of stock for about 2 months after it came out consistently.

But I'm talking more about the 77xx vs the 66x lines, which are the most mainstream discrete cards in the series, where the 77xx came out in February and the 66x cards came out in August. So about 6 months.

For all the hype around the 7970 vs 680, in the end the vast majority of their OEM card sales will be with less expensive hardware, which is why I think AMD "won" even if the 600 cards give better FPS for lower power usage in video games. They just basically controlled the market for mid-range cards for almost half the year, and by the price cuts they have been making they have been making a really sizable profit off sales until Nvidia brings out a competitor.

I just want to mention I'm not an AMD fanboy - I have an i7 920 and a gtx 285 right now. My "last" build was around 2006 and was an Athlon x2 with a 1950.


Perhaps, but we don't know yet. We don't know if "Hondo"(AMD) will compete with "Clover Trail"(Intel) for Win 8 tablets because they might really be competitively priced. Further we don't know why those two are Windows 8 only currently. There might be a Linux tablet in the future for "Hondo" but "Clover Trail" might not because of the PowerSGX graphics core.

Also I've spent most of my evening reading reviews of the new AMD CPU they released and it is looking good for budget enthusiasts.


I agree with this article, I don't even consider buying AMD-equipped machines these days whereas in the past I always bought AMD.


I hope not, I just purchased an AMD FX-8150 8 Core Processor, Gigabyte Motherboard and 16GB of RAM for $399. So far seems like a good machine. Performance is not as good as an i7, but it's quite a bit cheaper.


At the moment Intel owns the high end, but AMD is a pretty good value if you're not going to use your computer mostly for lightly threaded things like video games.

http://techreport.com/review/23750/amd-fx-8350-processor-rev...


Did Athlon and opteron really do well or did net burst do really bad?

I want AMD to continue to compete and push but one bad architecture takes a while to overcome. Bulldozer++ needs to deliver.


Netburst was bad because they hit a mhz ceiling on the architecture and couldn't force the thermal enveolope any higher, while AMD went with dual cores and x64, which proved the correct path. They just hit a home run.

What nobody ever seems to consider is that AMD has revenue that less than what Intel spends in R&D. The fact they can even compete on the scale they do with Intel is a testiment to their success. Intel was always way too big for AMD to compete directly against, and after Athlon's sucecss they tried to move into the big leagues and fight Intel one on one, and lost just due to raw funding (I'd argue, at least). It is why they had to sell off Global Foundries to buy ATI, and such.


This is great news - clockless architectures are the way of the future.


Another way of looking at it is that AMD was always David to Intel's goliath. Intel screwed up big time with the Pentium 4; they designed the architecture for an extremely high clock rate, and ran headlong into a thermal dissipation brick wall. The Itanium, Intel's strange attempt to kill the x86 architecture which had brought them so much money, was another huge blunder. AMD exploited these opportunities. But unless Intel makes another big mistake, capitalism will do its thing and force AMD out of the market.

There was an announcement that AMD will make ARM-based chips in the near future. That niche might have some more air supply than the one they're in currently. However, they'll have a lot of competitors in the ARM space, so who knows.


I don't think Itanium was Intel's attempt to kill x86. They legitimately thought its architecture would represent an improvement and a good way to move to 64 bits.

Like Bulldozer, it was taking a risk that didn't pan out as hoped. Thank goodness CPU companies are still taking interesting risks.


Wonder if AMD would be able to fill/partially fill a custom Apple chip order given that rumours suggest they are shopping about for a supplier?


Well, AMD will be making the chips in the next XBox, the next Playstation, and the graphics for the next Wii so that's something at least.


I don't see Apple switching to AMD, the performances and efficiency just aren't there.


It's kind of confusing because technically GlobalFoundries is separate from AMD. So if Apple wanted to manufacture their own custom ARM chip, would they be talking to GF, AMD, or both?

I also wonder what role AMD proper (as opposed to GF) will have in some kind of theoretical future world where ARM provides all the chip designs. Wasn't architecture sort of their main thing previously? I know a lot of companies add their own little things on to the base ARM designs, but it still seems like AMD will have to scale back their design team considerably in such a scenario.


I see what you did with that title.


lol I came here to post that. Upvotes for you!




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: