Hacker News new | past | comments | ask | show | jobs | submit login

B580 being a "success" is purely a business decision as a loss leader to get their name into the market. A larger die on a newer node than either Nvidia or AMD means their per-unit costs are higher, and are selling it at a lower price.

That's not a long-term success strategy. Maybe good for getting your name in the conversation, but not sustainable.




It’s a long term strategy to release a hardware platform with minimal margins in the beginning to attract software support needed for long term viability.

One of the benefits of being Intel.


Well yes, it's in the name "loss leader". It's not meant to be sustainable. It's meant to get their name out there as a good alternative to Radeon cards for the lower-end GPU market.

Profit can come after positive brand recognition for the product.


I was reading this whole thread as about technical accomplishment and non-nvidia GPU capabilities, not business. So I think you're talking about different definitions of "Success". Definitely counts, but not what I was reading.


I don’t know if this matters but while the B580 has a die comparable in size to a 4070 (~280mm^2), it has about half the transistors (~17-18 billion), iirc.


Tom Petersen said in a hardware unboxed video that they only reported “active” transistors, such that there are more transistors in the B580 than what they reported. I do not think this is the correct way to report them since one, TSMC counts all transistors when reporting the density of their process and two, Intel is unlikely to reduce the reported transistor count for the B570, which will certainly have fewer active transistors.

That said, the 4070 die is 294mm^2 while the B580 die is 272mm^2.


Is it a loss leader? I looked up the price of 16Gbit GDDR6 ICs the other day at dramexchange and the cost of 12GB is $48. Using the gamer nexus die measurements, we can calculate that they get at least 214 dies per wafer. At $12095 per wafer, which is reportedly the price at TSMC for 5nm wafers in 2025, that is $57 per die.

While defects ordinarily reduce yields, Intel put plenty of redundant transistors into the silicon. This is ordinarily not possible to estimate, but Tom Petersen reported in his interview with hardware unboxed that they did not count those when reporting the transistor count. Given that the density based on reported transistors is about 40% less than the density others get from the same process and the silicon in GPUs is already fairly redundant, they likely have a backup component for just about everything on the die. The consequence is that they should be able to use at least 99% of those dies even after tossing unusable dies, such that the $57 per die figure is likely correct.

As for the rest of the card, there is not much in it that would not be part of the price of an $80 Asrock motherboard. The main thing would be the bundled game, which they likely can get in bulk at around $5 per copy. This seems reasonable given how much Epic games pays for their giveaways:

https://x.com/simoncarless/status/1389297530341519362

That brings the total cost to $190. If we assume Asrock and the retailer both have a 10% margin on the $80 motherboard used as a substitute for the costs of the rest of the things, then it is $174. Then we need to add margins for board partners and the retailers. If we assume they both get 10% of the $250, then that leaves a $26 profit for Intel, provided that they have economics of scale such that the $80 motherboard approximation for the rest of the cost of the graphics card is accurate.

That is about a 10% margin for Intel. That is not a huge margin, but provided enough sales volume (to match the sales volume Asrock gets on their $80 motherboards), Intel should turn a profit on these versus not selling these at all. Interestingly, their board partners are not able/willing to hit the $250 MSRP and the closest they come to it is $260 so Intel is likely not sharing very much with them.

It should be noted that Tom Petersen claimed during his hardware unboxed interview that they were not making money on these. However, that predated the B580 being a hit and likely relied on expected low production volumes due to low sales projections. Since the B580 is a hit and napkin math says it is profitable as long as they build enough of them, I imagine that they are ramping production to meet demand and reach profitability.


That's just BOM. When you factor in R&D they are clearly still losing money on B580. There's no way they can recoup R&D this generation with a 10% gross margin.

Still, that's to be expected considering this is still only the second generation of Arc. If they can break even on the next gen, that would be an accomplishment.


To be fair, the R&D is shared with Intel’s integrated graphics as they use the same IP blocks, so they really only need to recoup the R&D that was needed to turn that into a discrete GPU. If that was $50 million and they sell 2 million of these, they would probably recoup it. Even if they fail to recoup their R&D funds, they would be losing more money by not selling these at all, since no sales means 0 dollars of R&D would be recouped.

While this is not an ideal situation, it is a decent foundation on which to build the next generation, which should be able to improve profitability.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: