Jensen has said for years that 30% of their R&D spend is on software. Needless to say as they continue to crush it financially this number continues to completely race past AMD.
Turns out people don’t actually want GPUs, they want solutions that happen to run best on GPUs. Nvidia understands that, AMD doesn’t.
Lisa Su keeps talking about “chips chips chips” and MAYBE “Oh btw here’s a minor ROCm update”. Meanwhile, Nvidia continues to masterfully execute deeper and wider into overall solutions and ecosystems - a substantial portion of which is software.
Nvidia is at the point where they’re eating the entire stack. They do a lot of work on their own models and then package them up nice and tight for you with NIM and Nvidia AI Enterprise. On top of stuff like Metropolis, RIVA, countless things. They even have a ton of frameworks to ingest/handle data, finetune/train, and then deploy via NIM.
Enterprise customers can be 100% Nvidia for a solution. When Nvidia is the #1-#2 most valuable company in the world “no one ever got fired for buying Nvidia” hits hard.
The people who say “AMD and Nvidia are equal - it’s all PyTorch anyway” have no view of the larger picture.
With x86_64, day one you could take a drive out of an Intel system, put it in an AMD system, and it would boot and run perfectly. You can still do that today unless you build something for REALLY specific/obscure CPU instructions.
Needless to say that’s not the case with GPUs and a lot of people that make the AMD vs Intel comparison don’t seem to understand that.
What needs to happen for GPUs to be commoditized so that an infrastructure builder is relatively indifferent to using GPUs from Nvidia, AMD or any other provider?
Either Nvidia has to open up CUDA to third party providers, or someone (OpenCL?) has to finally manage to create a high level cross hardware abstraction that has all the features of CUDA, can match CUDAs performance on Nvidia hardware and is as easy to use as CUDA. Honestly I'm not sure which is more unrealistic.
Heh, what a weird story, I didn't know that website ran stealth ads.
Nvidia has been trying to catch up to the AMD juggernaut for years, including having sockpuppet accounts on even tiny websites like ours, and it seems to have been paying off, I guess? They claim larger revenue, but most of it is just a by-product of rent-seeking and price gouging, combined with a moat that they sell to the user as a software stack; they can't keep up with the actual hardware side of the business, and rely on a good PR team to shore up the difference, including getting universities to teach "how to CUDA" classes, instead of actual useful classes with transferable skills.
When it comes to actually important things, like perf/watt, perf/slot, perf/$, unless you were stupid and let yourself be locked into a CUDA-only solution, why would you pick Nvidia? And lets say you were a gamer, and not some compute customer, why would you buy Nvidia, unless you really wanted to spend $1600 and 600 watts on a card that barely outruns a $1000/450w 7900XTX. Even on games that are "Made For Nvidia" (= botched PC ports that border on an easy to win antitrust suit), RDNA3 across the board is still better than Lovelace.
And to be completely clear, many gamers aren't PC gamers, they're console gamers: XBox One, XSX, PS4, and PS5 are AMD, and NVidia didn't win the contract for either because they couldn't deliver purely on technological reasons (couldn't meet the perf/watt requirements for even remotely the same $). What they did win? The Switch because they had a warehouse full of chips meant for the gaming tablet revolution that didn't (and wouldn't have) come; easier to convince Nintendo to buy them at a loss (they didn't even break even on the original 20nm run of the X1) than making literally ziltch on the run.
Given how ubiquitous AMD hardware is, both inside and outside of gaming and enterprise, I just find it utterly baffling to think Nvidia somehow has so much brand and goodwill that Wall Street would value it in the super-exclusive $T club.
Well, their lack of software spend also hurts their GPU driver quality. I have been an AMD graphics fanboy since they were ATI, and yet I finally bought NVidia since I like it when my video games don't crash for stupid fucking reasons. This affected Unreal engine games way worse. Now I can finally play whatever game I want, even running the GPU full bore, struggling it's absolute best to give me any frame it can, and not have any worry about stability.
Meanwhile, for several years, the default GPU driver for my RX 5700xt used a fan curve that was hard capped at 20% fan RPM, such that the card consistently overheated itself and died. Every single game I would play that pushed that card hard would crash it. Despite having suitable grunt, I would have to turn down graphics settings in games to not crash. That just doesn't happen with my modern NVidia card.
But will people keep buying AMD GPUs as gaming tech advanced and AI tools like LLM start getting integrated into games to make their worlds more realistic and expansive?
AMD should be much more successful with Tesla, Microsoft and Sony all using their chips exclusively in their top devices, but look like big corps milked the deals to the point AMD have no profit at all. Nvidia salespersons and marketing department are clearly decade ahead.
Turns out people don’t actually want GPUs, they want solutions that happen to run best on GPUs. Nvidia understands that, AMD doesn’t.
Lisa Su keeps talking about “chips chips chips” and MAYBE “Oh btw here’s a minor ROCm update”. Meanwhile, Nvidia continues to masterfully execute deeper and wider into overall solutions and ecosystems - a substantial portion of which is software.
Nvidia is at the point where they’re eating the entire stack. They do a lot of work on their own models and then package them up nice and tight for you with NIM and Nvidia AI Enterprise. On top of stuff like Metropolis, RIVA, countless things. They even have a ton of frameworks to ingest/handle data, finetune/train, and then deploy via NIM.
Enterprise customers can be 100% Nvidia for a solution. When Nvidia is the #1-#2 most valuable company in the world “no one ever got fired for buying Nvidia” hits hard.
The people who say “AMD and Nvidia are equal - it’s all PyTorch anyway” have no view of the larger picture.
With x86_64, day one you could take a drive out of an Intel system, put it in an AMD system, and it would boot and run perfectly. You can still do that today unless you build something for REALLY specific/obscure CPU instructions.
Needless to say that’s not the case with GPUs and a lot of people that make the AMD vs Intel comparison don’t seem to understand that.