Hacker News new | past | comments | ask | show | jobs | submit login
Nvidia to Challenge Intel with Arm-Based Processors for PCs (bloomberg.com)
153 points by marc__1 11 months ago | hide | past | favorite | 134 comments



I think AMD is the most likely to win this race as they are way more reasonable than Nvidia and Qualcomm in terms of margins and better at working with the vendors.


That is a reasonable assessment.

Nvidia and Qualcomm are just so... Nvidia and Qualcomm like. They can't help themselves.

That being said, history also suggests AMD will bungle the software side.


I’ve always been baffled by this, why hasn’t AMD brought in an army of software devs


So much easier said than done. Have you met those kernel/SW people? They are hard to poach and hard to manage well.


I did kernel dev and drivers for over a decade on embedded systems; people in the space are not easy to find, especially with experience. Because of that it is not hard to make 400-500k/year working remotely from my garden in the eu (and have been for the past 20 years). Why would I move and go to an office for similar of less? I would accept a position at Nvidia if the conditions are similar and I know my colleagues (who also have been working from home for most of their careers) would but that’s not on offer.


Where are you making 400-500k a year doing systems programming? Embedded or kernel dev usually doesn't pay so great.


Having professionally been doing embedded/kernel/systems development since 2011 and a variety of software and hardware engineering before that, I'd say embedded software pays decently, but I'm definitely not making $400-500k USD/year, not even half that. Maybe I'm doing it wrong?

In my experience, pretty much all software engineering positions pay better than hardware engineering positions with equivalent experience. I don't fully understand why, as hardware is hard and if you have a really good hardware engineering team you can shave months off your time to market, so those engineers should be just as valuable to a business as systems/embedded software people.


I've found it mostly true during my 20+ years career in IT (mostly devops, but also "solution architecture", cloud migration of legacy software, and even integration early on, etc) that if one wants to make more money there is almost always an opportunity to do so if one is willing to work hard and take risks for it. Switching jobs every 2~3 years and doing it with a plan in mind is one way of doing it.

Based on that people end up earning what they consider "a comfortable wage". Some people are not going to feel comfortable until they can buy a jet for their acro-flying hobby, some are happy when they can afford to easily save 1 year of income and being able to afford to buy a new pc every 3 years.

Other people have no idea these opportunities exist and they are miserable working in one place for a decade. Or they are not willing or can't take risks.

That's how we end up with two people doing similar jobs one earning $100k and the other $500k. In days of remote work the biggest differentiating factor in the earning potential IMO is your English (if not a native speaker) and willingness to work in timezones 6h+ away.


I have a company (only me in it but having an actual company looking pro Does make you more money and gives more freedoms like wfh for all years I have worked after/during uni) for 30 years that works for other companies creating products for/with them. If you work like this for a company like Gemalto or startups with VC invest, 250-300/hr are on the table. I have almost no overheads and invest almost everything so the gross has been close to net for practical purposes.

But you are right; the hardware market is harder than making web stuff. I just enjoy hardware/firmware more. Currently I am starting a new hardware/firmware project for a US company that will last 3 years with this income.

It depends where you live, spend and how you count; i optimised everything for the past 25+ years around making sure I never have to think about money if I would drop everything today. Your situation might be very different of course.


Don't you hit the problem that after six months you can't be a contractor anymore and have to be classified as an employee (on employee rates)?


No, the entire setup I have is geared to function as company: I have many clients (who keep coming back because they can kick me out any time which is a bit of an issue in the EU, but then they don't) and I also make products which even more cements me as 'defacto employee'. I had many discussions (especially) with the dutch taxes who tried to catch me out; never worked.


Pretty cool, thanks for sharing with us! It originally sounded like you were earning this from a specific job, but yes if you can build a successful business then your earnings can get arbitrarily high. I'm not sure most of us would describe it as "not hard" though! Congrats!


> I don't fully understand why

I think it has to do with the VC faucet, a completely irrational capital allocation mechanism.


I wish to know the same... I've been doing it as a hoby for years, but now I'm having financial FOMO lol


Yeah, I have — and I got the impression that stability and long-term goals are paramount. You need a good manager, but who doesn’t?


Pay them more money maybe?


Because of stock appreciation, they'd need to pay up to 10x what the other company is already paying.

Consider a senior SWE at e.g. NVIDIA making $300k, let's say halfsies split between base and RSU. The $150k RSU grant from a few years ago that still hasn't fully vested is worth $1M now. So even for a 50% increase in nominal total comp, it doesn't make financial sense to switch.

I don't know that anyone can afford to pay their engineers 10x more than the competition, and then also offer a cheaper product with thinner margins.


What percentage of the cost is the engineering?


Issues like this typically boil down to management. If they had a driven forward thinking manager who could deliver value early and impress their superiors, then they could have leverage to make change in the organization and push for more devs, a change in philosophy or push for more resources for the software arm. These things take time though of course.


Hardware never does software well because software is more expensive on a per person basis.

That disrupts the organization because the hardware people are arguably smarter and more talented than software people. They have actual engineering degrees.

A good software org has to value its people. A good hardware org is a manufacturing operation tries to minimize personnel costs.


They have and thats the problem


And AMD is so AMD.


It pisses me off that AMD does either low/cheap laptops or extremely gaudy gaming machines. There's this high performance ultrabook niche that I wish they could make inroads to.

It is impossible to find: AMD + high resolution screen (3840x2400) + 120hz + OLED. In the last year even the Intel ones downgraded though, everyone went 3200x2000 @ 120hz OLED.

It's just frustrating when you're looking at upper-end AMD laptops and you cannot find 16GB of memory, or a decent screen. I'm glad they can dish out $500-$800 laptops, but I don't know why OEMs are always pairing the good components with Intel.



Two years ago I got the AND advantage edition chipset in HP Omen. 5800 gpu, 6600M navi + dmbedded vega, 32gb ram 3200, 16” QHD @165Hz, 1Tb nvme + extra nvme. This was Stock, I just added an extra 1Tb nvme drive. Cool snd quiet. Even has full size cursor keys, and a row for home/end and friends, including prtnscrn. And lots of expansion ports.

AMD do offer solutions, its the vendors that ignored Navi and insisted on supplying power guzzling 3060 instead. Blame the customers for that.

Good laptop with 3 flaws - should have been 16:10, and the touchpad driver is terrible, and power supply is a brick. Fix these and you would have had oerfect laptop.


Have you seen the modular offerings from https://frame.work/

Given the nature of their laptops, even if the laptop you want isn’t there yet it’s only a matter of time before you get the module that you want


Really I have the most trouble with getting a high-end GPU and high resolution screen on an AMD laptop. Framework is great, but doesn't offer what I'm looking for.

Need me some 2160p or 2400p @ 120hz OLED. I'm often in hotel after hotel after hotel and I don't have a docked setup. I just have the 1 laptop. Certainly not toting around a high-end portable monitor for this need, you can't find 120hz OLED anyway. Another poster linked how Intel influenced what OEMs were offering with AMD CPUs and I think that's exactly what's going on. When you go to Dell.com you can very easily see the many options for Intel-CPU'd laptops, and the handful or less of AMD options for screen configurations, hard drives, memory, or graphics cards. It's just very fishy.


> It's just frustrating when you're looking at upper-end AMD laptops and you cannot find 16GB of memory, or a decent screen.

This is not correct. Just looking at Lenovo (and surely, there are similar machines from other brands), even a mainstream laptop like the Yoga 7 14" AMD ships 16 GB, and it has a 2.8K OLED display as option. If you want more, the T14(s) can ship with 32 GB and matte display.


You are right, but 2.8k is them again gimping AMD. We should be expecting 2160p minimum.


Also check Asus Zebook line, they have models with AMD CPU, OLED hidpi displays and 16GB/1TB.


AMD doesn't make laptops, take it up with Acer, Asus, Lenovo, etc etc.


If AMD makes the Linux versions of the various software surrounding their ARM CPU open source, it'll likely win by default. Even if Nvidia's and Qualcomm's offerings perform a bit better, nobody wants to put up with those two's nonsense if they don't have to.


Yes. The only reason I'm on intel on my main (general purpose) computer is that I want reliable software that will just work on a clean install, without having to modprobe or modify the grub.

AMD is cheaper (less power too for the same performances), so for my NAS and laptop it's my choice, and probably if I had to build a gaming computer I would use AMD too.


Plus their product is an almost one-to-one replacement for Intel's. There's no question of if X will work, and you don't need to port anything before doing performance tests.


Qualcomm already ships SoCs for Surfaces, Chromebooks and Galaxy Books. Those chips lag in performance, and I’m curious how much of them catching up will just be “updating to the latest ARM IP”

NVIDIA also shipped Tegra SoCs with their own Denver cores, but they scrapped the project


Apple's ARM switch worked because they have tight control over the whole software stack, and actually worked really hard to provide transparent support for older x86-64 software. Windows for ARM has none of this support and has basically zilch market share as a result.

If Microsoft releases an equivalent of Rosetta at the same level of quality it will be a different story. But I can't see what influence Nvidia has over this?


> If Microsoft releases an equivalent of Rosetta at the same level of quality it will be a different story.

Microsoft has had their own Rosetta equivalent for a few years now. In theory application compatibility is more comprehensive than Rosetta as it supports much older applications, including 32-bit applications. In practice (likely as a result of trying to support so much more stuff) application compatibility isn't as rock solid as Rosetta for every application.

https://learn.microsoft.com/en-us/windows/arm/apps-on-arm-x8...


Apple's Rosetta uses some special hardware in the M1 and later chips to make some x86/x64 translated instructions much faster. I believe it was the load and store instructions.

Microsoft doesn't have access to these instructions for their own implementation (because they are not ARM standard), but theoretically they could have qualcomm or other SoC providers add some custom hardware that does the same. But in a fragmented landscape that can be hard, some windows laptop SoCs might have it, some might not.


Microsoft could still work with their partners to implement this. Yes, the PC market tend to be fragmented, but what I assume would happen is that some ARM PCs will have the hardware acceleration, and for those it would lead to a much better performance. But, for the ones without hardware acceleration, it would just fallback to software emulation.

And as time goes on, more and more OEMs would probably favor adding the hardware support because people would prefer to buy the ones with faster x86 emulation.


Rosetta on the M1 enables the total store ordering from x86, which has a stronger memory model than that of ARM. It also can compute the parity flag in hardware <https://bytecellar.com/2022/11/16/a-secret-apple-silicon-ext...>.


If I remember right, total store ordering is actually a standard ARM extension, and not something Apple came up with on its own. I'm not sure about the flag computations, I suppose those are custom?


I think it’s slightly more complex than that. Apple included in their processor design custom extensions to ARM that alter things like memory order to make emulation easier [1]. It’s not just control over the software stack that makes Rosetta so performant.

[1] https://github.com/saagarjha/TSOEnabler


Nvidia could do the same


Which would only help them if they'd roll their own emulation layer instead of relying on what Microsoft has.

Nvidia is a software company, but i doubt they want to deal with making decades of windows binaries work.


Microsoft and NVidia have a history of collaborating to adapt Windows APIs to NVidia hardware. DirectX 10 was practically designed to NVidia's next-gen hardware spec.

If Microsoft is serious about ARM, there's no reason they couldn't work together to ensure the CPU is optimized for Microsoft's emulation layer.


In general Microsoft are in favour of doing anything that sells more copies of Windows and I have no doubt supporting an extremely fast Nvidia CPU on ARM is possible. It might never run software as well as x86 but maybe if it ends up being faster for games (which could be possible with MS and NVDA support) then people will buy it.


> Nvidia is a software company, but i doubt they want to deal with making decades of windows binaries work.

Isn't Nvidia already doing this with their drivers? Games are notorious for suboptimal or downright incorrect graphics API calls that get patched in the graphics driver on a per-game basis - at least for the AA games in the past.


Apple has fat binaries, and made Xcode produce them by default, with seamless cross-compilation. AFAIK Windows has nothing like that. There are no fat binaries. Cross-compilation is fiddly and needs to be done manually.

There is a massive difference between developers just needing to press the "Build" button again (with no code or config changes), and requiring developers to manually add support for both architectures the hard way.


I think it might be interesting though.

Microsoft is a software company. Nvidia is a hardware company.

Although they don't have the tight integration that Apple has, there might not be the limitations apple adds.


What do you call the surface line and the Xbox?


Also Rosetta isn't perfect. Apple doesn't have gaming. Windows does. Even if ms was to release Rosetta one of their core market uses will be crippled.


This is where close partnering helps. Microsoft can lead the way in building all their games(a considerable library, especially after the Activision acquisition) with universal binaries, and provide great tools to third parties to build games the same way.

The games that aren't recompiled will (after 2 years or so) fall far enough behind in requirements compared to the state of the art, that the emulation will be fast enough to play those older games.


I doubt Starfield will ever be emulatable on current gen hardware.

HOWEVER MS does have an ace -- they can make the next xbox an ARM machine. This will guarantee that all game devs either migrate to ARM or face being on an emulator with reduced capabilities. That will create the needed tech. But it will also help Apple... so MS is likely to try to avoid it.


This person got Overwatch 2 working at a decent framerate(100+FPS) on an M1 Pro:

https://www.reddit.com/r/macgaming/comments/129jzv4/overwatc...

Starfield should be barely playable if someone manages to get it to run(game requires x86 instructions rosetta doesn't currently support). With M3 around the corner it should be very playable, which falls in line with the 2 year timeline.


Apple doesn't have (tight) control over the whole software stack. This is something people say and probably believe.

They don't write Haskell, Postgres, Rust, Java, Scala, Firefox, Spotify, VS Code, Postico or any number of the other software that I run.

They (Apple) released _their_ software and the necessary tools to make it possible for all the software vendors to port all their code in their time, basically. But it all ran (runs!) on both Intel and ARM, still.


Most of the software mentioned was working fine on ARM based SBC-s long before Apple switched to it.


I know! And it was not under Apple's control in any sense. This is my point!


Why has this been downvoted? Who is upset by this?


I really think Intel will be hard to displace on the desktop.

There is a very low tolerance for incompatibility and ARM, whilst great in controlled environments, just cannot compete with Intel architecture in compatibility.


For better or worse, the last 15 years saw the industry move to a highly-abstracted platform-agnostic software ecosystem. People perform many of their tasks using applications running in a browser, in a wrapped browser engine like Electron, or in products using other portable runtimes like .NET or the JVM.

Meanwhile, much legacy software that hasn't made this transition often comes from an age of much lower performance and so can afford the penalties of emulation against modern hardware. Other legacy products are continually aging out and being retired for other reasons.

There are always exceptions, but Intel doesn't have the nearly the security it once did.


Don’t forget that there’s all the tricks apple did with Rosetta to make emulating x86 fast (hardware consistency support, 4k pages, etc). I think the same apply for chpe.


I reckon if you send a gamer kid down to the store to buy a PC they will come back with an Intel architecture machine, even if there were ARM machines there.


They don't really care as long as the games work.

With Nvidia, AMD, and MS pushing, most new releases will probably have an ARM release, and older games can run with translation. The main problematic niche is older releases that are very CPU bound (like some sim games).


Have you got a gamer teenager? I do.

They seem to care a great deal about their hardware and gamer teenagers also seem to spend alot more time analysing hardware and milliseconds performance advantages than anyone else. They literally care about mouse latency in milliseconds.

They spend a huge amount of time on YouTube and online forums talking gaming hardware. If the word on the street is that ARM is milliseconds slower than Intel then they won't buy it. There's lots of huge YouTube channels that do nothing but measure game performance on various hardware.


And yet even dedicated gaming keyboards often have 50ms latency or more internally, and the last YT video I've seen on "system latency" actually didn't look at any latencies and just measured frame render times. I don't know of a single channel that does actual keyboard-to-screen latency measurements.

The gaming community may care about numbers but I don't know if most actually know what those numbers say.


Yeah, there is a ton of fud in the tech YT channels.


Yet the Steamdeck is selling great....

Gaming is no longer dominated by affluent teenagers. At least not to the degree it once was.


>>> older games can run with translation

Games are the most optimised code there is.

Translation layers might cut it with vintage games but not with anything modern.


FEX-Emu would like a word. The asahi people recently used FEX to run Portal on arm laptops.

They JIT the x86 code to arm and their codegen is good enough to maintain most of native performance. Each new release brings them closer to closing the gap.


Portal is, like, 16 years old...

Where's your cutoff for vintage?


Portal 2 https://twitter.com/linaasahi/status/1638603205570613253

Portal 2 is demanding enough of a real-time FPS to demonstrate the principle and FEX is only becoming more optimized. It won't be long until it's running AAA games on low settings.


Yeah, and it was relatively fast even back then.


seems to be working alright for apple. I can’t remember the last game I played that was bottlenecked on the cpu anyways, seems to always be the GPU


My friends love the new Baldur's Gate game which now runs well on a little M1 laptop


Rimworld, Stellaris are big ones for me.


Older games are often already flaky on the current versions of the platforms they were first made for, there's no great easy business in taking legacy PC games and fixing them to be well emulated/translated.


Well if you go far back enough (like games that need DirectX8 or games that don't work on W10), the compatibility layers get really good, and overhead doesn't matter at all because they run at a billion fps.


Except for MS' Direct Draw implementation under Windows 8 and beyond, were your 1999 game runs like a dog under an i5, and you need a wrapper to translate DDraw to OpenGL.


Really? If a gamer saw a Nvidia gaming3000 arm CPU next to the GTX 5090 he was about to buy, I'm sure he's seriously consider it instead of the obvious boring Intel.


And then he'll come back for an exchange when he realizes his games won't run on his new PC.


Depends on if they keep up with the news. NVidia has repeatedly burned a lot of goodwill with the gaming community in the last few years. They catered to crypto miners (resulting a shortage of affordable cards for gamers for years), tried to sell a 4080 that was effectively a 4070, and then released the 4060 with what is perceived to be insufficient memory to last through the current console generation.


The 4060 doesn't seem to have any performance improvement over the 3060 either, so people can just opt for the older card. Or even buy a 3070 instead. ;)


Don’t forget them cutting the 256-bit wide bus of the 3060ti to 128-bits in the 4060ti.

Oof.


But gAmEr love Nvidia than Intel isn't it?


> For better or worse, the last 15 years saw the industry move to a highly-abstracted platform-agnostic software ecosystem. People perform many of their tasks using applications running in a browser, in a wrapped browser engine like Electron, or in products using other portable runtimes like .NET or the JVM.

A lot of the people who only need those things have largely already moved away from the PC desktop. (Especially if we don't include "laptop" in the "Intel will be hard to displace on the desktop" statement, which I'm not sure if the OP meant to include or not. But even if we include laptops, a lot of people have already moved away to tablet or non-Wintel-land.)

The remaining things, though? Generally not highly abstracted. There's the highly-tuned-for-amd64 performance side of things, and then the legacy/slow dev cycle side. I'm skeptical on the legacy side that there's a good business to be built on trying to move those users to a different architecture. Everybody's special weird setup is gonna be different enough with lots of random one-off external hardware or other such requirements I don't see good universal solutions that could scale.

It's not a growth business anymore but it's decline is slowing, I think.


> highly-abstracted platform-agnostic software ecosystem

It also moved towards Docker and Rust.


> I really think Intel will be hard to displace on the desktop.

It will be hard but not impossible. If (for example) Nvidia were to produce a product with a tight integration between CPU/GPU - both technically and price wise that could be used to create a very competitive gaming PC (perhaps low to mid end) then worked with a few gaming companies so the major games worked on it. Made sure Windows and the basic office apps, Chrome, Firefox etc. worked well there would be a market, especially if the ARM chip gave a noticeable performance boost over a similarly priced Intel.


Plus Microsoft is not super invested in making ARM shine like Apple is


I am not sure where you got that from. Windows runs great on ARM. I got one the Windows Dev Kit 2023 (it's an 8 core ARM PC with 32 GB of RAM, 512 GB SSD). I ran run Windows, Office, Visual Studio, GIT, etc.

I suspect a lot of games will not run but everything else works fine. This includes x64 (x86 64-bit software).

Also, MS has made a lot of their software ARM compatible. Office 365, Visual Studio, and Visual Studio code all have native ARM binaries. In Visual Studio's case, it has fewer features than the x64 version (mostly fewer supported SDKs).

Overall, I can't tell I am even running on an ARM system because it feels like an x64 system. It is responsive, fast and just works.


> Windows runs great on ARM. I got one the Windows Dev Kit 2023 (it's an 8 core ARM PC with 32 GB of RAM, 512 GB SSD).

Which you cannot even buy anymore, because they discontinued it after 9 months (10/2022-07/2023) [1]

Microsoft's commitment to Windows on ARM (WoA) is a joke.

[1] https://en.wikipedia.org/wiki/Windows_Dev_Kit_2023


Because current arm socs are the power of Intel Atoms but cost 4 times as much.


It's like no one remembers those Windows RT tablets...

https://en.m.wikipedia.org/wiki/Windows_RT


We remember them. They were terrible.


Microsoft has a history of trying to use architecture transitions as an excuse to lock down the platform. But when both of the architectures are still available, that puts the newer architecture at a competitive disadvantage.


How is your experience with printers?


HP's SUPD has support for ARM-based platforms, which I have found to work well on the more modern printers I have tried


They are actually,

https://blogs.windows.com/windowsdeveloper/2023/10/16/window...

However the Windows developer community couldn't care less if it doesn't bring more money home, and Microsoft isn't Apple, backwards compatibility matters.


Alas, feels like signaling to support the overheated hype and trillion dollar valuation.

Real competition in this space would be really good. The shape of mass computing is inceasingly a tortured collusion game rather than a true market.

Anybody who used a raspberry pi knows that the desktop computing landscape is not what it used to be. For peanuts you can have a rock-solid linux PC that covers the computing needs of like the 99% of people.

With a convergent OS you'd also cover the mobile use case, where conveniently Arm is already the norm.

The future will not look like the past but speculators beware, timing earthquakes is not possible.


I think another important message from the same article is: AMD is also preparing Arm-based CPUs.



DOA. It's one thing for Apple with full vertical integration to succeed, quite another for NVidia to make any inroads where they have no control over the OS stack. The Windows ARM version is not going to save them.


Linux ARM works perfectly well.

Unless you want to play videogames, Linux ARM is probably preferable to Intel.


Nvidia can have full vertical integration in thr data center. If they can have high memory bandwidth and integrate it with their next-gen GPU (e.g. with a custom high-bandwidth fabric allowing for more shared memory), they'll sell a lot of them to hyperscalers and ML training crowd. Lower cooling costs may be a co-headline item.


Most likely, but Jensen hows how important Software is. Maybe there is a chance he can pull a rabbit out of his leather Jacket.


I really want a sever with the absurd memory bandwidth/latency that an m1/m2 have.

Just not a lot of benchmarks here that reflect what happens in the real world unfortunately


I'm sure this will be dependent on lots of signed closed-source firmware like Intel and AMD platforms, but there's always room for NVIDIA to surprise me, break character and do the opposite. It would attract a lot of people to the platform who want or need open firmware.



Maybe NVIDIA is aiming for servers and the like here, not the mainstream market. So the flawless support for Windows and drivers and their own GPUs for gamers etc is less of a concern?

Fascinating developments though... we would all benefit from chipsets converging towards a similar architecture.


Just wondering how many hugabytes we will have to download just to get Hello World to link and run…


It seems Nvidia was trying to unify cloud computing (GPU + CPU + whatever-PU), and put those beasts in data centres.

Is this "PC" processor still aiming for data centres or desktops? It's not surprising at all if it's the former one.


Something like an Apple M-Series, but with enough RTX cores bolted on to provide, say, PS5 levels of GPU... that could be pretty compelling. I like my M1 Studio a ton, but I do wish it had a bit more GPU.


They've been building the Tegra for the switch and shield.

I'd be interested in more recent SoCs from them


You can already buy Orin dev boards, they just cost a fortune.


And have terrible software support. The Nvidia SDK Manager that you need to flash the OS image only runs on specific old versions of Ubuntu. Then you basically get stuck with whatever Linux kernel you get at release.


I have an AGX Xavier running ESXi arm edition. This makes it easy to run any Linux aarch64 distro that provides a uefi iso installer. It even supports running windows for arm vms. Great way to put the 32G of memory and NVME to use


The M2 improved the GPU considerably and the M3 is rumoured to improve it even further.


I look forward to it then. I actually bought mine through a best buy program where after, I think it was 24 months I basically have the option of making either a balloon payment and keeping the machine I have, or turning it in but applying at least a substantial percentage of payments so far (might even be 100%) towards the latest generation. So, I'll probably do that and get whatever M2/M3 Studio is available, and hopefully spring for the Ultra double the everything version instead of the base.

Even still, my basest of base 32GB M1 Studio actually games pretty well in the few games that are really properly optimized for it (e.g. arm native, vulkan)


Can any of the downvoters care to say why they did so?


Makes me wonder if this could be on the next-gen Playstation or XBOX


I wouldn't hold my breath too much. Before even talking about software and backward compatibility which is quite important for windows users (at least the one who are going to buy at high price/margins) it must be competitive from a performance standpoint.

If we look at what Apple has done with their ARM chips, outside of power consumption they rarely win on performance. It is mostly software optimized to use their specialized hardware compute block that get any advantage from Apple Silicon. Their ARM CPU are not that special, especially since they make it hard to sort CPU from GPU and other hardware co-processor. When you go look in detail in the various benchmark numbers you realize the CPU part performs at most like a mid-range laptop CPU. And their GPU (that gets used to "win" some CPU benchmark) is weak when you consider the price but admittedly that is not going to be a problem for NVidia.

People say Apple successfully transitioned but it is mostly because they leave no choice to their customers. In other words, they do not have to compete. I doubt they would have sold much Apple Silicon desktop if those were to compete with equivalent Intel + dGPU machines; particularly considering the price premium. Compared to the golden iMac era, it seems like they do not sell much of them in the first place, and it sort of makes my argument.

NVidia will not have this luxury that Apple has. They will have to compete on both performance and price. Apple did not really succeed; I doubt they will. But maybe their chip designers are so much better than Apple's or less focus on power consumption will allow them to unlock more performance from ARM. But this is old history, and it seems to just repeat itself. Apple just went back to their Motorola "G" days where they pretend that they are competitive but in fact when you do run software that exists cross platform and is not unfairly optimized for the Apple platform it is rarely true. NVidia won't be able to create a marketing spin around this and still sell their stuff at a massive premium. It is unlikely that PCs will switch to ARM because there are not that many benefits but quite a bit of hassle and cost involved.

It is like the rotary engine that has some advantages over traditional piston engines but in the end are very rarely worth the tradeoffs which is why it is stuck in niche applications (at best, mostly unused actually).


Hopefully they target Linux


By the way, https://en.wikipedia.org/wiki/Project_Denver : "Project Denver was originally intended to support both ARM and x86 code using code morphing technology from Transmeta, but was changed to the ARMv8-A 64-bit instruction set because Nvidia could not obtain a license to Intel's patents"


bypass paywall : https://archive.ph/5oX5Z


Does anyone else just get stuck in a “I’m not a robot” loop with these links now?


I hope Nvidia makes an Os too. Anything to threaten Windows.


What kind of benefits do you reckon they'd get from doing their own OS?

Guessing you mean something more than they're own Linux distro too?


Some of the Nvidia Jetson kits seemed to be pretty good little PC's. They run Linux fine, but Windows seems to be lagging for ARM at the moment.


The Jetson Orin kits have decent performance and good documentation, but they are extremely overpriced in comparison with equivalent computers with Intel or AMD CPUs.


Windows is most valuable because of all the software for it.

They need some sort of emulation.


Which they have! I use it daily on my MacBook Air, via Parallels, and it works great.

https://learn.microsoft.com/en-us/windows/arm/apps-on-arm-x8...


WINE and Proton work very well for a lot of things.

Its rapidly reaching the point where Windows is the least useful most foibled way to run Windows apps. No one can make a Steam Deck that works a third as well, and that's 90% because everyone else relies on Windows.

If you need anything other than a pure desktop form factor, it feels like Windows is a bad fit. It's unknown how long Windows can remain the top desktop option. And app emulation is great elsewhere.


wine and proton do nothing to port across architectures from x86 to ARM. whenever anyone uses wine to run windows software on arm linux they do so using qemu/fex via hangover.


office 365?


The ARM version of Windows runs ARM and x64 software. It works very well.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: