Hacker News new | past | comments | ask | show | jobs | submit login

And Microsoft just announced ARM support in Windows 8...



I think it's also interesting if taken into the context that a lot of the Nvidia-Intel feud stems around Nvidia's development of an x86 CPU that had to be shelved due to Intel patents. The chipset licensing disputes, etc, seem to all stem from Intel being upset about Nvidia attempting to encroach on their turf.

It makes me wonder if Nvidia was lobbying Microsoft to expand Windows support beyond x86 as a direct result of all this.

Either way, Nvidia's pretty clearly wanted to expand into the desktop CPU market for a while now, and with Intel blocking their entrance to the x86 market, it only makes sense they'd start examining other options. ARM seems like a pretty logical place for them to end up given the options.


> Nvidia's pretty clearly wanted to expand into the desktop CPU market for a while now

While trying to just hold position as a supplier of GPUs for x86 PCs would be very difficult for nVidia now that both of the big x86 CPU manufacturers are pushing their own GPU systems hard (and increasingly, integrating them in the CPU). This is probably quite an up-or-out situation for them.


Or, what if they just need to expand to be able to afford continuing the development of cutting-edge graphics? The profit margins must be getting thinner.


They also announced a database filesystem for Windows ... 6? Maybe earlier?


Windows NT has been ported to PowerPC, DEC Alpha and Itanium in the past, so an ARM port shouldn't be especially hard. Longhorn was probably a much more ambitious change to WinNT than a CPU port. MS would probably want to introduce some kind of universal binary format as well, but that shouldn't be undoable either.


To ensure portability, NT was originally written for the DEC Alpha and later ported to Intel. (It was also original created with a Pig-Latin UI and later localized to English.) Also, the Xbox360 runs a stripped-down branch of NT on big-endian PowerPC. An native ARM port should be relatively easy.


Actually MIPS was the original architecture target for NT (the DEC Alpha didn't exist yet when they started).


Apparently the very first target was ... the Intel i860: http://www.winsupersite.com/article/reviews/windows-server-2...


MS already has a universal binary format: http://en.wikipedia.org/wiki/.NET_assembly

I wouldn't be too surprised if one part of MS's .NET push was a desire for architecture independence.


That's not a universal binary. It's a bytecode package. It still needs to be run through an interpreter or a JIT compiler, just like a JAR file for Java.

A universal binary contains actual machine code for multiple architectures. Universal binary support requires deeper changes to the OS and can complicate testing, but it's basically a requirement for making cross-platform high performance code because a JIT compiler can't spend as much time optimizing code as an ahead-of-time compiler.

Machine-independent bytecode paired with a high-quality VM allows you to ship a cross-platform executable, but it's not going to be enough when ARM PCs are facing an uphill battle to prove their performance is acceptable to a market that isn't particularly satisfied with Intel's Atom.


That depends on the market, I think. AFAIK most of the dissatisfaction with Atom is due to its power consumption rather than its performance, so an ARM with comparable computing power and lower power consumption would be quite satisfactory for most users when teamed up with a solid GPU.


IIRC optimized, compiled versions of .Net assemblies can be cached by the system, allowing higher performance than JIT compilation.


It's a smart, easy optimization, so I'd be surprised if .NET wasn't using it, but ultimately it has the same effect as reducing the frequency of GC pauses. It doesn't lead to faster execution. It doesn't change the fact that the code didn't pass through a more thorough analyzer/optimizer. How good are JIT compilers at automatic vectorization, for example? Opportunities for automatic vectorization could be encoded into bytecode such that the SIMD capabilities of different architectures could be used, but I don't think .NET does that.


A universal binary format would be something like what Apple uses now to support both PowerPC and x86 versions of an application within the same file.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: