Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Did CPU in the past mattered as much as DL/LLM dev between NVDA and AMD?
7 points by hedgehog0 10 months ago | hide | past | favorite | 3 comments
Hi everyone,

We know that today a lot of DL and LLM research and development rely on NVDA's CUDA, even if AMD is trying to catch up with its ROCm, but it seems that it's not there yet. It seems that there's a large gap between these two, so that people who want to do research or develop DL and/or LLM would usually buy NVDA's products instead of AMD's.

I thought of the following question: Altair 8800 used Intel 8080, and IBM PC used Intel 8088 (and PCs usually came with Intel CPUs until AMD caught up), whereas Apple I and II used MOS 6502, and the first Macintosh used Motorola 68000 (PowerPC era).

So I was wondering that for people who have experienced or studied that era, were there any productivity gap between Intel machines and non-Intel machines, as large as today's NVDA and AMD? Some (small) business seemed to hold up with Apple machines even before its cooperation with Intel, do you think this is or can be true today, e.g., for (small/medium) companies/studios or individuals that can only afford AMD GPUs, or it's more a bet for the future?

Thank you for your time.




There have always been a tiny subset of people whose work really did go better with a specific CPU, but in general my experience is that people have not cared - if they get the OS they want, and the CPU is speedy, they are happy.

The one exception being the mid-90s because of: https://en.wikipedia.org/wiki/Pentium_FDIV_bug - that one had impact across the industry.


Here is how I remember it.

I got my TRS-80 Color Computer for about $400 around 1980. At that time the typical "home computer" was built around the video system: there was a master clock whose frequency was chosen so it could be divided down to make the horizontal and vertical scan frequencies for a TV set, the same timebase controlled the CPU clock. With a few notably exceptions like the TI-99/4A, Memory bandwidth was split evenly between the video system and the CPU which meant that both of these ran at half the potential they could have had. Computers like that struggled to support more than 40 columns of text and it was generally thought at the time that if you were serious about word processing and other business apps you needed 80 columns although I ported all sorts of software to work on the CoCo with 32 columns.

The real weakness of those systems though was you could not upgrade them. They couldn't come out with an "Apple 2 1/2" with a 50% increase in CPU speed. The result of that was that technical progress seemed to hang in the air for a long time as it was a really big deal to go from a Commodore 64 to a Commodore 128.

Note there were also more expensive business computers that ran the CP/M operating system and had 80 column text displays but rarely had any graphics.

There were two things revolutionary about the IBM PC which were: (i) the video and CPU were less closely coupled and (ii) it was an open system which other vendors could clone.

Because of that you quickly started getting computers that increased the clock rate or used a better CPU like an 8086 or NEC V20 or 80286 instead of an 8088. Fierce competition meant cost went down quickly and performance went way up.

Home computers could address 64k of memory, the 8088/8086 could address 1024K but architectural limitations of the PC had you stuck with 640K which was a lot at the time, few people could afford to fully populate a PC at the beginning. After 1985 home computers started to support more memory with awkward bank switching schemes that little software used; the PC used a segmentation system that was awkward too but I thought was fun to program in assembly language.

The 68000 was a 32-bit architecture that could hypothetically address 4M of memory but early chips could only address 16M, the memory space was flat and very easy to program but somehow the 68k line never manifested its potential and Motorola gave up on it and went to the PowerPC.

Circa 1988 I paid $1200 for a 80286-based computer which ran at 12 MHz and absolutely dominated the old home computers; I remember running a Z80 emulator on it so I could develop CP/M software and it was multiples faster than any real CP/M machine! (Note I spent way more on disk drives and other peripherals for the CoCo than I spent on the 286)

When I went to college we had "workstations" from Sun Microsystems based on the 68k and then Sun's own SPARC architecture. These ran UNIX and were way better than a PC. PCs were catching up with the 80386 but you couldn't get a mainstream OS that could take advantage of it. In my senior year of college Linux came out and it was the talk of all the profs that a Linux computer was cheaper and faster than the Sun workstations and I bought one my first year in grad school. At that point Intel was producing chips in such large volume that nobody else could compete which continued until smartphones came out and ARM hit even bigger volumes.

The PC had some real disadvantages compared to home computers. Updating the video was slow, even writing text to the screen was slow. Video games were a joke. Whereas you could code pretty good animation for home computers even without the specialized hardware used in arcade games. The C-64 had way better sound.

---

Now portability was another story. (The real AMD-NVIDIA gap is not in raw performance but in the software support) Most computers used some kind of Microsoft BASIC and I routinely would port BASIC programs to the Coco. C and Pascal and some other languages were around but people still coded a lot of assembly. Some companies had brilliant ideas to make their code portable like the virtual machine Infocom used for their adventure games. Overall though you could not expect software to be portable and I was often resentful that popular software for other machines was not out for the Coco. The Visicalc spreadsheet was ported to many machines but when Lotus 1-2-3 came out that really used the memory capacity of the PC, Visicalc never caught up.


Thank you for the story! I will look into more history.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: