Hi everyone,
We know that today a lot of DL and LLM research and development rely on NVDA's CUDA, even if AMD is trying to catch up with its ROCm, but it seems that it's not there yet. It seems that there's a large gap between these two, so that people who want to do research or develop DL and/or LLM would usually buy NVDA's products instead of AMD's.
I thought of the following question: Altair 8800 used Intel 8080, and IBM PC used Intel 8088 (and PCs usually came with Intel CPUs until AMD caught up), whereas Apple I and II used MOS 6502, and the first Macintosh used Motorola 68000 (PowerPC era).
So I was wondering that for people who have experienced or studied that era, were there any productivity gap between Intel machines and non-Intel machines, as large as today's NVDA and AMD? Some (small) business seemed to hold up with Apple machines even before its cooperation with Intel, do you think this is or can be true today, e.g., for (small/medium) companies/studios or individuals that can only afford AMD GPUs, or it's more a bet for the future?
Thank you for your time.
The one exception being the mid-90s because of: https://en.wikipedia.org/wiki/Pentium_FDIV_bug - that one had impact across the industry.