As manufacturers continue to make increasingly questionable design choices, it's natural that users will become hesitant. You can get a possibly not-that-much-faster system, at the risk of losing functionality that has worked for a long time.
In the same timeframe, we went from a 200 MHz Pentium MMX to a 1 GHz Pentium 3.
...and most if not all of your existing hardware and software would continue to work unchanged, in addition you got new features. I remember going from a 233MHz Pentium to a 2.4GHz P4 and pretty much everything was the same, just faster. The situation isn't so clear now.
To summarise, with the 90s to early 2000s each upgrade really was an upgrade. Now, not so much.
The only time I really notice the difference from SSD to M.2 NVME is with a node project build, or running certain background services that are disk heavy... General use, not nearly so much.
I'm not sure what you have in mind... generally speaking Windows software keeps working. PCs aren't speeding up much anymore, but what functionality is lost? I can still run (and i do run) software from the mid and late 90s on the Windows 10 PC i bought four months ago.
On windows 10, I was unable to install "Flight Simulator 2004" that I bought in 2007. Have you tried to install Word 6 (mid 90s) or visual studio 1.52 on windows 10 ?
There's plenty of good reasons to bash Microsoft, but 'lack of backwards compatibility' just isn't one of them. If anything, I think their obsession with backwards compatibility drove a lot of compromised design decisions at that company.
History of Microsoft is very long. Backwards compatibility was excellent until windows XP (included). This focus on compatibility has largely disappeared in more recent versions. I am not bashing Microsoft. I love visual studio code. I love the fact that I can edit office documents (powerpoint and word) for free using chrome on ubuntu. Maybe the work Microsoft is doing on Windows Subsystem for Linux will make me come back to windows one day.
It seems like you're talking about the fact that they dropped 16-bit compatibility in 64-bit editions. XP was the first consumer version of Windows with 64-bit support so that timeline would fit. There were various significant technical challenges which caused them to drop 16-bit support in 64-bit Windows and I don't think it's fair to imply that they lost their focus on backwards compatibility just because of that one regression.
There were various significant technical challenges which caused them to drop 16-bit support in 64-bit Windows
That's the official answer. The more nuanced and correct answer is that DOS (16-bit realmode) apps won't work due to the hardware lacking the functionality --- blame AMD, not Microsoft, for that --- but there's no such limitation for Win16 (16-bit protected mode) apps. Thus, WINE on a 64-bit Linux will handle 64, 32, and 16-bit Windows apps, and there's no emulation unlike with DOSbox and such:
To add even more nuance, even protected mode 16-bit Windows ran Windows apps in v86 mode. Essentially protected mode was a hypervisor you could launch from DOS and your actual Windows session would run on top of that. Before the VT-x extensions to x86-64, there was no more access to virtual-86 if you were running the processor in 64-bit long mode.
Considering that, it's no surprise that 16-bit Windows apps would no longer run on 64-bit Windows, especially since NTVDM was necessary to run them on 32-bit NT based versions of Windows anyway.
Why do you think backwards compatibility dropped? Just recently they decided to not build newer versions of .NET Framework on top of .NET Core (even though they have open sourced practically everything on .NET) to ensure that older programs built on .NET Framework will keep working and wont face issues from any incompatibility with .NET Core.
IMO they are making their life a lot harder by introducing all those frameworks and platforms every five years or so, but despite that they still make sure that stuff keeps working.
16bit Windows software will not work natively on 64bit Windows. However this isn't a recent change, this is happening since the introduction of 64bit Windows and consumers started getting 64bit Windows already for more than a decade ago.
There are solutions, however.
For 16bit Windows software you can try https://github.com/otya128/winevdm which provides 16bit emulation and pass-through of Windows API calls to a mix of Wine and Win32 native calls. It isn't perfect but it can run a lot of Windows 3.x software. You can either drag-drop the executable on the emulator's exe file or use the supplied registry files to install it system-wide.
DOS software tends to work fine under DOSBox or vDos (a DOSBox fork explicitly made for text-based DOS applications).
Now i'm not saying you cannot find cases where stuff do not work, but in my experience most stuff work out of the box or with little tweaks.
> and most if not all of your existing hardware and software would continue to work unchanged
It’s funny how backwards compatibility back then allowed us to have massive progress, by making gradual steps forward possible.
These days, where we are seeing less and less real progress, we now hear many programmers claiming we have to sacrifice and shed backwards compatibility if we want further progress. (I’m looking at you Apple and Google!)
Something is very very backwards. And yes, an upgrade may no longer be an actual upgrade.
I think this is partly because things have gotten vastly more complex & abstract, and none of that complexity and abstraction has been managed well. We basically always say "Yes right away" to increased complexity/abstraction these days when PM wants something.
Tech debt was lower back then.. maybe all of our practices were worse, but the code bases were thousands of times smaller which made up for a lot.
Heck Agile hadn't been invented yet back then. You got requirements and got to design the core technology first. There was no such thing as an MVP hack and all of a sudden you had a back end that had millions of lines of code and 10 layers of abstraction that was all built on a bad design.
It's an industry-wide Duke Nukem Forever situation.
If you keep re-laying your foundation, then you're also going to have to keep re-building whatever was built on top of it, and that's just a huge time suck.
We have accomplished the ideal of "wei wu wei" to an exquisite degree, but, unfortunately, we got our wei and our wei mixed up.
In the same timeframe, we went from a 200 MHz Pentium MMX to a 1 GHz Pentium 3.
...and most if not all of your existing hardware and software would continue to work unchanged, in addition you got new features. I remember going from a 233MHz Pentium to a 2.4GHz P4 and pretty much everything was the same, just faster. The situation isn't so clear now.
To summarise, with the 90s to early 2000s each upgrade really was an upgrade. Now, not so much.