It's declining perhaps, yes, but it's not there yet. Because modern workloads shifted mostly towards gaming, crypto-mining and web browsers doesn't mean that general-purpose computers declined. It only means they haven't developed very fast, but that is thanks to Intel and Microsoft.
In my opinion, the OS needs to go all virtual+distributed for general-purpose to survive. Or else all hardware will become web browser accelerators. Even now all browsers already have their own hardware drivers for video.
What I find is that people of my generation have some basic idea of at least some computer jargon because of what we had to learn just to get our PCs working in the 80s and 90s. What in fuck's name is a parity bit? We learned because we wanted to get our modems configured to play Doom together when our moms weren't on the phone. I'm not talking about people who went into computers, I'm just talking about the people who used them.
Nowadays everything just works so damn well. My phone works with my printer without any driver installation let alone mucking around. So nobody has any idea how anything works.
The people who grew up knowing what a parity bit was, are in the IT consulting business now, myself included. I think that's how things naturally change. IT and computer hardware has become that much complex and popular at the same rate, you now need people and abstraction for that. It doesn't have to be a bad thing though.
I really don't think its any more complex than it was 20-30-40 years ago. At the end of the day, none of this GUI abstraction does anything more than what the computer can do at the command line, and that's practically been etched in stone at this point—some of these bash commands are older than my boomer parents. Instead, people just don't have to bother learning beyond the abstraction to get work done anymore. It's just like cars: you don't need to know how to drive a stick shift and all the mechanical theory behind shifting gears smoothly if you have an automatic transmission instead.
I think consumer devices are going to remain consumerified, but we will see an increase in people across disciplines who need and make use of general compute power. More industries are starting to recognize the importance of statistical modelling and interpreting data, and that means being able to develop bespoke analysis on general purpose computing hardware. Right now it's just the data scientists and engineers doing this work, but I wouldn't be surprised if this grows to include investors, accountants, actuaries, urban planners, etc, in the coming decades. I expect we will see more smaller companies with unique cluster offerings to compete against AWS and their increasing prices, especially as hardware grows cheaper by the year.
In my opinion, the OS needs to go all virtual+distributed for general-purpose to survive. Or else all hardware will become web browser accelerators. Even now all browsers already have their own hardware drivers for video.