Things are much harder nowadays due to complexity. From almost impossible to understand CPU cores to massive amounts of third party code to the modern requirements (IoT, ouch!).
Debugging predictable single thread single core system was also child's play compared to distributed networked beasts each running on lots of cores and thousands of threads.
Nineties problems were contained in a small box. Oh, and no internet like today, so needed to order books and magazines. And to use BBS and usenet. Even then, a lot of it was reinventing the wheel again and again.
Modern problems are sometimes nearly uncontained (think software like web browsers, etc.).
Just jumping in to agree here. I've always thought of this way:
In the 90's the code I wrote was more difficult. Today coding is much easier (better languages, tooling, etc..). However the systems I build are many times more complex.
In the 90's we hired the best programmers, today we hire the people who are best at managing ambiguity and complexity.
All of this is very hand-wavy as there are certainly still disciplines where pure programming skill is most important, but those seem to be fewer and fewer every day.
Anyway I envy you guys, now things tend to be too high level and you lose that holistic view of the computing machine.
Back than you felt pretty skilled I am sure, now I feel anybody could do my job, often. Unless you are working for the big 4 or similar, many jobs don't give you that excitement.
Debugging predictable single thread single core system was also child's play compared to distributed networked beasts each running on lots of cores and thousands of threads.
Nineties problems were contained in a small box. Oh, and no internet like today, so needed to order books and magazines. And to use BBS and usenet. Even then, a lot of it was reinventing the wheel again and again.
Modern problems are sometimes nearly uncontained (think software like web browsers, etc.).