If by "understand" you mean glue together all of the high-level abstractions of each of those domains, each of which have mountains of tutorials / blog posts / StackOveflow answers, then sure.
This is true, and I don't mean to trivialize the work of today's programmer.
However, I suppose somewhat subjectively in reviewing or writing any of this glue or logic layer code using modern web frameworks I've never had the sort of WOW response that I do when I see what the programmers of yore were coming up with.
I think the real reason for this might be that clever hacks using current technology are looked on with disdain, whereas yesterday's hacks are seen as impressive ways of making the most of limited hardware and tooling.
To me, that's a sign of the engineering discipline having matured (as well as average computational needs not keeping pace with computational power). My first job out of college was at Google, and I remember being somewhat disappointed the first time I was told to replace some elegant, dense logic with something more readable. After years of having to read other people's prematurely optimized or unnecessarily compact code in large engineering systems in complex problem domains, I'm more than bought into the idea that readability is one of the primary goals of good code.
This doesn't preclude compromising readability for hacks when you need to squeeze performance out of some logic (just look at Tensorflow code), but you'd expect a maturing computational environment and engineering culture to reduce the number of clever hacks needed or present.
That can be the difference in how the software is deployed. At Google, most things are horizontally scaled and machines are cheaper than engineers. When I worked on Windows, it was the opposite: I was told to refactor clean multithreaded code to reduce the thread count by one. The rationale was that Windows scales vertically and every developer wasting a thread is how Windows gets slower over time.