It's hard to believe that a site with that kind of traffic & resources hasn't even bothered to concatenate and minify its JS & CSS. Most of the JS files still have comments. Bizarre. They should know better.
I like to think they are trying to preserve the spirit of being able to right click->view source and learn a thing or two. Geocities + view source did spark early interest for myself in coding.
Maybe sourcemaps will help with that in the future.
its old school publishers who often have very little idea what they are doing and have sclerotic development processes 9 week sprints! in one case.
One site I worked on did not manage to sort out a redirect of its .net version of its domain and its over 2 Years since a flagged this as a high priority problem.
The New York Times built the current site in 2006. The Times just announced that they have are working on a new site design (and presumably completely new architecture) for the last year and should release it soon.
There is a world of difference between doing a cute vanity project that works for say 99% of the time but doesn't mean that they buckle down and do the hard work involved delivering a major publishers site.
Another example Google has a lot of smart people but they cant parse a robots.txt file with a BOM in it.