I hope you don't get too many downvotes for that. You may have been making a joke (and you did give me a good laugh), but for a huge number of students, it's the truth. It's hard for a student to focus on the underlying theories when he or she is having a hard time just learning the tools that are used to explore them in today's world. Other disciplines are similar - although boolean algebra may be the basis for digital electronics, if you don't understand the basics of how to use a breadboard, a multimeter, and a logic analyzer (depending on the class), it's going to make getting through most digital electronics classes nearly impossible.
All of the 'fundamentals' courses - graph theory / data structures & algorithms, compilers / parsers / language processing, number theory, artificial intelligence / machine learning, networking, computer architecture, operating systems, graphics, databases, ... are taught using a programming language. To be able to learn them in today's educational system, where the practical examples are routinely given in the form of code, you must first have a firm grasp of the language employed. I would compare it to attempting to learn history from a book that is half English and half Mandarin when one does not speak Mandarin.
I was actually being mostly serious. I understand that the 'theory' of CS isn't inherently tied to computers, but in practice getting the computer stuff to work is where the vast majority of the time gets spent. Especially when CS100 is in Matlab, CS200 is in java, CS300 is in scheme, etc. That's a large part of why I ended up not studying CS in college, because I didn't have 40 hours a week to spend on a 1 credit compiler class and not fail all my other coursework.
It's weird because I currently work as a developer and I'm definitely deeply reliant on other people who did study CS. But at the same time it seems like a lot of them have sacrificed way more than they're actually getting back. The only thing more cliche than non-technical folks who 'just need a tech person' is deeply technical folks wasting years of their lives building things that it would be obvious no one wanted if only they had a liberal arts background.
> All of the 'fundamentals' courses - graph theory / data structures & algorithms, compilers / parsers / language processing, number theory, artificial intelligence / machine learning, networking, computer architecture, operating systems, graphics, databases, ... are taught using a programming language
You must have been lucky, if you got to write code when you were learning those things. My experience was proofs, proofs, and more proofs. I took an algorithms/data structures course that had no programming assignments whatsoever.
The school I went to emphasized having students practice practical implementations of the theories being taught. I actually was a Computer Engineering major in undergrad, though, so I only took dta structures & algorithms, graphics, and networking from that list. I also got Computer Architecture, but taught from an EE perspective rather than a CS perspective (really pretty much the same thing) ... the biggest difference is that we had to design and build a single board computer for the capstone in that class, and the CS folks didn't (they explored it from a more software oriented angle).
All of the 'fundamentals' courses - graph theory / data structures & algorithms, compilers / parsers / language processing, number theory, artificial intelligence / machine learning, networking, computer architecture, operating systems, graphics, databases, ... are taught using a programming language. To be able to learn them in today's educational system, where the practical examples are routinely given in the form of code, you must first have a firm grasp of the language employed. I would compare it to attempting to learn history from a book that is half English and half Mandarin when one does not speak Mandarin.