I think the issue that until a budding mathematician reaches upper-level undergraduate classes, foundation must be built, and built quickly, in order to catch him up with the current rigor of mathematics.
It's the same with computer science – eventually you can learn the nitty gritty, but you don't start out learning the x86 ISA. You start out with some sort of high-level language, be it C or Python. Eventually, you can learn the details.
I see where you are coming from, but I think there is an analogy here that can be raised to what happens at St John's (where I graduated). When I was there, we often had to pretend not to know things, especially in math, so that we wouldn't take advantage of things we knew, which happened especially in lab classes. I think it is a bit like we pretend to physics students that we live in a Newtonian world, and then introduce relativity afterward. For computer science, I think that the article was advocating more of a historical approach than a logical one; it is not hard to draw up a curriculum that starts with logic gates and assembly and progresses to the language de jour, but that starting with something like Babbage's computers and progressing to more modern computers naturally brings up more diverse issues, such as some discussion of analogue computers or how a computer doesn't need to be rebuilt any more for reprogramming. This would not teach any student how to program, but would help the students gain a computing education.
It's the same with computer science – eventually you can learn the nitty gritty, but you don't start out learning the x86 ISA. You start out with some sort of high-level language, be it C or Python. Eventually, you can learn the details.