Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Ugh, this is one thing I really hate about academia. There's this weird fixation on trying to make things intuitive or tangible at every step along the way. The result is that you get five hundred very wrong/sort of wrong/okayish explanations that all differ enough to make it so we score the student wrong during examinations anyway because they chose the wrong tutorial to follow.

Sometimes, things are just hard to understand. Sooner or later students are going to have to face that, so why do we delay the inevitable?



Because there's a lot you can do with basic knowledge of a subject.

Very few people need to know the subatomic behaviour of electromagnetic fields that make electricity work, but all of us need to know that it travels in wires and it can kill you.


I'm particularly talking about in the context of teaching someone what electrons are and how they behave at the subatomic level. High level overview? Sure, use analogies or models that aren't 100% accurate. But if we're talking about students in a university, then it shouldn't really be an issue to get into the nitty gritty.

I see this as well when it comes to teaching programming to freshman CS students. For some reason, we've strayed away from lower level languages like C and don't introduce the low level details until students are quite far into their curriculum. Abstracting away the details just muddies the waters in my opinion.


As a recent graduate who did both since I switched from a very theory first uni to a practice first uni I find starting with a high level language much better.

If you're trying to learn "class, method, extends, static, var/Integer/int, interface, abstract, virtual, different exceptions, recursion and loops, etc" having to also learn about memory is not helping and if you try to do basic pointers first it becomes kind of like spell chanting. You just start trying different combinations of * and & until it works. Partially because you're a bit overwhelmed and partially because it seems like useless knowledge.

The more I worked the more I started to appreciate subjects like operating systems, algorithms, etc but at the time of doing them they seemed too theoretical since they were way above my practical knowledge and useless to the projects I was doing. "Why would I need to know how to build a file system/compiler/etc? Why in the hell would I ever do that?"

The other aspect is that if you start with learning the theoretical side you end up worrying you won't be able to code by the end after a bit. For example you've been there for 2 semesters and while you can talk about low level subjects you've barely done a todo list/calculator/chess. If you start with the more high level things by the third semester you can definitely be working part time.

This is from the POV of doing CS to start working as a dev and not do academia.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: