Hacker News new | past | comments | ask | show | jobs | submit login
Tricky Programming Concepts Aren’t (evincarofautumn.blogspot.com)
185 points by evincarofautumn on Sept 22, 2011 | hide | past | favorite | 92 comments



I'm on board with his basic pedagogical point: don't tell people you expect them to struggle.

I'm concerned that he's going a little too far in the other direction, though. It's great to come up with a simple or even tautological explanation for concepts. However, those explanations don't address the nuances of why those concepts challenge people in practice. And telling someone it's easy when s/he actually is struggling is just as dangerous.

When I was learning C, I absolutely understood that a pointer pointed to something, and I wasn't bogged down by people telling me it was hard. That didn't stop me from dereferencing null pointers, returning addresses to variables on the stack, getting confused when someone added to a pointer, etc. The aptitude that Joel wrote about and Jon derides isn't the aptitude of applying a metaphor to code. It's the aptitude of coalscing a set of program behaviors into a mental abstraction that is more than "a pointer points."


I spent hours trying to help a friend in college "get" enough assembly language to not fail a class. I had already done assembly language development professionally and had previously gotten an A+ in the class, so I understood the concepts well, and I explained them from a dozen different directions.

I don't care what the OP says. This guy was simply not capable of understanding the concepts. Maybe at some level he was, and he had some kind of psychological blinders on that prevented him from getting what I was saying, but he really, really didn't get even the most basic concepts. And when it comes down to it, it doesn't matter whether he has some innate ability that he can't tap -- if he can't do it, even with lots of help, then he can't do it.

While I was helping him, others from the lab came over repeatedly to ask me for help, assuming I was the lab assistant. I WAS able to help a half dozen other students understand their program.

It was his third attempt at taking the class. He finally gave up and changed his major, IIRC.


Was it x86? That's a darn shame if so. I'm not sure /I/ could learn that, and I have used several simpler ones.


No, not x86. I actually can't remember the actual CPU it was. Something I'd never heard of, nor run across since. Maybe an IBM mini CPU? We would log into the computer via terminals and build and test that way. This was in the late 80s.

I remember that it was a pretty straightforward architecture, though. Something like 68000, which I knew already, but it was even more orthogonal with respect to addressing modes. I think it had 16 32-bit registers, and you could use any as an address to load data from, complete with offset for easy manipulation of data structures.

I do know x86, and while the 8086 was designed by idiots, it's not that hard to get the basics. Segment registers are a PITA, but as long as you're using 32-bit mode and not accessing more than 4Gb at a time, you can mostly ignore them. I've had to deal with something similar in another architecture, and you just end up with abstractions that hide the details as much as possible. Same as coding anything else that's nontrivial, really.


My point is that a memory reference is a conceptual reference. There is no difference. Understanding references is not about building abstraction atop your understanding of them; it's about recognising that you can express natural abstractions literally through them.


I'm unclear on what it means to "understand references" in a way that doesn't involve the way they behave in code. Unless you're saying that the concept as it appears in code can be explained entirely with the tautological definition, in which case I would be inclined to disagree.

Of course, I'm also not sure that Joel's suggestion that some people have an aptitude for such understanding is "irresponsible." If you were to say he underestimates what people can pick up with practice, that would be a different story.


I just think he comes too close to saying that it's entirely the onus of the student to understand the concepts, when the other half of the problem is that they're being taught (in my humble but loud opinion) incorrectly.


I can get behind that. I'm a big fan of finding better ways to teach. Thanks for taking the time to respond.


I agree with this completely. It's one thing for the concept itself to be hard to understand. It's another thing for the concept to be simple, but proper execution to be tough, frustrating, and prone to error.


This is a really important distinction; I see a lot of people claim they don't understand something because implementation is hard. Sometimes implementation is just hard. They should note this more often in schools rather than taking an extreme position of self-esteem/whatever.


I'd like some actual empirical evidence that saying "this is tricky" or "this is difficult" actually de-motivates ("turns off") students. It might, but I can also plausibly imagine the reverse. What if you say "oh, this will be easy and fun!" and then they have trouble with it? IMHO, that might send the message that they're just dumb. At least when you say "it's tricky" and they struggle they may not think it's their fault and might be willing to keep trying.

I think you could spin a lot of plausible pedagogical theories, but I'd be more interested in seeing some actual evidence.


I'm sure it depends crucially on the student.

The other thing to remember is that teachers speak to roomfuls of students at a time, not one student at a time. If you've got one student, then sure, you need never explicitly say that a particular concept is tricky. You can determine trickiness by trial-and-error: If the student gets confused at any point, explain again, but otherwise just keep going.

But with a whole roomful of people it's not as if you're going to fool them by avoiding the word "tricky": They're going to know when you cover something that is tricky, because half the audience will be confused, and the other half is then going to have to observe you explaining the tricky thing over and over again to the first half.


I think "this is interesting" might be a better introduction.

Also, in my experience, when people (especially teachers and professors) say "this is tricky", it is often in part or in whole because they are only giving you a partial explanation -- part of the picture.

It's "tricky", because they are telling you to accept certain things prima facie when actually (for me, at least) they'd make a hell of a lot more sense and be easier to absorb with a more complete description/explanation.

Some of the time, this is because they don't really understand the topic themselves. Other times, one or another pedagogical inclination or requirement makes them avoid a deeper discussion. ("I don't have time" or "We don't have time" or "That's beyond the scope".)


> What if you say "oh, this will be easy and fun!"

That is same problem as saying "Oh, this is hard and tricky." The author didn't seem to suggest to tell students things are easy, which is misleading and harmful, but rather he used the example of art teachers where the students are not told that things are easy but rather they need to practice and put in a lot of effort. Hell, he didn't say the word 'easy' once in his blog entry.

Lecturers and many others seem almost afraid to say "You're going to have to work hard." and then stand by that statement.

I'm currently at university and I've seen exactly what he has said. Lecturers and students saying x is hard and tricky and for most of the other students here that seems to have demotivated them. I think the current schooling system here in Australia and in other countries in the world where I've spent time or have friends from is horribly flawed in many ways and what this blog entry about is just one of those flaws.


I wonder if any of that has to be said at all? What if a lecturer did not set this prejudice upfront at all? However helpful emotions are in the learning process, saying "this is tricky" or "this is easy" is not going to help. Instead, play on other emotions. Explain the theory, show real life examples, simple tricks, stuff like that, surprise them. But don't say "this is hard" or "this is easy". Let them decide.


You should read Flow by Mihály Csíkszentmihályi. Flow is defined as completely focused motivation. http://en.wikipedia.org/wiki/Flow_(psychology)

The research states that flow is achieved by balancing skill and challenge. http://en.wikipedia.org/wiki/Flow_(psychology)#Mechanism_of_...

With respect to the original author, it sounds like he hasn't been far enough along in his career to be humbled by a hard problem.


If you tell someone that a simple concept is difficult, they will have trouble understanding it. I have seen this firsthand many, many times.


It's just like the whole "engineering is not for women", which successfully discourages girls from entering technical fields.


I thought so too, but after refreshing my memory, I think it might be similar, but not exactly the same.

http://en.wikipedia.org/wiki/Stereotype_threat


I'm just citing personal experience. I would be very interested to see studies that clearly backed up or refuted my perceptions. The real machinery at work is probably more interesting than I imagine, and definitely more interesting than I make it out to be in the article.


This is the common wisdom in language pedagogy as well, however I haven't heard any hard evidence either. Would love to.


This seems to be creating a false dichotomy in the original argument. It's not that there are some people who can't understand concepts like pointers and recursion, but having taught programming to several crops of complete neophytes, I've discovered there are some concepts that some people get right away, and others don't[1]. My response to this isn't to write off the people who don't get it, it's to work harder with them, but that doesn't change the fact that given two people with the same explicit programming background[2], and exposed to the same lectures, a fraction of them will need further work. It's this kind of concept that I call tricky.

[1] I'll fully grant the possibility that I'm teaching them in a sub-optimal way. But very few people have the luxury to learn from the optimal teacher.

[2] It's conceivable that some people have had experiences and training that, while not formally programming, have nevertheless taught them the abstraction and logical reasoning skills that are necessary to understand core programming concepts.


He's not creating a false dichotomy to attack--he's responding to Joel Spoelsky's post, and Spoelsky creates a very clear dichotomy of smart and not-smart in his post

"I've seen that the 100% Java schools have started churning out quite a few CS graduates who are simply not smart enough to work as programmers on anything more sophisticated than Yet Another Java Accounting Application, although they did manage to squeak through the newly-dumbed-down coursework...I used to be able to tell the smart kids because they could rip through a recursive algorithm in seconds, or implement linked-list manipulation functions using pointers as fast as they could write on the whiteboard. But with a JavaSchool Grad, I can't tell if they're struggling with these problems because they are undereducated or if they're struggling with these problems because they don't actually have that special part of the brain that they're going to need to do great programming work. Paul Graham calls them Blub Programmers."


I've done a fair amount of one-on-one programming tutoring both for friends and family and for pay. In my mind there's no question that the variation in the speed with which people pick up programming concepts is quite significant.


Valorizing the optimal case is not a safe strategy.


I'd love to see more programming abstractions described in simple, no nonsense terms. I think that, as programmers, we often get so used to our own jargon that we unconsciously create a non-trivial barrier to entry.

For extreme examples, try digging into some of Haskell's features -- I still have no idea what arrows are, but I've been given the impression that they're advanced and hard to understand -- or reading some of the discussions on Lambda the Ultimate. It usually turns out that the concepts are surprisingly simple, but it takes a lot of sifting through academic language to figure it out.


If you learn something like haskell from books instead of blogs, the explanations are usually much much more comprehensive and cogent. Both Learn You a Haskell and Real World Haskell are excellent in that respect.

I find more and more that blogs are great for discussing things you already have deep knowledge of, or of finding leads to resources for learning them (book recommendations, etc.), but there's no comparison to books for really learning something well.

Blogs are just too piecemeal, and the articles are usually written in a few days or a week, versus up to a year or more for books. Hence the quality of pedagogy is just much higher with books than blogs.


I found Learn You a Haskell unexciting; it was more useful for me to hang out with my Haskell-programming acquaintances, read a lot of code, and get my hands dirty using it than it was to just read and play with examples.

Even as someone who writes a technical blog, I'll agree that blogs aren't the best medium for disseminating learning materials. That's what proper online documentation and books are for: specialised topics, in long form. I can write 50 1000-word blog entries and help people on more topics, but fewer people per topic, or I can write one 50,000-word book and reach many people on just the one.


Yeah, blogs are great for discussing edge cases and the like with people who already know the topic, that books don't have room to cover.

And of course, neither books nor blogs nor language references, etc. substitute for actually sitting down and programming something challenging. But they can help reduce the amount of initial time wasted on trial and error, get you using the language idiomatically, and get you up to speed on language-specific best-practices faster.


Not to mention field reports getting something working - especially useful when an API of a library is not so obvious and the documentation is lacking. EG Rails-centric documentation but using Sinatra.

Although the time could arguably be spent more productively improving said documentation, sometimes an informal tone (which may not be appopriate in docs) can help to dissolve frustrations..


In my case I think it is after "a lot of sifting through academic language" that I get a cool idea through my thick skull. Academic language is (or should be) about being precise not obscure. What I am trying to say, is that learning takes time. The simple, no nonsense terms resonate in your mind after you really grasp the concept. Otherwise it's just a high level overview that won't allow you to really use the concept.


It is usually assumed that if you are discussing something on Lambda the Ultimate, that you already understand the jargon and by using it, you are using a language that everybody (the target audience - in this case, language lawyers and language enthusiasts, I guess) understands and is well defined.

Sure, they could use terms that are easier to understand to the uninitiated, but then you risk having to explain your terms every time because people use different ways to refer to the same thing. Jargon evolves precisely to solve that problem, at the cost of making it harder for somebody new to understand.

Obviously when learning these things, you need a clear and easy to understand description of the terms, but as others have stated, a book usually does this, while blogs and research papers probably assume you already know whats what.


Let me get this off my chest: pointers are NOT hard. If every 16 year old with weed can talk about representation and meaning, every CompSci student can grok pointers.

Managing code with pointers is hard, but grasping the concept is easy.

Closures are far more confusing.

I didn't learn regular expressions for a very long time because people said they were hard. That was stupid. It took me a couple of days to get good at them.


Thanks for the excellent introduction to my next topic: An important function of programming education is to teach people which things other people often find difficult. Because intuition is not a reliable guide. And the secret to all effective communication -- which certainly includes programming, as well as writing and teaching -- is to write stuff that your audience can understand.

When the teacher says "this is tricky", consider the possibility that they aren't merely trying to motivate the students. It might just be a simple statement of fact: There's a significant population that doesn't get this at first glance, and you have to know that, regardless of whether you get it yourself or not.


> Thanks for the excellent introduction to my next topic: An important function of programming education is to teach people which things other people often find difficult.

Why? In my mind, managing student psychology is more important than conveying meta-facts. If a student will learn slower if you tell them that other people find something hard, then don't tell them that. As a bonus, the meta-fact might become less true.


> I didn't learn regular expressions for a very long time because people said they were hard.

I learned to use them long before I had any clue what a DFA was or had even heard of an accepting state. It was pretty intuitive to me that you just have to specify these patterns and it would match whatever you told it to.

When I later read more about the theory and how they were constructed in a compiler book, I wouldn't have been able to decipher a word of it if I hadn't already known how they were supposed to work.

I wonder why more books don't show you what they do before showing you how they're made.


"Closures are far more confusing."

IMO (obviously this is subjective) this is not true but it depends upon whether you mean confusing to use or confusing to understand the full implications of a thing under the hood.

Closures are pretty easy to start using for just about any halfway decent programmer if you lay off the theoretical underpinnings and just show them: here do this, and this will happen. Actually understanding how they function is another matter, but to just start using them is fairly simple assuming the language you are using supports them natively.

Pointers on the other hand are a fairly simple concept and unlike closures there is really no depth to the concept, but because most people's brains don't deal well with indirection beyond one level, pointers are very "tricky" when it comes to training your brain on how to use them correctly even after you understand the concept.

Closures are easy to use, hard to grok; pointers are easy to grok, hard to use, so which is more confusing depends a lot on what you mean by confusing.


People try so hard to find excuses why some people can't understand some concepts, such as pointers. It's all an attempt to avoid just admitting that some people aren't capable.

I've spent much time tutoring people on various subjects, programming, physics, math. And the fact is there are just some people who can't grasp certain concepts. From my observations I think the culprit is a poor ability to integrate information into a single mental model. I've seen this many times. Someone can understand concept A, B, C in isolation, but when you try to guide them into combining the three to solve a problem they break down.

I think this really goes to the heart of intelligence. More intelligent people can integrate more information at a single time. This is probably related to working memory (there isn't a single place in your head where working memory functions, but its spread out). Tough concepts like pointers exemplify this. You need to understand the memory model, accessing a memory location, using that info to index back into the memory array, and how the syntax of the language supports these actions.

Analogies can help as it reduces the amount of info you have to hold in your head, but its not a replacement for understanding what it actually is. I think its time to admit that there is an intelligence barrier to some concepts that some people don't cross.


I'm sure I have said similar things before when teaching students. I have no problem with these concepts because I've dealt with them for years.

What I mean when I say such things is that I have a mental model of the student's current understanding. That mental model is based off of what I have taught them, and the understanding they have demonstrated through questions and assignments. When I try to think about this concept with just those tools, I think it may be difficult to understand.

I always try to build a bridge from their current mental model (or, more accurately, my mental model of their mental model) to the mental model required to understand this new concept. If I have difficulty building that bridge, then I think it is a "tricky concept" for them.


Thank you. I think this is a responsible, reasonable approach. "This is going to be tough" is a reliable signal from instructor to instructed that we need to pay close attention, because this next part won't be easy.


A pointer points. A reference refers. These are the same thing.

Except they aren't the same thing in many programming languages. In particular C++.

To implement a linked list, just open your eyes and think about what the term means: it is a list, and at each place there must be a thing, and the address of the next thing if there is one.

This is one example of a type that you probably won't use a reference in C++ (at least not a mutable linked list).

The thing the author did was actually swapped out one analogy for another, although unknowingly. To really get away from analogy is to point out that there are values stored in memory locations (determining the value uses metadata associated with memory locations). You can use those values literally, or you can use them as an index into other memory locations -- and then recurse (to tie both concepts together).


A pointer is a particular kind of reference (a value type that enables indirection), and C++ references are another (not very different) kind. A reference in C++ amounts to an immutable pointer.

Also, I didn't swap analogies: I traded analogy for metaphor. (A reference in a computer program is a conceptual reference in the real world.)


>A pointer is a particular kind of reference (a value type that enables indirection), and C++ references are another (not very different) kind. A reference in C++ amounts to an immutable pointer.

A pointer is a memory address. That's all it is. It can contain any valid memory address for the system.

A C++ reference is essentially a hard link to another declared object. As far as the compiler is concerned, a reference to an object is the object, and therefore has all the attributes that the original object has.

Pointers are like a house street addresses. They do not directly allow the user to see anything about the house, but they tell the user where the house is.

References are like what people call a certain house. I may call the house I live in "home", but my friend Bob may call it "cube13's house". Both myself and Bob use different terms for it, but we both mean the exact same physical location.


A pointer is a particular kind of reference (a value type that enables indirection), and C++ references are another (not very different) kind. A reference in C++ amounts to an immutable pointer.

I think you're beginning to walk that line where people start getting confused -- "I thought you said they were the same thing. Now you're saying they amount to some modified version of what you said?"

Also, I didn't swap analogies: I traded analogy for metaphor. (A reference in a computer program is a conceptual reference in the real world.)

I'm no grammar buff, but that seems to be walking a fine line. From my most recent CS SAT test:

reference::computer as conceptual reference::real world

In any case, many would argue that metaphor is equally dangerous. Djikstra has ranted on this very topic, saying it creates dangerous visualizations.

My point is that there is way to teach the concept relying on neither analogy nor metaphor, but a more basic concept of how data is stored. I think you'll find different students will react favorably to different techniques.


> A reference in C++ amounts to an immutable pointer.

"Amounts to" is a phrase that lets you say pretty much anything. Eg: A pointer amounts to an int.

If you are going to say that it amounts to an immutable pointer, you should at least add a caveat about how int *const x; would be different from int& x;


> Eg: A pointer amounts to an int.

Is there a reason why this analogy is invalid? I found it to be quite useful in using, understanding and explaining C pointers.


This is a really good example of an explanation that is correct in the basics, but almost completely incorrect and potentially dangerous when taken more in-depth.

Pointers are not ints. First off, pointers are unsigned data types. Also, on 32-bit systems, they're 32-bit unsigned longs, and on 64-bit systems, they're 64-bit unsigned long longs.

Both of those are incredibly important to understand, especially because the first can lead to very weird bugs with unsigned/signed comparisons or addition. I recently ran into a bug that involved that. 50% of the time, the program crashed with an incorrect memory address error. The other half, it worked completely fine. It turned out that we were casting a couple of pointers to signed values before doing some range checking. The cast converted them to hugely negative values, which broke the comparison check and sent us down a completely incorrect code path.

The second is important when dealing with structure and class allocation sizes.


> Pointers are not ints. First off, pointers are unsigned data types. Also, on 32-bit systems, they're 32-bit unsigned longs, and on 64-bit systems, they're 64-bit unsigned long longs.

Just want to point out that "int" is not a fixed defined size in C++ either, it just has a minimum size. You equally cannot depend on the size of int or void*.


This is true in regards to the C++ spec. An int data type is up to 32-bits. For old 16-bit architectures, an int and void* are both 16 bits. For more modern architectures and compilers(anything on a 32-bit processor or higher), ints are assumed to be 32-bit signed integers.


"Nullability" would be that caveat. But that's an argument about C++, not an argument about references.


OK, instead of saying "this is tricky" people should say, "this will take some practice." I got pointers right away but it took a lot of practice to use them without error.


The author completely missed the point about why teachers say "this is tricky". It's not about telling students to pay attention, they do that if they want. And it's certainly not about telling people it's so hard they need not bother.

It is to make it OK for students to not understand right away and not feel bad about asking questions.


I was trying to be brief, but that is essentially what I meant to say...perhaps I'll edit to clarify that. I think the effect is still the same, though, regardless of the motivation behind the phrase.


I like that. It's not "you can't", it's "you will have to work".


Experience has shown me that there are people who just won't get these concepts. I don't know if they decided it was hard and gave up or what.

Example: Shortly before a final exam for an undergraduate introductory course on logic, there was a student behind me who said "I still don't get this De Morgan thing"

This was something that was covered in the first week, it was impossible to do about 90% of the work in the class without it and has got to be at least as obvious as pointers and recursion. The student was trying in the class and went to office hours, study groups, etc.

I still don't have a good explanation for how someone can't grasp De Morgan's laws in under an hour, much less after 16 weeks of class.


I don't think the difficulty stems from an inherent inability to learn a concept. I think the difficulty occurs largely because of two factors: lack of interest and intimidation.

Have you ever had to read something unfamiliar that you found incredibly boring (think reading a patent application)? You end up reading the same page over and over but you absorb very little of the information. Your mind just moves through the words but doesn't make the effort to understand what was being read. This occurs simply because you don't have enough interest in the material at hand.

Seeing how intimidation can factor in this is a bit more difficult. You can usually see this by watching someone try to solve something that they think they can't solve. A lot of times people in this situation make a lot of half hearted a attempts but rarely follow through. I think this is similar to how people struggle when they are put in a situation where they need to make a really open ended decision and get overwhelmed by the options.

Edit: clarity


It's good to see someone bluntly take apart common misconceptions.


Agreed, that was refreshing (and affirming).

As the great teacher, Richard Feynman puts it: "if you can't explain something to a first year student, then you haven't really understood it"


Doesn't this underscore the argument against the OP? If Feymnan is correct, this means that many CS teachers don't really understand these concepts that they tell their students to be "tricky."

So, if numerous professionals are having trouble understanding a concept - or, they believe they understand something when they really don't, which is actually worse - then that seems to me to be empirical evidence arguing that said concept is probably difficult.

And of course you can make the argument that it just happens to be the case that these concepts are easy and society has just happened to promote a bunch of morons into professional positions, but I think that is the more difficult hypothesis to prove.


Feynman is wrong. Understanding something (technical skill) doesn't automatically make you good at explaining it (communication skill). Being poor at explaining something does not necessarily betray incomplete understanding.


Feynman is not wrong. Technical skill does not equal understanding something. Consider an idiot savant who is technically proficient at playing piano but can't string two words together. These definitions of 'understand' may help: 1. Perceive the intended meaning of (words, a language, or speaker): "he could usually make himself understood"; "she understood what he was saying". 2. Perceive the significance, explanation, or cause of (something): "she didn't really understand the situation". http://www.google.com/search?aq=f&q=define%3Aunderstand


Or perhaps these people just repeat "this is tricky" because that's what they were told. These people probably also say that Java is slow and Perl is a write-only language. It's easier than having a real opinion.


Feynman said it in reference to top-flight physicists explaining their research. I can certainly believe that CS teachers would know the words well enough to recite them to their students without having a true understanding of what they were saying.

As to why they don't understand - well, who taught them?


My pleasure. It's good you find it blunt: I tried to be concise so people would bother reading.


tl;dr Programming is not as hard as people make it out to be.


No. Pointers and recursion are easy. Programming is hard, like writing and design, because it is both.


Why can good programmers claim to be both good at writing and design whereas writers, under that taxonomy, are only good at writing? Isn't good writing inherently a type of good design, in terms of presenting an "interface" to the reader and adroitly arranging the elements?


I meant only to equate the core skill set of the three. Programmers aren't necessarily good writers of literature or designers of graphics, but neither are writers necessarily good writers of software, nor artists good designers of it.


Has the author ever tried to teach anyone pointers? Who had never heard of them before? I've introduced several people to pointers, and while they aren't "hard" in the sense that it takes them months of hard labor to understand, there's a reason they have the reputation they do.

I personally love pointers, and have a bad habit of trying to do my own memory management with them.


IMHO, the only problem with teach pointers is that professors try to dress them up. Not understanding a pointer typically means not understanding the basics of the underlying hardware. This touches on one of the common problems today is that software has been split away from the hardware that runs it.

I'm certainly not saying that anyone who writes software needs to know the ins and outs of a modern CPU, but basic knowledge of how a computer works (memory, CPU, DMA, mmio, etc...) is what's often missing when talking about low level programming concepts.


Yes, I tutored CS in C++ for a few years, and near-tautological explanations like these seemed to work best to cut through the explanations of the professors. I like to think I've helped create a good programmer or two. In general the profs were trying to be strictly correct, at the cost of explaining things in terms that their students would actually understand.


Since even the most high-level concept, explained correctly, can be absorbed and applied by any person of average intelligence, I will be very interested when the author (or, y'know, anyone) posts such an explanation of motivic cohomology.


Of course, there are plenty of actual tricky programming concepts: closures, tail recursion, coroutines, memory management, concurrency, regular expression matching, caching, algorithmic complexity, naming things...


Another commenter made a great point that knowing how and when to use these things is likely more difficult, but having a good grasp of the concepts is important. Here is a minute of me stabbing at analogies:

Coroutines: A subroutine is a basement which you go down into and come up from via the same door. A coroutine, by contrast, is an adjacent room with several doors and you can leave temporarily even if you're not completely done in there (of course, you must re-enter whichever door you previously left; not a perfect analogy).

Concurrency: doing multiple things at once. In a kitchen, concurrent actions work best when the cooks either don't share resources or have clever schemes for ensuring there is always an alternative activity to perform if resources are in use. This can be as simple as getting on a list for the knife, or blender, or what-have-you, and you'll be called when it's open.

Caching: I dislike going to the grocery store every time I am hungry so I cache things I will need to make shorter trips on average. How much you cache at once is important to determine.

Regular expression matching: This isn't what I'm trying to find, this is a description of what I'm trying to find.


Coroutines are more like reading multiple books, leaving bookmarks behind and returning to them as needed. I think of caching like moving the resources you're working with to be near at hand so it takes less time to get to them.

(When you're painting, for example, you keep pencil, eraser, brush, paints, and palette nearby, because they're in active use and it would take awhile to fetch them every time you needed them.)

Your other examples are great though. I worked in a kitchen for a couple of years and it's definitely all about concurrency.


That's what I was grasping for with caching, and I think I got carried away thinking about preemptive caching and conflated an idea with a common use.

Either way, thanks. And, your article is fucking brilliant. I might ape its style soon in a few "Learn Perl" articles of my own.


Great explanations.


Thanks! Maybe the author of the article will agree.


Just last night I was explaining regular expressions to my fiancée who has absolutely no programming experience. I just showed her some fundamentals:

1) '[\d]' means any digit

2) '+' means one or more

3) '[\d]+' means one or more digits

Understanding that is not hard or tricky at all. Using that to build a complicated regular expression is hard.

I think that's the author's point. Understanding the concept of a pointer or recursion is not hard. Actually using those concepts to build something takes practice and hard work.

edit: formatting


Honestly, most of the concepts you listed are fairly easy. They mostly apply to the mechanics of writing code, so they just require a bit of thought.

Problem solving is difficult. Recognizing how to apply dynamic programming is difficult. NP-completeness proofs are difficult.


And you've just given me an idea for my next article. :)


lol, I love the guy who commented: "Awesome post. Now do monads please?"


Monads are a lot like pointers in that the Dictionary definition is rather simple. It is the implications and proper use that takes dedication and practice.


Completely disagree with you. There is nothing more demotivating and infuriating than a professor breezing through challenging material with no acknowledgement that it is difficult.

If I'm learning a semester's worth of new things, and in there 10-20% is going to be discontinuously difficult, I want the prof to acknowledge that fact. I hate feeling like an idiot, and an acknowledgement that a particular topic requires a manner or mode of thinking orthogonal to what I'm used to helps a lot.


You're disagreeing with something I didn't say. A professor should not "breeze through" anything. I just said they shouldn't set up students with preconceptions of difficulty.


Phew.. I haven't tried to teach programming to anyone yet, but i think it would be a good idea to ask the student to point to some object. Then point to a different object. Then explaining to them that they are asking the computer to do the same thing as moving their hand and index finger. *- Ofcourse, it may not be practical in a big class, and it might help to put the objects in a sequence. Any feed back on having tried this??


Interesting article and it does go a good job of explaining these basic concepts.

But...

There is a huge difference between understanding something and knowing how to implement that knowledge properly in the real world. The former is a building block for the latter, but just the understanding of what pointers are or recursion is doesn't dictate someone's ability to use them properly or efficiently.


True, but understanding goes a long way toward practical skill.


you are definitely right about that


> This is exactly like the process of essay writing

Indeed. I find that essay writing and programming light up exactly the same parts of my brain.


Actually they are.


A queue is a queue of things.

On an only vaguely related note, this reminds me of something from college. I was helping a friend of mine who was taking an intro to programming class. It was in C++, and he had been given a class outline for a queue, and was required to fill in the implementation. I was horrified to find that the teacher's class declaration used method names "NQ" and "DQ".

I honestly question whether the instructor even understood the meaning of the word 'queue' and how 'enqueue' and 'dequeue' are derived. It was obvious that this cutesy shorthand completely obliterated any hope of my friend ever making the mental connection between the code and what it conceptually represented.

(Incidentally, they also taught C++ streams for I/O. Apparently, their goal was to prevent new programmers from succeeding)




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: