Hacker News new | past | comments | ask | show | jobs | submit login
Java in College: "You might as well be teaching Chinese to a monkey." (pshaw.wordpress.com)
83 points by bprater on Dec 19, 2008 | hide | past | favorite | 142 comments



I don't believe that programming is something you need to just get. I've seen a lot of people struggle on assignments like these as a TA. Usually, it's that they're missing certain concepts, but they don't know which ones and they can't debug themselves. If you know 70% of basic Java, you won't be able to create a working solution. You'll get tripped up on a type mismatch error that just makes you bonkers until you give up because you don't know what the error message means or you tried to make a templated array rather than an Object array and casting it to a templated array or whatnot.

There are lots of little pitfalls in programming that are just terrible. In C, you can just walk off the end of an array (or the beginning). In Java, you can store an object in an array and then modify that object in both the array copy and the one you have in a regular variable because Java works by reference, not by copy. A lot of the time, you just need someone who is a lot more learned in programming to notice that. It's not that you don't get a concept like object references, it's that if it's not in your mind, you don't notice when your code does it and it takes a long time for these things to really get into your mind to the point that you catch them fast.

And most people give up. You get hit enough times with silly things like these and you just want out unless you're truly dedicated. I know when I was learning, I hit all the trap doors. It's why I noticed them when other students made them. Heck, back in the old days of Java, you could easily screw up things like int vs. Integer and not notice it - especially since a lot of professors might not make a distinction because for veteran programmers, we know that pitfall, but also because its place is at the start where students get inundated with syntax and such a distinction might go unnoticed (or abbreviated as int in notes).

I would think schools should hire more TAs to help students through those pitfalls except that most students like to do assignments at the last minute where these errors become the source of the mind-breaking stress.


To hire sufficient good TAs, schools would have to pay much more than they do, as they are competing with a lucrative commercial industry. However academic hierarchies won't tolerate paying elevated salaries to people with lesser paper qualifications just because they are talented. So every year you get the people who want a bit of cash on the side, but aren't yet good enough to do part-time contracting.


I have to wonder if a language with less syntax would be easier to pick up.

Start people that are totally new to programming with something like Scheme or Ruby (surely there are even better ones than these) that requires less arcane punctuation to do things like hello world, then once they get the hang of things, slowly introduce them to more structured languages.


My university started the introductory course (for all majors, courses were common for the first 2 years) with scheme and most student seemed to understand it fairly well. We still had the 20% that were extremely fast because of having programming experience before (but learning scheme was a nice change compared to what they used to do) but the numbers of students completely lost seemed rather small.

So I think scheme is a very good choice for an introductory course...

Ruby is very nice (I'm mainly programming in it nowadays) but I'm not sure it's such a good choice for beginners, there are a lot of small details that could trip up beginners I think (block vs procs vs lambda vs method(:name_of_method), the eigenclass...)


The problem with Scheme is that it's very difficult to learn in a vacuum. You can do all sorts of interesting things with it, but if people can't use it to do real work then they quickly lose interest (I'm not talking about CS types who are interested in computing anyway, but others who would fine programming a useful skill in their "real" jobs). From that POV the best teaching language is something like VBA, awful tho' it is, because it gets people from zero to "look what I did!" very quickly. Or MATLAB or some other language that's looked down on by purists.


At the risk of sounding like a broken record, I must say that Python fits the bill.


I confess I know little of Python, so wasn't confident enough to list it.

From what I know, the Python mantra seems to be 'one way to do something', which I suppose would be less confusing for first timers.


Well, I think Python has one obvious preferred way of doing things, but then tends to let you do it anyway you want. Which I guess is a part of the whole "Python supports multiple programming paradigms (primarily object oriented, imperative, and functional)" thing. Not that I know that much about Python either but it is my preferred language.


Because of crippled lambdas, Python never felt to me like it really supported functional programming.


You can experiment with some functional ideas in Python, though -- once you're familiar with basic Python, exploring the stuff you can do using higher-order functions is likely to be quite educational, with the caveat that (in that language) recursion will blow up on you when it gets too deeply nested.

So, while it doesn't really support FP fully, it's probably sufficient for basic FP from a pedagogical standpoint. (I also think it's one of the better candidates for a first programming language overall.)


I have read email list anecdotes of Smalltalkers who introduced their kids to Smalltalk as a first programming language. In the majority of cases, their kids liked Smalltalk and their reactions to subsequent languages are, "Why are they so complicated? They don't do anything more than the simpler one."

Smalltalk as it exists today has about 8 terminals & nonterminals in its grammar. Python has something like 30. Ruby has something over a hundred, but for an intro to programming, you can just ignore most of the grammar and use a subset which makes it very close to Python in simplicity.

Smalltalk was specifically designed to be usable by children.


I think the problem with those languages is they're too different for a starting course. Scheme is obviously very powerful, but its syntax is wildly different than most popular programming languages used today. As for Ruby, don't even get me started. There's practically no syntax at all; merely reading it agitates me.

I think, in the beginning, people are most likely to understand concepts like Objects (Circle, Square, etc.) and how they relate to other stuff since it's so based off how we normally think. Also, since Java is statically typed, I think it makes things appear to contain less "voodoo". Then again, I could be completely wrong; I had been programming in other languages for years before taking my first Java course in college and only attended the class when there was a test.


Ruby has quite a bit of syntax (parens, methods on objects, blocks, etc), whereas scheme has almost none (parens, special forms like lambda, #t, '()).

Of course both have less obtrusive syntax than Java, which requires the famous

  public static void main...
just to get started. That is the kind of stuff that I think confuses true beginners. They look at that stuff and think "holy crap memorize these voodoo incantations" and never really leave that mindset once they get there. Even if they eventually learn what static means, they will always treat it as such.

I think once they get the hang of programming in something, it's easy to build up from there. I just think it causes people to focus on the wrong thing (syntax rather than semantics) when they are first starting out.


One of the things that complicates Scheme for many beginners is the prefix notation for arithmetic and such.

It's just not a concept most Freshmen are acquainted with.


I dunno--I started community college with an HP-48 instead of a TI calculator, and reverse polish notation wasn't a problem at all once the friend who'd convinced me to get one explained how the stack was like the way you do arithmetic on paper.


>>Also, since Java is statically typed, I think it makes >>things appear to contain less "voodoo".

No way. It's better to learn other programming concepts without the distraction of having to maintain types. Duck typing makes a lot more sense to new people, in my experience. Most of the popular languages with non programmers (PHP, VB, JavaScript) fit this mold.


Forth, Factor, and Io are languages that come to mind that have "no syntax". Ruby is certainly at the "more syntax" end of the spectrum.

    puts(("foo #{bar} %s %s" % [bat, baz]) * 2.+(0b0010))
There are a huge number of syntax rules in that line.


I count 23 in a PEG language without repetition. Here's an expanded version of your fragment that runs successfully:

    # from http://news.ycombinator.com/item?id=404609
    bar = 3
    bat = 4
    baz = 5
    puts(("foo #{bar} %s %s" % [bat, baz]) * 2.+(0b0010))
And here's a grammar for the subset of Ruby it uses:

    this <- 'A parsing expression grammar for a tiny subset of Ruby.'.
    copyright <- 'Written by Kragen Javier Sitaker 2008; in the public domain.'.

    program  <- _ expr program / .
    noninfix <- inparens / call / constant / quasiquoted / list / name.
    expr     <- infix / noninfix.
    inparens <- '('_ expr ')'_.

    call <- name '('_ commalist ')'_ / constant '.'_ methodname '('_ commalist ')'_.

    constant <- '0b' bits / digits.
    bits     <- bit bits / bit _.
    bit      <- '0' / '1'.
    digits   <- digit digits / digit _.
    digit    <- bit / '2' / '3' / '4' / '5' / '6' / '7' / '8' / '9'.

    quasiquoted   <- '"' qqcontents '"'_.
    qqcontents    <- !'#{' !'"' char qqcontents / interpolation qqcontents / .
    interpolation <- '#{' expr '}'.

    list      <- '['_ commalist ']'_.
    commalist <- expr ','_ commalist / expr / .

    infix <- noninfix '%'_ expr / noninfix '*'_ expr / noninfix '='_ expr.

    _  <- sp _ / .
    sp <- ' ' / '\n' / comment.

    comment      <- '#' commentchars '\n'.
    commentchars <- !'\n' char commentchars / .

    namechar <- 'b' / 'a' / 'r' / 't' / 'z' / 'p' / 'u' / 's'.
    name     <- namechar name / namechar _.

    methodname <- '+'_ / name.
I have tested it (in http://github.com/kragen/peg-bootstrap/tree/master/peg.md) and it seems to work for that fragment, although I haven't examined the parse tree to see if it's right. (Obviously the definition of namechar is wrong, and there's no $end token separating statements, and you can't call methods on things that aren't constants or self, etc.)


Start with digital logic and machine architecture, work your way up.


That is how my school taught me, but I don't think this is really an option for non-majors.


I remember checking out a first-year CS textbook from a library, back when I was in HS. It described one-bit adders and other hardware and then prescribed a lab assignment that required stuff I couldn't buy at a computer store.

Later, I signed up for a CS course at a college, and their textbook was just a glorified Visual Basic tutorial, because at the time, all the companies were looking for Visual Basic programmers.

Now, companies are looking for Java programmers, so the schools have switched over to that. Never mind that programming languages have a long history of turning out to be short-term fads (COBOL and Pascal before Java), and the real skill is in understanding the logic underneath the language.


I don't think syntax is the problem the problem is getting people to understand that programming is about changing what's going on in memory. In math when you solve (X + 7) = 15 the value of X does not change but when you say X = 7 the value of X changes.

IMO, the only thing which would help most into students "get coding" is using an inline debugger to step though their code and understand what's happening. Because people don't get the idea that software is a dynamic thing once you run it.


ruby syntax can be a little confusing. e.g. you can usually leave out parenthesis, but not always, and sometimes you get warnings but it works anyway.


Well with any language you have to teach a little bit of convention to go along with the rules, and the convention on when to use parens in ruby is mostly standardized from the code I see.

But yeah, I'm sure there has to be a better language for learning out there, I just don't personally know it. I originally learned with C++, and I remember the terror that were pointers at first. When I had a class that taught me Scheme, I remember thinking: "hey, I could write this code on paper without even testing it out if I wanted", which was the polar opposite of my C++ experience.


I agree with you. I started my CompSci degree a few years ago without any programming experience. The first intro-class was quite challenging as I had never programmed before. It took me a while to get used to the quirks but I became better after sufficient practice. Like anything else, programmers only get good after they have put in the hours to learn programming and the programming language. Its like music or sports or even academics. Practice makes perfect.


I think that programming is indeed something that you either get or don't.

I had an interesting expereience a while ago where I helped a friend doing a website. There was a backend involved where he would have to type some numbers into a table (how much of x, y, x... do you need when you are a certain age). I spent hours trying to explain to him why he couldn't just draw a graph on the screen, like he would on a piece of paper, and let the computer find out what the values were. He simply didn't get the basic concepts behind it and thought that the computer should easily be able to see the graph and decide what to do. He thought that is should be just as easy to draw a graph on screen and get the numbers from there as entering numbers manually.

It became apparent that he just didn't think sequentially and logically. Notions like if x is larger than y and y is larger than z then x is larger than z were extremely difficult for him to grasp. He isn't stupid by any means, he simply thinks about the world differently. And he would never get even a simple program to run no matter how much he tried.


> regular variable because Java works by reference, not by copy

I haven't read the other replies, so somebody else may have pointed this out already but Java is a by-copy language, like C and C++. Not by-reference like you claim.


In Java, you can store an object in an array and then modify that object in both the array copy and the one you have in a regular variable because Java works by reference, not by copy.

LOL. In what language does array assignment result in a new copy of that data structure rather than a reference of it?

http://en.wikipedia.org/wiki/Array

In computer science an array[1] is a data structure consisting of a group of elements that are accessed by indexing.

Answer: None.

You'll get tripped up on a type mismatch error

Every real language has types. Java catches most errors at compile time. That's an advantage for students.


In Pascal, Ada, PL/1, Perl, and C, array assignment or passing an array as a parameter results in a new copy of the array, at least under some circumstances. (For example, in Pascal, it has to be a value parameter, not a var parameter, and in C, the array has to be inside a struct.) These are not exactly obscure languages, so it's surprising that you assert there are no languages that work that way. Do you know any languages other than Java?

“Every real language has types,” you say. Except for BLISS, BCPL, almost any assembly language, Forth, B, the Bourne shell, dc, QED, TECO, sed, and old versions of Tcl, to name a few. Again, these are not obscure languages; every piece of software you use is inspired by some of them and implemented in part in others. (There are also languages like Python in which there are only dynamic types, but I agree with your point of view that those are still types.)

And that's why I downmodded your comment: not because it was arrogant, but because it was wrong.


I don't mind downmods on political subjects (too much), which are matters of opinion to a degree. But when I correct somebody's laughable mistakes, please either correct me in writing or leave it alone. Thanks.


I am pretty sure it was your tone.


Java is a whipping boy, and I also have my problems with it, but one of its minor strengths is that it is clean enough for a teaching language (especially for total neophytes). Here is the much lambasted manner of outputting:

System.out.println("Hello world");

...in fact, this is well organized. "System" is a namespace. "out" is a static variable. "println" is a static function. "Hello world" is a String. These things all exist in an orderly world. A professor could explain a lot of key elements of language design with that one line. What's this:

print("Hello world")

...it's shorthand for much the same. Easier? Yep. But it's glossing over a lot of details that students need to know.

So when you come at it from a teaching angle, you're maybe making a mistake. Python/Ruby/LISP all have their WTFs that will throw off students, even if they're all better languages than Java.


Java's a horrible language for teaching. If you think this...

    System.out.println("Hello world");
Is easier to understand than this...

    print("Hello world")
Then you've forgotten what it's like to be a beginner.

Java's not even a good example of what OO is, and it has way too much syntax and special keywords (something like 50) for teaching.

Python is a far better language for teaching, Smalltalk would be even better because it's much simpler, has only a handful of keywords, and a much more consistent and simpler syntax.

In any case, students should begins with a dynamic language where they don't have to think about manifest types that they aren't ready to understand anyway. It's also much faster working in a workspace or REPL where you have instant feedback and can experiment without messing around with compilers.


I should also point out that your Java example won't even compile, you need to wrap it in a class and a main function to even attempt to compile and run it. Tell me again how simple this is..

    class World {  
        public static void main(String args[]) {
           System.out.println("Hello World!");
        }
    }
compared to this...

    print("Hellow World!")


Not to mention that you have to save it in a file called World.java and it will compile to World.class. If you change the contents of a file, the first line, you also have to change the name of the file. How does that make sense to a beginner?


How does that make sense to anyone?

I still think Java is just a cruel joke on the part of the authors. I don't think they really thought manifest typing was a good idea. I don't think they really thought that Java's OO system was any good.

I think we will be seeing a paper soon that has a title like, "How bad could a programming language be before people stopped using it? A case study on Java."


Then you've forgotten what it's like to be a beginner.

Not just that, but what programming should be like. Hello-world shouldn't need more than two elements.


I think judging a language by "Hello-world" is a mistake. Sure, some languages have a fixed overhead for creating a program. Some have needless syntax fluff that doesn't do all that much, but it's actually a good skill IMHO to be able to identify and see through that fluff to the core elements of the code.

When we pick up a book do we judge it based on the index at the back and the preface and the publishing info at the start? We probably just ignore those bits.

I think in these sorts of classes it'd actually be a good idea to spend several weeks looking at other peoples code and analyzing what it does. Being able to read code properly is a really important skill I think people should master before they even start with writing hello-world themselves.


Like I said earlier, you've forgotten what it's like to be a beginner or you've not had much experience with teaching them. Beginners don't need to be able to see through fluff, they can't, the fluff needs to be gone entirely.

They need to be taught with languages that have no fluff so they learn the basics of thinking logically, algorithmically. They aren't ready to deal with the complexities of dealing with a particular languages quirks and learning to please its compiler with voodoo incantations like int, char, string, long, etc. They aren't even ready to understand that there are differences between languages, so their first language needs to be one that teaches concepts that apply to all languages.

They need to learn what functions are, what conditionals are, what loops are, what recursion is, in their simplest forms, and more than anything they need to see immediate success and running programs or they won't maintain interest long enough to get anywhere near the harder optional stuff like types and compiled languages. As soon as types come into play, you've advanced to an intermediate level, way over the head of beginners.

Beginners and manifest/statically typed languages do not mix well. The closest a student should get to Java, is JavaScript, but they'd be much better off learning Python, Scheme, and Smalltalk before getting anywhere near any of C's bastard children.


I disagree completely.

One of the best skills to learn is to see through fluff and understand what is going on. I'd actually say that the best first language to learn would be assembly language. It's insanely simple, but there's a lot of fluff, so you have to learn how to 'read' code in terms of overall structure/flow instead of focusing on each individual instruction.

I'm still not convinced learning in classes is going to produce anything but mediocre programmers though.


Learning to see through the fluff is not a beginner skill. I'll just stop here and say I think you'd make a poor teacher because you show little if any empathy or awareness of what it means to not know something or how important it is for a beginner to have small successes and working programs right from the start.


I agree I would make a bad teacher for a general class. I think programming skill is largely natural talent and isn't something that can be taught. I'd also say that the biggest skill anyone can have is the ability to self learn. Being taught the fundamentals of programming in a class doesn't bode well for future hacker ability IMHO.

If someone has a natural talent for it, it can be nurtured, but trying to teach just anyone how to program is the reason we have so many mediocre programmers in the industry. Some people should give up on programming.

Imagine anyone could go to art college, get a job as an artist, and be employed at big industries "blending in" and never really rated on their art work. This is the issue with programming. At the moment, there's this strange belief that anyone can be a great programmer - that it's just typing. Mediocre and bad programmers who may have a piece of paper saying they completed courses, are allowed jobs in industry.


I think programming skill is largely natural talent and isn't something that can be taught.

I disagree, sort of; thinking logically is the natural talent, being good at programming is a mere side effect. People who are inherently bad at associating cause and effect, just aren't going to be good at anything that requires logic and rational thinking.

but trying to teach just anyone how to program is the reason we have so many mediocre programmers in the industry.

No it isn't, people who have no natural talent for programming don't just do it for no reason, they do it because the pay is attractive and this is one of the few fields where one can get a pretty high paying job without any degree at all.

The dot com boom and the absurdly high salaries that were thrown out to anyone that could slap together some HTML is the reason we're currently stuck with so many mediocre programmers. As long as someone can fake it, and land a nice paying job, hordes of mediocre fakers will continue calling themselves programmers.

It's not even that these people can't be taught to program well, they simply have no interest in programming well. They learn just enough to get by, collect a paycheck, and go home. The thing missing isn't ability, it's interest. This isn't something limited to our field either, it happens in any field where $$ is not linked to actual performance or proof of skill; managers are another good example.


I agree - methodical scientific thought processes are the talent. I disagree about degrees though. When someone says they have a degree in computer science, I'd always be very weary.

If you were to commission a portrait of yourself, would you choose an artist based on his previous works, or based on some certificate saying he completed a course on art at some college? I know which I'd base it on.


I wasn't claiming a degree was necessary, or even an indicator of success in programming, merely that it tends to filter out the people who have no interest in the subject. Sadly, many people who are interested are ruined in college by being taught Java.

Past work is always a better indicator than any recommendation, be it on paper or verbal, but past work is proof of skill, which I already mentioned.


I find that the preface of a book is often highly indicative of the content of the book. The preface of a good book will generally talk about a few abstract themes which guide the exposition. The preface of a poor book will be a sort of random collection of notes, ideas, and guidelines. One textbook I looked at (on EE) started with a paragraph that went something like "Scientists understand the world by creating models. A model reflects reality. A model has components, each one of which corresponds to something in the world." The book was, unsurprisingly, no good.

In a similar way, Java's verbosity in even the simplest examples speaks to a certain kind of language design. Java's design is based on indulgence in the idea that difficult things can be made somehow easier or more reliable if the difficulty is hidden by a lot of simple, but needless verbiage.


"I think in these sorts of classes it'd actually be a good idea to spend several weeks looking at other peoples code and analyzing what it does."

I think that would just result in a lot of students dropping out. As a beginner, it's nice to get the computer to actually do something - even if it's just printing out a stupid message.


"I think that would just result in a lot of students dropping out."

Good. Perhaps there would be less mediocre/bad developers around.


I think some people just forget that the machine should be working for the human rather than vice versa.


Okay. Here's my real opinion:

If I was setting up a curricula, each incoming student would deal with a digital logic trainer. Okay? Wires and transistors. An 18 year old should learn the fundamentals and not the shortcuts. That's my opinion--maybe I'm wrong. Java is hardly the best teaching language, but it is closer to the bone than a lot of things....


The problem is that you're confusing a concept with its implementation. If it is a basic programming class, there are a whole lot of concepts that should be learned before worrying about the implementation.

Algorithms, variables, program flow, functions. Those are the real fundamentals... things that can even be analyzed in pseudo-code, no need to get into a particular language's idiosyncrasies.


<pedant mode> java.lang.System is class, not a namespace (although classes in Java are overloaded to function as namespaces as well as modules and data type definitions). println() is not a static method, but an instance method of the java.io.PrintStream class. out is a static variable in the System class of type PrintStream which refers to an instance of that class, on which we invoke the method println(). </pedant mode>

...which is a lot to demand of a beginner to understand in his very first program.

I'm self-taught myself, and for me, having a humanities background, C/assembly and Scheme was a perfect combination for understanding both how computers actually work and what computation is all about (to the extent that I understand either!).

I think Java is a nice language in many ways, but I think its dominant position as a teaching language is due to market demand more than any inherent suitability for that role.


>glossing over a lot of details that students need to know.

If you're teaching them object oriented programming. Why would you be teaching beginners object oriented programming?

I give the engineering programming classes at Pitt a fair amount of credit. Two semesters -- the first semester covers HTML as a reasonably useful introduction to the idea of structured text files, then Matlab to introduce if statements, loops, and functions in an environment with a friendly console. The second semester was C. Mostly everyone got it -- those that failed out of engineering didn't fail because of programming class.

The classes were taught/conceived by a professor whose primary research interest is engineering education.


In Python you can always do:

  import sys
  sys.stdout.write("Hello world!\n")
if you really need this level of detail as a teacher.


This really has nothing to do with Java, except it happens to be the language he's teaching. A more accurate title would be 'Programming in College: "You might as well be teaching Chinese to a monkey."' as it would be equally applicable to any programming language.


I taught Comp Sci 101/102 as a TA at a state university in the US. Yes, there are students who just don't get it. There was one student of mine who seemed to think as if the machine contained some weird little fussy creature, and if you somehow made nice to it in its unusual language, it would decide to like you and do you a favor and run your program correctly.

Seriously. He couldn't comprehend:

    X = X + 1
To be fair, that might seem paradoxical, but I would break it down for him, going step by step with cards and boxes labeled by the variable names. He just couldn't comprehend. He had a hard time with variables. Heck, I think he had a hard time with anything but social interaction. What was disturbing to me, was the amount of energy he devoted to sucking up to me and trying to scam me by mimicking the "Ah-HA!" moment.

Other disturbing observations: many of my fellow grad students didn't get it either. To them, the C compiler was just magic. No understanding of how things were marshaled on the stack or how operations were implemented in machine language. Surprisingly, they were often about as bad as the local community-college students who got employed by a local corporate body-shop.

At one point, one fellow grad student got exasperated at me in a study session and said, "I don't care about the principles, just give me the answer so I can get an A on the test!" She later told me she was just planning to go into management anyhow. (In a way, that was really disappointing. She was one of the few women in the program, and pretty cute.)


one fellow grad student got exasperated at me in a study session and said, "I don't care about the principles, just give me the answer so I can get an A on the test!"

IMO it's a serious bug in the educational system that people are able to pass by following that approach.


I think the bug is that finding the right answers is considered important. There is no such thing as a "right answer" unless you can prove it's right. If you don't understand the concepts, you can't know that you got the right answer.

(Then, in "real life", you are really on your own. If you are doing something new and original, you won't know if you are right immediately. You have to convince yourself by applying what you've learned. The answers to test questions, in this case, are completely meaningless. I don't think enough people realize that.)


I can see how "X = X + 1" could be confusing to someone who has never been exposed to programming before. All your life you're taught in math classes that "=" means the things on both sides are "equal" at all times.

Maybe replacing the "=" with an array while teaching would be helpful. Then once he understands the concept, try to introduce the "=" syntax.


This is why a certain language used := as the assignment operator and = as the boolean equality operator. It really doesn't make sense as written when compared to all of the maths people have learned up to the point they are introduced to programming. It's where learning about addressable memory and pointers compared to the values stored at those addresses actually made more sense to me than the optimized syntax (at least when I was 12 or so). It also operates backwards in most languages, where the stuff on the right of the equals happens before the stuff on the left. It really should be something that reads as "take the value in box with address x, add one to it, and store the value back in the box with address x" (basically show the student what the assembly language looks like).

I really don't have a problem with people not getting x=x+1, I think it shows someone is actually trying to deeply understand what is going on.


This is one of the advantages of using Scheme as an introductory language: (define x 1) (set! x 2) (let ((x 3)) (1+ x)


I think that this is one of those areas that would be greatly improved by starting students with assembly. It's fairly easy to grasp what each statement does in a small instruction set like MIPS since there aren't these sorts of preconceived ideas. Then once students have learned assembly, teachers can explain statements such as "X=X+1" unambiguously with the assembly instructions the students are familiar with.


I disagree about starting students on assembly. Assembly exposes all the implementation details on how we got computers to work efficiently, and exposes none of the theoretical basis for computation. Obviously you need to know both eventually, but to get started, I think it's a better idea to build on the mathematics students already know.

Say you want to print 2+2. In assembly, you have to put a two in two certain registers, then call add, then call a routine to convert the number 4 to the string "4\0", then find a file descriptor, then signal the OS that you want to make a system call (fwrite), ...

That is not a good way of teaching how computation works. The details in this case are arbitrary and irrelevant; there is no reason a computer has to distinguish between the string "4" and the result of adding 2 and 2.


Oops, I meant "arrow" not "array", and didn't notice in time to edit.

Can you tell I'm a programmer, not an archer? (actually, I have practiced archery in the past)


Actually, I did at one point diagram all of the states of X across time, so it was an array. (There was also iteration in a loop involved.) I think this only confused him even more, though.


Perhaps he could be a functional programmer if X = X + 1 doesn't make sense ;)

I could understand why if a mathematician found it odd - of course in practice they would understand it, but they would probably in their head make the left most X to be X' or somesuch so they could understand what was going on.

When you do a mutation of a variable - there are all sorts of subtle things to understand about the machine underneath that you wouldn't have to if everything was immutable. I guess its easy to forget that in the day to day.


Maybe the problem is that programming is taught without a foundation on how computers work. Before I really learned how to program, I spent some time studying basic computer architecture, so I knew what was behind the programming language. Without this knowledge, I find it is really hard to understand what programming is.


For me, I only really got it after I had taken all three courses:

    - Machine Language 
    - Compiler Construction
    - Operating Systems
Just after that, a bunch of lightbulbs started to fire off. I think the NAND to Tetris approach is really great in this regard. It's designed to be the place where it all comes together for a Comp Sci person.

http://video.google.com/videoplay?docid=7654043762021156507


I once tried to teach programming to someone who called all of his variables x. He genuinely couldn't understand why the machine couldn't just figure out which x he meant. This particular someone was due to replace us (a consulting firm) at the client. Heh, I wonder if they're still in business.


"Seriously. He couldn't comprehend: X = X + 1"

Isn't that really all there is to comprehend, when you get right down to it?

For all the people who don't "get" programming, I think that's exactly what they don't get.

Unless you're teaching functional programming. Then it would be f(x).


"To them, the C compiler was just magic."

Any sufficiently advanced technology is indistinguishable from magic. - Arthur C. Clarke

it must have been too advanced for them? :)


I can think of a couple other languages that I'd rather have people learn how to program with. It's a horrible first language.

In my "Intro to programming with Java" class a few years back, it took people three or four weeks just to be able to install java, set up the path in the environment variables on their home computer and compile a "hello world" program. If you were raised on Windows, and you've never used a command line before, it can be a challenge to figure out.

I suggested to every teacher as well as the dean of the program that if they're committed to Java as a first language, then offer students a pre-configured VMware Virtual Appliance with a Ubuntu install with the programming environment preconfigured and set up for the students. That way, they could download the appliance or hand it out on CD, download VMware player and start programming.

Instead, the teacher was playing sysadmin for java installations on Windows boxen.

It's a shame that so many schools teach Java as a first language. I think there's probably a lot of people with aptitude and interest in programming that might get into the industry, but who are put of by all the initial overhead associated with java and sys admin stuff.

I think he's up against a bug in the curriculum and not just people's limitations.


Recent research has tended to show that most skills tend to be learned through practice. Few people are naturally good and few people are naturally bad. Those who practice, excel.

Programming has the property that those who wish to practice today have almost unlimited opportunities.

This might explain the wildly different levels of skill that the poster observed. Perhaps programming would be best self-taught in a workshop setting so folks at different levels don't get in each others' way.


Diagnosable mental defects aside, I refuse to believe that there exist students who genuinely want to learn Java with a private tutor who understands the student's mental model of the world.

Students range from couldn't-care-less to passionate, not from stupid to smart.


You don't honestly believe that, do you? I'm quite sure that both axes exist (and maybe others too).


I suspect students could be plotted on both axes, and probably several others (amount of background knowledge, for example) as well.

It seems clear that smart/passionate will generally do well and stupid/CCL will generally fail, but that leaves smart/couldn't-care-less (know-it-alls who won't listen to anything in class) and stupid/passionate (who will probably need to put in extra time, but have a good chance).

Everyone will fall somewhere in between, of course - those are just quadrants.


stupid/passionate (who will probably need to put in extra time)

I think this view ignored the correlation between "put in lots of time" and "smart". All of the smart people I know have usually put in an extraordinary amount of time studying whatever they are good at.


nice, you just demonstrated that caring about the topic is all that matters to have a good chance of succeeding (not how smart you are).


Students range from couldn't-care-less to passionate, not from stupid to smart.

Wow, I never thought of it that way, but it really makes a lot of sense.


Err, first sentence was supposed to end "... and still, the students fail".


Several people in the blog comments have cited the “camel has two humps” paper, which still hasn’t made it through peer review. There have been some failures to replicate the study. Perhaps their results were an artifact of the authors’ teaching style, i.e. they were just terrible at teaching, so the students who entered the class without knowing the material failed?

I’m very skeptical of results that claim that some large fraction of the population is incapable of some mental feat like programming. 500 years ago the same was thought of reading, writing, and arithmetic.

(also posted to the blog comments)


"I’m very skeptical of results that claim that some large fraction of the population is incapable of some mental feat like programming. 500 years ago the same was thought of reading, writing, and arithmetic."

Exactly. At birth we are basically no different from someone born 100, 500, 1000 years ago. People in the past we not stupid. In fact, the giants who's shoulders we stand on now are from the past.

This isn't to say the camel doesn't have two humps. I tried pretty hard in college to learn German, but by then it was too late. Kids can learn languages without even trying because they're wired to learn languages. Waiting till college to learn German effectively doomed me to failure.

A similar thing might be happening with programming. Some people don't start thinking about programming until pretty late, sometimes as late as college or beyond. Perhaps some of the people who "just can't get it" could have had a chance if they started sooner.

This would imply that there is no fix at the college level; the solution is to start introducing programming/logic earlier.


There are people who attain native-level proficiency in foreign languages as adults. It is true that children more often do it, but it also seems to take them longer.

How many hours a week did you study German in college? I think children growing up in Germany typically study it about 70 to 90 hours a week if their parents are foreigners, or 110 hours a week otherwise, and it takes them about six or seven years to reach a high level of proficiency.

By contrast, an acquaintance of mine studied Mandarin Chinese only 40 hours a week as an adult (22-year-old I think) for six months, and then spent the next several years transcribing and analyzing intercepted Chinese telephone conversations 40 hours a week for the next couple of years, thereby achieving a native level of proficiency.

So my alternative hypothesis is that kids learn languages well because they try really hard for tens of thousands of hours, while adults rarely do.


I knew someone would say this ;)

It is true some adults can learn language almost as easily as children, but that's the exception. The main point is that as you get older (for the vast majority of people), learning a new language becomes more difficult. A similar effect may be happening with programming.


I don't think that needing thirty thousand hours of practice to learn something qualifies as learning it "easily", and that's how long children normally take to learn their first language. Adults can often learn new languages to an expert or native level of competency in only ten or twenty thousand hours of practice. What's exceptional is the adult who takes that amount of time to learn a language, challenging themselves and trying to improve the whole time.

(Exceptional adults may be able to do it in less time, and far less time to achieve a lower level of competency. Sidis supposedly could "learn" a new language in a day.)


Hmm, to avoid this being a back-and-forth between us, I attempted to find a reference for my point (which I was taught from multiple professors in college, and backed up by personal experience)... and I couldn't find any.

I could find reference to "the myth that children learn languages easier than adults". I'm shocked that my child psychology and cognitive psychology professors (who both taught a section on language development) would get that wrong, but it's entirely possible.


"I’m very skeptical of results that claim that some large fraction of the population is incapable of some mental feat like programming. 500 years ago the same was thought of reading, writing, and arithmetic."

Yeah and the reason was that 500 years ago reading and writing was college level material. Now they start teaching the alphabet at what age 5? Then again most print material is targeted to a pretty low reading level. So even if learning programing early becomes more common that still doesn't make it easy because chances are the majority won't be very good at it just like many aren't good with the written word.


Two illiterates can have a conversation a writer could transcribe. I'm pretty sure I couldn't represent the behavior or thought process of a computer-illiterate in code.


"I couldn't represent the behavior or thought process of a computer-illiterate in code."

That's what the job of a computer scientist is, translate from real world process to computer program. Maybe it isn't a one to one correspondence like spoken to written English but if someone tells you they need a program to track employee time and they need to sign in and sign out you can write that right?


A human and a computer are equally good at clocking one person into work on time.

A computer is probably much better at tracking the hours of 500 people, using this to estimate the probability of various inconvenient person-mixes (the foreman and backup foreman are both sick!), and using this to generate schedules.

A person is probably better than a computer at deciding whether or not it's better to clock out of work eight minutes early to surprise the wife with flowers.

Computers and people differ pretty wildly in their thought processes and abilities.


You're missing the point. Were not talking about the suitability of computers to a specific task. Were talking about literacy rates.


Never used SAP or Siebel then? ;-)


No, but I've used Taleo and Peoplesoft, so I know where you're coming from. Taleo is so egregiously bad that I want to start a company specifically to wipe them out.


The 1/3 of college students that have great difficulty with learning the fundamentals of programming can certainly learn it -- but it won't come naturally, and it will need to be slower and more personalized than how it is currently being taught. Differentiated instruction is the appropriate educational buzzword.


I think personalization is the key, too.

Why are we teaching programming with a human? At what point are we finally going to have machines that can work 1-on-1 with humans, at their pace, until they grok concepts.

Some people learn more quickly, some don't. Let's not punish one group or the other.

Let's teach interactively, but not with humans!


About a third of his class needs to switch to a more challenging course.

Every university should have a way to "place out" of introductory programming classes, just like you can place out of introductory (natural) language classes if you have previous coursework or are a native speaker. This should be done by demonstrating proficiency in the skills that will be taught in the programming course.


Well, sure, if universities were there to teach things, they'd invent ways to teach you using fewer classes. If universities are there to make you take classes until you get a piece of paper, why would they care?


Except that this is a pretty common practice: at most reputable schools, if you can demonstrate you don't need to take an introductory programming class, you can switch to a more advanced class.


what schools? (I ask because I hope that my school was "reputable" but that was certainly not the case)


At Brown (where I went for undergrad), you are normally tracked into an intro course, but if you can convince a professor of a higher-level course to override the prereq, then you can take whatever course you want.

I have the sense that most universities will do this given enough prodding; you are, of course, paying them tens of thousands of dollars a semester. Sometimes there won't be an official documented procedure, but I would be surprised if you were qualified for a particular course but couldn't talk your way into it.


I've found that students who breeze by in these introductory classes have no desire to skip over them. Easy semesters are hard to come by. College students generally don't have the attitude of "I'm paying so much for this so I'd better get something worthwhile out of it". They are there for the paper, and they don't feel like rocking the boat. They know this and the system knows this. They are normally so inundated with work from other classes that they're forced to take as many shortcuts as possible.


... and charge you for it until your pulled-out pockets can't drop any more coins, or you bank doesn't give you more loans (which is even worse).

In my 3rd year undergrad, "Informatics" class was teaching us how to use a mouse and close a window. That, after 2 years of requiring computer-typed project reports, and numerous labs involving computers in numerous ways (labs was the best part of the whole degree, we even got to inject dyes into rat brains and then section and image them). That was (is) the Biology degree from the University of Barcelona.


My first day of programming in college I read the syllabus and told the teacher I'd already done everything on it. They let me switch to the next class.


My "Introduction to Programming" course I did at Uni dealt with Ada shudder and I remember the very first things they were trying to teach were how to open, close and save files under Windows.

I played Quake 3 for first 2 weeks during that class while I waited for something of actual use.


What's wrong with Ada?


Some people start coding at a young age and some people graduate high school and decide it is time to give programming a try. Of the latter group, a subset seriously understand what it involves.

I think cs concepts can be taught to anyone but they can't be taught BY anyone. If you start from the very beginning and take baby steps I doubt there is anyone who can't pick up coding.


"A change of context is worth 60 IQ points" (Alan kay)

So, I think that the first step towards requing the last third of the group will be to teach them LOGO… yes LOGO once people “get” for loops drawing squares and stars and circles etc. etc. making the shift to other programming languages is easy.

If the curse requires OOP then start with Smalltalk and Etoys. They are intended for kids… but you are trying to build fundamental concepts here.

best of luck..


I can sympathize with the author. I "got" programming and generally found it rather easy (in college, I would look at some random student's code and just point out each bug; used to drive the random student crazy when I did that).

But there was one class I took, Unix Systems Programming (all in C). It was a higher level course with prerequisites of data structures (by that time in college, taught in C) and compiler writing (undergrad version). We were sitting there as the professor was going over the memory layout of a running Unix process ("the text segment is where the code is stored, the data segment were all pre-initialized code goes, yada yada"). One student raises her hand. The professor stops and asks if she has a question.

"Yes, where do the comments go?"

I think it took a full thirty seconds for the professor to reboot.

I do have to wonder how that student got through compiler writing.


You know, most ELF executables do have a .comment section, and IIRC Java includes comment text in its *.class files, and comments are included in the HTML and XML DOMs. So it's not a completely insane question.


What I think is somewhat particular about the challenge of teaching programming is that there quickly develops two distinct levels or types of thinking it's "difficult". I don't see "programming" as difficult in the same way that I don't see "writing papers" as difficult: it's a piece of cake to get words on the screen that make sense. Of course, writing a paper can be quite hard depending on the ideas you are trying to express. Similarly, I think everyone believes programming to be difficult depending on the underlying problem you are trying to solve (for example, no one thinks its easy to solve NP-Complete problems in NP time or to write fast games on slow devices).

However, what trips teachers up is that with some students the problem is that they can't even get past the first hump: getting meaningful things on the screen in the first place. It would be as frustrating as having a student in a college level writing course that always puts the words in his sentences in random order, and expects the reader to "know" what order to read them. Such a teacher would on the one hand be teaching tone and proper structure to one set of students, and basic rules of semantics to another. I recall many students in my CS classes just rearranging the syntax in their programs hoping the compiler would finally accept it. Clearly these students are having much different problems than the ones who simply get a little confused when they have a hard bug in their programs or find the idea of pointers unintuitive.


Wait, what does he mean about the shift key? I pushed shift, then pushed a, and the a is still lower case.


tRY LAYING OFF THE CAPS LOCK.


you didn't even write it in all lowercase.


I used to use caps-lock for that.

But then I learned enough Chinese to tell my pet monkey to type my comments up for me while waiting for Eclipse to launch.


I snuck into the roster for an intro to Object Oriented Class when I started grad school, which was taught in Java.

I took physics with the engineers in college and this was about as hard conceptually for me.

But it was one of those things where it was hard until I stepped away for a while and then came back to it. Then it wasn't so hard anymore. Just too much info to take in at once I found.

Loops were the hard part. Oh man. Loops. that took forever to "get."


Loops would probably be easier to teach if they taught "goto" first. "Goto" is easy for beginners to grasp, and loops can be taught in terms of it.


I started thinking of Loops as "repeaters"

"Ok. First I need to make my variables. Ok. There they ok. Good. Now I need to figure out what I want to do with them. Ok. That's that. Now I need to repeat this 5 times, so I need a repeater loop..."

That's kind of how it looked. Writing a program to tabulate diving scores was fun!


Recursion for me. I still grapple with it sometimes.


My first programming class (or my first class I had to use a compiler) was in C (it was one of those CSCI classes for non-majors) and this guy took us through the whole gamut, even if I barely passed the class, it was a rush. It was probably such a rush because I knew what I wanted to do, but had no idea how to express it, until we finally got to arrays, and I realized that a string is just an array of sorted characters. This insight catalyzed another insight and by the end of the year I was more interested in recursing meaningless input just to see what it looked like when it came out, which never really solved anything in my text-book, but this act taught me more about computer programming than any book could, this act showed me that computer programming is the art of counting and comparing for measuring an unknown order of magnitude. I came out of the course with so much more than what my grade reflected (somewhere between a D and C), but could really care less.

One semester later I ended up in a class called Object Oriented Analysis and Design; [insert brain hemorrhage here] I could not communicate with any of these people (nor my teacher) ever since the problem domain was suddenly so tangible, it seemed so much less enticing. This was when I laid my eccentric theory to rest, went with the flow, and started solving my exercises.


Recursion is really fun until you have to use it to solve something under pressure.


Recursion is easy if you're used to it. A recursive procedure just looks like a loop to me.


Why on earth are those three types of student all in the same class?


a) Some students have no idea what they are getting into when they register for a class. (This explains some of the clueless students who can't do any of the assignments.)

b) Some students who register for a first course in their probable major subject register conservatively, especially if they can't prove completion of equivalent courses. (This explains some of the students who easily do more than what is assigned.)

c) Some students have some good preparation (e.g., math) but not all the best possible preparation (e.g., previous programming experience). (This explains some of the students who seem to find the course just right for them.)

Further along in a computer science major sequence, students who neither get it nor like it will move on to some other major. Then the ability spread in any one class may diminish somewhat.


No system exists to identify the slow programmers before they get to class and most colleges don't have the resources to offer the more personalized instruction needed to the slower learners.


I remember some non-CS majors in an assembly language course. About 5 or 6 people walked out 30 minutes into the first class--a behavior too seldom seen.


I just finished helping to teach an introductory programming class as well (HTML and JavaScript -- one step below the Java class). I also observed a surprising range in the ability of the students to grasp programming. Hardest class ever all the way to easiest class ever.

Another funny thing I noticed is that many students had the idea that they could get credit for non-functional code. They would treat it like a math class and hope that we would pop open the hood and look to see that they had the right general idea.

I guess I can't blame them too much though. I've been a programmer for so long that I've acquired the unforgiving attitude of the machine. I certainly wish my computer would do what I mean and not what I say.


i remember another student complaining to the teacher, "so you're not giving me credit because i'm missing a for loop?"

i wish i was the teacher. i would have explained that "missing a for loop" isn't something like, say, having an injured arm. it's more like having an arm that was recently sawed off and is spraying and pouring copious amounts of blood all over the place

you have to be strict in those cases I think. programming requires precision, and if a person can't muster it then they shouldn't be fooled into thinking it's something they can major in, because then their time will just be wasted


A couple of suggestions…

Start with a visual programming language for the noobs- maybe something for kids like Scratch and then something where you can make real apps quickly like VB6. My 5 yr old can do programs in Scratch. She hasn’t learned recursion yet, but loops and events are clear to her. She did learn about the shift key in preschool computer class this year though, so she’s probably ahead of some of your students.

Grade people based on how far they get, don’t make them go forward with the rest of the class if they don’t grasp week one. Let them keep iterating on those concepts until they get it. The tragedy of group education is that you are forced to move and it is very hard to ever catch up. I would computerize the whole class and make it self-paced if there is that much variability.

For the non noobs, the Java 2D stuff is useless. It’s completely useless to learn that API. Make them write programs they can use in their life. Do a simple data model for their budget and use Ruby on Rails to turn it into a web app.

PASCAL, Ada, and Modula-2 suck. I hated programming when I was using those and thought it was useless BS. When I was a student, I liked learning c and asm because you learned about how computers work and I liked learning high level languages with easy graphics libs because you could make useful things quickly. I liked Matlab too, because I could use it to check my math problem sets.


I used to start my courses with the following:

"This course has as much chances of turning any of any given person into a programmer as a swimming class can turn you into a fish. Some of you have it, some of you don't. Some of you will find it easy, and some will find it utterly incomprehensible. The fact you are here holds, at least, some, but not much, promise. Live with it."


I've noticed this with higher-level concepts too. There are people who simply can't grasp the concept of a mind that doesn't work like a human mind. There are people who simply can't grasp various levels of meta-thinking - like the distinction between metaethics and object-level moral decisions.

You can keep trying to point in the direction of the meta-level but they just can't look in that direction, like they're in Flatland and you're trying to point "up".

And at least according to them, some of these people are programmers.

It seems there are multiple "gears" that you can have or not have, not just the "programmer's gear". I suspect I'm missing a social gear or two, because there are things I can't see, even after the fact, no matter how often they're pointed out to me.


I'm not so sure about that gear theory. I think you can have new gears installed. When I took some higher level courses that required new kinds of thinking, I didn't get it. I looked and felt retarded for a while, because no matter how it was explained I didn't get it.

Then by continuing to study, and thinking about things, eventually I was able to see it, and it started to integrate itself into my thinking. It's almost like I can feel my brain being rewired.

I think that what's required is enough exposure and determination to reset some neural connections and eventually the idea will integrate itself into you.


Yeah, for me it's always been struggling through. Stepping away and then coming back to it. Then I get it.

Same with piano lessons as a kid. I had chopsticks down pat after a few weeks. Then it was time to play a triad (a 3 note chord) and I just couldn't do it, pushing three keys at once with one hand.

Spent a whole lesson with my teacher literally just pushing my hand down on the piano while I looked on in horror at how badly I was at it. That sucked.

Then my family went on vacation for a week. At my next lesson I could play them just fine. Something clicked.

But it wouldn't have clicked had my teacher not spent an hour pushing my hand down onto the piano over and over again.


Thats how i learned driving. I couldn't get it, my instructor tried to teach me, but i couldn't do anything right. My instructor was pretty good and knew that it was normal and the next day after a good nights sleep i was starting to get good at driving. After a while of driving with mixed success i stopped for a week, and when i got in the car again i was really good. I got my license, but didn't drive for a month and today i got in a car and after a few minutes of remembering, i was good at it, it was all easy, you couldn't tell that it was my 30-th hour in a car, and i haven't driven THIS car ever before, so i had to 'learn it', and get used to it. The mind is an amazing thing, learning is one of the strangest things we are capable of.


I agree. There are definitely many styles of thinking, and they seem to be developed or discovered independently of the skills which they are associated with, i.e. you can learn Java without learning how to think like a programmer. We can call it "talent" and say it's innate, but I think that's a cop out from the hard problem of trying to find better ways to develop it.

In a sense, it's not unlike personality disorders, in that that those who "have it" generally don't know where they got it, and frequently don't recognize it as an "it" that not everyone has it, at least until they're confronted with evidence.

I happen to have recently transcribed something from Seymour Papert's Mindstorms, about LOGO, that seems relevant:

"By deliberately learning to imitate mechanical thinking, the learner becomes able to articulate what mechanical thinking is and what it is not. The exercise can lead to greater confidence about the ability to choose a cognitive style that suits the problem. Analysis of "mechanical thinking" and how it is different from other kinds and practice with problem analysis can result in a new degree of intellectual sophistication. By providing a very concrete, down to earth model of a particular style of thinking, work with the computer can make it easier to understand that there is such a thing as a "style of thinking." And giving children the opportunity to choose one style or another provides an opportunity to develop the skill necessary to choose between styles. Thus instead of inducing mechanical thinking, contact with computers could turn out to be the best conceivable antidote to it. And for me what is most important in this is that through these experiences these children would be serving their apprenticeships as epistemologists, that is to say learning to think articulately about thinking."


i think it's a matter of perspective. people's perspectives can be very different. it didn't hit me until i got really smashed one night and my vision became many times more detailed. the world looked beautiful. no doubt there are people that can see that clearly all the time, people who live in the world, as opposed to outside of it which is how it feels i am. those people are probably the ones who can "get" drawing, and who might wonder why it is that others don't get it

when it comes to programming however, i bet these people can confuse the letters on the screen for the program itself, or it might not be clear to them that the semantics of the language are rich and are more important for understanding it than its syntax is, and other fundamental misconceptions like that

i can see better if i put a lot of focus on it, but it would take practice to make it something i could readily call forth or sustain, and even then i don't know if i could see as clearly as when i'm drunk, or as clearly as someone for whom it is natural. going the other way, how do you show people what to look at when programming? i can give myself visual tasks like "try to perceive all of the individual blades of grass on the lawn at the same time," but what is an equivalent exercise for someone who finds programming difficult?


Maybe this is missing the point, but have you gotten your vision tested? Maybe you're farsighted or nearsighted. I think the world looks a lot more beautiful when it's in sharp focus.


i'm very nearsighted, and use glasses. however the kind of detail i see when i'm very drunk is on a different level, regardless of whether i'm wearing glasses or not (like looking at things up close.) it might seem strange that you can see better than 20/20 (or whatever 'perfect' vision is,) but this is something that is independent of the apparatus of the eye. it's deeper, like the cognitive management of attention/focus

i'm sure part of it is that i don't actually use my eyes much from day to day... which might be the reason i'm nearsighted to begin with. but even that much wasn't apparent to me before. though the same could be said of those who have trouble programming. maybe there is something they don't use a lot, and it atrophies over time?

let me be clear. when i say i don't use my eyes much from day to day, i mean i spend most of my energy in my head and i don't really focus on my sight at all. it doesn't take much ocular prowess or precision to parse some text on a screen

but who knows, maybe i'm loony. koo koo


Aha, cool. Thanks!


It's not surprising at all to see a huge variation in student ability for an introductory class. The variation is just as big in other areas. My wife has been attending a large public university and from helping her edit group project papers I've seen samples of numerous other students' writing. Some of them do pretty well, but others are borderline illiterate (seriously).


A lot of it is about motivation.

I took AP Comp. Science. A in freshman year of high school. I got a 5 on the exam. An A in the class. I'd be so excited each day as I went to class. This was the peak of my new found love with programming.

Then in senior year I took AP Comp. Science AB. I did just-okay in the class. And bombed the AP exam.

More telling was how little I cared about the poor grades.


This reminds me of a class in C I took last spring. Some people fell so hopelessly behind that they probably "borrowed" a good bit of code from other people for projects. On the other hand, some people turned in an almost commercial-quality video game for their final project. I got an A, but fell somewhere between those two extremes.


Learning programming in classes like this just seems a terrible idea to me.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: