Hacker News new | past | comments | ask | show | jobs | submit login
Autodidacticism (raganwald.posterous.com)
149 points by fogus on Feb 7, 2012 | hide | past | favorite | 33 comments



This whole idea of the self taught programmer is a little disingenuous, because while I fit the stereotypical mold of one (no college degree, no formal computer science education), I was anything but self taught.

To claim so would be to discredit my friend who explained pointers to me in high school, or my first programming boss who helped me transition from writing HTML to SQL.

While I didn't take a class from Hennessy or Patterson, Gosling, the GoF, McConnel they certainly 'taught' me from their books, as did the countless anonymous internet posters in #java, #c, and as of late, #scala.

The feedback I received from friends who reviewed side projects or the JBoss guys reviewing my patches was invaluable.

It may be fair to say I created my own computer science curriculum, or that I wasn't formally taught, but I'm quite hesitant to take on the label of 'self taught'. Unless someone grew up in a cave without the internet and emerged a programming wizard, I'd caution others too.


I agree that few programmers are autodidacts in a really strong sense, like the way some outsider artists have developed completely independent (and usually a bit crazy) art styles out of contact with historical art practice. Anyone who isn't sitting in a cave reinventing computation from scratch will learn things from other people, so it might just be useful to think of different ways of learning rather than a strict "formally trained" versus "autodidact" dichotomy. I have a CS degree, but before that I "self-taught" myself C mainly via FidoNet's C_ECHO... but of course it wasn't a one-person echo with no other humans on it, so "self" is in a certain manner of speaking, in that I had no formal teachers there, but did have interaction with people who I learned from.

It's still a vaguely useful term, but I do think it's important to keep the wide variation along more than one axis in mind. Some people mean something like, "I read the manual and taught myself to code", while other people mean something closer to, "I studied something close to a regular CS curriculum, but outside a traditional university lecture environment".


I have disagree with your interpretation. To teach implies the lesson has already been discovered and is just being passed on.

If you lived alone in a cave and independently discovered programming while there, I am not sure you are teaching yourself any more than scientists doing groundbreaking research are teaching themselves. We typically file that under research. It only said to be taught after the discovery.

The dictionary defines self-taught as learning without being formally taught. As far as modern lexicon goes, I have to side with the dictionary.


Based on that description, I'd consider you to be self-taught :) I have a degree but I also consider myself self-taught. What gives? I've encountered many folks (probably not your average HN reader) who are simply unable to learn new things without explicit direction under a teacher. The way I'm defining it, a self-taught person uses multiple sources (books, learning-by-doing, bosses, feedback, etc.) to acquire knowledge.

In some ways, the purpose of a university education is to give you the ability to learn new things. The best example in our field is programming languages. While many of us can code in several languages, we probably did so without the aid of explicit instruction.

It certainly comes down to definitions though :)


Uh no, you definitely qualify as self-taught.

It's just that all programmers are self-taught to a degree. You more than most.


Last week I took Philip Greenspun's intense relational database systems class at MIT. I had built some modest database-backed applications before, but I had only learned the minimum that I needed in order to make my applications work. Being forced by the structure of the class to read and do exercises that I evidently was not interested in on my own provided the framework for me to learn much more than I ever had on my own.

But what material did we cover? Pretty much http://philip.greenspun.com/sql/ with some additional topics. I had seen that online book before. Why had I never made myself study the entire thing? Why had I never gone through all of the examples on my own? I don't know. I guess I didn't see the value in it. Now I do.


I have had the same experience over a somewhat longer term. I am/was an autodidact most of my life. I learned programming and software engineering mostly by myself while pursuing an electrical major. I switched to software development as a job and learned a decent bit of algorithms and theory mostly due to interest and sometimes due to necessity.

After displaying great disdain towards structured education for the first 8 or so years of my career, I eventually decided to go back to school for a masters. Things were considerably harder; I discovered that there were parts of math and theory that I didn't quite "get". Often I had "seen" these during my studies, but hadn't looked at it in any detail because it didn't seem interesting for some reason or the other. I survived masters (and actually did quite well thanks to some guidance from professors and being well organized)

After another couple of years of working, I decided to do a phd and proceeded to discover in great detail the list of fundamental things I didn't understand about topics I thought I knew well. It took quite a while to cover that ground back.

Overall my experience has been that the purely interest driven approach I took is extremely valuable. But at the same time it makes one prone to fundamental gaps in understanding compared to a more systematic approach. (Just prone. It may not happen). I have found it valuable to realize this. Among other things it encourages me to continue poking at the subjects I care about.


I've a secret for you: in programming there are only people that taught programming to themselves, because there is no other way to do it. I guess that this is true for many other things as well.

p.s. for the same reason many people have a CS degree and can't code.


Disagree. There are plenty of people with CS degrees who can code, what's more dangerous is a programmer with gaping holes in their theoretical knowledge.

A friend of mine who both has years of real world experience and a PhD in CS is fond of saying: "A 'Web Developer' is a person militantly ignorant of computer science who spends most of their career re-implementing elementary computer science."

It's sheer hubris to think that one can disregard decades of CS theory and be a good developer. You might know the syntax of a language or have a great memory for APIs, but if you don't know how to efficiently apply that knowledge, what's the point?


As the old saying goes, theory is nice, in theory, but has problems in practice :)

Leaving aside the scornful sneers at those "web developers" (which read more as envy that "this chump makes more money than me and didn't spend as much as I did on a degree"), there is a strong tendency for people to assume that what they know and can do is special, while what someone else knows and can do will never be up to snuff. And there's a tendency to extend this to group membership. Witness the hate of people with theoretical CS education for people without.

In reality, there are large fields of programming which are well-trod enough at this point, or possess layman-oriented tools of sufficient caliber, that years of systematic formal theoretical CS education simply is not a necessity. Which, to be honest, is a good thing; programming is a useful skill to have in the toolbox, and restricting access to or participation in it by denigrating dabblers or self-taught amateurs is a net harm.

Consider, for example, how many people doing quite successful and worthwhile things today got their start writing BASIC on their home computers a decade or two ago, and then consider how many of them and how many of those successful and worthwhile things we'd have if they'd all listened faithfully to Dijkstra's bile.


It's a glib statement, but it's not coming from a place of envy - more likely cynicism than anything else :) As I said, he has real world experience, he's made his money and continues to do so...

I am both self-taught, and have a CS-type degree. I've walked both sides of the street on this issue, at times I've thought I wasted 4 years at uni, and at others I've been glad I sat through at times dry and disinteresting lectures, when I get to apply that theory in the "real world."


Don't confuse a formal CS education with a formal theoretical CS education. That is, CS theory is only part of a formal CS curriculum.


Typical CS arrogance. I was a "web developer" for years before I was a CS-studying "programmer" and I spent most of my time fixing the assumptions of CS graduates in order to produce something that actually fit the use case, satisfied the customer's desires, and worked in a browser.

"Web Developers" often have a much more practical sense of how computers and the web actually work whereas many CS graduate "programmers" tend to assume that the computer and browser will behave according to theoretical principles.

Both "web developers" and "programmers" are a crucial part of the web app development process. Just because someone can't write a sorting algorithm does not mean that they can't be of value. Similarly, just because someone can write a sorting algorithm does not mean that said algorithm will actually perform in IE7.


CS grads are a source of as much over-engineering as plebian programmers are for under-engineering.


There is the CS theory aspect to it, of course. I don't encounter that all-to-often in my daily development. Things I encounter that definitely took me as long to learn, if not even more (in the context of web development):

- decomposing my code into patterns, abstractions, without going overboard,

- documenting stuff so that I will understand or other people will understand things in 6 months time,

- writing code on a team,

- having a good project workflow to handle source code versioning, releases, packaging

- knowing how to deploy things to a multitude of servers

- debugging caching problems, optimizing a DB for performance, understanding runtime performance of javascript,

- dealing with customer expectations,

- dealing with deadlines, unexpected events, etc...

- dealing with legacy sourcecode of hundreds of thousands of crappy bullshit.

Even in the aspects that "could" involve more theory, like optimizing a DB for performance, I find that theoretical aspects have pretty much nothing to say. There may be some CS involved on the patterns / abstractions level, like which data structure to choose or how to structure a communication protocol. But to be honest, I don't really have the time and the need and I just resort to whatever library people think is cool and stable. And that works pretty well.

None of the things above is simple, and pretty much none of it is taught in CS. And that's what makes a good developer I think, not whether I'm able to bang out a petri net or a neural network.


There's a hell of a lot of wisdom in the collected software development body of knowledge. It's a shame that a lot of that knowledge seems to have been wiped from recent working memory.

I'm a little bit sensitive to this. This morning I received a 55 page 'requirements document' from my management-appointed senior software engineer (whose background is in forestry). It amounted to little more than a very complex relational database schema and some algorithmic pseudo code. I then spent 4 hours via phone and email trying desperately to explain why those weren't requirements. His forceful rebuff: his approach had worked in the past, so it should work again. I didn't blame him - he just didn't know what he didn't know.

I've worked with some damn good programmers who had little knowledge of computer science or software engineering. They've been awesome at cranking out some (even clean) code, and they're some of the smartest people I've ever met. But, software is hard, the breadth of knowledge is vast, and just being a good programmer doesn't cut it.


People, no matter how much prior training they have, are going to invent their own ways. Some will be hits and some will be misses. In your case, it sounds like a miss, but if your manager has had prior successes, he is inclined to disagree.

Some people believed the Ruby on Rails "magic" was the worst violation of software engineering to ever exist, while others felt it was the greatest advancement in programming to date.

No viewpoint is wrong, because there is no such thing as right.


> No viewpoint is wrong, because there is no such thing as right.

"2 + 2 = 5"

"do mutlti-threaded reads and writes to a shared mutable data structure and you absolutely never have to worry about race conditions, clobbering or corruption"

do you still think all viewpoints are equal and none are right?


Absolutely, assuming you are not trying to take my quote out of context. You can do whatever you feel is best.

Take, for example:

做多线程读取和写入到一个共享的可变数据结构,你绝对不会担心竞争条件,会破坏或腐败

Maybe you can grok that, though chances are that you cannot. It is still just as valid for someone to believe it is an efficient method of communication though, even if you disagree.

Generally accepted patterns can be useful, but it takes people pushing that envelope for us to evolve.


As many have notes in the past, 2 + 2 does equal 5 for sufficiently large values of 2. Marking it as an absolute falsehood merely indicates an insufficient knowledge of various areas of mathematics.


You really have no choice but to be somewhat autodidactic if you develop on the web, but I like to make that point that Tiger Woods has a coach. Michael Jordan had a coach. Coaching is important, because learning happens best with constant feedback.

Just read Moonwalking With Einstein, which touches on this.

Seems like this guy Dr. Ericsson in Florida is the go-to expert on mastery in general.

As working programmers I do think it's easily in our interest to spend time learning how to acquire new skills as well as we can.

Moonwalking With Einstein: http://www.amazon.com/Moonwalking-Einstein-Science-Rememberi...

Dr. Ericsson: http://en.wikipedia.org/wiki/K._Anders_Ericsson


During my "learning" life I have been in both sides; in different fields but definitely in both sides. While I have learn most of my knowledge in the university (2 undergrads, 2 masters and finishing a phd) there have been other topics besides CS, mechanical engineering, statistics,... that were interesting to me. Sports and nutrition being the most important, and I have never had any mentor or taking any course on them.

In the University the education has to be focused on teaching what is needed, not what an individual wants to learn. I did not like databases, but I took 2 db courses. I did not like compilers but I took 1 course just focused on this topic. And both topics are useful, and somebody working on CS need to know its basics. This cannot (and should not) be decided by an individual sorely based on his interest or needs. A basic layer of knowledge is decided by experts on the field that consider that ANY xxxx (put here whatever career you want) must have.

What I felt in the fields I am an autodidact is basically a huge lack of basic knowledge. How am I supposed to decide whether the book A is better than the book B for the topic I am trying to learn? Do I need to know something before getting into this other topic? How I decide the path I need to follow to reach a certain knowledge? Those questions are (almost) always solved with somebody guiding you through the process of learning.

There is something that has always shocked me: is that most autodidacts cannot accept that their education might be partial and the lack of base (generalizing here) might affect their adaptation for new projects or their speed on learning new techniques.

Following the structure of a university in careers like CS or engineering should be a must, and new methods of education are stepping on the learning process by helping autodidacts to be directed into a clear path of what-is-needed.


> How am I supposed to decide whether the book A is better than the book B for the topic I am trying to learn? Do I need to know something before getting into this other topic? How I decide the path I need to follow to reach a certain knowledge? Those questions are (almost) always solved with somebody guiding you through the process of learning.

This is rapidly becoming a thing of the past:

http://ocw.mit.edu/courses/electrical-engineering-and-comput...

I'm 3 years into a software engineering degree and in all honesty, I'm doing it for the piece of paper. Last semester, I attended about 10% of the classes, studied by myself and still managed to get good grades. I wouldn't recommend it to people who aren't passionate about the subject though.

One thing I hate about the 2 universities I went to (Montreal, Hong Kong) is that many teachers (un)knowingly discriminate against students who don't attend classes. This often takes the form of giving hints about the exam during tutorials, giving privileged information about which topics will be evaluated and which won't. In Hong Kong, I have even seen teachers give participation grades or require a minimum class attendance in order to pass (this happened in computer engineering/science!).


Precisely my point here (as I mention in the last paragraph) is that knowledge is becoming widely available and new ways of learning are arising.

You are no longer an autodidact and you are studying the topics the college is telling you to follow. Some of them you would have never consider if you were to chose, some others you would. But in the "high level" (and probably in the future) you would appreciate the starting point that college has given you.

PS: I agree completely with you regarding attending to classes, even though I would have homeworks/exams to help retention of the material.


The major benefit of a "formal" education (or in learning from someone with lots of experience) is that they have a better feel for the overall territory than a novice. They can help guide you towards the direction you need to go.

Learning by yourself should obviously be encouraged, but it's often difficult to know what thing to learn about next in order to ensure that the new knowledge integrates well with what you already know and (pragmatically) helps you solve the problem at hand.

To give a specific example, I'm lucky enough to sit next to our chief infrastructure engineer at work and he has helped me learn a huge amount about Linux over the past three years. He hasn't done this by giving me "formal" instruction, but instead he's pointed me in the right direction when I get stuck with something.


The word "autodidact" hides a lot of complexity behind it that makes it difficult to make any coherent argument for or against it.

For one thing, a teacher cannot shove anything down your throat. It is always the student who learns and in that sense, everyone is an "autodidact". This is not to belittle a teacher who's role in directing a student, keeping her on her toes, challenging, questioning, probing, explaining, etc. which are all valuable. The thing is, a formal teacher is not the only source of those things. Our parents and peers do those things too.

You learn, you get good at something ... anything. In today's world, feedback from the community on what you need to get good at and whether you're getting good at it seems adequate to the point that if you know how to make use of that, you can learn stuff efficiently and effectively. You might say more of us are becoming "allodidacts".

The third point is teachers also continuously learn. Peer learning is how scientific progress happens, for instance. So if you do continuous learning at all, you're likely to do it for a much longer fraction of your life than the period that you're "taught".


I don't agree with this:

Reality directs your learning.

With or without a mentor you are still being forced to work in a particular direction. Do you read a book for your own benefit or for your employers or for both? Who appraises you? Are your colleagues impressed by what you've learnt?

Unless you live on a desert island you probably have constant feedback.


I'm an autodidact.

Isn't it part of being self-taught to find mentors - in books, online, and if you're lucky, in person?


I'm an autodidact too--I was lucky enough to find this big group of mentors who were willing to speak to me several times a week about a variety of subjects. Then, after a few years, they all agreed that I now knew a lot of things and gave me a piece of paper saying so.


Yes - the article argues autodidacts become "students," and must accept that a student can't autodidact. Or they fail.

This is not some strawman I want to knock down. From the article:

"...there is more to being a good student than being good at learning. One of the responsibilities of a good student is to seek out excellent teachers. In the Wikipedia article on Autodidacticism, I find this paragraph:

"Autodidactism is only one facet of learning, and is usually complemented by learning in formal and informal spaces: from classrooms to other social settings. Many autodidacts seek instruction and guidance from experts, friends, teachers, parents, siblings, and community."

He misses the second sentence of his Wikipedia quote. He argues that you must sometimes become the student.

Any excellent student knows he must eventually teach from his corpus of knowledge to cement what he knows, and pit it against new, hungry students.

Even I - the autodidact - knew to communicate what I learned, test it in the "field of battle," wherever I could find that.

For me, it was all before I reached "higher education." So by the time I had professors, classmates, and TA's, I had already developed a highly effective method of autodidactism.


As far as knowing what my goals are, there is nobody better than myself as a teacher. Mentoring or other 1:1 discussions where a real teacher could help with career guidance, philosophy, et cetera. I also go online and talk with people there, or I could leave the house if I so choose.

As far as selecting the best teaching material, I win. I can pick any used copy of any books that I feel are necessary, and they are cheap. Textbooks with exercises are a must. If I was shut off from the rest of the world, the sheer volume of college textbooks I physically have a copy of would keep me busy for years.

You don't learn from the teaching, you learn by doing. If you teach yourself and you never use what you have learned, done 5-10 of the medium to hard problems, and then tried to put together what you have learned, you will fail.

In the end, we agree. You should NOT be the sole teacher, unless you plan on just living and working for yourself. Perspective is a hard thing to get from yourself.


I don't necessarily disagree with the author, but almost inevitably you cannot be an autodidact. If you consult a book or the internet for information, you are learning from someone else (just not necessarily audibly or directly). The real question is, are you somehow at a disadvantage if you avoid academia? Is it better to be practiced in real-world scenarios and real failures then to learn theory? Personally I think a little of both is good, but you do not necessarily need to be "taught" in the traditional sense we think of teaching.


But no matter what you think about formal education, it has one thing going for it: The separation of teacher and student.

It's more than just a separation, it's an inherent power-asymmetry.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: