Without knowing what your self-learning experience is, this is meaningless. You don't mention it once. Have you been hacking personal Haskell projects for 10 years in your sparetime (let's say 25% of your professional experience?), or just read a book or two, or nothing at all?
Personally I think it's pretty easy to make the opposite argument. If you work on open source software in your spare time you have a chance to work with a wider pool of more talented engineers in a more collaborative environment and wider ecosystem of software than exists in any single company. Many programming jobs by comparison involve either menial CRUD-style work, or working for years on end on some specific component that does not convey much general-purpose knowledge.
Now, if you aren't the kind of guy that is motivated to work on your own then certainly a business environment will be invaluable, however I reject the idea that corporate programming somehow inherently teaches you more than you can learn on your own.
Not sure I agree. I could give long and detailed answers to all those questions, despite having only written one "industrial" Haskell application as a team of 1. I've learned everything I know about programming on my own -- reading other people's code, and trying their techniques for myself. While I have written some nice code for money, the bulk of my time in front of a keyboard programming has been on my own.
And I work at an investment bank. So I am not sure that this article has any basis in reality. When I interview people, the candidates that are self-taught seem to have the most experience and interest in the job. This almost always makes them an instant "yes".
But I don't really see people like this for very long; 90% of the people I interview are just "lifers" that move from job to job in the financial sector every 6 months. Their recruiters make them lie on their resume, and it comes out in the interviews and makes them look like horrible people. Instead of being self-taught, they simply don't know anything about programming. ("Your resume said you designed your own processor to process network traffic faster. What was your role on this project?" "Oh, uh, well, I uh, got some coffee for the dev team.")
To the contrary, banks are dealing with the established business models and are rarely tolerating even short-term decreases in the added value that every person brings to the company. They hire people to earn money for the bank, not to experiment, and they pay those people accordingly.
So untrue. I work at the biggest investment bank in the country, and this is just not true. Did I mention that this is not true? Not true. Oh so untrue.
Do banks want to decrease their added value? Nope. Do they have a desire to hire 300,000 employees? Yup. And guess what, there aren't that many good programmers in the world. Problem.
Prop shops might be more careful, but banks hire pretty much anyone and everyone and give them a lot of money. It's a big company thing. Do you have buzzwords on your resume? Congratulations, you're in. And you should see the code that this process produces! (My department doesn't do this, but it's clear that most of the other ones do.)
Could you please elaborate on "Oh, uh, well, I uh, got some coffee for the dev team."? I'm really curious what are this kind of people saying at an interview.
Usually, they are stumbling all over themselves, and completely failing every question they are asked. These become the sorts of interviews people write about on The Daily WTF and other blogs online.
That is, unless they interview at an organization that is much more interested in finding compatible personality for their culture than they are in the person's ability. These organizations exist; I worked for one a year or two back. The candidate I was interviewing as part of a panel had not been asked a coding question to that point, and she was already quite far along in the interview process -- the next step after the panel interview was an offer. No one else really cared; they just wanted someone who would increase headcount while not disrupting the culture and structure of the organization.
So, the people who answer this way, they're just playing a numbers game. They keep going on interviews until they hit an organization like I describe above; at that point, all they need to be hired is to just be agreeable.
Why do you think people love "refactoring editors" like Eclipse so much? With enough auto-inserts, everyone can pretend to be a programmer.
Also, most people have figured out how to trial-and-error tweaking existing code. Blank editor window? Nothing. Modifying a bunch of existing code that renders a web page? Surprisingly able.
I don't think anyone is saying it, but lots of people are adding projects on their resumes just because they were in the vicinity of team colleagues working on it.
Try being part of the interview process sometimes; you'll have lots of fun :)
I actually had someone say something close to that in an interview for a sysadmin position. The guy claimed he had experience configuring Cisco routers.
For this particular job configuring routers was not a primary task, but because I had a passing knowledge of Cisco routers I asked a couple of questions since he had brought it up. After 2-3 questions which the candidate couldn't answer, he finally admitted that his experience in 'configuring' Cisco routers consisted of plugging the network cables into the port at the back.
I once interviewed someone whose resume said he had "modified the Unix kernel to support [some device]". I assumed that meant that he'd written a device driver, but when I asked him about it, he explained that he'd bought a third-party device driver and installed it on his local system.
To be a true philomath/autodidact, one does need to do more than just read a few books. You need to be possessed of such determination to learn as much as you can that you realize how little you know and are not satisfied until you've at least made an attempt to best the summit. Being able to hack out a few toy programs and call it a day is not enough. The pursuit of wisdom for a philomath is a life-long endeavor and measured with the same rigor as the process for an academic scholar.
We just live with the connotations of the term, "hack."
Here I was expecting to refute the OP, but instead I am left disappointed. He is not some academic elitist guffawing at the plebeians attempting to build their own ivory towers. Instead he is a corporate lifer making some inane red-herring argument. What he's really suggesting is that you cannot teach yourself anything. I find that notion particularly banal.
I simply never had an external pressure that would throw me into the necessity to know ins and outs of Haskell by heart
This is the real problem then, isn't it.
External pressure is not the only thing that can exert the necessary motivation to learn new things. But the crux of his argument seems to be that you have to have these external motivators, otherwise you'll never be any good at what you are learning.
I suppose the argument could be made that if you don't have external pressures, you won't learn whatever you are studying well enough. This argument seems specious to me. I've heard plenty of stories which validates against my own more limited experience with people who are motivated to learning by extrinsic motivations; on the level, they just aren't as good, because the extrinsic motivation ends the instant you can convince someone else you are good enough. From there, you only have to convince yourself you are good enough, and the thresholds for this vary a lot. Having the intrinsic motivation to go beyond what others expect I could only see having an additive effect on one's ability.
Or perhaps I am just making an attempt to validate my own intrinsic motivation to learn stuff in spite of a relative absence of external pressure. I can't help but think this author is attempting to validate the converse of that.
Or maybe I'm missing the point. Maybe if I want a position at a bank being held to this sort of standard, then I don't really have to worry about teaching myself new things and I can survive without a motivation above and beyond getting paid. I don't know that I'd be so sycophantic toward that culture, though, if that is all that is expected of me. I certainly wouldn't go so far as to write an article claiming it is a waste of my time.
One other point:
First of all, banks are always a good indicator of the position’s value in business terms.
How true is this? He seems to take the stance that banks are pretty much infallible when it comes to assigning proper value to individuals; the folklore I am exposed to here and elsewhere suggests this isn't anywhere near true.
It seems rather odd, also, that he asserts that banks will fire people who have even short-term decreases in added value. This seems really foolish; value generation varies over time. I don't understand how organizations with such a intricate grasp of value production could miss this. That is, unless they don't have that grasp he claims they do.
Learning AI (or any deep comp.sci for that matter) is not like learning J2EE or Ruby DSLs or whatever the fad du jour in the enterprise software world is — read a few chapters of the latest bestselling “pragmatic” book, write some crappy web site and hey presto you are the expert.
This seems like over-the-top smuggery. While I don't agree with a lot of the author's remarks, I certainly feel his pain and empathize with the idea that noodling or other aimless, low-motivation and low-direction play does not lead to the kind of expertise that will get you hired by the kind of people who use Haskell/F# to make money.
But this kind of remark seems wildly baseless. Is he postulating that if we find a bank using J2EE or Rails that you can get hired based on a few chapters of reading and a web site? This is not consistent with the rest of the article.
What I take away from the article is that he believes banks are very demanding, not that Haskell/F# is very demanding. In which case, if he finds himself at a J2EE bank, he's going to be discussing the kinds of things I recall working with in one of the world's biggest banks: the behaviour of transaction queueing systems, type erasure and its effects on reflection in the presence of certain generics, subtle considerations when using JDBC and Oracle, or whatever is now the kind of thing J2EE people lose sleep over (my knowledge is out of date, the things I mentioned may now be passé).
"Learning AI (or any deep comp.sci for that matter) is not like learning J2EE or Ruby DSLs or whatever the fad du jour in the enterprise software world is — read a few chapters of the latest bestselling “pragmatic” book, write some crappy web site and hey presto you are the expert.
This seems like over-the-top smuggery."
That's probably because he didn't say it.
I did (I don't know if you noticed he was linking to a separate page, and if so you followed the link), a long time ago, in a blog post. And of course the context was different.
Here is the original post from which that sentence was extracted. Hopefully it makes more sense in context.
I keep getting asked this question and I keep saying the same thing - to three people in the last week, for e.g, two of whom were working through (or planning to work through) AIMA - so I thought I'd put this down here (and point anyone who asks the same question to this entry in the future).
Learning AI (or any deep comp.sci for that matter) is not like learning J2EE or ruby "dsl"s or whatever the fad du jour in the enterprise software world is - read a few chapters of the latest bestselling "pragmatic" book, write some crappy web site and hey presto you are the expert.
"Real" comp sci doesn't quite work like that.
To really understand a standard 3 layer feed forward neural network, for example, you need to have a solid grip on
* vector spaces
* basis vectors and change of bases
* eigen vectors and eigen values
* Basic matrix operations like inversion
* multi dimensional performance surfaces
* Quadratic functions and finding their global maxima within Newton's method and Conjugate gradients
* performance measures and steepest descent
* partial differentiation of vector values
* numerical stability of algorithms
Without that background you will be able to use conceive of "class NeuralNetwork" that "has an" instance of "class Node" etc, but you will not,(repeat NOT) be able to do anything useful with real world data (writing a XOR classifier doesn't count!).
And, a feed forward neural network is only one type of pattern recognizer (or function approximator). There are many more, each with its own trade offs , and you have to know the math to make the trade offs.
That being said, the best book to start with is AIMA as long as you (1) learn the required math parallely (2) do all the exercises at the end of every chapter.(I can't emphasize this enough). Reading through != "working through".
If one is willing to work hard, there are very few fields as fascinating as the various branches of AI.
Once you start down the rabbit hole, however it may be very hard to continue writing all that heavy lifting enterprise software without hearing the giant sucking sound of your life going down the drain as you write yet another jsp page for the latest leasing system. ;-)
Be warned! :-D.
Anyway the really hard part of "learning AI" is not getting the books or working through them systematically (which is hard enough), but that is for another blog entry.
Post Script - Some book recommendations for beginners.
1. Linear Algebra by Gilbert Strang
2. Calculus by Gilbert Strang
3. Artificial Intelligence A Modern Approach By Russell and Norvig
4. Machine Learning By Tom Mitchell
5. Neural Networks for Pattern Recognition by Chris Bishop
6. Paradigms of Artificial Intelligence Programming by Peter Norvig (ok, this one is not quite an cutting edge AI book, but imo this is the best book on programming ever written and if you call yourself a programmer you should have it on your book shelf)
Hopefully that wasn't too much "over the top smuggery" :-). (Not that I care if it is, I write stuff on my blog to clear my head, not be popular).
This was written as answer to my ex-colleagues from the world of enterprise software who asked for advice on "How to move into AI Dev and do interesting projects like you do" and the "leasing system " and "heavy lifting" "read some book and become the (local) expert" etc references make sense to the people (from ThoughtWorks) who asked me the question.
I am not sure how anything I said fits into this Java-bank-interview vs Haskell-bank-interview discussion.
Fwiw I am entirely self taught and was addressing other self learners in that blog post, so I am not sure how the author uses my opinion to buttress his thesis that "self learning is overrated". If anything I believe it is underrated.
jrockway said it best above,
" I've learned everything I know about programming on my own -- reading other people's code, and trying their techniques for myself. "
All I can say is that when I quote other people to support my argument, people assume I support the words I am quoting unless I explicitly argue with them. He(?) quoted them, I'm ok with questioning them in the context of the rest of his post.
I disagree with the premise that reading a few chapters of a book and writing a crappy web site makes you an expert on anything, AI, DSLs, Ruby, whatever. Those are the words I took issue with. If you write for clarity rather than popularity (as I do), then I withdraw the suggestion of smuggery and suggest instead that those specific words are simply wrong. If you want to stand by them, that's your privilege, we can agree to disagree.
I agree with your statement I am not sure how anything I said fits into this Java-bank-interview vs Haskell-bank-interview discussion. I'm not sure either.
Going by the title of the post, I initially assumed the author was making a case for learning in a school/university-based environment, as opposed to autodidacticism ("self-learning"). This turned out to be wrong; rather, he is basically saying that any amount of studying (theory) does not make up for experience (practice).
The real title should be: "Theory without practice is overrated".
It's not even that. His assertion seems to be that practice outside of a sufficiently business-like environment doesn't count, because you won't be compelled to focus all of your time and energy on just those things that bring a positive change in rate of value production now. It's a dangerous game to play, but a desirable one moment-to-moment in an environment that severely dislikes disruptive ideas -- the thing that learning for something other than instant gratification is bound to cause.
The person he's quoting tells the opposite: practice without theory won't get you far in any comp.sci heavy field, which is true, but doesn't have anything to do with being self-taught.
I taught myself. Then I went to school to get the paperwork to aid in getting a job. School was a -joke-. I got a lot more out of teaching myself, in far shorter time. I ended up able to actually get a project done, where the other people in my class were barely able to complete the assignments with help from the teacher.
The point is not that schooling sucks, but that it isn't guaranteed, either.
As others have noted, what made me successful wasn't that I avoided the classroom. It was that my way of teaching myself was to actually program things. It gives the experience you need to lock in the knowledge you're gaining.
This is basically what the article said. The author programmed a lot of Java at work, and read a blog post on Haskell at home. Then he found a job interview for Java easier than one for Haskell.
I've walked people out of the office over things like this. Generally, people like this will shotgun every language and technology about which they've ever read a blog entry. It usually gets them past the phone screen (which I try to spread out on my team so everyone learns to do them) and in the door.
When I see them claiming experience with any of the "cool" technologies, I usually throw them a couple questions related to Haskell[1], ocaml, Erlang, Lisp, 6502 assembly, etc. If they flub that area without acknowledging that it was a toy project, I'll know they are probably lying someone else. I'll be a lot more aggressive zoning in on the design and implementation of their most recent project. If they're soft, where I might normally give a little room for "nervous at interviews," they're pretty much done.
However, if they nail it, I know we're going to be able to work well together. Especially so if they get excited that someone is asking them about something...that...they'll...never...get...to...work...with...in...our...boring...IT...department.
sigh
[1] This one is almost too easy. "I hear these 'monad' things are really tough. What are they about?"
It's certainly true, in most cases, that noodling around on your own is no substitute for professional experience. But where are the entry-level Haskell jobs that would get someone that experience? Only route in is academia?
Having a package or two on Hackage (or even better, in the Haskell Platform) will demonstrate effort and aptitude at learning, and give the opportunity to review your work.
If you want to become strong enough to leverage yourself into a Haskell job, stop writing in anything but Haskell for non-production-work tasks. All your shell scripts--Haskell. All your recreational coding--Haskell. Picture yourself in a Shao Lin temple of Haskell.
Also, think about the flip side of your question. Haskell is a pain in the ass for the first few months. If you haven't climbed a substantial part of the learning curve by the time you interview, you might discover that you hate it[1]. Part of the interview process is not just seeing if you can do the job, but that you'll stick around for at least a year and not quit because you don't like the environment.
I normally wouldn't worry about someone trying to make a transition from Java to C#, but I would worry about someone trying Java -> Haskell without having broken through the "difficult" parts.
[1] There's a lot to hate. Imagine, on the flip side, if you had to program in RPG II. You'd want out as soon as you realized you made a mistake. Admittedly, RPG of any flavor doesn't sound sexy, but you get the idea.
Right. A certain degree of self-learning is necessary in order to reach the point where you are firmly ensconced in that professional engineering enterprise gig. Not to mention the fact that as that enterprise shifts to future, non-faddish frameworks, you are going to have to spend a certain amount of self-motivated time learning on your own to keep your skillset up to date, even if your current gig sends you to conferences and otherwise invests in you.
It sounds like the author just played around with Haskell for fun. You run up against the types of issues discussed in this post when you don't have the option of avoiding them. If you are working on a project where the goal is to create a piece of software that functions in a particular way, you will inevitably run up against some road blocks that require you to gain a deeper understanding of how your tools work. If the goal is to play with the language and learn for the sake of learning, it is far too easy to change course whenever you approach a difficult problem.
I would argue that experience in a corporate environment is often overrated, especially when working on a long-term project that lasts several years using the same programming language and frameworks. Sure, you're ending up knowing all the nuts and bolts of the system, but you haven't gained much experience with a broad range of different technologies.
>> No amount of self-learning can come near years of work in a mission-critical environment when company’s revenue is at stake.
So true!!
It doesn't matter how much you read/experiment with distributed computing (or any advanced field). How many simulations you are running or papers you're reading on the subject.
The only reason I'd sacrifice personal freedom to work for a big company is not money but resources/environment. They have data, millions of users, hundreds of computers and the kind of challenge that will simply push you and make you better at what you do.
It kinda sucks in a way because unlike ECE or any "regular engineering", you don't need money/resources to do cool things in software. Unfortunately, I'm starting to realize that's not entirely true. As soon as you start playing with intense computing, you need $$$
That's really the only reason to work for a big company if you are _only_ passionate about software --> Data, resources, work environment.
Poppycock. You don't neeed to work 10 years in some investmennt bank dungeon programming to know that stuff. You can "self-learn" computer science and you can answer that stuff without much problem. I was more than familiar enough to answer those questions before my junior year in college and really I was close to getting a handle on concurrency, at least to answer the superficial questions that get asked in an interview, before I entered college. If you follow programming languages, the type inference and covariance, etc is not a big deal. In fact, I wouldn't think working for an investment bank is the way you go to learn that.
Different learning methods make sense for different things and different people. I think for many things collaboration works best in programming. Its certainly more fun to work on solving a problem with someone and helping someone reach an understanding of something as you reach an understanding of it makes you understand it more completely. Just stating things in a way someone else can understand increases comprehension.
I'm not sure that I'd choose to use the terms "investment bank" and "respectable company" in the same context.
Also, when people use the word "real" in a gratuitous way - such as "real" industry, "real" work, "real" programmer, etc - it always makes me suspicious that they don't know what they're talking about and are attempting to needlessly inject gravitas into their statements.
It's not that self-learning can't be good. It's just that it has no intrinsic trial-by-fire, so there's no way to tell the good from the bad, outright.
If you took a class, then you got a grade. If you were in a job, then you had the potential to get fired. If you're self-learning, there's no one way to get punished for not learning well. There's no way to measure what you've done.
I personally know some successful self-taught programmers. But I only know they're good because I know them well. If I had to pick a stranger I'd want a reference or a GPA.
Not to be a jerk, but it sounds like the author has problems with his ability to self-learn and is projecting that upon the universe. Turns out, maybe it doesn't work for you, bummer.
Re-title it, "My Self-learning was overrated (by me.)"
Personally I think it's pretty easy to make the opposite argument. If you work on open source software in your spare time you have a chance to work with a wider pool of more talented engineers in a more collaborative environment and wider ecosystem of software than exists in any single company. Many programming jobs by comparison involve either menial CRUD-style work, or working for years on end on some specific component that does not convey much general-purpose knowledge.
Now, if you aren't the kind of guy that is motivated to work on your own then certainly a business environment will be invaluable, however I reject the idea that corporate programming somehow inherently teaches you more than you can learn on your own.