Most of the self-taught engineers that I've met struggle with algorithm questions, even relatively straightforward ones. Most also don't have a clear idea of things like algorithmic complexity and how to get an idea of how efficient an algorithm implementation would be. This is mainly because CS students are forced to learn this whereas self-taught programmers would need to be particularly motivated to sit down and teach themselves this, because there are no immediate practical uses for this.
The great debate of course is, how important are those skills? Many self-taught programmers I know reject the idea that these CS fundamental knowledge is important. And to their point, many self-taught programmers that I've met/hired/worked with that would be productive on day one because they have real-world knowledge and experience. Case in point, the founding engineer of the company that I work at is a college dropout, he's more than 15 years younger than me, and he's excellent. As well, I'm a self-taught programmer but came from a EE background, so I had two CS courses under my belt before I decided to pursue programming after graduating.
> Many self-taught programmers I know reject the idea that these CS fundamental knowledge is important.
Besides some CS math courses I took while doing an unrelated engineering degree, I'm self-taught. And I've always felt incredibly limited as a consequence. E.g. given the popularity of Rust, I'm really interested in type systems to limit aliasing. But the papers make my eyes glaze over. Self-teaching is difficult, because even the stuff written with an eye towards approachability assumes mastery I don't have of a couple of years worth of discrete math.
As both a Rust user and a CS grad, I can confirm both that papers still make my eyes glaze over and that thankfully you don't need to understand the type system to use Rust effectively. :)
Personally I'm thankful for my CS background, but you also need experience using various tools to round out your knowledge. To put it another way: real-world experience teaches you how to solve problems, while academic experience teaches you that hard problems can be solved.
I'd agree 100% with the algorithm point but I'll add data structures to the list. Beyond low level memory - think shift registers - I had no concept of linked lists, pointers, stack vs heap, etc.
My solution was simple. I contacted the CS department from my undergrad (Rose-Hulman - small school, I knew the department head) and asked for the syllabus for the data structures course. The prof was great and went so far as to send me all the homeworks, last year's tests, and recommendations for books. It was above and beyond and reinforced my appreciation.
Also self-taught with an ME degree (also from Rose-Hulman. Hi Keith!).
I'm an engineer at Twitter. Due to the interview process, I have yet to encounter anyone (myself included) who doesn't have a pretty sound understanding of algorithms and data structures. The biggest difference I've noticed between self taught folks and CS majors is comfort level with bit-twiddling. All the high-level stuff I know I've had to learn to do my job. All the algorithm and data structures stuff I've had to learn to pass interviews. Moving bits and bytes around is hard for me because it's so rare for me to encounter it in my day-to-day work.
As for how I learned all this stuff? Read Steve Yegge's Google interview post[1], buy the Algorithms book[2], learn everything Steve suggests you learn.
Not too long ago I had to bang on some code that was running too slowly to be practical. Thanks to knowing stuff about algorithms and complexity I was able to knock it down from cubic time to just-over-linear time and reduced it from taking about 9.5 seconds of a 10 second response time to 'basically none' of a 0.5 second response time.
"because there are no immediate practical uses for this"
It depends on how hard a problem you're trying to solve. If you're writing the MVP of some CRUD app that supports a handful of users, you don't care about efficient algorithms. But if you ever hope to scale up to support massive numbers of users without using up massive amounts of extraneous servers (think of Google's or Facebook's data centers, and what their power and cooling costs), then knowing how to create efficient algorithms is of tremendous practical importance - it could mean the difference between profit and loss.
Another example on a small scale: If you want to write a game that drives a virtual reality display in real time with realistically rendered imagery, knowing how to write efficient algorithms is also going to be very important.
But if you ever hope to scale up to support massive numbers of users without using up massive amounts of extraneous servers
When the time comes that a self-taught engineer is scaling to these massive numbers, there should be money in the budget to hire people with expertise in efficient algorithms.
"there should be money in the budget to hire people with expertise in efficient algorithms"
1. Wouldn't you rather be the developer who built the scalable software that drives your company toward its IPO rather than the developer whose skills suddenly became useless just as the company was getting really profitable? Why should they even keep the first developer around after they hire someone else to clean up their mess?
2. Does your company really want to have to re-write all its critical code just as that code is starting to make money for them? It's OK for an MVP to be lacking features; features can always be added later. But if its fundamental architecture isn't efficient and scalable, you're going to run into a brick wall at the most inconvenient time.
In the grandfather post it was suggested that it was a founder who was self taught. If I am the founder and I am self-taught, I have to build what I can into the early version and when I have enough money hire someone to build it better. That day might be a few months from now, or years, or it may never come at all.
I can't imagine that someone could be an effective programmer without at least a basic understanding of fundamentals like algorithmic complexity and data structures. They just touch every aspect of programming.
Being able to answer those questions represent 1-2 freshman/sophomore level courses. Being able to answer these questions represents signaling more so than answers that are intrinsically representative of skill.
This is going to be a highly biased answer, and will probably be downvoted. I don't think there is any difference between a "good" programmer that was self-taught or one that had a degree in CS/SE. The reason I put "good" is because, at least in my field, there are a lot of terrible programmers/coders/developers/etc that come from all backgrounds - be it self-taught or with a degree (I work in a local market for .NET). I think a persons background is a terrible starting point to determine their competence. I personally am self-taught, having started "coding" when I was about 8. I have a colleague who didn't start coding until he was in college, and I think him and I are about the same skill level. We have different strengths: he is more architecturally minded, while I am more of a "hacker" (I can figure any technical problem out though it may not be pretty). In the course of my career though, I have met many self-taught coders and university trained ones, and to me it makes almost no difference. What it comes down to is passion, and that can be extremely hard to quantify. For me personally, having been doing this for almost two decades, you just have to feel the person out. If you are passionate, you should be able to tell if they are as well.
Self-taught person here. This is surely a minefield of downvotes, but I've noticed several differences in CS students:
1. While CS students have a broader array of programming knowledge, they have narrower perspective about it. By this I mean they have difficulty distinguishing their own programming experience and knowledge from that of the world as a whole. They assume everyone learned Java in CSE101 so they can't fathom how some one could program Ruby without being able to explain it using Java-esque CSE101 terminology.
2. CS students participate in open source more. Self-taughters usually do so in order to complete a direct objective (i.e. make a site, sell some widgets), they weren't exposed to the culture or peer recognition around programming that comes from years of academia. CS students are often encouraged or required to participate in open source projects. This sometimes leads to the false insular belief among the educated that all "good" programs work on open source projects. This pops up on HN every now and then.
3. CS students like libraries more. My guesses for why this is might be: (1) the academic exercise of writing your own libraries; (2) libraries are a generally more academic approach to programming where you write code "by the book" and work on many what-if possibilities; (3) it's 11:00pm and you need to submit your code in an hour - you don't care about maintainability - so shove a library in there to eek it out as fast as you can. Self-taughters don't have such academic exercises or deadlines in their learning; and they'll usually be maintaining their code for a long time, which discourages libraries in my experience.
These are my observations from the coders I've worked with. Of course, there are exceptions, I know there are many self-taughters who use libraries and work on open source, but just not as many CS students.
I'd disagree on this one. From my admittedly anecdotal experience, all the people I know who are heavily self-taught (myself included) have participated more in open source. Most of these people are driven by a passion for software, rather than a need to sell something.
I actually find both 2 & 3 to be more prevalent in self-taught programmers (especially those who know enough "officially" taught people to have an inferiority complex. (i.e. why re-invent the wheel when (surely) someone else has already re-invented it better).
Also, I find that most CS students miss the "forest for the trees", and focus on the technical - rather than the immediate reality (Big-O < shipping features) at a small company.
Overall, none of those qualities are bad, but both can contribute to the success or failure, of a company.
Morale: caveat emptor. Culture & Diversity is important.
If you pull in a dependency on Xlib because you wanted a matrix math function from a graphics library, then it does.
If you're building on a platform that emphasizes many small libraries, and you pull in a few well-reputed ones, only to find that your dependency graph now includes 15 small buggy libraries hosted on github and abandoned by their developers, then it does.
If you introduce a dependency on a Windows-only library in your game's physics engine, and then want to port the game to Linux, then it does.
It's possible to be sloppy with using libraries, but I don't think academically-grounded programmers are any prone to being sloppy.
If you are writing a enterprise crud or workflow app that likely wont ever be looked at again till a systems upgrade comes in 3 years, having a sprawl of abandoned or semi-abandoned libraries can be a nightmare to update.
Enterprise it management includes trying to coalesce around a well known corporate standards for that reason. It's much easier to maintain upgrade a portfolio of projects if they are all using the same libraries than if each project is using a different 3rd party library. Sometimes these standards mean sucking it up and using core language feature or existing library features to bridge the gap to where you want to be.
Most universities don't have a software engineering program, just a CS program. Typically it's one that's heavier on the software aspects than the mathematical part of things.
Will prefix this with the necessary disclaimer that there are exceptions to this on both sides, and by generalising I'm necessarily going to get it wrong for some specific cases. One of the best devs I know dropped out of CS in the first few months, contributed to opensource projects (particularly in OS X stuff), got recruited by Apple, has been with them for some 6-7 years since on a product you all know.
Caveat over, onto the generalization: From having done a few dozen interviews now, the self-taught applicants are more likely to have holes in their knowledge that they don't know about. That is, CS grads have (hopefully) gone through a broad and formal program so they have a more complete mental map of the knowledge space and their own weaknesses. The self-taughts don't know what it is they don't know.
This gap can be overcome, but the pattern I've often seen is the knowledge gap and disadvantage is compounded by the first jobs the self-taught tend to end up in. Rejected from graduate programs in larger firms where they'll be exposed to a wider range of talent, they often end up in smaller shops where they work in a niche where they're not challenged to keep growing (just as often due to the nature of the business. Smaller shops tend to be contract businesses, and contract work doesn't provide much opportunity to go deep on problems in the way that will stretch and grow you). And because they don't know what they don't know, they don't realise the gap between their skillset and their contemporaries who went the formal CS track.
Frequently I've interviewed developers who've done this for 5-10 years and decided they wanted to try working in a larger company. And they're hard interviews. I hate it when an earnest applicant is an obvious 'no' in the first few minutes, and it's because they've specialised into a low-level dead-end, and they didn't realise it.
Having said that, I think we're at a point where it's easier for a self-teaching programmer to overcome this. Even compared to when I went through uni in the mid-2000s, it's ridiculously easy to get the material that would cover an undergrad's CS education. There's no need for a self-taught person to be caught short in an interview on not knowing what a linked list is, or the basics of algorithmic complexity. I could probably put together a list of 10-20 Wikipedia articles and Khan videos that, if you were to read and watch until you understood, would cover you for the vast bulk of CS-related questions you're likely to hit in an interview.
So by "gap" you mean unable to answer some trivial questions during an interview that don't pertain to the job at all?
While I agree self-taught developers may lack knowledge in certain areas, they compensate by being able to learn quickly and efficiently for the job at hand. School taught, though not always, will lack that discipline and/or efficiency. At least in my experience, so take what I said with a grain of salt.
> they compensate by being able to learn quickly and efficiently for the job at hand.
We ALL need to be able to learn quickly or we simply don't last. If you've been doing this for a decade or more, you're guaranteed to have that skill in abundance, regardless of your origins.
What the OP was getting at is that you get exposed to a host of different ideas and paradigms in a proper CS program. Assembly, lisp/ada/scheme/etc., C/C++/Java, graphics, computer learning, computer architecture, etc.
I learned how to code when I was ten, but I would have never exposed myself to any of those things if I didn't focus on CS in college. What does that add? Exactly what the OP said. I know what I don't know. And it's a lot.
> We ALL need to be able to learn quickly or we simply don't last.
I totally agree with that statement. In my response, I meant that self-taught devs have a leg up, so to speak, in the beginning. In the end, we all need to learn quickly and adapt regardless of how we started.
> Assembly, lisp/ada/scheme/etc., C/C++/Java, graphics, computer learning, computer architecture, etc.
From my own experience as a self-taught developer, I have learned, though maybe never used professionally, almost all of those topics to some degree because I had a keen interest in understanding software and computing in general. I find that self-taught devs, though not a majority, are like that - they want to know and understand because its passion of theirs. Also, i'm not saying that university trained students lack that passion either. I believe at the end of the day it really doesnt matter whether you are self-taught or not, only the passion you bring into learning and understanding is what matters.
Do basic CS concepts pertain to the job? Well that depends on your job I guess. I consider them the tools of the trade that will outlast all the languages and frameworks that will come and go in a career. And you can intuitively learn them by self-teaching, but it's less certain, and just as importantly, you may not learn them in the same vocabulary as others. CS gives you a shared vocabulary for talking about these concepts with others.
Given your other comment I suspect you're speaking from an observation bias leading you to judgement about the general case that doesn't necessarily stack. You witness those who've made it as devs, but that doesn't tell you anything about those who didn't.
The important question is of those who self-learn, how many of them end up as decent developers, vs those who go through a formal CS ed. Or to re-frame, what's the better course for the majority of people who want to become developers? I don't know of any numbers on this, so can only go off what I've seen, and what I've seen is that for the average person considering dev, a CS education is the best option.
If you read my other comment you also read the part where I said I have been around a lot of terrible developers (worked with, for, under, around, consulted for, etc). These people have "made it" as you said, with long careers in software development, but they are far from competent. Which leads us to your next point:
> how many of them end up as decent developers
I don't honestly think where or how you start can be used as any sort of determination for how "decent" you are. What matters to me is passion (i've said this a few times already in other comments). When I am hiring and interviewing developers to bring on to the team I barely look at their resumes and I definitely don't look at their education history. Whats more important to me is the initial call, and then the face-to-face. I don't even bother asking any real technical questions. What i'm looking for is a spark; something that tells me this person loves to build and learn. Self-taught or university taught makes no difference at the end of the day, and should never be used as basis to judge whether a person is a "decent" developer or not.
There's no need for a self-taught person to be caught short in an interview on not knowing what a linked list is, or the basics of algorithmic complexity.
> I could probably put together a list of 10-20 Wikipedia articles and Khan videos that, if you were to read and watch until you understood, would cover you for the vast bulk of CS-related questions you're likely to hit in an interview.
I think that a large proportion of CS students are self-taught anyway. I self-taught myself most of my coding skills, despite being a CS student. It taught me the fundamentals, I then built on that myself.
What sort of stuff did you learn in your CS course that have proven valuable in your career that you may not have learned otherwise? I asked this question because I don't want to fall off due to not going to university.
The most valuable thing to me in the CS program was really being exposed to things I didn't know existed. It may not be as big an issue today with places like HN and Reddit to push you in unexpected directions, but I really benefited from being introduced to stuff I wouldn't have even thought to look at. Once the introduction was made, I was mostly self-taught because I'd dig into it out of pure interest, but that initial introduction was vital.
I agree. I only did a CS minor (10 courses), but even just doing that exposed me to things that I'd never really been exposed to, like operating systems and numerical computing.
I was self-taught and went to community college to sort of "validate" that I have CS skills (to help get a job). I think there are a lot of things you do in school that you wouldn't otherwise learn/explore. Things that come to mind are binary, assembly language, how sorting algorithms work and big-o notation. Those topics are kind of dry, and something you're not likely to try and learn about unless it's required.
For me those skills aren't incredibly useful in everyday work (I mostly do web development), but I think having that low level understanding does help from time to time, in a hard to measure way. I feel like it sort of gives you a better intuition on certain things, like evaluating new technologies, why you're having performance issues, etc.
Also watching a teacher code, and listening to other people describe their thought process was interesting. I didn't do a lot of pair programming, but if you go to a university where you do that, it would be useful.
That was what a degree was for in the old days - it was to teach you how to learn. There are other paths nowadays I suppose, but a degree is a tick from an institution (who are experts in learning) that you can learn something and presumably if you've done this you can learn other stuff.
I'm not great at learning things myself and benefit greatly from having an instructor, but personally I feel like I need to completely understand something to say that I've actually "learned" it. So for me, getting a CS education was huge for me in terms of learning how and why things work the way they do, which really helped my confidence in feeling that I really know what I'm doing.
I've also learned about things such as linked structures which I probably wouldn't have bothered to look into if I taught myself, and are actually some crucial things that companies look for when they hire.
Another thing I've noticed personally is just a faster progression. I have a friend who taught himself how to make iOS apps, and it took him 3 years to get to the point that I did after taking one online course on iTunes U for it. Having a structured course really makes a huge impact by providing a logical progression and teaching you all the little things that take a long time to learn on your own.
Like others have said, I find self-taught programmers (I'm a tutor, so I meet a lot of them) have an attitude of, "why should I care about time complexity?" They are motivated to get things done quickly, and will do it to the best of their ability, but it may not be the best solution, since that would require deeper theoretical knowledge.
On the other hand, programmers coming from academia are expected to "just know" a lot of the practical tools that are used these days. I did C++, Java, MATLAB, and machine learning-type stuff in school, but learned Git, Rails, Python, Hadoop, etc etc. after graduating. Recruiters, etc. expected I'd just be ready for that stuff right off the bat.
I was self taught then did a CS degree, so have a foot in both camps. Some of the things I learnt in the degree which I never would have learnt by myself were (and others have mentioned some of these) - algorithms, relational theory, O-notation, symbolic logic, stats, CSP. Then there were things I learnt myself that I never learnt at uni because I wanted to - assembler, 2d/3d graphics, c++. Then there are things I've taught myself since uni - compiler theory, database optimisation, functional programming (actually did this at Uni but couldn't see the point then - hardware was rubbish then though), web stuff.
So difference would be in broad for my case - uni gives you the theoretical foundations that you probably wouldn't learn yourself, teaching yourself programming gives you the hands on stuff you'd never learn in depth at a uni. Two sides of the same coin I suppose, it would be rare to have a self taught person learn the theory that you'd learn at uni, and I don't think you'd ever get that depth without some uni training imho.
The other thing that comes to mind is if you're smart and self taught - why wouldn't you do a degree? The only good answer I can think of is that you're so friggin awesome you're churning out code that everyone says is awesome and google or apple has hired you already, there are perhaps 10 people in the world like this - everyone else do your degree :-)
Smart people who love coding end up pretty good at programming whatever their background. That being said, my (extremely generalized) observations have been that people with more formal training are able to spot issues earlier than people with less formal training, but end up more false positives (issues that aren't issues.) I've found CS majors also have a better nose for "code smell" due to having seen very well constructed code and dealt with graded assignments for years.
Within the enterprise it world I've worked in, the self taught programmers tend to be very bimodal- likely to be truly excellent or unspeakable terrible whereas programmers with a CS credential tend to be more normally distributed.
I do think that the credentialed programmers I've met are a bit more technically flexible than self taught programmers. A CS degree requires learning different languages, coding styles, and technologies whereas self taught coders are more likely to have come up through one language/technology stack and may never leave it. That's a massive generalization though and I can think of many counter examples.
Note-I'm was a CS undergrad and mostly deal with very large IT departments rather than startups or consumer application development.
There are a lot of interesting views of how everyone here has been exposed to self taught devs and cs grads and it seems from where I live and what I have seen its the opposite. I have seen horrible code from both sides from indie app devs to professional algorithm guys that are smart but shouldn't be coding. It's hard to assume from a persons background without much work to review but on the east coast I have met computer science engineers who some how don't know what a IP address is and couldn't throw together a html page but still get a junior C++ job. I have been obsessively reading and googling my whole life and always searching for industry standards, best practice, how low level code works and what's going on in the background - if the best we're doing it and using it then I wanted to know. I am now a dev working on just about every area for a large website and currently finishing up my first year in college...so far I can say college isn't enough. True passion and interest in your field will show in your work. Just watch some defcon vids and you'll see some amazing programmers and hackers from both sides.
The differences within groups far outweigh the differences between them.
I'm a 30yo self-taught programmer. I've worked with many CS grads over the years.
There are pretty big gaps in my knowledge of what would be considered fundamentals, some of which are more important than others. I think the type of software you write largely determines how important these gaps are, and that being aware of the gaps in my knowledge has been more important than actually having the knowledge itself. If I need to implement something that requires some knowledge I haven't yet picked up, I learn what I need to know as I need it. Sorting algorithms are an example of something that any CS grad would know way more than me about because the need to understand all the alternative ways of sorting is not something that I've needed to know.
An important distinction is the difference between knowing something and knowing the language to describe that thing. I understand the time complexity of algorithms, but I didn't learn Big O notation until recently.
I don't know if this observation is true in general, or more common for the CS programs where I live, but I've noticed that CS students and recent grads are terrible programmers. That is, their ability to produce well-written, maintainable code is horrible. This can probably be said of all new programmers, even if they are self-taught, but I think the difference is that many of the CS students I've worked with have the illusion that their CS knowledge is what is most important when they write software, so they are more ignorant of their ignorance.
Learning all the fundamentals of CS in the classroom is no doubt a huge benefit, but many of the other things that make a good programmer come down to experience in a lot of cases, and the classroom is not going to be a substitute.
This is a very subjective answer. Personally I studied computer engineering which is essentially the hard classes from computer science combined with the hard classes from electric engineering. I suck at algorithms. I have a hard time with abstract theories. I contribute to open source not too much but I do. I started in .net and now try to stick to JavaScript and node or rails. I have held titles of architect, director, and team lead. My favorite title though is "engineer".
The answer to this question is very simple. It's not school vs self-taught. It's PASSION and I fucking love programming :)
Self-taught programmers tend to be more practical, while CS graduates (and worse, Doctorates) value theory over experience.
Some real life examples:
- The algorithm is tight enough that the choice of programming language doesn't matter. Also phrased at least once as "the constants in big-O notations just don't matter".
- No, you can't install a configuration management tool on the web server; the web server must be completely isolated to protect our network.
- If I'm using a programming language with so much power, why would I ever want my configuration files to have any less power?
- If only there was a programming language where the (type system / memory management / homoiconicity) was more powerful, we could solve every problem automatically.
Of course, on the other side of the fence, there's these pearls of wisdom:
- Of course C++ is memory safe, if you follow these few hundred best practices...
- (Ruby / Python / Perl) is fast enough to solve every problem.
- If I can use Javascript on the server and the browser, why would I want to use anything else?
- Why would I ever care what the Big-O complexity is for this algorithm?
It makes for an interesting comparison and contrast. And since I come from the self-taught angle, a couple of the second group of quotes are my own.
I'm self taught (from the time I was 9) and now, at almost 30, nearly done with my CS degree. I've worked "in the industry" before going back to school, and the biggest stand out for me is that the majority of CS students are terrible at figuring things out on their own. What I've noticed that many, regardless of course understanding, when faced with a programming problem that they don't understand, is that they give up and start asking others for help, whereas most of the self taught coders that I've noticed are much quicker to pick up a manual or start reading documentation prior to reaching out to others. This is probably an advantage in some areas as much as it may be a disadvantage in others. Another thing that I've noticed is that many CS students are terrible at producing clean looking or consistent code, which is probably a detriment once they reach "the industry"
I worked with a guy that was self taught. When he left the company, his stuff became serious technical debt. If you working with someone that is self taught, do yourself a favor and teach them some good practices.
Obviously YMMV, but if I think on myself without CS degree, I'll probably:
Be comfortable programming in Java/C#,
learned Scala and trying to learn FP.
Learned about big O notation.
But wouldn't know:
why you should NEVER use float/double to represent money
How databases work
That NoSQL conceptual model predates the relational model.
Dependent types
Category theory
Overall, a good CS degree exposes you to things you don't know you don't know (a big pitfall in self study)
I really respect the self taught devs, because they have to work harder for it.
The biggest difference is that self-teaching implies more variability in skill outcomes. Lower lows, higher highs.
Look for people who can learn the structure of a problem when there's no guidebook (or textbook) available. If they can do this, then they will likely succeed in tackling the unknown.
This 'autodidactic tenacity' can be learned inside or outside of school.
Knowing how to independently learn means you will invest in yourself for the rest of your life. Attending classes and doing homework do not, themselves, teach you how to do that.
Here's my generalization ... CS students aspire to be good architects, self-taught aspire to be good programmers. The caveat - experience and aptitude combine to blur the groups.
I'm a current CS student, finishing this year. I work with both people who are self taught and people who have degrees. I think pretty highly of those I work with who are self taught. All in all the differences are minimal when getting the job done.
CS emphasizes theory and mathematics. Programming and coding is usually discussed in the introductory courses, but from there students are expected to teach themselves to keep up.
I just wanna make one point here, I dont know any self taught programmer in the world who has made a significant contribution to programming or computer science or computing.
Case in point :
1) Linus Torvalds : The Creator of Linux kernel.
2) Richard Stallman : GCC, Emacs and GNU.
3) DHH : Ruby on Rails.
4) Matz : Ruby.
there are several other examples Donald Knuth, Dennis Ritchie, Ken Thompson and many others ...
Stallman has a bachelor's degree in physics, and he did graduate work in physics. I thought he got a master's, but his bio on his site does not mention one. No CS degree. (I'm not counting honorary doctorates, which he has, and why he writes "Dr." in front of his name--yes, an honorary doctorate allows for that).
Knuth has a bachelor's, master's, and PhD in mathematics. No CS degree.
Given enough time as a programmer, we're all self-taught depending on what you mean by that. My degrees are in mathematics. However, I mostly learn now by asking about what type of api, constaints, types, and conventions will give me the desired result. I'll hack at it and use StackOverflow or watch video when I need some insight or syntax/implementation examples.
These are the classes I took to get my Computer Science degree. I spent about 45 hours in lecture for each of these classes plus somewhere between 20 - 50 hours per class (depending on the class) doing labs/homework/studying/etc.
Java
Lisp
Data Structures
Algorithms
Operating Systems
Information Systems (How databases work)
Software Engineering
Assembly Language
Artificial Intelligence
Networking
Embedded Systems
Calculus I & II
Multivariate Calculus
Linear Algebra
Discrete Mathematics
Probability and Statistics
Anyone could self teach all of these things to the same level that I learned them. It would probably take a similar time commitment. Honestly, if someone was dedicated enough to spent 60 - 100 hours learning each of those topics on their own time, they are probably a better computer scientist that I.
Although, I wouldn't say my CS degree made me a "programmer".
What did I didn't learn from College:
How to write maintainable code
How to write testable code
How to deploy code
How to manage servers
How to document code
If you're building web pages there probably isn't any difference between a self taught developer and someone with a CS degree. The person with the CS degree most likely also self taught themselves all their web development skills.
However, if you're building new programming languages or compilers or robots or missile guidance systems or self driving cars or spaceship navigation systems you'll probably start to see some differences. From reading the comments on this thread, I'm not convinced that the self identified self-taughters are aware of all that a CS degree encompasses (heck, no two CS degrees are necessarily after the first three semesters!)
A 22 year old with a CS degree is probably not going to be as good of a web developer as a 22 year old with with 4 years experience working as a web developer instead of a CS degree. But, a 22 year old with a CS degree and 2 years of internships working as a web developer...
I am self taught. My degree is actually in finance. At my current position, there is another guy slightly older than me who does have a CS degree and the difference between him and I is not a lot. I may be slightly biased, but he even admits, most of the development we do, he has been self-taught/ from previous jobs.
I don't think this is just a CS thing either. In any profession, you have those who graduate and learn for their first 6 months to a year out of school and then stop self development. They literally repeat the same experience over and over for the rest of their career. In the end, anyone who is very successful, will be 'self-taught', school only provides a foundation to build on. It's what you do after that matters. If you compared two developers, each with 5 years of experience, one with a CS degree and one without, the one with the CS degree will not always be the better developer.
The question about CS students is "were they self taught before they got their degree". Most who had no interest in computing before they got their degree seem to remain that way - a CS degree was seen as a way to earn easy money.
Most people who were self taught and then took a CS degree did so as part of their self teaching.
In my experience, self taught engineers are typically better with frameworks and more productive - but less detail oriented and have less grasp of theory.
CS schools naturally teach you to be slow paced, detail oriented and methodical, and it is sometimes good and sometimes bad.
The great debate of course is, how important are those skills? Many self-taught programmers I know reject the idea that these CS fundamental knowledge is important. And to their point, many self-taught programmers that I've met/hired/worked with that would be productive on day one because they have real-world knowledge and experience. Case in point, the founding engineer of the company that I work at is a college dropout, he's more than 15 years younger than me, and he's excellent. As well, I'm a self-taught programmer but came from a EE background, so I had two CS courses under my belt before I decided to pursue programming after graduating.