Hacker News new | past | comments | ask | show | jobs | submit login
Teach Yourself Computer Science (teachyourselfcs.com)
1308 points by kruse-tim on March 13, 2017 | hide | past | favorite | 237 comments



This is a really good list. I love the simplicity. I also agree that it is both worthwhile and very interesting to learn the fundamentals of CS.

That said, I think it is a mistake to assume that lots of Type 2 developers wander around in a perpetual state of under-achievement. Most of these people are indeed a different class of developer (I think the word engineer is positively abused), but many of them really have almost no professional requirement to understand fundamentals. Any more than they need to understand particle physics.

These developers are a class of systems integrators and they produce a lot of usable systems, at a quality level that represents appropriate trade-offs to the business case they are employed to address.

Yes, many will say this is a less elevated pursuit. It has its own challenges and mindset. It lives at a particular level of abstraction and its very existence assumes stability of that layer of abstraction. The fact that this breaks down sometimes is besides the point.

The reality is most developers probably do Type 2 work, though very many may have or aspire to have a Type 1 level of knowledge and insight. However I think it's unfair to portray a contented Type 2 developer as lacking in some essential.


Max Howell, creator of Homebrew, expressed this quite nicely: "Google: 90% of our engineers use the software you wrote (Homebrew), but you can’t invert a binary tree on a whiteboard so fuck off."

This is not to say that computer science (CS) skills aren't important -- they are! But far too many companies base their hiring process around "are you a recent graduate from a top-tier CS school" rather than "can you do the work that needs to be done here?"

Being a software engineer is about way more than just knowing what algorithm to apply to solve a clearly-stated problem. You need to know how to build automated test suites, how to make software which is easy to deploy and maintain, and how to deal with the real world, where Things Go Wrong All The Time.

Having worked as a consultant for many years, with a large number of teams, I can say with some certainty that I would rather hire a person that had mediocre-at-best CS skills, but was otherwise a rock-solid software engineer.

Because the amount of time they will spend implementing graph-traversal algorithms is asymptotically close to zero.

They will instead spend pretty much all of their time needing to write code that other people can maintain.

Why would I spend the majority of the interview process focused on the former, rather than the latter?

Admittedly, this has given me quite an edge as a manager when it comes to hiring.

Because my hiring process focuses on "can this person do real-world work in a real-world context", I usually manage to find rather excellent engineers that other companies overlook.


> Being a software engineer is about way more than just knowing what algorithm to apply to solve a clearly-stated problem.

Actually I think this is exactly what a software engineer is. Knowing what datastructure to use in which situation because of its implementation details is important! Yes, it's not the only aspect of the job, but it's hugely important.

It can be as simple as Arrays vs. LinkedLists; where Arrays have zero cost access and LinkedLists have high indexing costs, but low constant time insertion costs where you don't know the content length. This can have huge effects on the performance of an application.

This is important to applications, and no, you don't have to implement the datastructure, but knowing which to use when and why can save you the time later from having to find all your performance problems.


I was interning at a pretty successful software company in The Netherlands. I have never heard anyone there discuss about whether to use Arrays or LinkedLists.

In the binary education system in The Netherlands there are two types of universities, that which prepare people for industry jobs and that which prepare people for research. I went to both, the former has only a small focus on algorithms, complexity and datastructures. The latter has specific courses about those topics.

I think in practice you can often reach good enough performance without worrying about which algorithm or datastructure to use or what the complexity of your code is. There are many exceptions where those kind of skills are actually required, but it's easy to forget that software is a huge market and that a large portion of that market is people developing fairly uncomplicated applications.

You can create an incredible amount of value in the software industry without knowing much of computer science, but you are dependent on people who actually have that knowledge to create your tools, dependent libraries, used services, etc.


Sounds familiar. I had only the former kind of education. Often I'm pondering if I should pursue a masters degree in anything CS related. It's not that I really need it for my work. So, my options are to either do a CS master, and have to catch up on a lot of math and CS fundamentals, do some master in a totally different field such as business administration (useful since I have my own company), or do nothing at all :). Any advice from your perspective about this?


I loved my time studying for a masters degree, and after graduating I started with a PhD in Japan which has been an amazing experience. If you want to learn many new things, meet interesting people from all over the world and you don't mind going back to taking classes I would definitely recommend it. Don't worry too much about the math and CS fundamentals. The stuff you need to know to do a master's in CS is quite useful in practice, so it's valuable to learn about that anyway. Some specialisations in CS require more math than others, so keep that in mind.

If you don't know whether to do CS or business administration, you might be able to do something in between, e.g. business administration with a IT specialisation.

(I got master degrees in Computer Science and Business Information Technology from the University of Twente and am currently a PhD student in Empowerment Informatics at the University of Tsukuba in Japan)


You're making an assumption that "performance problems" are important. Sometimes they are. Often times they simply aren't. In my experience, outside of certain components of a game engine, scientific/numerical computing, very large (e.g. Google scale) systems and a certain class of real-time systems the difference in performance between one data structure and algorithms that operate on it or another simply isn't worth worrying about.

It's far more important that the code be simple, readable, and integrate easily with other teams' work.


Sometimes you have to wonder how much we are suffering "death by a thousand cuts" though. This is still quite noticeable on mobile and in certain applications.


You left out a really big, significant sector: finance. Minimal latency is king (at least in trading). This often means little or no heap allocation (if there is any, it's often done up-front). User-space network stacks to avoid the user to kernel context switches. Web devs think that sub 10ms response time are good. In finance, often anything over 10 nanosecs (for example when responding to an IOI - indication of interest) is trash.

Yeah, sure, in some domains latency isn't terribly important, but there certainly are a number of domains where it is.


Yes, we are in agreement (except for what I perceive as a derision toward "web devs", which by the way I am not and didn't even mention, so I'm not sure why you even mentioned it). I also left out specific kinds of some very resource constrained domains in general; a terrible omission on my part. I'm sure there are others as well that I didn't call out specifically. Not having experienced every single kind of programming problem type I rather think it would be silly and pretentious of me to try to enumerate them all.


> except for what I perceive as a derision toward "web devs", which by the way I am not and didn't even mention, so I'm not sure why you even mentioned it

I didn't read it as derision. I read it as an indication of a different focus between two areas of development. Microseconds don't matter in most cases. Milliseconds matter more often. Something optimized for microsecond-ish timing will make a tradeoff somewhere else (like code readability, generalness, etc).


often anything over 10 nanosecs is trash

Are you sure you don't mean 10 microseconds?


He probably means 10 microseconds, even if he's talking about HFT. The state of the art for FPGA's seems to be around 700 nanoseconds. [0]

[0] http://stackoverflow.com/questions/17256040/how-fast-is-stat...


>It can be as simple as Arrays vs. LinkedLists; where Arrays have zero cost access and LinkedLists have high indexing costs, but low constant time insertion costs where you don't know the content length.

But then in the real world, arrays can still be faster than lists even when you're inserting a lot...

https://www.youtube.com/watch?v=YQs6IC-vgmo


Yes, if you can afford the allocation of the memory upfront Arrays are always faster :)

My point was more that understanding the semantics of the two types is important to how the program will run.


> Because my hiring process focuses on "can this person do real-world work in a real-world context", I usually manage to find rather excellent engineers that other companies overlook.

Ah, if only we could know how well the people do at the jobs they are hired to do.

9 times out of 10 that requires gut feel, subjectivity, biases, etc. Inverting binary trees is measurable and objective.


Use a realistic work sample test. It's marginally more time consuming than a whiteboard interview, but it's fairer and more effective. Most candidates would vastly prefer to commit and document a small patch than solve brainteasers in front of an audience. You'll learn a lot more about their practical skills.


Oh 100%.

I give 60 minute programming exercise in a language of choice as the first part of the interview.

I do that part first because it separates the doers from talkers quickly. Measurable and objective.


Certainly measurable, but what are you really measuring? If I gave one of my junior devs a serious problem to research, consider and solve and they came back to me 60 minutes later, I wiuld tell them to go jump in a lake...and I'm certainly not looking over their shoulder while they do it. I always say I do my best development in the shower when thinking about a problem fir a while. Unless you plan on jumping in there with me, you'll miss a lot of where my actual problem-solving happens. Short syntax questions to screen and longer-form 2-dayish take home problems for the win... with practically every other approach you're measuring things that just aren't really germane to how good software is developed.


When you have a large volume of applicants and new hires (like Google does) then you can actually start to measure the correlation of the things you can assess in the interview vs long term job performance. Then you can make your decisions about who to hire based on this evidence rather than gut feel, subjectivity, biases or even CS trivia. It's much harder for a small company to do that, so at best they can end up copying the methods of the large ones and hope that their situation is similar enough.


Yeah, I got to see that first hand at my last company. Very difficult to judge someones capabilities without some technical test or something.


> Because the amount of time they will spend implementing graph-traversal algorithms is asymptotically close to zero.

Yes. Definitely.

Depending on the nature of the work I could see expanding this statement to include writing any part of the software. In other words, "the time they will spend writing code is asymptotically close to zero". This, BTW, is also the reason I don't care for discussions about text editor efficiency. In certain domains these can be rounding errors in the accounting of the totality of the work surrounding the production of solid, reliable and well documented software and the tools required to test, deploy, support, configure and sometimes port it.

I can't remember the last time I asked a candidate how they would detect a palindrome or traverse a tree. I could not care less. There are books and google for that. I want to know how they think, how they organize projects, what they prioritize, how they approach new problems, etc. I also want to know they understand how business works. I prefer people with entrepreneurial experience, even if they failed...particularly if they failed.


Writing code or typing in code? If the former I believe developers spend amount of time significantly different from zero.


It isn't about the time spent typing. It's about that time in relation to the totality of the job. The fact that coding time isn't zero means nothing.

Depending on project type a team can easily spend 10x more time doing things other than actually typing code. In fact, they should spend 10x more time doing those things, for example, architecture, design, documentation, testing, etc.


Not frequently talked about - there's a fucking limit to how much one can learn and experience in a lifetime. I'd rather have someone that appreciates aesthetic, design, and people create a frontend for me than someone who can write a compiler from scratch. If you try to look for someone who can do both, you will be looking a very long time (and probably can't afford them).

There is no such thing in this day and age as a "best possible" developer for all situations. Going through this list will definitely let you understand the underlying principles and the implementations of the code you write, and the systems its running on - I'm not downplaying the value of learning here. It does frequently feel like its a high horse thing.


In my experience so far, there seems to be a line which demarcates the strong performers from the weak ones: an awareness - a basic, bare-bones understanding, if you will - of relevant theory.

A sysadmin or someone in devops, for example, obviously there's minimal benefit for them to understand the ECDSA protocol, and to an extent you don't even need to know how SSH key exchange works. But being aware of them: the role of the private key and the public key, what purpose they serve and the generic high-level overview of their role in SSH - that is information someone in that position would benefit from.

For a systems/app developer, yeah, you probably don't need to understand the theory behind how GC works in Java/Python/Go or the intricacies of DNS records, but when you start seeing heap size errors, weird pause times, or network connectivity issues, simply being aware of all of those things means that you know where to start looking.

I'm 100% with you - there's a limit, and as with all things, you have to prioritize. But every time I see this discussion come up, there's always this push back: "street skills over book smarts!" And there's always this undercurrent that the book smarts matter but ugh why do people care so much and there's never any talk of a balance: knowing what book smarts you might one day need to supplement those street skills.


I also agree with everything you've said - I guess the key point here is to learn what is necessary for what you do, and do a lot of learning on the peripheries to reduce blind spots. A point which many can argue about how much is "sufficient", depending on the type of work.


A phrase that might sum this up: "Know what you don't know."

People that know what they don't know are far more valuable than those that think they know everything. The first class can educate themselves when necessary to fill a gap. The second class is lost. They'll often spin in ignorance for entirety, never finding a solution to the problem, probably without ever once asking for help to someone nearby that is more knowledgeable.


And that's why I like to spend my time on hacker news!

I enjoy having a pretty big network of people to postulate problems at and get new areas of research when I come against a stuck problem.


From my experience, a strong CS background is the difference between software engineer and a prolific coder. With equal experience, the engineer’s results tend to be better optimized and easier to maintain. Of course there are exceptions to the rule; a naturally gifted coder can overcome some of these limitations by using patterns and algorithms that he’s learned from experience. Without the math, you’re limiting you potential and going to spend a lifetime reinventing the wheel.


Honestly, I've had mixed experiences. I've worked with both great CS students that failed in real-world efforts and poor to mediocre CS students that excelled in real-world efforts. I've also worked with professional developers with no official CS training that were some of the best engineers I've ever worked with (one had a philosophy degree and another a physics degree).

I think there are 2 fundamental indications of indication of success in software development/engineering: 1. an eagerness and desire for continued learning. 2. passion for the quality of the product. Sure, there may be other factors, but I think if you don't have these 2, they're probably not going to be a positive net contribution on a project.


Yeah, except all of those people who can write compilers are also the people your front end devs rely on to implement their pretty UIs. Without the mountain of abstractions they sit upon nothing works.

UIs can be ugly and still get the job done. If your compiler or interpreter doesn't work, you're hosed. If TCP doesn't work, you're hosed. If your OS can't properly schedule processes... You get it.

The people who make things work are demonstrably more important than those who make things pretty and pleasing to us. I would give up either, but if I had to, the UI can shove off. Computers were changing the world long before we even had UIs.


Pretty UIs existed in the 80s and 90s. TCP worked twenty years ago. OSs could schedule processes twenty years ago. Platforms had pretty decent standard libraries of common algorithms and datastructures ten or twenty years ago.

If they don't work, we're hosed - but they do work. They have worked for decades. So what are you saying all these people are doing presently?

Without the mountain of abstractions they sit upon nothing works.

Because Linux Audio sure is better with a mountain of abstractions, so is Wayland, so are the several unstable competing high level filesystems, and cramming applications in the browser is such a delightful experience to develop and use compared to fast, local, platform-integrated, more simple native applications. Because pulling down 30 dependencies like 'pad string' from github to get a HTTP listener running for a Ruby based IRC chatbot is such an improvement. Because everyone needs Bash on Ubuntu running on translated syscalls on the NT kernel and Microsoft SQL Server running on an OS abstraction layer so SQL Server Operating System provides 'robust memory management, thread scheduling, and IO services' on Linux because what - Linux can't do that already? And Docker definitely isn't just 'installing your application and all dependencies in a folder where the libraries never get patched' and a security nightmare in progress. These things are important, necessary, and without them sending text and taking digital pictures wouldn't be possible.

And hundreds of thousands of password hashes were just leaked because MongoDB installs insecurely by default, and hundreds of thousands of web pages through Cloudflare had memory leak data in them because a compiler layer had a bounds checking error. Problems that were solved ages ago by compiler writers and type system theorists and still haven't made it everywhere in production because $reasons.

Is that what they're doing? Writing scheduling engines on scheduling engines, writing GCs for more and more niche languages? Half-finishing more and more filesystems that can't deliver? Writing more and more enterprise broken-XML parsers? If I had to, that lot can shove off. Leave the rudimentary UI and the CLI and the text based protocols (and TLS).


>If they don't work, we're hosed - but they do work. They have worked for decades. So what are you saying all these people are doing presently?

Do you really think no one is currently working on compilers, thread schedulers, standard libraries, etc? Do you think your javascript runs faster today than it did five years ago solely due to hardware improvements?

I'm not even sure how to interpret the rest of your argument. Your middle paragraph seems to agree with me in spirit (I'm not a fan of hugely abstracted 'magical' systems either, UI or no UI.) That said, if you really think we wrote all of the lower level, technically complex code we'll ever need 20 years ago... I don't even know what to say.


Do you really think no one is currently working on compilers, thread schedulers, standard libraries, etc?

No. Do you really think all the work on compilers, thread schedulers, standard libraries, etc. is forward progress? Is reimplementing list sorting for a standard library for Python and for Lua and for Perl and for Java and for Clojure and for .Net and for Go and for Ruby and for JS and for Rust and for ... really the kind of progress you mean when you say "without this, nothing works"? Is this not sideways movement instead of forwards movement?

Are there really improvements in thread schedulers in the last 10 years, that justify "without this progress, nothing works"?

Do you think your javascript runs faster today than it did five years ago solely due to hardware improvements?

Pages are more bloated, load slower, call more dependencies, have more fluff, have worse UIs. Javascript engine improvements haven't brought me better quality of life, they've turned fast desktop apps into slow web apps. Fast desktop CRUD into slow web CRUD.

That said, if you really think we wrote all of the lower level, technically complex code we'll ever need 20 years ago... I don't even know what to say.

I didn't think that until it came out in my comment, and 20 years is a bit of a stretch, but 10 years ago - ok, why not? What exactly is so much better in low level worlds that nothing modern would work without it? ~10 years ago you could have Java and C# and Python and Perl and Lua runtimes, ZFS storage, Linux or BSD schedulers memory management, Varnish cache, memcached, LLVM/Clang, ACLs, SELinux, Virtualization, IPv6, TLS 1.1, MP3s, DVDs, 3D graphics, Mozilla, and on and on.

When you say "without the last ten years of low level code, nothing works" what exactly doesn't work? Drivers for new hardware. No CUDA? No JS frameworks? No Roslyn? Less capable compilers and JITs, no Go-Lang?

Right right, some things run a bit slower, and others would have to be written in different languages. But what ... can't you do? Actual things you can do now that would be impossible? Go back to a CLI only and there's a lot you can't do. Drop hardware video decoding and you can't usefully play video. Drop all the GC/RAD languages completely and there's a lot of apps you can't economically write at all. Go far enough back and there is no storage with checksumming or storage which can do software RAID or storage with multi-TB support, no independent processes - but this doesn't go that far back, not at all.

Sure there's newer, faster hardware, and there are benefits. Sure things are being worked on, but there's a huge amount of towering tech stacks now abandoned, duplication and triplication of the same effort, and general sideways churn and reinvention of wheels that isn't forward progress.

I guess I'm saying that C -> high level GC'd languages was a hell of a change. Two high level languages -> twenty high level languages isn't. It's more human-hours but less progress. Mainframe TUI programs to localizable high res desktop apps was a hell of a change. Desktop CRUD to Web CRUD isn't. No web server to a web server to PHP and maybe to RoR was a hell of a change. One dynamic web framework to a hundred wasn't.


I work on compilers full time.


And?


> Not frequently talked about - there's a fucking limit to how much one can learn and experience in a lifetime.

I agree with this sentiment and what's even more frustrating as a old (and since I'm over 35, rapidly aging) developer I find it incredibly sorry state of affairs that even the Type 1 kind of engineers mentioned in the article have to put in the off work hours in learning the newest buzz word language /framework /rehash of the day, to be able to keep doing what they love doing.

The other thing also is the frustration that those of us who actually love to learn, love to learn more than just technology. One of my friends actually choose to not give into the hype and not trading his time improving as a jazz musician, graphic artist and whatever else takes his fancy for learning the latest tech-du-jour. My own imposter syndrome keeps me from being able to take that kind of a risk. This is sad.


This is so true. I love technology, but once I leave work I want to learn more about woodworking, baking, languages, history, gardening, euro board games, hiking/camping, fitness, landscaping, etc. etc. There are so many worthwhile things to learn and do outside of programming that if I don't learn it on the job I'm probably not going to find time for it outside the job.


Isn't having to learn a new language/framework every month a symptom of not being a top class engineer? If you're a web developer code monkey yes that might be important for career development, but the fundamentals are mostly unchanged/rehashed from 80s research.


  > someone that appreciates aesthetic, design, and people
  > [can] create a frontend
  > someone who can write a compiler from scratch
Hey, that sounds like me. At least, I've done those things.

I've also found it doesn't matter; no one cares. This combination isn't valued.

The only thing that's valued (per revealed, not stated, preference) is your skill at solving homework problems on a whiteboard with an interviewer breathing down your neck and hoping to prove you're an idiot.


I'm a Systems Engineer (in the DevOps/automation space) but I want to move towards development more and more. I never did attend university and my maths is poor. I'm in agreement with you that there are a lot of people running around doing perfectly fine at integrating into the industry and solving problems.

Recently I've been getting more and more eager to move into development, however. Now perhaps it's just a personality thing, but I'm very keen to learn CS basics and get a good grasp of maths whilst I make this transition. I think this is important for me simply because I want to be able to utilise CS level maths to solve certain CS problems. As I never went to university, I'm losing out on some efficient solutions.

All this being said, I could stumble around without this knowledge and do just fine ($200k a year, in fact) as a contractor. I just personally feel the CS level knowledge is important to become good at what I want to do.


You just described my current position. I'm reading as many books as I can. I want to contribute to the linux kernel.


What books have you found so far? Willing to share a link to them (free or otherwise)? :)


Can I ask if you contract for multiple companies or just one? What vertical banking, government? That seems like a pretty solid income. Cheers.


Those figures are in AUD. If I worked a full working year, I would essentially earn AUD$850 * 242 (number of estimated working days in the year), or $205,700 pre-tax. That's about USD$154,000 based on xe.com and just rounding a bit.

I work in any fields that need me, excluding gambling or anything immoral or unethical.

EDIT: forgot to say that I work for multiple companies, yeah. I love around once every 3-6 months, essentially.


I agree with almost everything you say, though I disagree with your second paragraph some. I don't know that I would characterize many Type 2 devs as underachievers, per se, but I think the website we're discussing rightfully points out that there is a real difference in skill level (on the whole) that makes the competitive outlook for Type 2 much more challenging than Type 1. There are many more Type 2 than Type 1 developers. Without something to otherwise set themselves apart from the crowd, those Type 2 developers are precisely the ones you can shop the globe to replace for lowest dollar and not come out the worse for having done so. Type 1, not so easy to shop for price if that's who you really need (most don't).

I say this as someone that is very much in that Type 2 category and only encounters the Type 2 developer on a regular basis. I deal in ERP systems, a subject which generally doesn't draw many Type 1's to either creating them or running them for their companies. That's not to say there aren't smart people involved in this area, nor are they underachievers. Still, by and large many of my colleagues have a certain very practical, but shallow understanding, of technology generally and simply aren't equipped for more challenging, real engineering scenario.

I disagree with the website in that somehow becoming more Type 1 is necessarily the only way to bolster your competitiveness as a programmer. I do pretty good compared to many of my peers in my area of expertise not because I strive to be an engineer (which I end up doing anyway), but rather because I have gained substantial knowledge of accounting, warehouse operations, logistics support, etc. that allow me to use my modest technical skills (by comparison to the Type 1s) to achieve not just any program for my clients, but often times the right one. Sure, it may not be as efficient as it could be, or as clean, etc. but it matches the actual need better than many of my peers can achieve. The value of that domain knowledge also makes it so I don't compete directly with the Type 2s that "just program" on a global scale. It's helped with age bias, too.

Anyway, I still look up to many of my Type 1 friends and (occasional) colleagues for their often times more accomplished technical skills.


As an Indian guy, I've met my fair share of the type of developers you described. They don't take work home, don't have any particular passion for the job, and are biding time before they can go do an MBA and move on to investment banking or consulting with the Big 4.

Just as an aside, your comment was among the most complete I've ever seen. It addressed almost every caveat that I could have though up!


Another Indian guy here. Can we limit the broad paint brushes here and realize that the circle of developers you may have exposed to is solely your experience? I think you are leaning on to some assumption that other "Indian" developers are like this. Well if you are located in India you can say the same thing for "all" professions right? And whose fault is it that mostly all the developers you come across seem to you like feet daggers? Maybe you can work in a better company than you are in where you actually see great developers? It just grates on my senses when someone just paints a whole section of people with one huge ill informed paint brush. I am sure you have your reasons but also realize that you haven't met a huge amount of people to make that assumption, or have interacted with people in other great places to work or there are vast number of great Indian engineering talent working in a lot of countries. An ill informed opinion only makes the quality of information degrade on this forum. And seriously unless you are a Nobel prize winner yourself there is no point in judging others.


>"And seriously unless you are a Nobel prize winner yourself there is no point in judging others."

I didn't see the OP was judging anyone but describing a type they perceived was only interested in development as a stepping stone to something else.

In fact you yourself seem to be completely judgmental and also condescending telling someone:

>" ... you haven't met a huge amount of people to make that assumption, or have interacted with people in other great places to work"

You seem to have no issue passing your own judgement however by telling someone that their opinion is "degrading this forum." That's a pretty rotten thing to say.


I think vikascoder makes a really good point, actually. The fact that the poster was Indian was entirely irrelevant except in the context of stereotyping all the "other" Indian programmers. I found the post offensive on behalf of anyone who is tired of lazily getting lumped into any group. Despite not being an Indian programmer.


Apologies. I'm not judging anyone here. My implication was more "different strokes for different folks". Some people like to take work home, others prefer to have a drink in the evenings. Some manage to combine both :-) I'm just suggesting that priorities differ.

But for the sake of argument, while I don't deny that India has some star developers, I think it's incorrect to suggest that there isn't a huge section in it for reasons _other_ than a pure love of computers. Many do it because they need the money, or need a stopgap before they go on to do other things etc. I stand by what I said.

Finally, I'm not sure why you're implying that maybe _I'm_ a feet dragger and that I need to work on improving the company I keep. I thought my comment was fairly mild and inoffensive.


> They don't take work home,

You're not paid for doing that. Please stop.


Some people do that and get promoted, and in the end get paid well. You can't stop that; there are hustlers in every industry and always have been.

It's only a problem if the employer starts treating this as the norm, and expects to sacrifice their personal time. Then it's time to find a new employer.


> Some people do that and get promoted, and in the end get paid well.

The fastest way to get promoted when you are young or low on the ladder is to leave your company, not to climb up.

Your future companies won't know and won't care what time you put in.


You're right that for most people the fastest route to advancement requires changing companies, and you're right that future companies are unlikely to care (much) or know (much). But you're wrong in your conclusion.

The (good) reason to take work home is not to suck up. That's crappy strategy. The good reason to take work home is that it's practice. The sooner you put in the hours, the sooner you can get-and-keep the better jobs. Obviously there's a limit to personal endurance, and so just pushing yourself to grind out 80 low-quality hours per week is insanity. But taking work home, selected with the intention of improving your own skills, is often good advice for getting ahead. Hustle correlates with prosperity.

You're right, though, that if _I_ take work home, that you, user5994461, are thereby disadvantaged in the market. Thus I assume that your imperious demand that others not pursue career success is a cynical ploy to improve your position.


Indeed, you working more is lowering the bar for employers, you're helping them to pay less and abuse their workers. (Let's ignore for a minute that I don't work for these employers).

Since career success is linked firsts and foremost to negotiating offers and following a career plan, I demand that you negotiate better and follow a more aggressive growth plan! It will give you more returns for less efforts than long hours.

I am on the high end of the spectrum, I actually need others to pursue their career success better in a cynical ploy to improve my position. ;)


You're paid under the assumption that you're generating a certain amount of value. That could be achieved, or appear to be achieved, by taking work home.


well, some people are paid for that.


This was so eloquently put, I almost completely saw past how condescending it feels :)

Disclosure: I am probably a Type 1-aspiring Type 2 developer


something tells me in this era the majority of us are not CS/theory-first developers. And that's a good thing. It's the natural thing. And it means we have true inspiration to learn CS when it naturally comes time, rather than artificially enforced training before you have anything to do with all the theory, which also means a harder time learning it.


That's a pretty broad assumption about CS grads. That it was regimented and scheduled does not preclude students being passionate about learning it. And in many cases it's the best possible environment to learn.


I agree. I'm simply saying for many it makes a lot of sense to learn the theory as it actually becomes relevant to what you're already doing.

The more important thing though is that it's a line of thinking that enables people and gives them confidence to get going earlier, rather than wait until they've checked a bunch of checkboxes and have been cosigned by an institution. I think all the fears and ego-issues around not knowing enough before you get started are a bad thing, an inhibitor.

That said, I now wish I had done those 4 years. It would be amazing to be able to take 4 years off just for school at 31 than to have to struggle to find time to learn in the middle of projects. But it just was never gonna be any other way for me than diving in. And after all, a mindset of continual learning is what it's all about.

I think diving in and building something--regardless if you start out early with a CS degree--should be the primary focus.

If you jump straight to a CS degree before even building a simple website or simple whatever, it's a mistake. And ideally develop a minor level proficiency and achieve some wins where you start feeling yourself for your growing skillset. If you jump into a CS degree without at least having that, ur making a huge bet on something you know nothing about, that you might not even enjoy.


I think that hypothetical you describe at the end is very unlikely. Assuredly most CS majors had interest in computers that manifested in writing a simple app in C or building a web app or an IOS app prior to college.


I think 99% of developers fall into the "Type 2" as you describe.


I'm not familiar with these designations type 1 and type 2. Could you elaborate?


This is a reference to the section "Why learn computer science?" in the article.


I use the terms of the article itself


I actually didn't read the section "Why Learn Computer Science" because my own bias saw this as a rhetorical question :)

Now I feel silly. Thanks.


The article is blocked for me in work for some reason.

Can you post the explanation of these terms?


>"There are 2 types of software engineer: those who understand computer science well enough to do challenging, innovative work, and those who just get by because they’re familiar with a few high level tools.

Both call themselves software engineers, and both tend to earn similar salaries in their early careers. But Type 1 engineers grow in to more fullfilling and well-remunerated work over time, whether that’s valuable commercial work or breakthrough open-source projects, technical leadership or high-quality individual contributions.

Type 1 engineers find ways to learn computer science in depth, whether through conventional means or by relentlessly learning throughout their careers. Type 2 engineers typically stay at the surface, learning specific tools and technologies rather than their underlying foundations, only picking up new skills when the winds of technical fashion change.

Currently, the number of people entering the industry is rapidly increasing, while the number of CS grads is essentially static. This oversupply of Type 2 engineers is starting to reduce their employment opportunities and keep them out of the industry’s more fulfilling work. Whether you’re striving to become a Type 1 engineer or simply looking for more job security, learning computer science is the only reliable path."


For those who need the structure of a formal course, or who want a CS degree for career reasons, the University of London's International program is a great option - it's very flexible, so easy to combine with full-time work, and costs around $2500 per year for 3 years. I'm around 2/3 of the way through, and find it helps force me to learn things I know I need to know, but might not make the time for otherwise

http://www.londoninternational.ac.uk/courses/undergraduate/g...

The creative computing has a slightly more art/graphics emphasis, but is still rigorous: http://www.londoninternational.ac.uk/courses/undergraduate/g...


Is this a BS that would be recognizable? I gotta ask. I also gotta ask: Is it rigorous? Not just time wise. I want a program that will be stimulating (even if brutal).

For clarity: when I say recognizable, I mean, we aren't talking some for profit online uni like DeVry. I'm not exactly looking for Stanford level, just something that would has a respectable reputation for actually teaching.

edit: I really don't know if I'm asking this question in a way that is non-aggressive, so I apologize in advance. I'm very interested in this (I never heard of it before) and I'm just wondering how much you like it.


There's also: https://www.cs.ox.ac.uk/softeng/

A lot more expensive (£25,000 all-in, part-time over four years), a reasonably well-known university, it's an MSc, flexible curriculum, no undergrad degree needed.


What am I missing £25,000 all in and no undergrad degree needed? How is it possible to be this cheap and no undergrad needed?


You have to pass the coursework to get the degree


Sure, and 4 hours a day is quite a commitment but for 25K this seems like a fair tradeoff to earn a CS degree from a respected institution.


> reasonably welly known university

I know it's partially tongue-in-cheek, but how is comp sci in Oxford? I thought all big names in engineering/compsci were MIT, CalTech, and so on. Please correct me if I'm wrong.


The Software Engineering department, via which the course is offered, has a strong faculty from functional programming and formal methods perspective. There are some interesting names here: https://www.cs.ox.ac.uk/people/faculty.html

However, from a prestige point of view, the parent university is sufficiently lofty that it's hard to go wrong; similarly, Judge is a questionable business school, but telling people you went to Cambridge will open any doors that need opening.


Very interesting names indeed.

Some notable ones:

- Tim Berners-Lee

- C. A. R. Hoare


Tim Berners-Lee? Wow, I am happily corrected.


Idk about Oxford but Cambridge is top notch. In high-assurance security, the CHERI work is one of the best:

https://www.cl.cam.ac.uk/research/security/ctsrd/cheri/

A quick Google of verification work (decent test of skill) shows Oxford doing CBMC for C language (significant), a compiler for security protocols, automatic verification of firmware for Intel, work on CSP for concurrency, and contributions to Cadence Jasper for ASIC verification. So, they're not doing the best work but they're not sitting on their hands either.


> a reasonably well-known university

lol! That's sarcasm, right? :P


Its all about scale, people. Harvard would be 'well known'. Everyone knows Harvard the world over with little exceptions.

Harvey Mudd? One of the best schools in the country, probably one of the top 100 hundred in the world. I would wouldn't say its well known. However, its reasonably known within the community of grads it puts out: Engineering, Science, Computer Science. Hence...reasonably well-known.

Like a litmus test of sorts, maybe.

They're not linguistically congruent. I swear!


> Hence...reasonably well-known.

It's nearly 1,000 years old. It's extremely well known, regardless of the field or context.

It has 680 years on your country, young pup :)


It's a British statement free from all sarcasm.


"Reasonably well-known university" XD



I'm aware. I live in London. I just found the idea of calling Oxford, an almost thousand-year-old university "reasonably well know" absolutely hilarious!


I admit two things here.

1) Deep ignorance of British Cultural and Educational institutions. I honestly didn't equate this with "University of London is world class, you idiot" type thinking. I don't know why, I think it comes from the fact that some universities want to seem more legit, at least in the US, will attempt to sort of use a well known universities naming scheme. Like [American University](https://en.wikipedia.org/wiki/American_University) which is has a solid reputation as a learning institution and [American Public University](https://en.wikipedia.org/wiki/American_Public_University_Sys...) which for all I can tell, attempts to ride the back of the name of American University, but is not exactly known for its quality of education.

2) Given that, I felt uncomfortable even given some research poking around I was doing stating anything too affirmatively.


And in the top 500 in World Rankings too!


Are you asking if a degree conferred by the University of London is more recognisable than one from a place like DeVry? Yes - colleges of the University of London are frequently in the top ten of all universities in the world.


The degree you get is the same one you would get offline from the University of London - in fact, you can transfer to the final year of the offline degree if you're willing to pay the (much higher) tuition costs. That suggests that the curriculum and marking standards are the same, in fact I believe that the exams themselves are actually the same as the in-person exams.

http://www.qmul.ac.uk/international/international-students/e...


I think they're asking more like what's the difference between this degree (which is remote) and an "actual" degree in UoL where you attend normally and physically as a student. That's also my question.


However, LSoE is not the same as Goldsmiths, which is the college mentioned here. London colleges are generally good but there is quite a lot of variance - between "great" and "the very best".


I'll take great. I'm okay with that.

even 3500 USD is a steal for something like that.

I wonder if the cost is so low because they don't get any grants/subsidies from anything other than the UK government, so this is pretty much pure profit from a administrative perspective?


I think the cost is so low because they are providing the absolute minimum in terms of a degree - curriculum, and marking of assessments and exams. In fact, you even pay for them separately, so you can pay some amount ($800?) In September to register for the year on 4 courses (which is full time, equivalent to 30 semester units in the US), and get access to the whole curriculum. Then you can decide by Feb the following year which exams you want to do, and only pay for the ones you want to do that year, which is around $300 per exam. You are mostly self taught, though there are some online forums with teaching staff available to answer questions.


I'm kind of skeptical about the total cost, I've been looking at the fees section of the course's website (http://www.londoninternational.ac.uk/courses/undergraduate/g...) and it looks it's upwards to $7000 per annum for the BSc, or I'm missing something.


I believe thats total cost of degree, if I'm reading this right, when converted, is 6921 USD (or 7K). Which works out to less than 2500 a year (which is what I was working on). I figure, if you start count some additional unexpected expenses somewhere, 2500 is reasonable. Still a steal. From what it sounds like and upon doing more research, is a all around solid uni. I wonder too if its too good to be true myself. Haven't found evidence to the contrary yet. I WOULD LOVE some input from anyone who actually studied this, hopefully from the USA? though any input is good.


No, it really is around $7000 for the whole thing! It used to be more like $7500, but the british pound dropped after Brexit :) I live in the SF Bay area, and I am doing the BS in Computing and Information Systems. My daughter is doing the Creative Computing BS, while working full-time as a software engineer in SF - she actually got a transfer place at UCLA to study math a few years ago, but decided she'd rather learn while working, and this lets her do that. Happy to answer any questions, here or by PM.


Wow then it is actually extremely interesting.

My main question is about time, if you don't mind answering. The website estimates about 250 hours per course, looks like about 4 courses per year for the first two years, that would amount to about 4 hours of study per day (weekends included) per season (about 9 months), how accurate is this? In short, how much time do you devote on the program?

Also, how are exams conducted? Does one have to be physically present?


I would say that if you're doing the University of London BS full-time (4 courese), it's completely equivalent in terms of time required to full-time study in the US. So I think that's about the right order of magnitude. There are exam centers all over the place, even Wyoming has two! You only have to go there for the exams in May. There is a link below with this year's exam schedule for the CS exams, just to get an idea of the timings.

http://www.londoninternational.ac.uk/sites/default/files/doc...

http://www.londoninternational.ac.uk/current-students-exams-...


You can study from 3 to 8 years depending on how much free time you have. There are only three deadlines each year - two for coursework assignments and one for exams. It takes me at most 5 days of study per course before each coursework and before an exam, so 15 days of study per course. You can take up to 4 full or up to 6 half courses per year.


The university of london international programme (used to be called the external programme) has been around for more than 100 years, these are some of the alumni (not in CS though!) http://www.londoninternational.ac.uk/our-global-reputation/o...


Extremely interesting. How much time do you need to devote for a successful completion in 3 years per your experience? Is there any way to advance faster?

Edit: The website estimates about 250 hours for each course, at four courses per year, that's 1000 hours per nine months, it looks like the estimate is at least 4 hours of study per day (weekends included) for the duration of the course (about 9 months). How accurate is this?

Also, do you have to be physically present for the exams?


You can study from 3 to 8 years depending on how much free time you have. There are only three deadlines each year - two for coursework assignments and one for exams. It takes me at most 5 days of study per course before each coursework and before an exam, so 15 days of study per course. You can take up to 4 full or up to 6 half courses per year.

And yes, you have to be physically present at an exam, but not necessary in London - there are examination centres all over the world.


I think that's probably correct. 4 courses per year is equivalent to 30 semester units in the US (2 semesters of 15 units each). You can't do it in less than 3 years, due to the university rules which won't let you take more than 4 courses per year.


Current BSc Computing and Information Systems student here. Program structure, flexibility and exam structure are really great. But there were some truly horrific quality control and student support issues, so cannot advise this program to others. I would switch to another program in a moment, but I am already in the last year.


There is also a similar offering from Georgia Tech: https://www.omscs.gatech.edu/home


Just curious -- where do you live? UK? US? Other?



btw, this is the link to the various course guides (has the table of contents and the first one or two chapters for each). Some of the material is a little outdated (e.g Computer organization and architecture doesn't cover flash memory), but I feel like it's pretty easy self-study the little bit of more recent technology that they might not have included yet.

Final year includes "artificial intelligence", "natural language processing" and "neural networks". Haven't got there yet, so can't comment on the material covered. Also, it's worth noting that UK degrees follow a different pattern from the US, each year the mnaterial gets increasingly hard, and also gets weighted higher in terms of the final grade. So an A in a final year course is worth much more than an A in a first year course, which sort of makes sense to me.

http://www.londoninternational.ac.uk/sample-study-materials-...


Yeah, I did my exams last year at Daly City library, and this year I'll be going to San Jose State university to do them.


California


This is a solid list, but its a shame that no computer graphics resources are even mentioned. Although the reason for the omission is mentioned in the FAQ, I'd argue that computer graphics basics (images, basic rasterization, color spaces, etc.) are as fundamental as networking or databases. A link to Computer Graphics Principles and Practice (https://www.amazon.com/Computer-Graphics-Principles-Practice...) would have been nice.

I understand that most graphics resources out there focus on real-time 3D rendering for games or writing raytracers, which I agree are currently industry specific topics. Your average developer isn't going to write a vector graphics library as part of their day job, but the browser abstracts computer graphics in the same way it abstracts networking or compilers, so if the goal is to understand the underlying principles of software platforms you'll be working on every day I think computer graphics is a strange, biased, omission.


Hi! It was hard to draw the line. And then, it was an omission to not even make a suggestion in the FAQ. Now fixed, thanks :)


Heh, heh, draw the line.

...Sorry.


best pun of today :P


How does that book compares to Peter Shirley's Fundamentals of computer graphics [1]

I'm interested in CG but don't really know where to start.

[1] https://www.amazon.com/Fundamentals-Computer-Graphics-Fourth...


cs184.org


The video links there are broken. Any idea if they have the videos available anywhere?


I don't think so :( UC Berkeley were forced to take down lecture videos as a result of some slight violation of the ADA. There are a couple links about that further down this page.


The videos will be taken down starting March 15th: http://news.berkeley.edu/2017/02/24/faq-on-legacy-public-cou...

So you still have a day to run youtube-dl on whatever you want from their channel. Recordings from a 2014 CS 184 course can be found here: https://www.youtube.com/playlist?list=PL-XXv-cvA_iBifi0GQVF1...


Does anyone know if there is any project that will archive these? This is a shame. I'm not even clear why the DOJ got involved here.


Those seem like a different set of videos though than the ones from the linked page about this year’s course.


I don't know what ADA is in this context (misspelling of NDA maybe?), but hopefully this means that the next time they run the course -- in 2018 -- they will avoid those violations and be able to distribute video recordings.


Americans with Disabilities Act.

UC Berkeley was hit with a lawsuit saying that their videos were inaccessible because they had no subtitles. So they're just taking them all down to avoid it.

Some people just don't want others to have nice things.


The stuff costs money, too. Money that might have gone on other things even more important. Just one class of disability combined with sidewalks might cost LA over a billion dollars:

http://www.scpr.org/news/2015/04/01/50727/l-a-to-pay-1-4-bil...


>Some people just don't want others to have nice things.

I don't know you or your situation, but it's easy to think that while not really being how this works. ADA lawsuits are filed for a variety of reasons and at least some are intended solely to be the stick that gets a company to do the right thing for a group of people who have protection under the law. Often, these lawsuits are accompanied by timelines for implementation that are pretty reasonable (months or longer). I can't speak to this particular one, but that was an experience at my company. Some companies/organizations choose to provide the benefit in one way or another, and others choose to behave in the manner of UC-Berkeley (and, presumably, they know why they are choosing to take down videos instead of captioning). It isn't fair to single out people w/ that empty platitude. Sucks that the content is being taken down, but I bet UC-Berkeley had a choice.


ADA is the American Disabilities Act, which require equal access for those with hearing, visual or manual disabilities. The videos in question most likely lacked captions for the hearing impaired, which would have been prohibitively expensive for Berkeley to add to the thousands of videos that they offered. Therefore to comply with the letter of the law, Berkeley removed access to the videos.


It's the acronym for American Disabilities Act or something. I get ADA-related results every time I look for updates on what people are doing with the Ada programming language. It gets more unfortunate when one realizes how much words like safety, formal policies, and so on are used by web sites dedicated to both. I almost got fooled a few times thinking I was reading a result about the language but it was some ADA safety initiative. (rolls eyes) Have to work the filters carefully.

I will say you don't want to screw around with ADA requirements. I've seen plenty of news reports of companies taking hits in court over it. Best for business owners to see what they need to do and do it whenever they can afford.


It's the Americans with Disabilities Act: https://en.wikipedia.org/wiki/Americans_with_Disabilities_Ac...


Great list that covers the basics. For anyone interested to expand I would suggest the following:

(the description is taken from the corresponding courses I took in college which I found super helpful)

Programming paradigms: Examination of the basic principles of the major programming language paradigms. Focus on declarative paradigms such as functional and logic programming. Data types, control expressions, loops, types of references, lazy evaluation, different interpretation principles, information hiding.

Textbook on Haskell and prolog would be recommended.

Computability: An introduction to abstract models of sequential computation, including finite automata, regular expressions, context-free grammars, and Turing machines. Formal languages, including regular, context-free, and recursive languages, methods for classifying languages according to these types, and relationships among these classes.

Introduction to the Theory of Computation by Michael Sipser

Explorations in Information Security: A broad survey of topics in information security and privacy, with the purpose of cultivating an appropriate mindset for approaching security and privacy issues. Topics will be motivated by recreational puzzles. Legal and ethical considerations will be introduced as necessary.

Someone already mentioned computer graphics which I excluded. I personally had the most fun in college in my graphics courses. They were hard but super rewarding and a ton of fun!


I wish there was a way to include "What could have been" or "What could be" into the "What is" of an "education" in Computer Science.

Distributed Systems, Databases, Networking and Architecture all have a past with much better solutions that were never adopted because of patents, cost, or some such other that grow fainter with every coming day.

If courses like these cosnsited of "History" in parallel I think I'd be a more well-rounded graduate.


Personally I think one of the problems with self-learning are gaps in knowledge.

As a part of a formal education you get to learn what you like, as well as what you do not like much.

My advice to self-learners is: never engage in "cargo-cult programming". This means: do not touch or reuse code that you do not understand. Force yourself to understand. If you lack the time, write it down and follow up later.


Great advice to everyone, not just self-learners


True.

But when you are just taught to use libraries and technologies without having some exposure to he principles behind their inner workings, it's easier to fall in cargo cult.


I can't speak for every student at every university, but my university is in the top 50 of csrankings.org and yet even in my junior-level courses there are loads of cargo-cult programming students. They think they understand things like compilers, operating systems, and so much more, and they delude themselves into having those beliefs based on things like "I'm in CS; of course I understand $THING" or "I'm at $UNIVERSITY and we've learned so much, so I definitely understand $THING". In reality, if you mention "ELF", most of them would probably think you're talking about a special type of new line, and the only thing they think "IR" could possibly stand for is "infrared"

It gets even worse when you get on the topic of "Computer Architecture" - so many people think they can talk to their computer on a deep level, without realizing that what they're learning in class (MIPS) is actually not what their computer understands!


Well, you can ask over 50% of software engineers: explain in 1 phrase what engineering is, or what software is.

Many people cannot give you the correct accurate definition.


spot on, man. takes a lot of discipline to remain indifferent to the status-signaling side of learning


The best way I found was to go through the CMU BSc requirements (or other university), look up the public course pages and pair the lecture notes (and occasionally lecture vids) w/TAoCP series looking up the same topics but getting a thorough drilling in the topic by trying the problems in the book. Before I started doing this I kept forgetting material after taking it a few weeks later like when I had to rewatch a lecture on floating point to remember what the bias was.

If you look up 15-213 and get the book CS:App that accompanies the course it will more than prepare you to understand the MMIX fasicle (or orig MIX if you want) http://www.cs.cmu.edu/afs/cs/academic/class/15213-f16/www/sc...


Ozan and Myles also teach this stuff in-person in SF: http://bradfieldcs.com/.

I just finished their databases course and it was excellent.


The blog is good reading, amongst interesting technical insight and commentary you have wonderfully humous passages such as"

>"Everyone “has been meaning to” learn Rust or Go or Clojure over a weekend, not C. There isn’t even a cute C animal in C’s non-logo on a C decal not stuck to your laptop."

Brilliant.


Can't say enough good things about Myles. I had a similarly good experience with him via Bradfield. Myles did a Clojure course for my colleagues and it went remarkably well, especially given that none of us had really done much work in a functional language before.


Me too - I took their Computer Architecture course and really enjoyed it. Would recommend.


I took their Databases course and currently taking the APIs one. Highly recommended!


How much is the tuition for their courses?


You can find it on their site - bradfieldcs.com


$1,800 per course.


About the part on databases:

   "but we suggest just writing a simple relational database management system from scratch"
This part is the one I'm more interested, but also the most hard to get.

As explained there, is very hard to get information about databases (all is hunting material here and there). So, how do this? How build a "basic RDBMS"?

Probably looking at sqlite will be the default answer, but that is not the ideal. Is hard to see how was the thinking process after a materialized and realized piece of code.



The AWK Programming Language by the creators of the language is a great book to work through and includes a section on building a simple relational database system in AWK.

This book is a great introduction to databases: http://infolab.stanford.edu/~ullman/dscb.html


I love this. It has a Back to Basics, no BS approach to CS that appeals to me. I agree with all the recommendations. A couple of tiny comments:

- I know that learning C is not strictly speaking part of Computer Science, but it is a nice counterpart to SICP, ties in with other topics (such as computer architecture and OS) and should definitely belong to this curriculum. The authors of this site themselves have defended C in another blog post. Like pg would say, all you need is Lisp and C.

- IMO a better option for learning databases is Jennifer Widom's MOOC: http://cs.stanford.edu/people/widom/DB-mooc.html


Seconding.

Widom's Database course is extremely well done, and very amenable to picking and choosing a relevant subset for your needs.

Working through K&R C or something similar should be a prerequisite for the OS course and Skiena's lectures.


Isn't that DB MOOC more about using DBs rather than implementing them?


It's more about relational algebra rather than implementation (at least when I took it)


Yes, but what we need is more materials about actual implementation - how ACID is implemented (e.g. in C++), how query parsing/optimization/execution is realized, etc...


FYI... The video content for SICP seems to be going away in a couple of days... https://www.youtube.com/playlist?list=PL3E89002AA9B9879E


Great. A pinheaded Dept. of Justice decision has asserted that UC Berkeley is in violation of the Americans with Disabilities Act "because, in its view, not all of the free course and lecture content UC Berkeley makes available on certain online platforms is fully accessible to individuals with hearing, visual or manual disabilities." Therefore, rather than having to revamp all their freely available YouTube and iTunesU lectures at great expense, the University is simply removing them.


Thanks for the heads up. I'm grabbing these all to a HDD now for review later.


Could I bug you for these since you're grabbing them?


Not to worry: https://archive.org/search.php?query=subject%3A%22UC+Berkele...

Some folks organized and grabbed it all. Even the manual-download iTunesU stuff. Should be available in the near future.


:/

I was tutored by BH in middle school - he taught me recursion and basic programming through Logo. My family was a little close with him for awhile; we used to do a New Year's brunch that he'd come to, but I haven't seen him in person for nearly a decade.


Has anyone mirrored it?


I strongly feel that Erlang really, really needs more visibility in the world. It is an important language for distributed systems, but the language itself is startlingly spare, using recursion and pattern-matching in lieu of conditionals. There are two resources that I like, a 3 week Future Learn course [1] and Learn You Some Erlang [2].

It is my belief that the Erlang "process" is a true object, as opposed to Ruby/Java/C++ etc object which is, ultimately, a thin easily-torn veneer over global spaghetti.

WhatsApp's acquisition for $1B for a 57-person team that could run a large, world-wide messaging system with Erlang should also be considered a resounding endorsement.

Last but not least, I personally have come to see the overall trend toward statelessness is a coping mechanism to deal with bad technology.

(If I could change my name to ErlangJosh, and if it sounded good, I would.)

1. https://www.futurelearn.com/courses/functional-programming-e...

2. http://learnyousomeerlang.com/


What about languages like Smalltalk and Lisp? ... Well, Lisp at least have some acknowledgments, but Smalltalk can be compared to Erlang in lot of aspects.


The book that I recommend for Networking, and that has been recommended to me by every fulltime NetEng I've ever asked, has been Interconnections.

https://www.amazon.com/Interconnections-Bridges-Switches-Int...

Yes, the material is a bit dated. Yes, it won't give you the ins and outs of what you need to know. What it will give you is the why and from there you can figure out everything else you need to know.


for networking, my _personal_ favorite is 'network algorithmics' by george-varghese. basically, i find it to combine topics from a variety of cs disciplines including (but not limited to) computer architecture, algorithms, data-structures and to some extent, compilers as well.


That's usually the second book recommendation I would get.


:) indeed.

the trouble with books like these is that, you get sucked into it/them, and can spend literally months on end going through them, working the exercises, reading reference papers etc. etc. it is megafun ;)

to me at least, the books might _appear_ to be dated, but the vantage point they offer to inspect the networking landscape is invaluable. in some ways, as john-barth says in 'chimera':

"the real magic is to understand which words work, and when, and for what; the trick is to learn the trick. ... and those words are made from the letters of our alphabet; a couple-dozen squiggles we can draw with a pen. this is the key! and the treasure, too, if we can only get our hands on it! it's as if - as if the key to the treasure is the treasure!"


They recommend to put 100-200 hours in each topic. That would require to give it 8-10 hours a week for three years. Sounds feasible even with a day job actually.


I kept two text books after graduating college -- one was Structure and Interpretation of Computer Programs. I also remember going back through Harvey's videos online whenever I missed lecture... are those being taken down too? https://news.ycombinator.com/item?id=13768856


Maybe OT, but what was the other book?


Rudin, Principles of Mathematical Analysis: https://notendur.hi.is/vae11/%C3%9Eekking/principles_of_math...


Thanks. Spivak is staring at me from my bookshelf, waiting for me to do the exercises, but Rudin seems like a worthy goal.


I know of the people behind this, who founded a CS school. I've heard good things, but I hadn't seen this before!


Are all those CS61 lectures from Berkeley shortly going to disappear from youtube?


youtube-dl can take a playlist as an argument.

That playlist is about 3GB of videos downloaded at max quality, in case you're curious.


Thanks for the tip. I'd not heard of this before and it looks really useful and not just for this. Until now I've been doing a lot of legwork flitting between chrome and vlc


They've been there for three years coming June. Not sure why they would disappear now?

If it's legal to do so, you could use a YouTube video downloading platform to grab them? :)

EDIT: Didn't know about the whole legal case thing. Thanks for letting me know (and down voting me... lovely.)

I've snatched a copy of it all.


Berkeley lost a lawsuit saying their courses were not in compliance with the ADA. They're taking a lot down in response. The people over at /r/datahoarder were working on getting them archived on Archive.org, though.


There's still quite a bit of effort to be done, help would be useful. Particularly with semi-manual downloading of material on iTunes (not all of it is on YT). See http://archiveteam.org/index.php?title=UC_Berkeley_Course_Ca... and https://www.reddit.com/r/DataHoarder/comments/5z2499/many_of...


Archive Team is archiving everything.


That is fantastic. I never heard of that site before! http://www.archiveteam.org/index.php?title=UC_Berkeley_Cours...


Berkeley's statement on why they are disappearing starting 3/15: http://news.berkeley.edu/2017/03/01/course-capture/


They are going to disappear; that being said, you can find all the course material here: cs61a.org (except lectures :-()

Took the course years ago (having never programmed before) with John DeNero; phenomenal learning experience that made me fall in love with CS.


Don't take it personally. It's the information in your post that's been downvoted to signal that it's incorrect.


Yeah OK, that's fair enough. Thanks :)


Why not watch the original SICP lecture videos?


This could do without the "Why learn computer science?" section entirely.


It might seem obvious to you, but the bootcamp crowd really underestimates the impact of computer science on their day-to-day work. I know because I used to be among them.

At least in my case, I was indoctrinated by my bootcamp to put a standard CS undergrad degree into the same bucket as the rest of our broken education system. So at my first job I focused on keeping up with the trends, trying to master web development by becoming hyperproductive with my day-to-day tools. I was trying to emulate the most visible engineers I saw at conferences, figuring that to shape the trends I'd have to be on top of them. Very naive of me, but then again, I didn't have much exposure to the world beyond web development, and you don't know what you don't know.

I wish I could have seen this post years ago! Would've saved me a ton of time.


Can you give an example of a time when having more knowledge in CS would have helped? I'm finding anecdotes extremely difficult to come by.


I think the problem with asking for anecdotes is that people don't necessarily separate their decision-making-due-to-CS knowledge from decisions they make due to experience. But if you don't have CS knowledge there are many types of projects you'll probably never be assigned or might not even try for, so you won't have the chance to use-or-not-use it.


I can totally see that, but it clashes a bit with self learning and research towards figuring figuring out whatever the problem is, or the domain space. Surely, a self taught web developer wouldn't want to take a job building a compiler for a DSL if they didn't have that skillset, but maybe they know or can learn enough about compilers to be able to track down a crazy bug?

I've always gone towards projects which may need a lot of research on my part, and I've had plenty of trusting peers and managers with hard CS educations who believed I could do it.

If I wanted to change problem domains to something much more grounded in CS (say operating system schedulers, robotics or microcontroller programming) Id read these books.

I'm trying really hard to see what the value is of learning this pattern or that pattern, and what sorts of worlds it can open for me, but so far (for me) it's usually been roads I don't want to go down professionally. Maybe my imagination itself is stunted by my lack of formal education, I don't know.


Couple of things about the distributed systems course.

1. Maarten van Steen -- one of the authors -- recorded screencasts in 2012 (see https://www.distributed-systems.net/index.php/books/distribu... ).

2. Maarten van Steen released an updated version of the book this year about distributed systems.

Full disclosure: I followed Maarten van Steen's lectures back in the day :)


You know, as a former academic, I was reading this list and I immediately knew: This is written by someone in academia (and sure enough, you find out at the bottom it is).

I don't have a problem with this list per se. For all I know, it may be a good list and the designation of Type 1 and Type 2 engineers may be accurate.

But I wish I read a post from a Type 1 engineer in industry that mirrored what academics often write. I hardly find one. Why the disconnect? If the academics are so right, why is it mostly academics who preach this? There are more Type 1 engineers than academics, I'm sure.

Take my story: Was pursuing a PhD in physics/engineering and dropped out. Heavy on mathematics. And programming was always a hobby/passion. Went into industry in my discipline (not programming). Then decided to change careers into software.

Going in, I had the impostor syndrome. I had read quite a bit of CLRS in grad school on my own, but remembered little. So I took a bunch of Coursera courses to review all the basic algorithms, graph theory, etc.

My goal was that this was the bare minimum to survive, and I would work for a while and figure out what to focus on next (architecture? networking, OS? databases?).

Well, I've been working a bunch of years now, and there is no "next thing". Even the algorithms courses I took, while a lot of fun and interesting, play little role beyond what most Type 2 engineers will know!

That's just the reality: Most software jobs do not require you to know much beyond the basic data structures (hash, sets, lists, etc) and the complexity of their storage/operations. I looked for ways to use all the extra stuff I had learned (in essentially introductory algorithms courses), and did not find opportunities. I'm facing the inverse problem: Someone who knows some of this (or wants to), and having trouble finding a job where this knowledge actually leads to more robust systems.

And it's hard to find the jobs where these things matter, and it is rare that they are paid more. Difficulty and complexity does not equate to higher pay. Market rules do. Trust me, I know. I was doing more challenging work before I became a software engineer, but I get paid more now because there were few challenging jobs.

I know people say it often, but I'll say it too: Communication and negotiation skills are more valuable than the topics on the page. Why spend your nights on diminishing returns when you can get pretty far with just the basics of negotiation? Most engineers are overeducated in terms of what they need to know when it comes to technical skills. But other important skills? We're very undereducated. Why work hard to be even more overeducated, while ignoring the deficiencies?


Having a strong background in math and algorithms is very useful because it opens up the number of problems you can solve. However, knowing these things doesn't mean you will face problems that require them.

I've had jobs where I've used fancy math all the time. I've also had jobs where there was no fancy math required at all.


Thank you for sharing, it's a very useful prospective for me as a CS undergrad.


^ I've posted a decently long question, but my 2c are I think it's useful to know those things - as a previous CS undergrad :)


Sorry I have a number of questions on this

> You know, as a former academic, I was reading this list and I immediately knew: This is written by someone in academia (and sure enough, you find out at the bottom it is).

What defines academic here? The author is not a professor nor a phd in CS as far as I can tell. I would argue this list is more broad than an academic would write necessarily. Also I argue this article emphasizes "vocabulary" over an ability to apply and connect this knowledge.

>Well, I've been working a bunch of years now, and there is no "next thing". Even the algorithms courses I took, while a lot of fun and interesting, play little role beyond what most Type 2 engineers will know! That's just the reality: Most software jobs do not require you to know much beyond the basic data structures (hash, sets, lists, etc) and the complexity of their storage/operations. I looked for ways to use all the extra stuff I had learned (in essentially introductory algorithms courses), and did not find opportunities. I'm facing the inverse problem: Someone who knows some of this (or wants to), and having trouble finding a job where this knowledge actually leads to more robust systems.

I believe I'm going to have to disagree (and I really mean this to have a conversation rather than an argument)

Summary: Why?

I agree in that many jobs are Type 2 engineers. But I argue a lot of the new technology-driving things that are made, are not made by Type 2 engineers as well. I would argue many of the new "wow" things that are coming forward in tech require a quite strong Type 1 background - similar if you're trying to work in finance. E.g. Alexa the device alone requires a lot of hw, os, fs, nlp, ai, networking work.

> And it's hard to find the jobs where these things matter, and it is rare that they are paid more. Difficulty and complexity does not equate to higher pay. Market rules do. Trust me, I know. I was doing more challenging work before I became a software engineer, but I get paid more now because there were few challenging jobs.

I both disagree and agree. Most jobs look pretty type 2 to me as well. But even at a google/fb/amazon/microsoft there are opportunities for type 1 work. At Apple for example there is the Core OS and Networking Teams. At Google there are quite a few OS, networking, SDN, algo, distributed sys, file store, ...etc teams.

And in terms of core skills I would argue there are people who get paid /quite a bit/ more for their abilities. E.g. if you look at very top-tier devs at (for argument) Citadel, or Memsql or similar, their pay tends to be quite better than most other places. Similarly phd's in $specific-topic are frequently very highly valued. E.g. AirBnB is paying quite a premium on PhD economists right now.

I would argue you need to have something in mind as well in order to have these be useful, and that won't necessarily spring to you.

E.g. Understanding FSs is very useful when setting up booting for new embedded devices (e.g. at Cisco).


>What defines academic here? The author is not a professor nor a phd in CS as far as I can tell.

True - I did not realize that. But what I mean by academic is someone who is not in industry and mostly has teaching or research experience.

>I would argue this list is more broad than an academic would write necessarily.

Pretty much everything listed is part of the CS curriculum - either as a required course or as an elective.

>But I argue a lot of the new technology-driving things that are made, are not made by Type 2 engineers as well. I would argue many of the new "wow" things that are coming forward in tech require a quite strong Type 1 background - similar if you're trying to work in finance. E.g. Alexa the device alone requires a lot of hw, os, fs, nlp, ai, networking work.

No disagreement here. But how easy is it to get a job in these even if you have a Type 1 background?

It's the same story in other disciplines, like my own background in physics/electrical engineering. When looking for a job that utilized the skills I had learned, I found very few. Most listed a minimum of a PhD (even if the job really didn't need that much knowledge), which I didn't have as I dropped out of my PhD program. What's more, once I was "inside" with a job, I found out most people doing jobs that required a PhD were doing work that required even less technical skills than the job I was doing (and my job required knowledge of one somewhat advanced topic, but no heavy mathematics).

Yes, I accept the jobs you write about exist. But it is hard to get them even when you have the skills. The supply vastly exceeds the demand. So why spend time honing these skills (beyond personal curiosity)?

>But even at a google/fb/amazon/microsoft there are opportunities for type 1 work. At Apple for example there is the Core OS and Networking Teams. At Google there are quite a few OS, networking, SDN, algo, distributed sys, file store, ...etc teams.

Same is true at the large company I work in. And again, the majority of software jobs at my company are Type 2.

>And in terms of core skills I would argue there are people who get paid /quite a bit/ more for their abilities.

My argument is not that these people do not exist. My argument is that when you look at the typical person who has Type 2 knowledge, they are not paid that much more (I suspect it is true even when restricting to jobs that need that knowledge).

Look at web development. I know some people really get paid a lot for it. But most of the people I know in it who do network programming (and not trivial network programming) get paid less than I do. And my salary is not high. The web is a particularly notable example. I routinely bump into software developers doing web work that requires a lot more ingenuity than my boring business logic work. Few get paid more than me.


Which demographic do these compilation of computer science books target people? Considering the number of votes this post got on HN, it must be some importance. I already have plenty of resources gathered up with time which right now working on the first one, but really, who are they targetting at?

From the threads [slashdot, HN] I read yesterday which related to me quite well [26 year old trying to enter the software development field], all seem to conclude, 'you're a dinosaur if you're coding past 30'. Which drew a really grim picture of where my future is heading. To me, there is vast amount of knowledge and learning which I don't know if I can do within 4 year span. There are numerous books, theories and fields to work in.

How can someone around 30 stop coding and move to more management type position when there is so much to learn in this space?


> you're a dinosaur if you're coding past 30

Don't let your mind get poisoned by this absurdity


SICP is a great resource for learning functional programming paradigm, but is it a suitable resource for CS beginners?

There is only a few universities that still use SICP or its variants in CS introduction modules.

I think a book on imperative or OOP paradigm might be better and more relevant in today's context.


This list is short on programming language theory, here's a rigorous book I enjoyed if interested in the formal definitions of such things as Abstract Syntax Trees or Types.

Robert Harper's Practical Foundations of Programming Languages. (free draft copy, also take the notation guide) http://www.cs.cmu.edu/~rwh/pfpl.html

There's videos for this book (and for Category Theory) @ OPLSS https://www.cs.uoregon.edu/research/summerschool/summer16/cu...


Do you have suggestions for PL reading lists? I nearly never see PL mentioned on them, and have been looking for alternatives for some time as PFPL offers one approach + style to many of the problems they address, but I would love a broader survey.

Thanks!


If you understand PFPL you can probably just attend or look up previous years tutorial tracks, lecture videos and research papers from the The Programming Languages Mentoring Workshop (PLMW), which encourages students to take PL theory, or Principles of Programming Languages conference (POPL 2017), or PLDI http://conf.researchr.org/home/pldi-2017 if they are paywalled there is of course sci-hub proxy.

This year's OPLSS looks interesting as well https://www.cs.uoregon.edu/research/summerschool/summer17/to... which will use this book as an introductory http://www.seas.upenn.edu/~bcpierce/tapl/index.html

I mainly follow Dan Licata and Matt Fredikson's personal pages for presentations/lectures as my specific interest is verification and Type Theory. http://dlicata.web.wesleyan.edu/index.html


A good list of topics. But there are many others. A CS undergrad now has many more options than the four or five available courses that I had when I was at Carnegie Mellon.

Next item: "Print yourself a computer science diploma" ;)


What did you have when you were at Carnegie Mellon if I may ask!


First, understand that there was no Computer Science degree back then - it was called "Applied Math". I think the courses available to undergrads were:

1. programming [A]

2. algorithms

3. operating systems

4. compiler design

If you were lucky like me, and a graduating student bestowed upon you a coveted key to the graduate terminal room, then you could hack on those excellent DEC VT-100 terminals besides James Gosling, and buy bottles of Coke from what was perhaps the first vending machine connected to the internet.

[A] Interesting in that it was a school-wide class that you were required to take and to pass. Programing was done in Pascal on a mini-computer-based IDE.


Ah interesting :)

You may be interested to know that (1) is still required, even for CFA people


And of that I approve


That's a good list of books, but it's terrible to start with. I'm pro-reading, pro-book, but I mean, you can't be motivated on learning computer science knowing that you SHOULD read all those books.


I loved this list so much that I created a public Trello board that summarizes it, and helps you track your progress if you're following the curriculum. https://github.com/ousmanedev/teach-yourself-computer-scienc...


What is the best book or video is always hard to say, but it's always interesting to see such collections, having studied CS myself and working in the field for 20 years. I'd appreciate proper quotations though, missing the books authors. Also a short reasoning why this title was selected would be helpful. Thanks for putting this together!


> Computer Architecture

I found it fascinating to learn about the basis of computing on memory that is manipulated by a processor. It takes only 10 minutes to realize the basic concepts of it. Meanwhile, the majority of popular technology Press still treats processors as mystical machines disconnected from memory, that mysteriously make your game run faster.


As a physicist breaking into software engineering, but lacking education in CS, this list is extremely welcome to me.


Getting this message : The webpage at https://teachyourselfcs.com/ might be temporarily down or it may have moved permanently to a new web address. ERR_TUNNEL_CONNECTION_FAILED


Don't forget the Scientific Method (it is Computer Science, after all)

https://blog.makersacademy.com/scientific-method-in-programm...


Having bombed the programming interview for a Google cloud position and like a kid seemingly suffering from stockholm-syndrome I think the value to having arbitrarily abstract and difficult interviews are overall a good thing -- getting through one is like getting through basic training, it shows you have the grit to persevere. And in the case of programming you at least can understand data structures well enough to answer the canned questions. I know the initial reasoning behind it all was to show people who could think analytically but also be competent in the role. The downside I think is the process biases people who might be valuable members of a team who don't care about implementing a linked list but understand the pros and cons of one having used such a data structure or a similar one in production applications.


Two subjects that I think are missing yet important though in the practical and not theoretical context:

10. Software Engineering

11. The Art of Shipping

Shipping is a crucial skill to learn, if you don't know how to ship, then learning everything else is really moot!


Appreciate the short list and not hundreds of links to resources.


Wow awesome list first time i have seen some explaining why they gave preferences to particular things over others in such a lucid and simple way ...


This list is awesome, as a student, I really doubt a computer science will worth the money spend... Degree is expensive in most places nowaday.


I will have to respectively disagree :)


Does anyone know of a different decent networking book that's available on Safari?


ITunesU have great computer science classes which is offered my major universities


I am a self taught programmer with 10 years experience. Would this help me?


Yes.


if you learn CS, I think you should not be a programmer for your next step. You can be better then now.


I somehow did not get what you intended to convey. Can you please elaborate? Would going back to Algos and OS etc. help me grow as a hard core programmer?


Very strange nothing on computer security


Great website! Thanks for sharing!


THANK YOU.


Read good compilation! I agree with almost everything on the list.


There are almost dozen blogs and websites promising to teach yourself CS - great initiative but most people are either saying the same things or missing few key points here and there. Everyone seems to tell it as if they know it best.


Do you have any specific criticisms of this site?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: