For the sake of registering a counterpoint: I disagree entirely that things like kernel development or assembly (and let’s throw in architecture, computability theory, and all that jazz for good measure) are even remotely useful in software engineering. You’ll forget most of it and personally I don’t think it will even meaningfully alter your performance over the long term.
Knowledge that is acquired but not routinely recalled or applied will atrophy.
Sometimes you can make the argument that it’s worth your time to satisfy your own intellectual curiosity and I can understand that. Where people misstep is in thinking all knowledge is created equal.
I used to rationalize forays into theoretical material as holistically improving my capability as a thinker. In hindsight, it’s obvious that was bullshit. There are much more efficient ways of turning yourself into a good thinker that are more directly relevant to how things work in the real world.
The other thing I realized (and this is more specific to me), is that if I were to give myself the luxury of diving into knowledge for its own sake, I would choose a topic in the natural sciences, like physics or astronomy. Computers are interesting, but the theory surrounding them doesn't do much to help explain the nature of our reality, which I personally find much more fascinating.
If I could go back and redo my education, I would try my best to focus on a combination of:
(1) The most pragmatic courses in CS. IMO, the most useful ones beyond the intro courses were data structures and algos, distributed systems (project-driven), OS design (writing a simple OS), basic prob/stat, and intro ML (you do not and never will need deep anything, unless you decide to specialize). You could cover all of that in about a semester and a half tops.
(2) Projects out the wazoo. Real ones. Ideally motivated by a real problem and birthed into the world with all the messiness that entails, and iterated upon until they create real value for someone. You'll learn a stupid amount along the way.
(3) Through some combination of courses, reading, and projects: scripting/automation, API design (easy), modern web dev (project plus lots of Googling and learning to accelerate learning by relying on others), mobile app design (same approach as web dev), PaaS via AWS or GCP (or bespoke), basic security, AMQs, orchestration (at least Docker; maybe Kuberbetes), proxying (uses of Nginx) and UNIX/Linux networking fundamentals, metrics and analytics (with an emphasis on learning the value of instrumenting a system/product/business and using the feedback to improve it), databases (Postgres at least; become super proficient at SQL), basic UI/UX design principles, software engineering best practices (from simple things like KISS, coupling, testing, all the way up to reliability, availability, maintainability, scalability, and good decision-making, particularly with respect to knowing how to achieve a sensible balance between time, cost, and quality).
I’m missing a lot, but in short you should know every technology function required in a modern company at least at a basic level. Some people call this "full stack".
If you want a lasting career in tech and you don’t plan to specialize, then this is the way to go. The merits of being a specialist vs a generalist are debated all over the place. Thiel will tell you to relentlessly focus on one thing and ‘vertically integrate’. Scott Adams will tell you to get very good at two or more things and then combine them, since becoming the best at any one thing is extremely hard.
If it’s not obvious, I chose to be a generalist. If I had to explain why, it would be because: (a) I don’t like the risk of committing to one thing (“blockchain engineer" seems like a dubious track, for example), (b) I get bored easily, (c) specialization often but not always seems to lead to myopia, which is cancer in any enterprise; this is hard to explain but you’ll know it if you ever see it: everyone operating in their own silos, incapable of cross-displinary thinking, lacking empathy for the nature of what other people do, pervasive groupthink, arrogance (d) if you’re not good enough to be a top-tier specialist (I'm not), then the way you maximize the value you can create and that you can get paid for is to be an exceedingly useful generalist, who can think across organizational concerns and boundaries effectively.
(4) What Charlie Munger calls “remedial worldly wisdom”.
The most appalling failure of our education system is that it produces people who can take a test but can’t think independently, let alone innovate.
Some of us software engineers get to thinking we’re hot shit. We're not. For one simple reason: what we do is almost always deterministic. Someone has done it before and written it down so that you can do it too. At worst, you have to tweak something a bit to make it work for your situation.
In the real world, nondeterminism drives novel value. In other words, everything wrapped around the lines of code you write is what's important. That means you're going to be hard-pressed to make a dent in anything if all you can do is write code.
Thinking well is a broad subject and you’re going to have to tackle it on multiple fronts, probably for the rest of your life. The most important thing by far is behavioral psychology. Do whatever you possibly can to grasp it. Additionally: systems thinking, philosophy, basic accounting, very basic economics (as soon as they say “Solow Model” run away; ideally well before that), some history. Poor Charlie's Almanac is a good starting point for much of this. It'll help you appreciate why this is important.
You should also know how to apply math to solve any problem you run into that falls short of involving calculus or advanced prob/stat. In a perfect world, you would know how to apply calculus as well, but the opportunities to do so are so few and far between that you likely won’t remember it beyond basic differentiation/integration in the long run (or at least I didn't since I have a poor memory, but that may not apply to you).
(5) Read The Lean Startup. And expand out. Be careful since there’s a lot of garbage in the business genre. Others I can recommend: The Phoenix Project, Lean Analytics, the first part of The Startup Owner’s Manual (the latter two parts only if you ever get past the first stage of building a company). Even if you never choose to work on a startup, it’s the same kind of thinking that will enable you to generate outsized value in any organization. Good decision-making offers at least an order of magnitude better value per unit time than writing code. You will get in the door by writing code. You will get up the ladder by making good decisions.
When you read books, get paper copies and write in them: underline, take notes in the margins, drop in some Post-Its to mark really good sections, etc. If you read a book that really resonates with you, then go further and write up notes on it afterward. Even just underlining a book is ridiculously useful. Underlining alone can allow future you to skim through what you understood to be the most important parts of a 300 page book in roughly 10 minutes.
All of the above may seem like a lot. And honestly, it would all fit in easily if I could swap out the less useful required parts of my CS degree. But that won't be viable until universities offer that option and companies stop thinking a complete CS degree is the qualification they should be targeting. Until that happens, the onus is on you to not let your "schooling interfere with [your] education."
>For the sake of registering a counterpoint: I disagree entirely that things like kernel development or assembly (and let’s throw in architecture, computability theory, and all that jazz for good measure) are even remotely useful in software engineering. You’ll forget most of it and personally I don’t think it will even meaningfully alter your performance over the long term.
I said I want to take them because I want to (i.e. they are fun to me personally), not because I said they are useful. I think that's the same reason many people went deep into a field, they found it fun to work on problems in that field. Sure, we're not the hot shit, it might not teach us many useful skills. But somehow the idea of satisfaction in the field of study and work goes a long way to me. It makes me stay late at night working on things that matter instead of smoking weed every night and wonder about our life choices and thinking about dropping out. I've been there, done that, I know how it feels to be in a noble place but dead inside. I'd choose to be creative and inspired to work any day of the year.
>Some of us software engineers get to thinking we’re hot shit. We're not. For one simple reason: what we do is almost always deterministic. Someone has done it before and written it down so that you can do it too. At worst, you have to tweak something a bit to make it work for your situation.
>In the real world, nondeterminism drives novel value. In other words, everything wrapped around the lines of code you write is what's important. That means you're going to be hard-pressed to make a dent in anything if all you can do is write code.
I totally agree. Personally, I think of myself closer to being a creative person than a procedural person. I think creativity is very very important perhaps, as much as competency. That's why inspiration is great, and that's why studying something that I find fun is important.
Knowledge that is acquired but not routinely recalled or applied will atrophy.
Sometimes you can make the argument that it’s worth your time to satisfy your own intellectual curiosity and I can understand that. Where people misstep is in thinking all knowledge is created equal.
I used to rationalize forays into theoretical material as holistically improving my capability as a thinker. In hindsight, it’s obvious that was bullshit. There are much more efficient ways of turning yourself into a good thinker that are more directly relevant to how things work in the real world.
The other thing I realized (and this is more specific to me), is that if I were to give myself the luxury of diving into knowledge for its own sake, I would choose a topic in the natural sciences, like physics or astronomy. Computers are interesting, but the theory surrounding them doesn't do much to help explain the nature of our reality, which I personally find much more fascinating.
If I could go back and redo my education, I would try my best to focus on a combination of:
(1) The most pragmatic courses in CS. IMO, the most useful ones beyond the intro courses were data structures and algos, distributed systems (project-driven), OS design (writing a simple OS), basic prob/stat, and intro ML (you do not and never will need deep anything, unless you decide to specialize). You could cover all of that in about a semester and a half tops.
(2) Projects out the wazoo. Real ones. Ideally motivated by a real problem and birthed into the world with all the messiness that entails, and iterated upon until they create real value for someone. You'll learn a stupid amount along the way.
(3) Through some combination of courses, reading, and projects: scripting/automation, API design (easy), modern web dev (project plus lots of Googling and learning to accelerate learning by relying on others), mobile app design (same approach as web dev), PaaS via AWS or GCP (or bespoke), basic security, AMQs, orchestration (at least Docker; maybe Kuberbetes), proxying (uses of Nginx) and UNIX/Linux networking fundamentals, metrics and analytics (with an emphasis on learning the value of instrumenting a system/product/business and using the feedback to improve it), databases (Postgres at least; become super proficient at SQL), basic UI/UX design principles, software engineering best practices (from simple things like KISS, coupling, testing, all the way up to reliability, availability, maintainability, scalability, and good decision-making, particularly with respect to knowing how to achieve a sensible balance between time, cost, and quality).
I’m missing a lot, but in short you should know every technology function required in a modern company at least at a basic level. Some people call this "full stack".
If you want a lasting career in tech and you don’t plan to specialize, then this is the way to go. The merits of being a specialist vs a generalist are debated all over the place. Thiel will tell you to relentlessly focus on one thing and ‘vertically integrate’. Scott Adams will tell you to get very good at two or more things and then combine them, since becoming the best at any one thing is extremely hard.
If it’s not obvious, I chose to be a generalist. If I had to explain why, it would be because: (a) I don’t like the risk of committing to one thing (“blockchain engineer" seems like a dubious track, for example), (b) I get bored easily, (c) specialization often but not always seems to lead to myopia, which is cancer in any enterprise; this is hard to explain but you’ll know it if you ever see it: everyone operating in their own silos, incapable of cross-displinary thinking, lacking empathy for the nature of what other people do, pervasive groupthink, arrogance (d) if you’re not good enough to be a top-tier specialist (I'm not), then the way you maximize the value you can create and that you can get paid for is to be an exceedingly useful generalist, who can think across organizational concerns and boundaries effectively.
(4) What Charlie Munger calls “remedial worldly wisdom”.
The most appalling failure of our education system is that it produces people who can take a test but can’t think independently, let alone innovate.
Some of us software engineers get to thinking we’re hot shit. We're not. For one simple reason: what we do is almost always deterministic. Someone has done it before and written it down so that you can do it too. At worst, you have to tweak something a bit to make it work for your situation.
In the real world, nondeterminism drives novel value. In other words, everything wrapped around the lines of code you write is what's important. That means you're going to be hard-pressed to make a dent in anything if all you can do is write code.
Thinking well is a broad subject and you’re going to have to tackle it on multiple fronts, probably for the rest of your life. The most important thing by far is behavioral psychology. Do whatever you possibly can to grasp it. Additionally: systems thinking, philosophy, basic accounting, very basic economics (as soon as they say “Solow Model” run away; ideally well before that), some history. Poor Charlie's Almanac is a good starting point for much of this. It'll help you appreciate why this is important.
You should also know how to apply math to solve any problem you run into that falls short of involving calculus or advanced prob/stat. In a perfect world, you would know how to apply calculus as well, but the opportunities to do so are so few and far between that you likely won’t remember it beyond basic differentiation/integration in the long run (or at least I didn't since I have a poor memory, but that may not apply to you).
(5) Read The Lean Startup. And expand out. Be careful since there’s a lot of garbage in the business genre. Others I can recommend: The Phoenix Project, Lean Analytics, the first part of The Startup Owner’s Manual (the latter two parts only if you ever get past the first stage of building a company). Even if you never choose to work on a startup, it’s the same kind of thinking that will enable you to generate outsized value in any organization. Good decision-making offers at least an order of magnitude better value per unit time than writing code. You will get in the door by writing code. You will get up the ladder by making good decisions.
When you read books, get paper copies and write in them: underline, take notes in the margins, drop in some Post-Its to mark really good sections, etc. If you read a book that really resonates with you, then go further and write up notes on it afterward. Even just underlining a book is ridiculously useful. Underlining alone can allow future you to skim through what you understood to be the most important parts of a 300 page book in roughly 10 minutes.
All of the above may seem like a lot. And honestly, it would all fit in easily if I could swap out the less useful required parts of my CS degree. But that won't be viable until universities offer that option and companies stop thinking a complete CS degree is the qualification they should be targeting. Until that happens, the onus is on you to not let your "schooling interfere with [your] education."