Hacker News new | past | comments | ask | show | jobs | submit login
How I learned to program (danluu.com)
322 points by deafcalculus on Dec 26, 2017 | hide | past | favorite | 100 comments



His 'ineffective fumbling' and 'high school' seem to mirror my experiences. Using a bunch of mismatched books and computers with no real goals, no mentor, and no real results.

However, it's amazing when you read about <insert famous programmer>. They grew up with Apple II's and Commodore 64's, but they mastered them. Or at least that's the narrative from the author of the articles that talk about them.

They don't just write silly BASIC programs like the rest of us, they started using assembler and learn the machine like the back of their hand. They learn how to optimize the living crap out of their code. Maybe their schools just had better books in their libraries or they subscribed to the good magazines of the day. Or they were just that much better at programming and smarter and more focused, one of the above, or all three.


The timeline seems to match my experience, but the fumbling does not; I started reading books and magazines before I had access to computers in begin 80s. Those where mostly Basic but some asm (esp for the c64 and spectrum the basic code had fair amounts of data which was asm). When I had access to computers after that, I wrote monsters of Basic programs. I remember writing a music composition program for the MSX which, after I got it all typed in on a PC (I had only partly access to an MSX and part PC), it was too large for the MSX memory. It was too slow too, so I started replacing parts with z80 asm typed in hex. I still remember most opcodes in hex: it was much faster and more efficient to type than assembler.

I guess I was lucky to have access to the books: I went to a (monthly?) Philips ‘old crap’ sale. If I remember it was in the bottom of the PSV stadium and they had old electronics and mags and books for sale for cheap. It was so lovely; we took a car full of old circuitboards to get components off, books, printed papers etc home and spent the next month working on them (this was summervacation usually as it was close to my grandparents).


Sounds like heaven


I grew up with a Commodore 64 and then an Amiga 500. What I remember is that I had few books and a lot of time to study them. Moreover, these were much simpler computers than what we have today. Kids could understand a lot of the hardware and the assembler by themselves. On the other hand, It was harder to learn about algorithms and data structures. I didn't have access to the adequate instruction before college.


Instead of switching to Amiga, I decided to defend my C=64 against the Amiga kids in the block by being amazing at it. That gave me an advantage in my early programming career because it taught me the discipline of persistence. I knew that bugs were solveable and existing code wasn't daunting.


I didn't really get past the 'ineffective fumbling', alas. Got good at BASIC programming on the ZX Spectrum, and read that (I think) Matthew Smith learned Assembly by reading 'Programming the Z80' by Zaks. So I got that for Christmas one year, and got totally drowned by it; had three concerted attempts to understand it, and while I had some idea, it wasn't enough to get going. Decided I was an idiot, and pretty much gave up.

Here I am, 30+ years later, spending a couple of years trying to learn programming properly having been made redundant from 2 jobs... I don't think I'm an idiot, but I think it's a combination of others being smarter than me and more attuned to the task in hand, but also there's an element of luck to it. I think if some of the cards fall slightly differently and the things you try happen to work (mostly skill, but with an element of luck), you can get drawn into it, gain some resilience and make progress. If you stumble too much before you make real progress you can think you're stupid and give up.

Have a similar parallel experience with playing the guitar (which is how I've mostly earned my living in the intervening years) - an element of picking technique which I didn't manage to work out in the 80s when I was learning and again put it down to my own inability; now with better resources and research in the current era, found out what the 'missing part' of the jigsaw was, and realise that if I had found it (I was close, but no cigar) then things could have been drastically different. Fortunately it happened at a playing level which was already beyond 'give up, you're useless'!


maybe it's vision more than anything.

you've got a thing you want to make, but you get stuck along the way. it's too slow, or doesn't fit in memory, so you have to figure out how to optimize what you've got before you can keep building.


I think it depends on getting access to enough mentor guidance and/or stumbling over the right initial materials and tools (simple enough compiler/interpreter, some books/magazines, ...). In the WWW era this is in some ways getting easier than it used to be.

Kids don’t necessarily know where to look, and it’s easy to get very stuck or discouraged if the learning curve is too steep to get something simple working within a reasonable amount of time.


I learned electronics before computers (my father was an electronics engineer at the time) and I was more or less taught Turing (intuitively) on a hardware level. So the idea that anything is possible and their is no such thing as magic when it comes to computers (modern ones included) made me keep trying because I knew it had to be possible on the gate level so it was just a matter of trying. When I still had an Msx, my friends where getting Amigas with much better capabilities. I knew that, basically, my computer could do the same: by that time I was well aware of math and algorithms, so I was able to code up ‘real 3d’ but it was so slow that my poor system had to stand there all night to calculate a rotating cube and put it into screen 5 buffers. Now that is all silly stuff, however, it did make me remember my whole carreer that there is no magic. Something which some people who learn high level only, do not seem to have. They actually seem to believe that the provided abstraction (by libraries or language) is the limit to working the machine.


I learned programming on Trash 80s in an underfunded Mississippi High School (well, I learned some BASIC earlier). I never got interested enough to learn assembler until college, but it was enough to set me on my way.


I’ve wondered about this. I was keenly interested in programming as a kid, but I was never into hardware or operating systems. I wanted to write text based d&d style games, as well as simulate outcomes from dice games like risk (or know the odds of one character defeating a monster).

Those are real programs, but looking back, I actually do think this might contain a seed of why I never did become as strong a programmer as my peers from those days. I floated away fr programming, got into writing and music, floated back in. Some of my peers from those days couldn’t take their eyes off computers, it was a deep fascination.

There are lots of good ways to be a programmer. But the real experts do seem to have a deeper fascination with the inner core, the hardware, machine code, actual design of programming languages (debates about ruby vs python or similar almost always go beyond my depth, I just like how ruby looks on the page. I just like how quickly I can get my answer in python).


Eh. I wrote a quake-like engine at 17. But that cuts out the 4 years of fumbling that it took to get to that point.

There's probably a lot of fumbling common to all computer users in their early days. But what separates people is how far you can go in a short time.


> We were exposed to some kind of lego-related programming, uhhh, thing in school, but none of us had any idea how to do anything beyond what was in the instructions. While it was fun, it was no more educational than a video game and had a similar impact.

I wonder how many of these "teach your kid STEM" toys have this result. I learned to program when my mom got me a vtech precomputer 1000 for Christmas when I was a kid. But becoming a "developer" took many years of effort. Hopefully more kids will be set on the path like I was.


I've always felt that the "Scratch" programming that the kids are doing these days may actually get them turned off on programming. While easier to get the kids started, I think the kids quickly start to feel just how mundane and boring programming is when they are just dragging and dropping pre-set elements on their screen all day. The ones that are meant to program would flourish under a computer with just a black and white Terminal app on their disposal. But give them Scratch and I think they will be thinking of other careers.


> The ones that are meant to program would flourish under a computer with just a black and white Terminal app on their disposal.

I don't know about "meant to". I avoided writing anything but plain HTML and CSS for years because I thought I wasn't a programming type person (wasn't smart in the right way, didn't like math, hit brick walls with basic as a teenager, etc). The terminal did nothing for me and still isn't my favorite place to be, but I do really like writing little web apps that solve problems and work. Programming, like most things, is available to anybody who wants to do the work to understand it and can put in the hours to get competent.

I see the risk in Scratch turning some kids off and would never suggest just providing anybody with one option for how to learn a skill. But I don't know about this "meant to program" idea. I'm more in favor of fostering the curiosity that can fuel real learning by any means that work.


HTML/CSS is the right kind of programming to introduce to kids. It teaches a lot of the cognitive skills that you need as a programmer (attention to detail, what you see is _not_ what you get, ability to abstract a problem); that combo teaches a lot of programming concepts (encapsulation, separation of concerns); and it is,last but not least, REAL and eminently useful. A lot of that lego-scratch-type-programming is completely useless once you take it out of that sandbox. HTML/CSS will be there at the heat death of the universe.

HTML/CSS does not teach things like algorithms and data-structures, but it doesn't need to. It's complex enough that kids with an affinity for programming will find it stimulating, but not so complex that dumbo Timmy over there can't do at least something productive with it. And it straddles that line between nerdy-programming and design, so you can engage the creative kids as well.

Family member's been teaching HTML/CSS to 14-15 year olds, with good results. Like all class topics, the engagement-distribution is a bell curve. 10% is lost like a puppy at sea, 80% is following along with varying degrees of success, and 10% are absolutely doing amazing things and taking it way beyond the class contents.

One difference with the other type of kids-programming isn't that some robot scoots around the gym, but that at the end these kids are making goofy 90's websites, with bonker color-schemes and bouncing images. Parents and administration love the former. The latter only really clicks for parents with some familiarity in the domain.


I think this great advice for teaching, but I'd stress that Javascript should be included as well.

HTML and CSS are great for learning because everything they change will adjust the webpage instantly. This is the web's biggest teaching advantage: insanely fast iteration time. To folks learning this speed feels like a super power. (And when they try iOS/Android programming, they will miss it)

Where Javascript comes in, is it helps tie that fast iteration into a place where variables and functions can exist. Functions are especially hard to grasp, so starting those pretty early matters a lot. Simply put: if you don't understand functions, then all the HTML/CSS in the world won't save you!


> HTML/CSS is the right kind of programming to introduce to kids. It teaches a lot of the cognitive skills that you need as a programmer (attention to detail, what you see is _not_ what you get, ability to abstract a problem);

I strognly disagree. Playing around with markup languages only trains a person to expect a computer to map a particular input to an output. That's far from programming, let alone real-world programming practice. Writing software is much more than getting a computer to output something in a one-shot process. Making these sort of claims does a diservice to anyone interested in programming because it paints a rosy picture of how programming is trivial and free from any intellectual challenge, which goes directly down the crapper once the first crash or bug needs to be solved.


The parent comment said HTML/CSS the right kind of programming to introduce to kids. Your disagreement is based around it not being the final end-all be-all representation of the complete set of features and challenges of programming. You're optimizing for accuracy and completeness, whereas the parent was optimizing for appeal and ease-of-learning.

As someone who's spent many months teaching people to code from scratch, it seems to me you're ignoring just how tough it can be for people (especially kids) to learn challenging new information. Most will quit when they hit a wall.

In my opinion, baby steps are much wiser when the goal is to teach.


I agree, they're not a full programming language, they lack the aspects you describe. But they are a simple introduction to programming, work on a lot of levels, and can engage a broad range of students.

I would disagree with one thing. Writing HTML/CSS most certainly is an intellectual challenge, especially for beginners, and does engage parts of the programmer's brain.


This fall, I helped teach programming to my son's grade 8 class. They had done Scratch before, and wanted to learn "real" programming, so we taught them a subset of Python. It turned out really well. With basically just for-loops and if-statements they were able to write quite a bit. I liked that the concepts they learn were the actual fundamentals I use every day at work when I program in Python.

I wrote it up here: https://henrikwarne.com/2017/12/17/programming-for-grade-8/


that is remarkable! this is the type of teachers i wish i had growing up!


Yeah, this "meant to" attitude seems a little be self-aggrandizing. I feel like that's a very convenient way to say, "programmers only club, no _others_ allowed." My favorite thing has been teaching scratch in middle school and realizing how non-obvious programming aptitude is. Some real "nerd"-types don't have any actual skill and others who are little socialites or behavior problems take to it with ease. Makes you realize how not special us programmers are.


Sometimes you just need one joke from elders, here is my story:

https://www.quora.com/Are-there-any-lies-that-your-parents-t...


I still think QBasic is one of the better ways to get people interested in programming. It starts off very simple, but you can really go a long ways with it. Plus it's got some pretty powerful, yet easy to access graphics and music capabilities built right into it. My first programming class in high school, we went from Hello World to creating an animated Christmas card, with MIDI Christmas carols accompanying it, in the first semester.

QBasic might not have been the greatest language, but it had a pretty sweet IDE, fantastic live documentation, and was really just fun to play with. I feel like there ought to be a packaged up Python distribution with the same kind of attributes, but I've never really seen anything that really aced that spot the same way.


Same! I had a logic gates robot game for my apple II, Lego mindstorms, and FIRST robotics through high school. Which really helped because I didn't come close to the grades needed to get into CS or engineering.

I didn't Hello World until my 20s, and it took years to really grok the programming world, but I think my lifelong exposure to logic and programming of sorts made all those concepts accessible outside of a classroom environment. Ie. I could understand what a python snippet was doing because I knew what a loop and conditional was.


This. A logic class could do the same thing if you could get a good, modern logic class. Internalizing for loops, if-else, while, boolean, etc is a universal benefit that would help anyone break into programming.

Sadly, I've taken some really bad logic classes. Usually this is because they use terrible toy languages that are artificially difficult to work with. They should just do those classes in Javascript or python or something, after they cover the old Aristotelian logic bit. (Socrates is a man, all men are mortal...)

You could rig up a test suite to brute force truth tables and verify when you create correct logical statements. It would be a blast.

Instead you spend all year having to derive known logical theorems from scratch. Because reasons?

Example:!(x and y) == (!x or !y). I had a class where in the toy language it took a whole hour to derive that theorem. Instead, how about we have to draw out a truth table or actually use that statement? You can do that in half the time and get more from it. I've had to use that in programming all the time. You can flip if!(!x.isEmpty and !y.isEmpty) into if(y.isEmpty or x.isEmpty)... If I typed that right. Comes up all the time refactoring or simplifying logic.


A logic class teaches logic, worrying whether people get interested in programming because of it is at best irrelevant and at worst detrimental to the subject(such as allowing the students to brute force it all, missing the point completely). That's like expecting the calculus class to foster programing interests because they use matlab.

Maybe i'm missing something, but what does for-loops have to do with logic in a way that it might be part of the class?


As Leibniz put it, it's a goal of logic to be able to say "well, let's calculate" when you have a dispute. Truth tables are about the simplest way to calculate logic, but they don't scale. I think a good intro course could be made starting there and working up to fancier methods, explaining logic and computing together. My article https://codewords.recurse.com/issues/four/the-language-of-ch... aimed in that general direction, though it's for people who can already program, because I wasn't ready to tackle anything more ambitious.


My problem isn't with the idea of using truth tables to solve problems or using languages to assist in the class, but with the bizarre expectation that logic classes that make you to actually use the subject matter and prove stuff in a -supposedly- scalable way are somehow not modern(whatever that means) and the "right" way to teach such logic classes is teaching the basics of control flow.

You can go in a direction of explaining logic and computing together, and by what I skimmed from your article it does seems to be a cool way to tackle the problem, but that's not what someone who takes a logic class should expect from it, they should expect it from the "topics of logic in computing or something like that" class.


My two cents is that teaching logic, without bogging down in mathematics is teaching how to practically use chisels, saws, sandpaper. It gives you the tools for woodworking, but it alone does not instill the love for woodworking.

You absolutely need something more. Maybe the teaching of logic as a means to solving a larger problem, whose result is beautiful, exciting, engaging. This is why I love robotics for children. How do we make a hand-built toy robot do the things we want it to do using logic?


If you are teaching youngsters, you probably don't want anything more. Show them how to use the tools, and present the higher order concepts as how others have used them. But for young adults and adults, yes we need a map of the bigger picture.


See http://www.cs.toronto.edu/~hehner/aPToP/ for a textbook treating programming and proving in an integrated way with a unified formalism. I think generally how subjects get divided up into courses is too fossilized, especially pre-college, so I might have too little respect for the conventional meaning of a "logic class".


Sounds like prolog or minikanren would suit this - built in search is great as long as you don't accidentally make it exponential tine complexity


Eh, I get that it is limited in a "Okay but how do I step out of this toy environment and actually do stuff" kind of way, but I think it teaches people how to use their logical thinking without overloading them with information. I think that almost gets taken for granted, but I did an escape room recently and it really showed me just how bad most people are at just generally putting two and two together and solving problems.


> vtech precomputer 1000

I had no idea that was even a thing, even though I'm definitely of your generation, though probably not in the same general area, situation, etc.

You should have had an Amiga, C64, or C128 or... I'm happy that you made it, though!


I misread "lego" as "logo" when reading the post, because that was my experience. Before having a computer at home took an extracurricular BASIC class which in hindsight I learned almost nothing from but it whet my appetite. The intermediate class had us draw stuff with Logo, which I felt was stupid, and turned me off of programming for awhile. Probably my fault, now Logo looks like an interesting language.


I just finished creating a LOGO class, and I feel it's best for elementary school aged children. If you took LOGO over the age of say 12, it probably seemed childish and useless. But for younger kids, I think they can get interested in computing with LOGO.

You can do some cool stuff with LOGO like create state machines, implement sorting algorithms, and create simple games.


Logo is a great tool to use to learn visually about basic shapes and their properties (circles, polygons, conic sections, spirals, various sorts of roulettes, ...), logic, graphs of data and functions, fractals, cellular automata, synthetic geometry including projective geometry, transformation geometry and symmetry, basic number theory, complex arithmetic, polar coordinates, trigonometry, the logarithm/exponential functions, matrix algebra, linear regression and other statistical tools (Kalman filters!), optimization, basic probability, dynamical systems, lattices, group theory, graph theory, vector calculus, mechanics, optics, electric circuits, special relativity, 3d vector graphics, searching and sorting, trees, mazes, robot planning, game AI, ....

Or really anything you want, though after several years of experience students might find other tools more convenient for some types of modeling.


>If you took LOGO over the age of say 12, it probably seemed childish and useless.

Yes. However, at a fundamental level, LOGO is actually an advanced programming language.


My first programming exposure was with Logo, and it caught on... Still programming with fervor now at university.


Looking back in my life when I was born a year after Tavish Armstrong, I just feel so, disappointed how my life turned out to be. I had similar dreams and hope but they didn't work out in the end.

Only if I told my younger self to actually go through the Macromedia Flash book I borrowed once in high school, I would have had my first glimpse in computing. But during my early days, I never had anyone who was remotely interested in computers. I wasn't great at socializing either which would have lead me to people who were in machines. There was AP CS course at my university but I never took it only because I thought it was too hard and instead took AP Calc.

Without any concept of computing or the power of programming, how can a kid get into life of software development? All the random success stories I've read of popular programmers these days, all of their younger days began with someone giving them a gift or some 'assembly' language computer and started from there.

Wish the pursuit of programming caught me much earlier in life than right now, went into wrong major and always thought about programming, programming. Having friends and family who aren't into computers didn't help either until last year. So much life was wasted.


I used to spend 14 hours sitting at my computer installing weird operating systems and learning to code and hack games when I was 10. I still do basically the same thing at 30 but its cloud systems instead of games and will probably do this the rest of my life. If this isn't something that you are naturally driven to why force yourself to it? Find what you love and put your best effort into it. If what you love is just hanging out then be happy with that.


Don't get too caught up with success stories about "epic programmers", there's more than one path to a what you want. Stay focused and you'll get closer everyday.


It's never too late! And the cool thing about programming is that you can pursue it at a high level as a hobby. Nobody can prevent you to develop cool programs during your free time.


Programming is a massively useful skill in most non-programming professions too (assuming you do office work). You can automate so many tedious tasks, analyse data so much better when you can code something. Even if you don't become a full time programmer, you can easily end up spending 10-20% of your time adding value in your job by coding on the side.


How old are you? If you are less =< 25, it's not too late. I started self studying CS then after many aimless years. Then I got my first junior position at 48k and after 3 years of raises and job changes I'm at 100k+


I'm 27. That's too late?


Not at all. But you need to specialise and master a chosen sub-field. Like, I consider myself a programming languages guy because I love type theory. Others love exploring functional programming, OOP, Rust, or whatever. Whatever you like, just get into it and explore, and try to build and ship cool stuff if you can.

Everyone has periods of low productivity, you just need to roll with the punches and keep learning and building whenever you can.


I got my first real programming job at 38. It's never too late if it's what you want to do.

In my case, I learned how to program when I was much younger but it was never actually my job - or at least, not the primary component of my job. I kind of backed into it, writing more complicated scripts which helped automate what I was getting paid to do, until I eventually just got a "real" programming job.

The big thing I think I have in common with the OP: I always try to automate every task possible. This can allow you to add value for your organization above and beyond whatever primary task they hired you to complete, and any company worth its salt will recognize that contribution and try to find ways to help you contribute more elsewhere too.


I got my first real programming job at age 40, and had only seriously started programming a few years before that.


No, I don't think so. I was just applying the number from my own experience when I thought it was too late. There are many paths to success!


too late for what? if you enjoy programing just keep programing.


Why does (your) salary matter in this discussion?


I guess I (mis?)interpreted the above poster's reasoning for not pursuing programming "this late" was the money aspect because otherwise it makes no sense to me. Most anyone at anytime could learn to program if they don't care if it will make them money. I.e. I could only imagine someone not pursuing it because they thought it was too late for it to be a career for them.


  "but that’s so obvious that it’s not even worth stating"
Intelligent people I know say this or something similar. What is obvious to a smart person for something they are skilled at, is not obvious to the rest of us!


Thanks for mentioning this. I don't know whether or not this is related to intelligence, but it does seem to be a fairly common pattern where someone will underestimate the value of writing something down. I did notice this a couple times in danluu's article. A couple comments:

1) There's a lot of value in writing down something about one's experiences, even if the result is "obvious" to anyone else who has gone through the experience. There are likely a lot of people who HAVEN'T gone through that experience who would benefit from reading about it.

2) I worked on a project, mostly independently, for a year or so and produced something useful. To me, it seemed fairly pedestrian: some basic data structures, some ordinary algorithms, nothing particularly novel. A mentor suggested I write a conference proposal about it. I resisted, using the "it's obvious" argument. He came back and said "You've worked on this for a year, and no one else in the industry has worked on it. By definition you're the expert. It's worth a conference talk." Indeed it was, and it launched a significant portion of my career as a conference presenter.

I think this is largely about the difficulty of thinking about what others think when one is too close to a topic. After working on something intensively for a time, one develops a familiarity and intuition of the knowledge about that area. When working with that understanding, things may seem "obvious" but that are totally non-obvious to someone who hasn't worked in the area. This is less about absolute intelligence per se than it is about expertise, and so it's a variation of Kay's aphorism "A change in perspective is worth 80 IQ points."


God i absolutely hated this kind of "this should be obvious" statement made by professors in lectures.

It absolutely poisens the learning atmosphere. And after 20 minutes of lecturing on a difficult topic they ask if anyone has any questions and wonder why noone raises their hand or says anything.


FWIW, at least in mathematics "obvious" doesn't mean "you're an idiot if you don't instantly understand this." It means, "this isn't deep. If you're thinking about really deep, complicated things beyond what we've been talking about, you're missing the point."

I have on multiple occasions sat down and worked with another mathematician on a problem for an hour+, only for us to find the solution and both conclude that the answer was "obvious."


As a professor in a non-math subject, I agree with this and think it generalizes. I often say "this is obvious" as a way to communicate to students "you're overthinking this like woah."


Heh. The worst I heard of was a professor (in a math class, IIRC) saying "this is obvious". A student questioned whether it was obvious. The professor looked at the board, left the room for 45 minutes (!), returned and said, "Yes, it's obvious."

Dude, if you had to think about it for 45 minutes, it's not obvious.


>Maybe the moral of the story should be “leave bad roles more quickly and stay in good roles longer”, but that’s so obvious that it’s not even worth stating

Is that not inherently obvious? Good things are... good, and bad things are not?


Well, it wasn't obvious to me in the early stages of my career. But maybe the problem was that I hadn't seen enough situations to have a decent grasp on what was "good" and "bad".


Agreed. Not a perfect comparison, but this type of thinking reminds me of the history of Fermat's Little Theorem. "I would show you, but I don't want to ramble on too long" and of course it was almost 100 years before a solution was published.

Assuming something is obvious is dangerous.


Related to the topic: it took me almost 3 years to understand what a variable is and does.

Looking back I can't understand why it took me so long. It's so obvious.

Sometimes it's really hard to understand why some things are not understood.


Maybe they're smart, but not intelligent


> hear a lot of stories like that, and I’m happy to listen because I like stories, but I don’t know that there’s anything actionable here. Avoid managers who prefer doling out punishments to helping their employees? Obvious but not actionable.

Actually, the advice to avoid managers who are not helping employees is eminently actionable. It's just that the actions are either tedious or radical.

Tedious steps to take, before you are in the situation: find other current and former team members on linked in and do backchannel references on the position.

Radical steps to take, after you are in the situation: quit your job and find another one.


Re tedious steps, I had multiple job offers and reached out to people on LinkedIn who used to work on the teams I had offers for and got some great honest info that helped me make a decision.


What kind of info did you look for?


I just asked questions like what's the team like, would you recommend working there etc.


"I ended up at a little chip company called Centaur. I was hesitant about taking the job because the interview was the easiest interview I had at any company4, which made me wonder if they had a low hiring bar, and therefore relatively weak engineers. It turns out that, on average, that’s the best group of people I’ve ever worked with. I didn’t realize it at the time, but this would later teach me that companies that claim to have brilliant engineers because they have super hard interviews are full of it, and that the interview difficulty one-upmanship a lot of companies promote is more of a prestige play than anything else."


as an aside, i think dan luu's writings are some of the best on interweb: articulate, methodical, concise.


Yep! I also like the fact that it's just plaintext with no interspersed meme/pictures. Just straight information.


1990s internet style of blog posts or give me death


I really enjoy his writing too, but I wish it was concise. The article was more of a career summary, not what the title suggests.


on the contrary, I think he is a braggart who has a tendency to show off his alpha nerd credentials.


We had dinner once at his suggestion. He wanted to pick my brains about where I work.

He speaks a lot faster than I do, which is not something I'm used to. We exchanged Flandersian fusillades of words (including some about the relative speed of talking in different places, languages and scenarios).

He's legit. We'd struggle as a programming pair, though. Both too keen to hold the lock on the talkbit.


I worked with Dan for several years. He's one of the nicer people I've met in the industry and off the hoook smart.


> In conclusion, life is messy and I don’t have any advice.

That _is_ the advice :P seriously. Most advice is far too subjective to likely be applicable to yourself (because life is so messy).

With one exception: "leave bad roles more quickly and stay in good roles longer", as the author mentions, it seems far too obvious, but it's not... because "bad" is actually subjective, so "not working well enough" is perhaps more accurate.

Some "successful people" have never had a "not working well enough" part of their career because they are crazy lucky, and others have simply been perceptive enough to recognise it and move on. Some label this as "luck" but luck implies some objective "good", it's about chance, the chance of landing in a well fitting scenario.

Rolling the dice can be painful, and knowing when it's good not always obvious. Perhaps at least knowing to look out for this even though you don't know what it will look like for you, is the only general advice?


Anybody knows how to solve that third meta skill, of "solving hard problems"?


Some tools that can help:

* break the hard problem into smaller problems (e.g. "make a car that drives itself" would really benefit from considering smaller components)

* reframe the problem. Sometimes changing the perspective on a problem lets you leverage tools or techniques from related fields. For instance, there isn't much good literature on techniques for understanding obfuscated programs, much of it gets bogged down in low-level details and has a very short shelf life. But there is one field that has a lot of good literature on understanding and transforming programs: compilers. Realizing that was transformative in my own research.

* go bottom-up: are there special cases that are easier to solve than the whole? Can you generalize from there?

* go top-down: pretend that you could use some off-the-shelf components that would automagically solve parts of the problem. See if you can compose them into something that looks like a solution, then see how to implement them.


Great response! Reframing the problem is an especially important and well-known technique in applied mathematics and physics. And the others can arguably be generalized into "solve the easiest part first". If the easiest part is at the bottom, start at the bottom; if it's at the top, start at the top. Keep solving the easiest part remaining until there's nothing left to solve. This inherently requires breaking the problem into achievable sub problems.


Come up with the worst, most ridiculous and outrageous approaches possible for solving the problem and turn those "bad" ideas into "good" ones.

Beyond the straightforward approaches of chunking the problem to sub-problems or what have you, which you will apply anyway, sometimes a little brainstorming can surprisingly help you with that "spark."


If you haven't read it, this is a short book that explains a methodology is applicable to a range of problems outside of the author's expertise in mathematics. https://en.wikipedia.org/wiki/How_to_Solve_It


I walk away for a while. This helps me in 2 ways:

1) it helps answer if the problem is worth solving in the bigger picture

2) it allows me to approach the problem differently. (A new approach compared to head banging against walls)


First, keep solving problems. Mathematical brainteasers got me started problem solving when I was in grade school. I really like Martin Gardner's books[1] and those by Robert Smullyan[2]. These kinds of problems develop the flexibility of thought that helps find creative solutions.

Actual, specific approaches to tackling tough problems are taught by the famous Hungarian mathematician George Polya in his classic book How to Solve It [3].

Discrete Mathematics is a field that covers a number of areas, but especially in counting problems (from combinatorics) and graph theory there are a number of results that are not hard to grasp but lead to beautiful solutions for real problems. There are many powerful theorems and principles in discrete math that will unlock seemingly impossible problems. Unfortunately, the books for this subject are mostly written for math majors that are interested not only the application of these results but how to prove them and consequently the books may not appeal to everyone. Perhaps something like Schaum's Outline of Discrete Mathematics would suitably cover the way to apply some of the important theorems without bogging down in the proofs.

Finally, I think there is value in doing small interesting programming projects. Two older books, available used, with interesting projects are Etudes for Programmers by Wetherell [5] and Software Tools in Pascal by Kernighan and Plauger [6]. Etudes has my favorite exercise for trying out new programming languages, building an interpreter for the simple TRAC programming language; Software Tools has a number of programs in Pascal that do interesting things, try implementing the programs in your programming language--they cover a range of difficulties and the book has a nice discussion for each that explains why the programs are structured the way they are.

A more advanced book, Structure and Interpretation of Programming Languages, available as a downloadable pdf [7] is a classic book for those wanting to become better programmers.

[1] https://www.amazon.com/Martin-Gardner/e/B000AP8X8G/ref=sr_tc...

[2] https://www.amazon.com/Raymond-M.-Smullyan/e/B000AQ1NF0/ref=...

[3] https://www.amazon.com/How-Solve-Mathematical-Princeton-Scie...

[4] https://www.amazon.com/Schaums-Outline-Discrete-Mathematics-...

[5] https://www.amazon.com/Etudes-Programmers-Charles-Wetherell/...

[6] https://www.amazon.com/Software-Tools-Pascal-Brian-Kernighan...

[7] http://web.mit.edu/alexmv/6.037/sicp.pdf


Thank you for this list, greatly appreciated.


My recollection is that Dan was working at Microsoft during "Third real job (2015-2017)." (He has a couple of posts in this timeframe where he refers vaguely to his employer. "Hiring Lemons" is one.)

I would be interested to know what teams Dan worked on while he was at Microsoft. He says that team #1 couldn't build their source code on most days, and that while team #2 was better, they still often had broken builds in origin/master.


I love the section he wrote on Microsoft. He is clearly a master of the backhanded compliment:

> However, I can already tell that this experience broadened my horizons. Two examples I got out of the experience are a better understanding of how sales worked and I have a better idea of the range of processes that exist across different teams.

> To the first point, the company produced some decent products backed by a world class sales team. Watching the sales team work was a revelation. The sales people would regularly go in and create sales even when the product wasn’t the best thing for customers. I’d always known that sales was at least as important as engineering, but seeing this in action was different from knowing in the abstract.

> To the second point, I once wrote a blog post on build uptime where I looked at uptime data that seemed surprisingly bad to me and said

> at every place I’ve worked, two 9s of reliability (99% uptime) on the build would be considered bad. That would mean that the build is failing for over three and a half days a year, or seven hours per month. Even three 9s (99.9% uptime) is about forty-five minutes of downtime a month. That’s kinda ok if there isn’t a hard system in place to prevent people from checking in bad code, but it’s quite bad for a place that’s serious about having working builds.

> A number of people responded with comments like “lololol that guy has been pretty lucky in the team’s he’s worked on”. I didn’t think much of those comments until this job. At my first team on this company, we couldn’t build on most days for most of the time I was there! My second team was better, but I would regularly get broken builds when I fetched and merged from origin/master. I had no idea that companies that considered themselves serious about development could have practices like this and it’s good to have this new information.


https://github.com/bitfunnel/bitfunnel

This was his second MSFT team, as I recall - I don't know that he ever publicly identified the first.


Pretty anecdotal, but it seems to strike a chord with a few people in here, like anecdotes tend to do... I like the HTML/CSS approach, and then tack on some JavaScript. No stupid assembler to keep track of. I've taught about 300 kids web design, and around 150 programming (rough estimate), and that's what stuck the best with the kids. Java, not so much. From JavaScript the sky's the limit. Personally I tried BASIC, but I really learned programming by going the PHP route, and onwards to Perl and Java. But the first thing I learned was HTML/CSS. If I could choose again, I'd go JavaScript first, and perhaps Python - another lovely language.


> Personally I tried BASIC...

JavaScript is the new BASIC, in the sense that it is universally available and a great first language to pick up.

It is easy to forget that self-taught programming is motivated by that intrinsic discovery factor, and JS feeds the dopamine loop!


> My class had a couple of kids who already knew how to program and were making good money doing programming competitions on topcoder

> High school (1996 - 2000)

This does't seem right. What sorts of competitive programming sites were around that early?


    whois topcoder.com | grep Creation
    Creation Date: 1998-12-30T05:00:00Z`
archive.org sample: http://web.archive.org/web/20010401182116/http://www.topcode...

So plausible, but extremely early adopter. More likely would be craigslist, IMO.


https://i.imgur.com/vBhtqtE.png [2]

next step - CSS for dummies, because this is awful[1]

[1] https://i.imgur.com/vBhtqtE.png

[2] https://i.imgur.com/vBhtqtE.png


What is your github account? I am curious to see some of your code.


Ironically, my high school expelled me for truancy. My guidance counselor, in what I think was a hamfisted attempt to scare me straight, told me that I'd never get into college without As and Bs, so I took her word for it. Why study, or taking the SATs if my existing marks would keep me out of good 4-year colleges anyway? I had two years of terrible grades behind me, I was quickly moving up in the ranks in a food service job anyway, and I wasn't planning on going into the military, so what was the point?

During school hours, I'd just sit in the computer lab learning Photoshop 2.0, HTML, and the new cutting edge thing called CSS. I'd have loved to take one of the BASIC classes because I had already worked with it a bit on my TRS-80, but they were only offered to kids who had taken AP Algebra 2. One pleasant spring day I was coming back onto campus after smoking a butt across the street, and the vice principal told me I wasn't allowed back onto school property because I was no longer a student. After a few months of just working, I went back and finished high school in a part-time night school program while working full time. Oddly, I've always thought of this as a somewhat pleasant time in my life despite my negative feelings about high school in general.

A year or two after graduation, around 2000, I took a few classes in a UNIX undergrad certificate course at Northeastern. Not only did I ace the classes, but I ended up teaching my classmates more than the instructor did in the Linux class. (That was way less cool than it sounds though; he recommended everybody drop the course and ask for a full refund because he was under-qualified and knew it.) Also, the second-level C class teacher— a hardcore windows guy— emailed me asking questions about developing in a Linux environment for a few months after the class ended.

It was a confidence boost which SHOULD have pushed me towards college, but I still assumed that no 4-year college would have me, and the thought of essentially trudging through two more years of high school in the form of community college was... unpalatable. I stopped taking classes before I finished the certificate because they didn't offer me anything I couldn't learn on my own— even if I did enjoy learning little-unexpected things like C-shell scripting, and advanced awk— and it wasn't going to lead to a real degree. I was much more interested in working in bars and living in shitty apartments with a million roommates with only discount pizza and malt liquor for sustenance. (wait, maybe I did go to college...)

I was lucky enough to get a $10/hr job as a university library IT assistant where the systems administrator gave me a bit of flexibility with what I worked on, a bit of leeway for solving those problems, all of the tech books I wanted, and all of the scrap hardware I could muster up. I ended up learning how to use Perl, first as a systems language and then for CGI programming, honed my shell scripting skills, built a few slightly more involved internal web applications using PHP, learned how bigger networks worked, good IT practices, and lots of other cool stuff. From there I bounced around to a few higher-level support type jobs and then got my first regular software development job after that. Since then, I've tackled problems with increasing complexity and managed a few projects.

Though the more 'vocational' path I took to coding instilled some good practices and a rock-solid work ethic, I feel like there's a bit of whiteboard swag that people who took a math-first, comp-sci-heavy path to coding have that I lack. As the level of complexity with my work projects increases, I sometimes struggle because I picked up a lot of the math and comp-sci knowledge magpie style without having the solid theoretical foundation to tie it all together. I thought learning a programming language without a real-world problem to solve, and no real deadlines was a slog; doing that with things like discrete math is much worse. (though I do think it's super neat.)

It's pretty funny that the primary difference between me and him is that I believed a guidance counselor who told me it was impossible. Approaching 40, I realize how full of shit she was.


”My guidance counselor […] told me that I'd never get into college without As and Bs

[…]

I went back and finished high school in a part-time night school program while working full time.

[…]

I believed a guidance counselor who told me it was impossible. Approaching 40, I realize how full of shit she was. ”

So, what were your grades at that night school program?


Pass/Fail


You may think this website is old-fashioned, but it uses FLEXBOX when it should be using table layouts.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: