Hacker News new | past | comments | ask | show | jobs | submit login
Alan Kay's answer to ‘what are some forgotten books programmers should read?’ (quora.com)
1181 points by enkiv2 on Aug 9, 2019 | hide | past | favorite | 260 comments



These reading lists are lovely and promise a Better You, but I'll propose the following challenge:

* If you're at home, turn around and look at the bookshelf you've already accumulated. Did you really read all those books you so looked forward to when you first bought them? Or do you remember all the best bits from your favorite ones? Be honest now... the unbroken spine on your Godel Escher Bach suggests otherwise. Read what you have, before stressing on Kay's or others' lists!

* If you're at work, have you read all the wiki/docs/etc created by your team and neighboring teams? Do you understand the full architecture and implementation of the system you work with every day? Go read that, level up and become the most knowledgeable person on your team.


Sadly 99% of the code that's written nowadays can be directly thrown away. I propose another challenge: Learn the actual meaningful stuff. E.g. how to code in lisp. Really learn it. You will never code lisp in your job. Promised. But in almost every task you will find that you reuse what you learned.

Or learn the architecture of git. How many people write bullshit around git because they don't understand that git can already do 100x of what their shell script or UI tool or plugin can do....

Don't waste your time with what is hip and instead learn how things really work. Real things. Don't learn the latest javascript framework, but learn the difference between class based inheritance and prototypal inheritance. Don't learn the hippest UI IDE, learn vim or emacs. Really learn it, e.g. how to solve almost all problems in vim without any plugins.

And one last point about "read all the content your team created". Not sure how big your teams are, but in many companies that's impossible. If you spend 12h of reading 7 days a week you can't read all the content that is created in that week. But you know that the cool_library that replaces the AwesomeLibrary is just as bad at solving the problem and both actually don't even know what the problem is.


>Don't waste your time with what is hip

Wouldn't that apply to git? You can probably get 90-95% of the benefit of git using mercurial with a fraction of the effort in learning. Life is too short to spend learning the internals of one version control system.

This idea of "Don't waste time on X focus on more valuable Y" has no real end. Don't waste your non-work time on learning computing related stuff and instead spend time on social connections and physical fitness first. Pays off way more than emacs, git, vim, lisp, or programming paradigms. See how easy it is to make such pronouncements?


> You can probably get 90-95% of the benefit of git using mercurial with a fraction of the effort in learning.

You absolutely cannot.

Even if we pretend that Mercurial is simply outright better than Git, there's a lot of value in learning Git specifically, as it's what most development teams use. Employers value proficiency in Git. If you mention your Mercurial proficiency in an interview, it's likely to be scored up as cute but irrelevant.

If you always work alone, sure, Mercurial might work great for you, but the real value of these version-control systems is in enabling teams to work effectively. Most teams these days use Git, so you need to know Git.

(Of course, if your team does use Mercurial, you'd better become proficient in Mercurial.)

> Life is too short to spend learning the internals of one version control system.

I agree that learning the intimate internals of Git's codebase isn't something that's likely to pay off in the day job, but short of that, Git's 'useful skill-ceiling' is pretty high.

> Don't waste your non-work time on learning computing related stuff and instead spend time on social connections and physical fitness first.

Depends on your working situation. If your work offers no opportunity to learn new skills, and you don't want your CV to get stale, you have little choice.


>there's a lot of value in learning Git specifically, as it's what most development teams use. Employers value proficiency in Git. If you mention your Mercurial proficiency in an interview, it's likely to be scored up as cute but irrelevant.

Given that the person I was responding to was advocating not spending time on things for the reason that employers are looking for them, and is advocating things almost no workplace uses, I find your comment strange. Perhaps you should be responding to the comment I'm responding to?

>Depends on your working situation. If your work offers no opportunity to learn new skills, and you don't want your CV to get stale, you have little choice.

You do have a choice, and you made your choice. In the US, in this era, SW professionals are near the top when it comes to freedom to change jobs and change locations. When I know several people who make half of what I do, are relatively unskilled, and have to work much longer hours than I do, who made a very clear choice in favor of physical fitness and social relations, I am not going to claim I don't have a choice.

Having said that, this is all orthogonal to my point, which was how easy it is to make (reasonable) lists of things one should focus on that are almost exclusive to one another.


After programming for 50 years and having a successful career, I’m not really going to need serious git skills. I still write code, all the time, but it’s for fun and personal projects.

Yes Mercurial is easier to use than git, and really fossil is even better for me, someone that works solo. However git is what I use. I know that if I am going to help someone with source code control (like my daughter who is studying CS) git is what they will need help with. Personal use of git is to me the only way to ensure that I can give pertinent help.


> Wouldn't that apply to git? You can probably get 90-95% of the benefit of git using mercurial with a fraction of the effort in learning. Life is too short to spend learning the internals of one version control system.

I haven't used mercurial much, but it seems like you would have to understand the same amount about the core data structures to do something in mercurial as with git [0]. The really simple activities you do day to day don't require any deep knowledge. And when you want to do more complicated things you have to know what you are doing.

Its similar to writing simple C++ programs vs complex ones. You may not need to understand smart pointers to write small programs, but understanding smart pointers or at least the concepts behind them is critical to write large programs.

[0]: There are minor differences between the two that probably make git harder to pick up. Among them being the staging area, and rather obtuse UI. However, the market effects of git are hard to ignore.


I'd disagree a bit here: mercurial does a much better job of abstracting over the underlying data structures.

Git is a DAG navigation tool that can diff text, and the API shows it.

Mercurial is a version control system that happens to abstract over a DAG. The API, similarly, shows it.


> Git is a DAG navigation tool that can diff text, and the API shows it.

That seems like a relatively accurate characterization.

> Mercurial is a version control system that happens to abstract over a DAG. The API, similarly, shows it.

I am unfamiliar with the details of mercurial. Could you explain or link to something detailing that point?


Social connections and physical health are also such multipliers, just as vim or git or lisp. If you have been asocial and unhealthy in the past and now were able to change you will also experience this increase in output, e.g. because you can suddenly ask others for help and can focus for longer periods of time. Maybe you will even come into less situations where you use git and vim and programming languages, because others also value your ability to focus and your connections more than your ability to produce code. Then it paid of even more to focus on these multipliers. Great.

That doesn't mean either of the other multipliers are not multipliers though. They are just more topic-focussed instead of generally applicable.

And no, mercurial can't do more than 5% of what git can. That's just a fact. Just like you probably wouldn't waste time explaining a flat earther why the earth is round I won't explain this to you. If you are smart (and I assume so from your comment) you will go out and fact check that by yourself from the assumption that you just might be mistaken and maybe this random dude on the net might've been correct.

Hope I'm right in that assumption. Then you'll have an awesome time of a lot of WOWs ahead of you. Enjoy it.

And if not, there's not much lost. Some humans will fly to Mars even if some others believe the Earth is a disk.


One thing that life tought me is that you can be the best dev in the company but still your boss is going to give the promotion to one of his buddies he likes.

Personal connections and social engineering is 10times as important as your actual skills.

Its unfortunate looking at current state of the world, but thats how it simply is. And I understand some may not agree with it, but please look around yourself and tell me Im wrong..


While it's nice you're agreeing with me, I did not suggest it as a way to improve your career, and my advice stands even if it has no chance of helping your career.


>If you are smart (and I assume so from your comment) you will go out and fact check that

Exactly how do I fact check this? Most Google searches give two types of results: Either pro git or meh they're mostly the same. Virtually every pro git page has basic facts wrong about mercurial and are criticizing the mercurial of a decade or longer ago(0) The very few exceptions cover use cases that really do fall into the 5% category.

And I assume you meant 95% and not 5% that you wrote. If you really did mean the latter I suspect you know little of mercurial.

(0) Branches are a classic example. Although I do think git has slightly better branch handling pretty much most pages criticizing mercurial branches expose their ignorance of beaching in mercurial


I'd argue they got it right: when 95% of projects use Git, then why learn anything else? I think the last time I used Mercurial was almost a decade ago at this point, and I don't know of any major project that uses it (outside of Mozilla, who has most of their newer projects are on GitHub, and OpenJDK, which maintains a GitHub mirror).

After Googling, it's pretty dire actually: https://www.mercurial-scm.org/wiki/ProjectsUsingMercurial Many of these are dead links or links to repos that haven't been updated in years.


>when 95% of projects use Git, then why learn anything else?

I remember my University days: "I don't know anyone else who doesn't use Windows. Why do you use Linux?"


That is ignoring my point with a deflection. This isn’t 1999 with Linux as a (relative) newcomer and on the upswing, the adoption rate has clearly dropped over time. I did miss the fact that Facebook uses Mercurial internally, but their public-facing repos are all on Github (including, ironically, their re-implementation of Mercurial, Mononoke). Open Source software doesn’t exist in a vacuum, it needs a community to support it and use it.


> You will never code lisp in your job. Promised.

I believed this too, but then Clojure happened. It's not my ideal Lisp, but I can get paid to write it at a mainstream company.


Ditto, Clojure gets work done, and there are hardly any Fortune 100 companies left without someone using it on the job. It's become part of my job.


What is Clojure's wheelhouse -- what is it used for?


All sorts, but I believe it was born out of frustration at unnecessary complexity arising from standard enterprise development.

Hickey’s choice of hosting it on the JVM was ingenious, gives access to one of the largest most battle tested ecosystems in modern day enterprise engineering.

It’s much more versatile these days with the likes of ClojureScript, it’s being used in anything from real-time multimedia projects to SPA JavaScript work.


Basically anything other than operating systems. I've been working on an optimizing compiler for spreadsheets, and an event sourcing system with a WebSocket gateway in my current work.


Anything that Java is used for


Great to read that. I also know one other person who writes Clojure professionally. Gives clojure to some of who hoped for people to get this opportunity at some point.

There are also rock stars, but that doesn't mean we should give advise that anyone could become a rock star. Those who will become rock stars will become it anyway even if you tell them they won't.


According to TIOBE index, Clojure is not a thing outside HN bubble, is it?


I code in lisp for work everyday! This is a bad promise, but I agree lisp is amazing. I've recently been learning to love macros and reading let over lambda is fantastic!


While I agree in principle, this is a nice way to quickly become unemployed.

We live in a hype oriented industry, where momentum and programmer enthusiasm (free man hours or work) and not technical merits predict whether a given technology will thrive or not.


> Don't learn the hippest UI IDE, learn vim or emacs.

In some circles, these are equally as 'hip.' Idea: learn the tools you need to get the job done. Study theory to supplement your skills.


> Don't learn the hippest UI IDE, learn vim or emacs.

This jars. Why not recommend really learning Cocoa and AppleScript or VB for Applications? Or really dig into Smalltalk's code browser? vi and emacs have a long track record, and I still use both because I sank the time in to learn them years ago, but I wouldn't recommend them as fundamentally changing someone's view of computing.


Well, Emacs may not have changed my view of computing, but it definitely changed how I _use_ computers...


>learn the meaningful stuff, you will never code in your job implying the code that can be thrown away is the code they get paid for hmmmm


> unbroken spine on your Godel Escher Bach

I've read most of it, some parts several times, over the years.

But my spine is unbroken. Here's how:

Open about 20 pages and run your finger firmly down the gutter. Flip 20 pages and repeat until you get to the center of the book. Next, start at the back and repeat, moving towards the center. Then repeat the whole exercise.

While you are reading, occasionally run your finger down the gutter. If you do this, you'll never break a spine, and those massive paperbacks will lie open on the table.

An iPhone makes a good book weight, almost as good as Levenger's.


this comment is kinda the best possible case of "extremely literal reading of the OP's comment"


You must be my son from the future...takes everything literally.


Thanks for the advice. Never heard that before. I'll try it out!


To the contrary, some of the best books have so much knowledge in them that you instantly level-up many notches with a single chapter.

For instance, reading The Pragmatic Programmer and JavaScript: The Good Parts early-on changed my professional life irrevocably, both in terms of programming and career.

I am sure there are various other books like those that truly are impactful, and it doesn't hurt to look up recommendations for those from people who are experts / polymaths.

The case against wikis (with the exception of design docs) and blog posts is that they are erratic, restricted, unedited, stale, and low on pedagogical focus. Some of the books are meticulously structured-- there's no comparison.


"low on pedagogical focus"

Although I have some duds, the hundreds of pages of wiki contributions I've made are nearly all designed to be what I wished existed when I was learning said details. Sadly, they only help to a certain degree. The individual really needs to want to learn to get any real lasting benefit.


>what I wished existed when I was learning said details.

This isn't what 'pedagogical focus' means though. Just writing down the stuff that helped you out very rarely covers what is necessary to help people out in general.

A pedagogically focused approach would also cover the assumptions you had going in and provide clear references to expected prerequisites. It should also cover the "why" for a specific approach if there are multiple approaches, which is very frequently left out of internal documentation wikis.


May I ask you what you wrote? I’m interested.


Internal team wiki covering my industry which is power systems (transmission level stuff).


I hate these books. Basically it's just synonyms for well known concepts, like blue in language b is called gog and orange in language a is called koom. It extends to more complex concepts as well like this is how you do an if statement in python vs. c++. For JavaScript the good parts it's this type of book. Literally nothing new is learned save how to rename the some concept as the good part of JavaScript. Details that you can learn any time.

What I want from a book is to learn new concepts. For example functions and data are the same thing. Or recursion and a loop + stack are isomorphic concepts. Or that unit tests only verify one test case of your program correct while a proof based systems like type checking can verify an entire domain of your program correct.


If you want innovative mind bending concepts read a book about Lisp. Paul Graham's "On Lisp" is great - and I've heard good things about "Let over Lambda".

Lisp is such and open mental playground when it comes to programming.

Vs the languages which consist of the exact same ideas with a slightly different syntax.


It's on my bucket list along with smalltalk. Ive done some sicp and it's definitely mind opening.

Right now I'm on a different route that I think is much more harder. Haskell, type checking, category theory, Idris and dependent typing.


I went that route, and found it less bang for the buck.

Still interesting, but a lot of learn, but it took significantly more effort to see any type of reward. And the total reward is definitely less than that of lisp (from what I encountered).


The purpose of a personal library is not necessarily to keep all the books you've already read within hands reach. I personally prefer to keep books I have never read but may one day be interested to read, so when I approach the bookshelves, it has the potential of a surprise, the thrill of exploring/discovering/etc.


That was a comment I read some time ago coming from Umberto Eco. He has a very big library and he occasionally gets asked by visitors to his home if he read all or most of them? His answer was long your lines. Of course he had not read most of it at all - because what was the point of keeping a library mostly consisting of books you already read? Your library should be the exact opposite to be useful, i.e. mostly consisting of books you had not yet read. Unfortunately I forgot where I read that.


Nice, thanks. Sadly he passed away not too long ago. An image search for "umberto eco library" gives some impressive pictures.


Ouch. This hurts. This is my problem exactly. I have two bookshelves full of programming books that I’ve never completed, and yet each morning I spend at least 30 minutes on Amazon reading the TOC of whatever new programming books they’re recommending to me. Imagine if I instead spent that time reading the books I already have. Just simple math shows that would be 180+ hours of learning per year. But my biggest problem is simply deciding where to begin. Which book do I read? As silly as it sounds, that is very often exactly what paralyzes me. I struggle to just pick one and commit to it.


Start with the old ones. For example, the Mythical Man Month. Still relevant and every other book references it.


the unbroken spine on your Godel Escher Bach suggests otherwise. Read what you have, before stressing on Kay's or others' lists!

Read what you like. You're describing someone who's clearly been 'stressing lists' so much they've got books they're not interested in and are never going to read. If you've made the mistake of accumulating a bunch of books as props, well, it happens - there's no need to compound this further by guilt-tripping yourself to read them before reading something you might actually want to read.


I'm describing the problem of already owning books you'd like (love!) to read, but don't make time for them.

I'm also describing that most of us (I posit) already have a great collection of great books, which aren't fully tapped. Honestly, even re-reading your college textbooks will up your level significantly. You don't usually need to seek out the "best of the best" book, unless you're really going deep or want to learn a specific topic in a very particular way.


If you say so, fair enough. The image you've picked seems like a perfect encapsulation of a different problem. I want to help the person with the smooth-spined G.E.B whose third monitor is not quite at the right height. Go right ahead and slip it under there.


Even ignoring the fact that people are more than just cogs in their company's grinder, I don't quite get why you seem to believe it is obviously more important to memorise the method signature of my interns' implementation of leftpad() than what a good book has to offer.

To be entirely honest, the list is actually too narrowly focussed to support my larger point: namely, that lifting you view to the horizon, by sampling from the best that fields as far away from yours as possible have to offer, is about feeding the soul and not just the machine.

Anyway, I've only read about half the books on my bookshelf (but including GED) and that's actually how I like it. After all, books you haven't read are worth far more to you than those you already know.


> have you read all the wiki/docs/etc created by your team and neighboring teams?

Well, in my defense, it's pretty terribly written...


If you can convince your lead to let you set aside some time to update it / write a new one, it's a good opportunity to learn AND earn some brownie points


My personal experience is that no discipline is better at pointing out my weak spots than trying to write instructions or otherwise document something.


Which is why some of my coworkers simply won't do it.


Honestly it's more about not forgetting them.

The Amazon wishlist has honestly helped me a lot here, I can also annotate and sort it in case I come across a book again.

Plus it helps people buy gifts for me on holidays really easily.

Sorry for turning this into an Amazon ad but it's a great feature


If I'm understanding your comment right, you could use Calibre to do this just as easily, possibly with more information. There's no requirement to actually have a file associated with a record in Calibre, so you can add your comments into the metadata (or have the record point to a file with comments and notes instead of/alongside the ebook). I use a Calibre database to keep track of my bookshelves; the metadata includes ISBN, my review, my rating, average online rating from various sites, where I got the book, similar and associated books, etc. It should be relatively simple to get a single-click plugin going that brings up a search for the ($ISBN|$ASIN|$TITLE+$AUTHOR) of the currently selected record. The only thing missing here is public access (the buying gifts part)


While I myself would like to do just this at some point, this is a lot of work to learn a new tool vs using a website that someone is already familiar with. And you get the added bonus of it being in the cloud and available on your phone (especially great for trips to used bookstores).


Reply since I can't edit:

Note that I'm not endorsing using this for "flipping" books, quite the opposite. I've been screwed over so many times in my search for old out-of-print technical books, that it's a bit of a sore spot.


> Be honest now... the unbroken spine on your Godel Escher Bach suggests otherwise.

I find it amusing that this book is chosen as an example of one people don't really read, and not, say, Penrose's The Road To Reality.


I remember begging my mom to buy me GEB at a Barnes and Noble when I was about 17, about twenty years ago. The book was expensive, especially for a Mexican family, but the book was hyped up in Slashdot, so I really really wanted to read it. I remember reading a few chapters and then couldn't muster the motivation to continue.

I still feel guilty about it. The book sits in my bookshelf, spine unbroken. My mom recently passed away. I'm sorry, mom.


Libraries: I ordered it thru the library maybe ten years ago after geeks talked it up on a similar site. Gave up after a few chapters, had no clue what it was about.


I'm thinking on prioritizing "I am a Strange Loop" over GEB. According to Hofstadter, it is a better book [1]. Also, shorter :-).

1: https://en.wikipedia.org/wiki/I_Am_a_Strange_Loop


GEB was written when he was a young man and his ideas hadn't fully percolated into the more concrete form they take in I Am a Strange Loop. I can't find a source at the moment (it might actually be in IaaSL itself) but I seem to remember him saying IaaSL is the book he wanted to write when he wrote GEB.

Both are excellent reads, but I'd definitely agree IaaSL is far more approachable and a better use of your time.


Strongly disagree.

I Am a Strange Loop spends an entire chapter of Hofstadter informing us how he is better than everyone else because he hears Bach better than everyone else. And this attitude fills the book, I found it very obnoxious and am not sure why no else mentions it.

There are at least two places in the book where he talks about an interesting point, then realizes it is a rehash of something from a previous book. And aside from the self-congratulations (did you know being a vegetarian makes you more of a person?), anything interesting in this book was done better in a previous book.

I would recommend The Mind's I instead. It's a collection of stories and essays from most of his influences, see what Turing, Lucas, et al. actually said.


He also spends a chapter trashing Searle, in a way that seems a little unnecessarily vindictive.

I still enjoyed it. It's not for everyone, but I found that his meanderings served to break up the seriousness of the subject in a way that made for an easy read, much easier than GEB.


if you read one book of hofstadter's, make it "metamagical themas", a truly engaging collection of his columns for "scientific american". I also liked "le ton beau de marot", on the art of translation, but that's less of a general appeal book.

I have read geb, and it had some nice ideas, but i can't say I was motivated to reread it; it felt overly self indulgent somewhat in the manner of a friend telling you about how their rpg campaign played out without realising that half the interest lay in having been there.


Hey there. I've read the first half of GEB at least four times. It's only the last 50% I've never read.


The content is fractally self-similar, so you're covered.


Well it does have a 25 year headstart.

And a cooler cover!


Or Infinite Jest.

Disclosure: My (unfinished) copy of GEB was given to a friend as a gift, and my (unfinished) copy of IJ taunts me as I type.


I have both books sitting next to each other on my bookshelf. In everyone's defense...both are extremely dense reading material.


I read both GEB and TRTR, and I haven't read any of the books on Kay's list ;)


I read about 30% of GEB. Now it makes quite a good mouse pad.


Had it been a reading list from anyone else I may not have looked as I feel as fatigued as your statement suggest you are as well(and you are very much talking about my bookshelf as well)...

But there is a special place in my heart for certain folks...Alan Kay is one of em. Time to move some of that dust on my shelf around.


I don't think there's anything wrong with buying books that you intend to read in the future. I do this all the time, and when I want something new to read I pick something off of the shelf. There are worse things to buy than books.


>If you're at home, turn around and look at the bookshelf you've already accumulated.

So accurate as to be creepy. There's a shelf full of unread technical books right behind me. I will save this link for later.


It's like the gym on January. People like the idea of getting fit. They don't actually like doing it though.

The act of figuring out what's next and buying the materials feels good without any commitment.


> unbroken spine on your Godel Escher Bach suggests otherwise

This hits close to home!


fail - I read quite a bit of it.. one or two pages at a time!


> Be honest now... the unbroken spine on your Godel Escher Bach suggests otherwise.

Ouch. That hurt. I literally turned around and saw this book in pristine condition.


strong agree: a committment to working through ones library can yield great things.

also you can unload poor books that are taking up space now!


There’s a used book store near where I work that has a whole section of programming books. I stop by and pick up old computer books for $10 from time to time, and I always mean to get around to actually reading them - more than once I’ve found that I bought the same book twice, having forgotten that I already own it…


Is this in the Bay Area by chance? I’ve been dying to find a place like this.


Dallas, TX, sorry. Half-priced books, if you’re ever in the area. Most of the computer books sell for a fraction of what they cost new.


I have GEB in the bookshelf behind me and have read it all the way through, same for The Mind's I, two of the books in the list are there as well. The one book behind me that I haven't finished is the dragon compilers book by Aho, Sethi & Ullman.

Have also read Lisp 1.5 but don't have a physical copy.


Is the dragon book worth reading? Most compilers nowadays are completely different from what's discussed there.

Like, SSA form is barely covered in that book.


Can I throw in a snark? If you buy some number of books because your coworker told you to, then actually spend some time reading them?


I also think some science fiction should sit on everyone's bookshelf.


Not just books, you can apply this thinking to just about any purchase.


GEB is a scam book that causes undergraduates to annoy their mathematical logic professors with meaningless questions


Somewhat unrelated but here is Bret Victor's reading list for anyone that's interested: https://gist.github.com/nickloewen/10565777

alan kay is a big fan of his.

The depressing things about reading lists is that it's hard to go through all of them. Many of the books list (SICP) take a long time to wade through, read, and program the examples. They are not "light reading".


SICP also requires a fair bit of Mathematical maturity which threw me off when I tried to go through it.


That’s not quite true, I remember going through SICP easily before I learned post-high school math (which included relearning some basic algebra) when I went on my own self guided CS course. The lectures on YouTube are particularly accessible to newbies too with basic math and programming knowledge.


That doesn’t make it not quite true. You may be gifted at the subject. I was not and couldn’t get through SICP. You need a mind for mathematics or the requisite education in mathematics before doing SICP.


Did you do the problem sets? The book makes a lot of references to make theorems and number theory that the average laymen wouldn't understand.


It doesn't. There's a few math heavy exercises in the first chapter, and the math is generally explained as it goes along too. Except for the first chapter, it's pretty math-free.


Only at the beginning. I found the first couple of chapters around algebra to be a turn off. But once I got past that, I found it a lot more engaging.

(Not that there's anything wrong with those chapters. If someone happens to like them, great. It's your time. Read the book the way that works best for you.)


Very true. The calculus references halfway through the course puts many people off.


Thanks for sharing this—I had forgotten about it!

That gist is copied from this page on Victor's site: http://worrydream.com/#!/Links

It was discussed on HN in 2014: https://news.ycombinator.com/item?id=7578795


You don’t have to read 100% of every single book. Even just going through the first few chapters of SICP (or other similarly dense, more theoretical text) would most likely be beneficial to those who have never worked through the material before.


Also, by reading the classics/greats, you get the style of their thinking—something missed in a summary of the book, or someone else's re-presentation of it. In training, what you are taught is the important thing; but education is in being exposed to ways of thinking characteristic of a particular field, and the styles of thinking of particular teachers/writers.

The constant recommendation of successful scientists is to "Go to the masters, not the commentators." It is the master who, by definition, has the right style, and often the commentators give the results without the essence—style!

— Richard Hamming, Methods of Mathematics Applied to Calculus, Probability, and Statistics, Prologue


Thanks for posting this. I had not seen it before.


I have read "Lisp 1.5 Programmers Manual”, “The Mythical Man-Month” and “The Meta-Object Protocol” by Kiczales. These all are definitely timeless classics.

These quotes about Gregor Kiczales and AspectJ https://en.wikipedia.org/wiki/AspectJ are good intro to MOP:

"In Lisp, if you want to do aspect-oriented programming, you just do a bunch of macros and you're there. In Java, you have to get Gregor Kiczales to go out and start a new company, taking months and years and try to get that to work. Lisp still has the advantage there, it's just a question of people wanting that." -- Peter Norvig

"I am reminded of Gregor Kiczales at ILC 2003 displaying some AspectJ to a silent crowd, pausing, then plaintively adding, "When I show that to Java programmers they stand up and cheer." -- Kenny Tilton


Agreed, and I also think "annotation oriented programming" has become the bane of Java programmers everywhere.

Seems simple when you first slap an annotation onto a class or method. But God help you when it stops working and you actually have to debug what it's doing. Figuring out what code is actually executing and what it's doing seems nigh impossible.

(Which is all to say, Lisp is still far superior, because there is a straightforward process for figuring out what kind of code a macro will generate, or to run a macro and look at its output. Macros can become complex and convoluted, but that's still nothing to the mess that annotations in modern Java frameworks create.)


You can debug an aspect too eh? Throw a breakpoint before a method call, step into it, "oh I'm in a proxy now" step step "oh I'm in this forgotten point cut I made 6 months ago, okay now I know." I have no exp with Lisp so not arguing just observing the one side of exp I have.


But you inadvertently reinforce the point I am trying to make.

There is no way to reason about the code without actually running it. You can not read the code and understand what it's going to do.


Does anyone else get scared that too much breadth will stifle them? CS is a massive field these days and you could easily sink all your time into a small area without fully mastering it. I have had coworkers before and who could speak at length about different Linux distros, networking, web dev, dbs etc. but then were not great at the meat and potatos of the job.


The books recommended by Kay here are very much about DEPTH, not BREADTH.

The majority of programming books are just ephemera and arcana and details that will be irrelevant in a year, or next month when the new version of the framework comes out.

Kay points to books, like the original Lisp Programming Manual, that will help you understand deep core concepts about computing itself, that will remain applicable no matter what framework or library you need to use tomorrow.

Take an Alan Kay, a McCarthy, Norvig, Abelson, Sussman, Armstrong, Steele, etc. from their prime and drop them into a software company where they have zero familiarity with the programming languages or tools currently being used, and within days or weeks they will be the most productive developer at that company by far. They will come up with simple, elegant, high performance and correct solutions to problems none of the other developers would have even considered.

Those are the kinds of thinkers you want to emulate, if you really want to write excellent software solving real problems in the shortest amount of time.


It doesn't even have to be "from their prime" (and I don't like to encourage that ageism concept, now that we have hiring managers fresh out of school). I'm comfortable asserting that, at any age, they will probably be high-value, having enough background to have insights and see connections that others cannot yet, now with a lot of general wisdom besides. See Danny Hillis's writeup on Feynman at Thinking Machines. You'd just need to offer them a compelling work situation, with problems that interest them.


> You'd just need to offer them a compelling work situation, with problems that interest them.

If you want to attract those kind of people and have them do their best work, then part of the “compelling work situation is to let them pursue their own problems.

Alan Kay:

” I don’t run CDG, I visit it. [Xerox PARC founder Robert] Taylor didn’t want to hire anyone who needed to be managed. That’s not the way it works. I have people on my list who are already moving in great directions, according to their own inner visions. I didn’t have to explain to these people what they would be working on, because they already are. Bret Victor has already hired four people that I didn’t know about. I wanted people to fund, not manage.”

https://www.fastcompany.com/3046437/5-steps-to-recreate-xero...


Heh, true, I just added "from their prime" because I know not all of them are still alive so would not add much to a development team in their current state.


You never know, something like "The Hand of Alan Turing" might be a fairly decent draw for hiring


I'm totally picturing The Hand treated like a relic in some Italian church, along these lines:

https://www.atlasobscura.com/places/st-anthonys-tongue


> to write excellent software solving real problems in the shortest amount of time.

This should be on the wall of every developer's workplace.


> ephemera and arcana and details that will be irrelevant in a year

Personally, I like this one.


That sounds like JS developer mentality where you rewrite everything in a different framework every year so figuring stuff out in-depth is a waste of time and you just duct tape everything to get it working.

Ephemera and arcana I've learned decade ago still serve me well today, even when it's outdated people mostly either reinvent stuff so I can recognise new stuff as a variation on an existing thing or build on previous solutions so I know the details and circumstances that lead to some developments and this lets me understand new things better as well (puts it in context).


>Ephemera and arcana I've learned decade ago still serve me well today,

I think the usual back and forth about developer interviews may have this thought as an underlying assumption; specifically people who can solve a red/black tree on the whiteboard at the drop of a hat must be filled to the brim with useful but potentially obscure knowledge. (Obscure to those who haven't majored in CS or a related field.)

It would be interesting to submit an ask hn "What is your so there I was doing x, y, z when being able to answer obscure interview question seventeen saved the day" story. (That's not well worded but you get the idea).


No it sounds like bad development mentality in any language.


And they all would have failed to get past the phone screen from the recruiter wanting to know how many years of experience they have had with "the XML" or "the Spring".


Can you give a concrete example of some topics it covers in breadth and how it would help me to read that book? FWIW I mostly program in ruby, go, elixir, and I program web apps.


I disagree with the premise itself. Breadth of knowledge is not obtained because you will use every single piece of it, rather because you will be able to make connections between disparate topics. When you have breadth of knowledge you know which reference to pick up for help with your new exciting problem. (I am assuming it is a given that breadth of knowledge does not imply only superficial knowledge).

Lastly, having breadth of knowledge means you have learned how to learn efficiently. This is a significant force multiplayer.


I would even go so far as to disagree with one of your sibling comments in saying that this reading list is important because it isn’t about “arcana”. Arcana is what usually gets you the last 10%* of the way to a working solution, even if your high level approach is beautiful and pure.

An old-style hacker reads man pages, RFCs, specs, and programming books not to immediately know how to solve problems at hand, but to know what solutions are possible for future problems. It isn’t useful to know that the HTTP spec doesn’t specify a max header length or that the default setting on Apache is to only accept 8kb of headers, until it is suddenly extremely useful to know both things at once :)

* some would say “the other 90%”


> reads man pages

Good luck with that, especially, but not only, with git man pages.

I tend to avoid reading these “just to know” but only when I need something specific. They are mostly torture. They could have been more useful to read but they presently aren’t.


The git man pages is famously inaccessible if you don't already know the concepts and jargon of git. The following auto generated satire is both hilarious and sad if you have read the actual git man pages:

https://git-man-page-generator.lokaltog.net/


To compare, a real small part from "man git checkout":

"git checkout --detach [<branch>]

git checkout [--detach] <commit>

Prepare to work on top of <commit>, by detaching HEAD at it (see "DETACHED HEAD" section), and updating the index and the files in the working tree. Local modifications to the files in the working tree are kept, so that the resulting working tree will be the state recorded in the commit plus the local modifications.

When the <commit> argument is a branch name, the --detach option can be used to detach HEAD at the tip of the branch (git checkout <branch> would check out that branch without detaching HEAD).

Omitting <branch> detaches HEAD at the tip of the current branch.

git checkout [<tree-ish>] [--] <pathspec>…

Overwrite paths in the working tree by replacing with the contents in the index or in the <tree-ish> (most often a commit). When a <tree-ish> is given, the paths that match the <pathspec> are updated both in the index and in the working tree.

The index may contain unmerged entries because of a previous failed merge. By default, if you try to check out such an entry from the index, the checkout operation will fail and nothing will be checked out. Using -f will ignore these unmerged entries. The contents from a specific side of the merge can be checked out of the index by using --ours or --theirs. With -m, changes made to the working tree file can be discarded to re-create the original conflicted merge result."

Well yes of course.


I read through the bash man page in a reading group with some co-workers. Good times.


It's a legitimate concern. I spent 4 months of evenings and weekends(could have been one or two to be honest) to prep for the Google Cloud Architect exam. I have used NONE of this knowledge (past what I already knew - load balancing and spinning up VMs/containers...I think I used Dataprep instead of a Excel once too for shits and giggles).

I am looking to spend another 4 months learning ML, which I likely won't apply in any way.

That's a year down the drain with ONE cloud provider and ONE way of doing ML. It's easy to completely waste your life like this WITHOUT getting better at your job.

Carry-over is far more limited than people make it out to be, unless you REALLY know a lot, but those people are rare.


100% agree.

Being 35 and having worked in IT for 15 years now and seeing the rapid acceleration into DevOps/Cloud/nix/Programming/Stacks I fear for my future. I want to learn a ton of stuff, but the vast amount of stuff needed to learn in order for me to move up in my salary bracket is stifling. AWS/Azure/+ the former I mentioned, then Python, YAML, Cloud networking. I'm good at some stuff, but the industry is just moving so ultra fast now it's hard to keep up.

I've been contemplating getting out of IT altogether because I'm not fully confident in career growth at this point unless I murder myself with study and ignore my family.

I've been a MS SysAdmin for 15 years, moving into nix devops (the new way of sysadminning) isn't easy.


"unless I murder myself with study and ignore my family"

I think this is completely true as well. To move past senior developer, you fall into one of these camps:

1. Very bright.

2. Spent a LOT of time studying or messing around with the right tech on your own.

3. Sell your soul, i.e. ignore (or not have) family/kids.

A lot of people from camp #1/2 don't understand that for most, #3 is the only option (in the short term). There is also the very real tradeoff of not going with #3 and risking declining job prospects/salary.

I think this is doubly painful for devs, because they are generally used to quick career progression / salary bumps, and then it stalls hard at senior dev.


Why do you need to progress past senior developer? Senior developer jobs provide interesting, fulfilling work, and excellent salaries (probably 90th percentile).

What kind of job would you be aiming for?


There is strong cultural pressure (at least in the US) to constantly progress in your career. This is measured primarily by title and compensation.

I've been a senior developer at the same company for 13 years. I feel that most of the time, I am progressing in my knowledge and experience, so, in my mind, I am making progress. It just doesn't seem that way on my Linked In profile.


What prevents title changes if only to please the LinkedIn profile? Junior - Senior is a change that occurs within the first 5-10 years. Isn't there a way to discuss with your manager for title changes just to show you haven't been doing the same thing?

Deputy Developer? Elder Developer? Doyen of Development? Development nestor of company X? Director of Engineering at Sub-sub-sub-sub-department that happens to be just your team? Level 20 Wizard? Does it even matter if it sounds good on LinkedIn?


> Why do you need to progress past senior developer?

Age-ism, of course. You can't be a 50yo senior developer.


I'm a little confused, because I wasn't aware that there was career progression beyond Senior Developer. I mean, you can go and lead a team or something if you want I guess (in fact, I'm doing that at the moment), but most older developers I know have been and done that, and settled back in highly-paid, highly-respected, and much easier individual contributor / architectural roles. Looks like a good life to me!

The best developer I've ever had the pleasure to work with was a 50-year old senior developer. He cut his teeth doing a lot of C/C++ stuff back in the day, but was also (pretty successfully) leading the company's adoption of Angular. If you have a sharp mind, and you don't get stuck in your ways, then people will be begging for you to be their 50-year old senior developer.


I know more than a few places that have Principal Developer positions. This is basically for senior devs who have tons of domain knowledge that companies don't want to lose.


Um, I know lots of 50 year-old (and older) senior developers.

I'm not ignoring the fact that ageism is a real thing (it most definitely is), and it is more difficult for many older programmers to "keep up", but that doesn't mean that no one is doing it.


It was half-sarchasm. I'm a 48yo senior dev.


Sure you can.

It called consulting.


Our engineering job ladder does not recognize any differences in core software engineering skills after Senior. It is entirely about social and organizational skills.

Running contentious meetings and herding directors are difficult skills to even begin to practice on your own time, but navigating family life is probably as close as you can get.


Maybe this is a good place to articulate what's been on my mind lately regarding growth. I don't think there's much local opportunity for developers beyond starting their own businesses past the senior level. Instead you have to be willing to move. Individual companies will not be able to challenge you beyond their business needs.

The other thing I wanted to talk about is how I solve technical problems. The first thing I do is get a representation of what the problem is in my mind enough to where I can see a clear path forward. This leans on my ability to take a 10,000 foot survey of a problem space. My current role deals with microservice architecture. Microservices is right in my wheelhouse due to my better-than-average sysadmin skills.

But I end up having to learn a lot within a short amount of time. So in order to cut down on what I call the "sheer mass of information needed for mastery" any time I look at a new tool or tech, I make a beeline for the "architecture" or "concepts" page. This is where I work out exactly which concepts and which level of the architecture I'm working at.

I then use the problem statement and vision above to hone in on a perfect implementation. Then I look at the actual state of the system and work to bring it more in line with the perfect one.

I recently was tasked with getting one of our guys unstuck. He was having a tricky issue with aquasec that he'd been beating his head against for a week. It took me five minutes to understand the problem, then I went to my desk and spent twenty on obtaining a reproduction. I didn't want to redo his work, so I then asked him what happened when he did X and Y. From his answers I had a clear path to being able to demonstrate that it was aquasec throwing a false positive on a npm library, and was already in talks with our devops team about next steps. It took 30 minutes for me to move his issue forward.

I feel like this manner of solving issues with techs that you don't necessarily have full understanding of could revolutionize the industry. But I can't really grasp how to teach it. It looks like magic to people when I show it to them, they think it relies on years and years of experience. I mean, it kind of does, but I was able to avoid ever getting fully stuck on problems even as a teenager.

But stretching out the problem space and treating each barrier in turn, diving in a little bit into complex techs along the way, I don't see a lot of coders doing that. Instead they just kind of muddle around with what they know, believing they need perfect understanding of a tech before being able to solve problems effectively with it seems to be the norm. And we have this tech landscape where years of experience in technologies becomes the primary determinant behind how most employers judge candidates.

I think the increasing march of devops and other techs that purport to unite the whole world into one walled, splendid garden will eventually bifurcate the tech world into supermen who know everything, and the underclass who can only work in one garden. If that's not how things already are?

Maybe a secondary school for advanced coding or bringing guilds back.


As an MCSE from the late 90s, I can tell you with oldfart-confidence that you can rely on Microsoft maintaining a landscape where you can be an MS-only sysadmin for decades to come. Sure, the salaries might not be stratospheric, but corporate AD & GPO work will be never-ending. There are lots of businesses running on Windows laptops and desktops and that ain't changing.


yeah, I would agree with this. You may not have the servers locally, but there is still a lot of admin stuff to do, and Azure doesn't configure itself.


'I've been contemplating getting out of IT altogether because I'm not fully confident in career growth at this point unless I murder myself with study and ignore my family."

That's a real problem. I had a few years when I worked on pretty cutting edge stuff so most of my learning was on the job and could be applied quickly. In my current job there is a lot of repetition and stagnation so you have to spend a lot of time outside the job learning stuff which you then never apply. This gets really old after a while.


You seem to need to study programming before being able to do real DevOps. nix is kinda advanced in this sense even if it is has a small language. In my experience, very few ops are able coders.


I agree with this.

Interestingly, earlier in my career, most of the developers I worked with were also capable at system administration, and could easily fill that role if needed.

The sysadmins during that time were quite capable, but had no desire, or an admitted lack of ability for a development role.

It is an interesting phenomena to observe.


Given their background I think they meant nix as Linux/Unix and not Nix the programming language/package manager.

Despite its clear fit with the current hotness of immutable systems and functional programming, I don't think I've seen anyone in for-profit industry on a Nix/NixOps stack.


Yeah I meant Linux/Unix. The asterisk turned everything italic so I just left it out.


I feel a similar way, and I'm a little bit older with a fairly strong desktop/web/mobile dev background. My focus was heavy on Microsoft technology before moving 50/50 to Linux.

I look at industry as a series of different waves happening where I just need to get on one so that I can find the next. A wave is going to cross the startup world before it moves into established business. If I time it right then I can hang in there for a while. It doesn't help with my anxiety, but I've done this before so I know I can do it again. I suspect I'd find the same pattern in a closer examination of your background.

There are very few people learning all of these technologies because time is required to learn the basics, as well as put it into practice within industry. That's deep learning. A lot of people are lucky to work in places that will put unproven technology into production. The number of places willing to take this risk is increasing because cool technology is a requirement for attracting top talent. If you are in a more risk-averse organization it may seem scary to move to a faster moving, less risk-averse one, but you might find they actually have more leeway for mistakes and learning.

I watch for particular patterns around how a technology is hyped and who is applying the technology. Thoughtworks publish their perspective (https://www.thoughtworks.com/radar) and I try to read this, plus other analyses to understand what direction things are moving in. You have to take a longitudinal view. It's not good enough to just compile your research and make a decision. You need to know how a technology has moved over time. Once you start doing this you start to develop some "spidey" senses when you see something at the top of HN - and you don't have a ramp-up cost each time you have to make a switch.

Other than that focus on a few fundamentals, including one programming language reasonably well. I'd strongly suggest learning Python and having an incremental two year development plan so you don't half-ass it. A lot of the Linux world is built on stable skills. Rather than focusing on AWS, or Azure networking learning networking and TCP/IP beyond a cursory level. There are a lot of folks building cloud infrastructure badly, slowly etc. because they don't understand these fundamentals. The references are old and boring like TCP/IP Illustrated Vol. 1.


> I've been a MS SysAdmin for 15 years, moving into nix devops (the new way of sysadminning) isn't easy.

nix devops is productizing/consumerizing the old way of nix sysadmining (at least for shops that did things the right way)


I would suggest attempting to learn the general principles behind the one way to do ML/cloud/whatever you have studied. That type of breadth of knowledge is the useful one. It lets you gain further knowledge much more efficiently. As I mentioned above, to great extent breadth of knowledge is about learning how to be an efficient learner.


I would say the most important knowledge is the one that allows you to ask the questions of "why is it still not done?" or stating "well, they did it, but it won't work well in situation X" and then either finding the products that do exactly that, or knowing that in a few years there will be an industry providing exactly that kind of service.

Saying that you can't move between cloud providers because you need to learn something new probably implies you don't understand the fundamentals constraining the design of e.g. databases provided by cloud.


It's usually the lack of breadth that hurts you.

IT is a fast-moving field. Solving business problems takes understanding which good ways to do that are currently available, what limitations and dangers each approach brings, how would it interact with other things, etc.

You still need deep knowledge, but the most efficient knowledge is that of key principles, general approaches and ideas. A specific technology does not matter much, and can be mastered quickly enough, if you already are acquainted with the principles behind it. E.g. learn about FRP, and you will see how React, Elm, and Excel all work along the same lines.

What you end up with after some time is like a normal distribution: deeper knowledge around some area and various levels of acquaintance with a wide range of other things.

There are still areas where you can polish the mastery of one narrow thing to utter perfection: making pizza or coffee, sports like running or weightlifting, etc. You can keep practicing them to counterweight the feeling from the view of the sheer and constantly moving IT landscape.


> There are still areas where you can polish the mastery of one narrow thing to utter perfection: making pizza or coffee, sports like running or weightlifting, etc. You can keep practicing them to counterweight the feeling from the view of the sheer and constantly moving IT landscape.

Hear Hear! When talking to Junior Devs i always point out that they should take into account the Lindy Effect and invest some of their learning time into things that were here for atleast a century so they can get enjoyment out of their accomplishment for the rest of their lives.

You don't want to die without making at least ONE perfect pizza.


What would be a good example of a slow-moving field?


Management. Kay's list includes The Mythical Man Month, which, except for the technical details, could have been written yesterday.


It is very waterfall-y as someone I know said in a talk and therefore you probably need to be a little careful what "rules" you take away from it. That said, the "mythical man month" discussion itself as well as the idea of a 9x difference in effort between a program and a programming system product are worth the price of admission.

The latter arguably runs somewhat counter to a lot of MVP, etc. approaches but recognizing the difference is still useful.


I've heard an observation that Agile as practiced in the real world is often basically "tiny waterfalls."


Agile™, as I've encountered it, is collectivized micromanagement.


Ahhh...

You just put into words what’s been bothering me about the way my team works. The “process” people—while well-intentioned—seem to think that by breaking inherently complex tasks up in just the right way, they can make the complexity go away.

Well, no. If that’s really the idea, why are you paying me so much?


Exactly and it's almost perfectly orthogonal to engineering.


There's probably some truth to that. That's what a sprint basically is, right?


If you run your sprints right then no, nothing's set in stone. If something comes up in the middle of the sprint that takes priority, you address it. You don't mindlessly stick to the schedule set (that would be a "tiny waterfall.")


Fair enough. Although waterfalls usually weren't really set in stone either in my experience. Which had both good and bad points. (It's good to be adaptable but changing requirements all the time is also a good way to make a project late.)


This sounds is/ought-y? What is the prevalence of running sprints right in the real world? "As practiced" was the important part of my question/observation.


I feel that if there is a millennia old book or tablet from early days of mankind regarding management, many advices would still apply.


Physics.


Here's my approach. Do your best to aware of as many tools/techniques as possible, know where to look for information on the ones that look/become promising or useful, and build deep knowledge on the ones you wind up regularly using.

It's very useful to have enough exposure to things to be able to tell generally what's going on and to know what you need to look up.


Here is the interesting thing about reading books, I once thought reading books and remembering everything about the text is the goal; However over time I’ve learned there is value in having read the book, understanding the material and then you know where to reference the information should you need it in the future.


Often reading good books is less helpful than it could be because the books answer questions you don't yet have.

Reading and practicing is key.


Yes -- which is why you'd do better to read these books than 95% of the links on HN. These are books about how objects work and how EVAL is defined. These are the meat-and-potatoes of computer science. There's no "Linux distros, networking, web dev, dbs etc" here.


I believe any team needs a combination of both types i.e. people with breadth and others with depth.

Some people are just good at seeing the top view and how all of it will fit in most optimal way possible.

I am saying this because I can see abstract patterns from the top and what's wrong with entire project. But entire IT industry is looking for people with depth. So sometimes I wonder if there is need for people like me in IT but then again when I see a vast system I can immediately discern what could go wrong, possible bottlenecks or come up with better solution. This gives me little bit of hope.


I have a feeling that in CS you need a decade of experience, a wide range of experience, before you can even get on the path of mastering something.


Reconcile that value of experience... with commonplace startup hiring age biases, and dotcom interview processes biased towards someone with CS101 fresh in mind and who perhaps just spent months practicing leetcode whiteboarding. :)


You can focus on breadth or depth depending on what kind of programmer (generalist or specialist) you want to be. Matt Klein (of Envoy) had a good thread about that on Twitter [1].

He focuses on depth whereas I tend to focus on breadth. Of course I have areas and technologies I know better because I have worked in / with them (computer vision, filesystem / DB replication algorithms, Lua, etc) but I try to expand my areas of knowledge so I can understand whole systems rather than become an expert at a specific thing. For instance, right now, I'm doing mostly Web front-end stuff, which is the part of the Web stack I know the least.

[1] https://twitter.com/mattklein123/status/1130206792175169536


not really, like if you're broad then it means you get muscle memory for "any old shit". Like I'll happily blunder into language or tech stack because the implementation patterns overlap so much after you've juggled a few things.

> oh its one of these sort of things like xyz I did four years ago.

Of course the specifics are different but the muscle memory can get you through. I'd say the worst bit about it is that you often feel impostor syndrome pretty bad because in many cases you're never 100% sure about stuff like you might be in a speciality.


Too much of anything is not good.

I do tend to focus more on breadth, but I do go deeper on some subjects. However, I never give up breadth for the sake of going _Jon Skeet on C#_ level of depth. That, for me, is a waste of time[0]. I don't need to know a language or tool quite that deep. I can be very productive with a certain depth without touching bottom.

[0] Note: I'm not saying Skeet is wasting his time.


The other thing you need to be careful in with focusing solely on going deep on one topic is that that one topic can go by the wayside.

I've known people who were, in one example, arguably the world expert in performance on a long ago computer architecture. There came a point where no one cared any longer and I'm not sure to what degree he successfully moved on to other pursuits.

Another example is Y2K. A lot of consultants ended up defined as being Y2K guys and they didn't necessarily successfully transition to something new.

Not arguing that going deep is necessarily wrong but, if you do, you need to keep your eye on emerging areas that could benefit from your existing skills.


Most people lack both breadth and depth. So don't worry about going too broad... you probably won't get there.


Define the meat and potatoes of their job.

They could just be branching into /r/iamverysmart territory and just parroting omgubuntu.


But the first 10-100 hours of learning and using a new language feel so good.


Nothing beats T-shaped skills :) https://en.wikipedia.org/wiki/T-shaped_skills


Depth cannot be gained without breadth & vice versa. Knowledge isn't tree-shaped but web-shaped.


It has been widely confirmed that going for a wide breadth only makes you good at architectural type jobs where you orchestrate multiple separate silos of knowledge coming together, without actually knowing the details too deeply but instead simply trusting that they will interface correctly.

If you try to have a wide breadth and a deep knowledge simultaneously, you will pay a heavy price. You will not have much of a family, you will not have other hobbies, you will not know much beyond CS, you will not even have time to use the vast majority of your knowledge to its full depth. There simply isn’t enough time. You will likely die as you lived, at a keyboard with your head weighing down on keys spamming them infinitely in a code editor, or slumped in a chair or bed with a technical book resting on your chest. Very few people will notice your passing, and the world will be no different for whatever knowledge you gained.

What we should strive for instead is “Just-in-time knowledge”, where the goal is to quickly become an expert on a topic you know nothing about right when you need to be. Many people first learn to do this when they get into debates on the internet, and then extend it into their professional careers.


> quickly become an expert on a topic you know nothing about right when you need to be

This simply doesn't happen for anyone.


I would agree that work-life or study-life balance is important, but your comment is way over the top. One can definitely "have it all, just not at the same time"^1. "All" it requires is spending time and tought on each piece and having a support group for when you feel you are too tired.

1: Admittedly, the lottery of birth can make that more difficult.


A PDF version of the first recommended book Lisp 1.5 Programmer's Manual:

http://www.softwarepreservation.org/projects/LISP/book/LISP%...


Thanks for posting this link. I'd never heard of softwarepreservation.org before and now I can see I'm not going to get a lot done today.


Of course you will get a lot done, just none of what you intended to do! :)


There's also a bunch of stuff at oldlinux.org though (obviously) all Linux-related. "A Heavily Commented Linux Kernel Source Code" is also available in English now (scroll down to the top of the change log).


Thanks for sharing! Now I've something to read in the evenings


My own humble suggestions - although the books are hardly forgotten. But I think people focus a lot on technical / engineering books, and very little on design / user experience / human behavior, which arguably contribute much more to the overall impressions end users have of programmers' work.

First, the greatest book of all time, The Autobiography of Benjamin Franklin - an amazingly introspective and insightful look into how to live an examined life and improve oneself.

And then if you want to learn lower-case "design thinking", my top 10 books

* Design for Everyday Things - duh. I re-read chunks of it all the time.

* Tufte - hard to pick one, I might actually be iconoclastic and go with Visual Explanations which I think has more to offer programmers over pure data visualization. Again, just grab one every day, flip through 3-4 pages, rinse, repeat.

* User Story Mapping - Extremely memorable book - it gives you a pretty clear field guide on prioritization, empathy, communication ... just a great book.

* Badass by Kathy Sierra - I flip through this book again and again. It is gospel truth about what motivates humans.

* The Field Guide to Human-Centered Design - IDEO's most practical book. (Close second: Designing Interactions.)

* Universal Methods of Design - another deeply practical book, lots of good tips and examples.

* Universal Principles of Design - Sister book to the Universal Methods. Again, straightforward, flip to any page and get an idea when you're brainstorming.

* Thinking in Systems - I recommend you skim this book through, but come back to it a lot, it grows with you.

* Inspired by Marty Cagan - again, love nuts and bolts process books.

* Don't Make Me Think! - still a classic, still see these mistakes being made all the time in modern app dev.


> Design for Everyday Things

Do you mean "The Design of Everyday Things" by Don Norman? If so, I agree that it is a great book.


Joe Amstrong had a related talk 'A Guide for the Perplexed'[0] last year. He mentioned several 'forgotten great ideas' such as Linda Tuple Spaces, Flow-based programming, Xanadu and Unix pipes.

[0] https://youtu.be/rmueBVrLKcY?t=1949


Hah.. My BSc thesis (>15 yrs ago) was around Linda Tuple Spaces... Good times.


Probably not on Alan Kay's list, but Leo Brodie's Thinking Forth was mind blowing and game changing!

http://thinking-forth.sourceforge.net/


He did use a stack machine at one point - B5000 https://queue.acm.org/detail.cfm?id=1039523


What is a good Forth implementation to follow along with this book?


Take the book as a specification, then make your own Forth implementation.

Edit: Starting Forth also by Brodie would probably be a better “spec” for implementing a Forth.


Cool, "Starting Forth" is available for free on forth.com!

Makes me curious about SwiftForth [1]. I saw a bunch of other commercially available forth implementations in the past ([2] ?), but since Forth looks so niche I never got motivated enough to try Forth more seriously.

1: https://www.forth.com/swiftforth/

2: https://8th-dev.com


Coming from Alan Kay i definitely need to check these out. If I'd read this post a year ago I would have known almost none of the authors, but after reading "Hackers: the heroes of the computer revolution" earlier, these are all names that stood out right away.

Both Minsky and McCarthy seem like almost mythical figures in the book, and I don't think I could ever hold aa candle to them, but the next best thing is probably to understand their thinking. I think it's a bit easy for us to get caught up in the medium blog posts detailing a small segment of a new framework, when what we really ought to do to grow, is go back to the basics and understand them in-depth.


The Levy "Hackers" book is what told me there were concentrations of other kids who liked to figure out and build things as much as I did.

I eventually found my way to Minsky's class. It was nominally on one of his past books, but the class sessions often seemed to be him talking about whatever he was thinking about that day, as he worked on his next book.

Minsky's "ten-year grad students" and unofficials were also great. Two of them were especially personable, and would wander around the lab, and strike up impromptu technical conversations with random other enthusiastic students. Which seemed unusual among grad students of my dotcom era, and maybe it was more old-school greatness, like the Levy book.


Not a great day for Minsky's reputation. He's been named in connection with the Jeffrey Epstein sex trafficking allegations: https://www.thedailybeast.com/jeffrey-epstein-unsealed-docum...


I hope he's exonerated promptly. Best wishes to his family and friends in the meantime.


I hope the truth comes out, risk of it getting buried along with Epstein.


Another great book by Minsky is Perceptrons. It's really short to boot, IIRC. It is an especially good book for those interested in ML. In it Minsky shows that single layer neural nets can not compute all problems. I remember that he has a caveat somewhere that he hasn't thought about multi-layer neural nets and they they may still be interesting. So great was the impact of this that hardly anyone worked on neural nets for something like a decade afterwards. But it's a brilliant book presented in a really simple style. It's a piece of history, I think.


I read it fairly recently, and I agree that it is both great and short.

Regarding multi-layer neural nets Minsky says they're uninteresting as they could be declared with enough complexity to basically reimplement any existing logic circuit. What made multi-layer neural nets interesting again was a multi-layer training algorithm.

There another interesting part, shortly after showing that single-layer neural nets can't implement the XOR function, Minsky shows that all that's required for a single-layer neural net to implement the XOR function is to add another column to the training set with specific values, effectively encoding the hidden layer back into the training set.


Also Marvin Minsky's hilarious paper: Jokes and their Relation to the Cognitive Unconscious. More fun than a barrel of an infinite number of monkeys.

https://web.media.mit.edu/~minsky/papers/jokes.cognitive.txt

Abstract: Freud's theory of jokes explains how they overcome the mental "censors" that make it hard for us to think "forbidden" thoughts. But his theory did not work so well for humorous nonsense as for other comical subjects. In this essay I argue that the different forms of humor can be seen as much more similar, once we recognize the importance of knowledge about knowledge and, particularly, aspects of thinking concerned with recognizing and suppressing bugs -- ineffective or destructive thought processes. When seen in this light, much humor that at first seems pointless, or mysterious, becomes more understandable.

A gentleman entered a pastry-cook's shop and ordered a cake; but he soon brought it back and asked for a glass of liqueur instead. He drank it and began to leave without having paid. The proprietor detained him. "You've not paid for the liqueur." "But I gave you the cake in exchange for it." "You didn't pay for that either." "But I hadn't eaten it". --- from Freud (1905).

"Yields truth when appended to its own quotation" yields truth when appended to its own quotation. --W. V. Quine

A man at the dinner table dipped his hands in the mayonnaise and then ran them through his hair. When his neighbor looked astonished, the man apologized: "I'm so sorry. I thought it was spinach."

[Note 11] Spinach. A reader mentioned that she heard this joke about brocolli, not mayonnaise. This is funnier, because it transfers a plausible mistake into an implausible context. In Freud's version the mistake is already too silly: one could mistake spinach for broccoli, but not for mayonnaise. I suspect that Freud transposed the wrong absurdity when he determined to tell it himself later on. Indeed, he (p.139) seems particularly annoyed at this joke -- and well he might be if, indeed, he himself damaged it by spoiling the elegance of the frame-shift. I would not mention this were it not for the established tradition of advancing psychiatry by analyzing Freud's own writings.


> The way to grow from this book is to deeply learn what they did and how they did it, and then try to rewrite page 13 in a number of ways. How nicely can this be written in “a lisp” using recursion. How nicely can this be written without recursion? (In both cases, look ahead in the book to see that Lisp 1.5 had gotten to the idea of EXPRs and FEXPRs (functions which don’t eval their arguments before the call — thus they can be used to replace all the “special forms” — do a Lisp made from FEXPRs and get the rest by definition, etc.).

Found it ironic that in a comment about Lisp, Kay forgot to balance his parens. :)


He knew someone would comment on it and use a smiley emoji.



Not a superset of OP -- only five titles under "Computing".



My adviser had suggested me a fantastic book for bedside reading - "Algorithmics: The Spirit of Computing" by David Harel (https://www.amazon.com/Algorithmics-Spirit-Computing-David-H...).


I discovered this book by looking for more information on state charts vs state machines. Harel writing also got me interested in Topology:

"Topological features are a lot more fundamental than geometric ones, in that topology is a more basic branch of mathematics than geometry in terms of symmetries and mappings. One thing being inside another is more basic than it being smaller or larger than the other, or than one being a rectangle and the other a circle. Being connected to something is more basic than being green or yellow or being drawn with a thick line or with a thin line." [1]

1: http://lambda-the-ultimate.org/node/2342


The Lisp 1.5 Programmer's Manual listed by Kay is available for download from the Computer Museum archives [1].

More Lisp fun [2].

[1] http://www.softwarepreservation.org/projects/LISP/book/LISP%...

[2] http://www.softwarepreservation.org/projects/LISP


I love that Joe Armstrong's PhD thesis is on this list. I have it on my reading list for junior engineers. It's incredibly accessible and the mental model is very powerful.


Here's a link for those who are interested in reading it: https://web.archive.org/web/20041204143417/http://www.sics.s...

This was archived from http://www.sics.se/~joe/thesis/armstrong_thesis_2003.pdf which now returns a 404 Error.



Would you mind sharing that list? :)


Yeah, how accessible is a 295 page PhD thesis where half of it is Erlang specific for junior engineers? Unless you want them to quit their jobs, or complain to their friends what a crap manager they have.


Relax. Nobody said it was a mandatory reading list. There are no tests. It's a suggested reading list that I pull from depending on context. Joe's thesis is incredibly accessible.

At least two of my items were on Alan Kay's list. The other being "The Mythical Man Month", especially the edition with the "No Silver Bullet" article, which a depressing number of people in our industry seem to not have read. So I don't feel like I'm too far off the mark.


I've been meaning to read The Mythical Man-Month and found it's freely available online: https://archive.org/details/mythicalmanmonth00fred


This is a great companion to "The Goal" and "The Phoenix Project," both of which use fictional narratives to illustrate best engineering practices in the context of saving a (fictional) business.

https://www.amazon.com/Goal-Process-Ongoing-Improvement/dp/0...

https://www.amazon.com/Phoenix-Project-DevOps-Helping-Busine...


I'd also like to point out that the majority of software engineers nowadays lacks the mathematical background, so it's probably worth including theoretical books like Abstract Algebra by Dummit and Foote on the 'must read' list.



> How could you combine this with Val Shorre’s “Meta II” programmatic parser to make a really extensible language?

At VPRI (Kay's research institute) they did this: COLA. (See also STEPS.)

http://www.vpri.org/

https://en.wikipedia.org/wiki/COLA_(software_architecture)

http://www.vpri.org/pdf/tr2012001_steps.pdf "STEPS Toward the Reinvention ofProgramming, 2012 Final Report"


Not a book, but I encourage people to read Douglas Engelbart's paper on Augmenting Human Intellect: http://dougengelbart.org/content/view/138

Alan kay is a big of Engelbart and I'm surprised it wasn't listed in his answer. Also, for anyone that's interested, a windows clone of NLS is available here for windows: http://www.ndma.com/resources/ndm8543.htm

Minus the "journal", many of the multi-user capabilities, and the "compiler compiler" programming system of NLS. still, it's interesting to play with.


Grab programming pearls, read it cover-to-cover (and try the exercises please), collect all the references, read it all.


I can see the n-gate.com entry now:

Alan Kay recommends some books. The Hacker News spend most of the thread recommending books they like, instead.

Anyway, this is a nice list filled with works I've not read, so I'll make certain to give them some attention next time I'm at a book store and ask them to order some things, since they only carry magazines and other drivel in-store.


> It starts with a version of John’s first papers about Lisp, and develops the ideas in a few pages of examples to culminate on page 13 with Lisp eval and apply defined in itself. There are many other thought provoking ideas and examples throughout the rest of the book.

Sounds like the MIT SICP course from the 1980s using scheme!


An old comment on McCarthy's LISP 1.5 manual: "This is the source of the opinion that LISP is hard."


Threaded Interpretive Languages: Their Design and Implementation

https://www.amazon.com/Threaded-Interpretive-Languages-Desig...


Probably controversial, but eelco dolstra's PhD thesis was eye opening for me.


Agreed. IMO the most significant work in system administration in the past 20 years.

https://nixos.org/~eelco/pubs/phd-thesis.pdf


Glad to see him mention Peter Landin (https://en.m.wikipedia.org/wiki/Peter_Landin). As a user and developer of functional programming languages, https://www.cs.cmu.edu/~crary/819-f09/Landin66.pdf is my favorite paper of all time.


Found a great set of slides about early practice vs theory disputes (and their resolution via the mathematical study of PLs, programs, and their semantics) that I found ~looking for pdfs of one of the books Kay mentions: Advances in Programming and Non-Numerical Computation~ I mean searching the internet:

https://www.cs.ox.ac.uk/strachey100/slides/2-JS.pdf


Computation: Finite and Infinite is one of my favorite books. It's fascinating to see computation built up from the theory of neural networks. Marvin Minsky is just such a deep thinker and writer.

For those who are interested I made the neural net system into a simulator:

https://justinmeiners.github.io/neural-nets-sim/


Arrggh, I don't think it's published anymore. And used ones are over $100 on amazon.


I actually got it in my mail today. $30 plus shipping. Keep an eye on price fluctuations.


yeah its out of print. find it online or get it from a library



I wouldn't recommend the Lisp 1.5 manual, other than for historic reference in the context of learning about Lisp as it exists today.

If that is too new, then, for pete's sake, at least study 1986 Lisp; no need to go back to 1962.

Books from the mid to late 80's, like Wilensky's Common LISPCraft are decently useful.

Just like I wouldn't tell someone to read the 1978 first edition of the Kernighan and Ritchie C book.


It is not exactly a "book", but I have found HAKMEM to be an invaluable source of knowledge.


  Assembly Language Step-by-Step
  second edition
  Programming with DOS and Linux
  Copyright © 2000 by Jeff Duntemann
  Rev. ed. of: Assembly language, © 1992
  ISBN 0-471-37523-3


The Mythical Man-Month is a classic, not just for CS.


People don't read books anyways anymore. It's all audio books and videos now.

However, people like to claim that what I said above is reading a book.


Now I've been at this game several decades I've probably got just about enough experience to really get some value out of Lisp.


Computation: Finite And Infinite Machines was my first introduction to computing. It is the only book I still have.


Ted Nelson "Dream Machines"


Why on Earth is Alan Kay on Quora?


He is very accessible, concerned about preserving knowledge from a previous generation of computing, and not worried about his brand of self promotion. So he goes where interesting discussions are happening - he has been on HN a number of times (user https://news.ycombinator.com/user?id=alankay1, AMA - https://news.ycombinator.com/item?id=11939851)

I met him at a conference 10 years ago - I basically blew off the whole conference to sit at his feet in the lobby while he told stories and shared pearls. There were 3-4 of us, probably 30-40 years younger than him, and we were all nobodies, but he didn't care. One of my favorite experiences ever.


> I basically blew off the whole conference to sit at his feet in the lobby

This implies that he also blew off the whole conference. I wonder why he did that.

> There were 3-4 of us, probably 30-40 years younger than him, and we were all nobodies

I would hope that this isn’t the reason. Being famous and “holding court” like this is probably addictive.


What conference was this? Curious.


I think he's a fan of collective intelligence. Maybe he feels quora is a good implementation of this concept.


Why on Earth is Alan Kay on Hacker News?


Why on Earth not?


It feels like a worse and more ridiculous version of Yahoo Answers. Half the answerers don't know what they're talking about and half are marketing departments.


Software Creativity 1st Edition (1995) by Robert L. Glass.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: