Perhaps the question we should be asking is why talent doesn't seem to increase with age as much as we'd like. Let's assume for a minute that the age bias is important and useful. Why is that?
More importantly, what could we do to facilitate an increase in creativity or productivity or 'talent' over time? The fact that we don't has a huge impact on the kinds of things society achieve.
I don't think it's correct to just say "brains get less flexible" and give up. Most Nobel Prize winners (even in theoretical physics) tend to do their prize-winning work after this cutoff (I believe average is 40+). I suspect it has to do with the fact that scientists frequently have to shift the problems they're looking at as they go from grad student to postdoc, and grant to grant.
If I had to guess, I would say that it's not biological age, but time spent in a particular field that suggests how big a breakthrough you can make, because you get stuck thinking in a particular way. If young people have an audacity that lets them tackle new problems, we might focus on how to preserve that audacity as we get older by making it easier to switch fields.
> The question we should be asking is why talent doesn't seem to increase with age as much as we'd like.
This question has been answered many times before. People are re-solving the exact same problems with a new framework, platform, or language every two to five years, sometimes less. The "solved problems" never seem to stay solved.
Alan Kay had a good point when he said the last 25 years (now 35 years) have been more about pop culture than actual engineering[1].
I don't buy any of that. Look, there's a known thing here. It's called "the youth culture" brought to you by cigarette advertisers in the 1960s. It's now being continued and expanded on by the present class of online media companies. Well outlined in "Generation Like" on PBS frontline, it's clear that this is a continuation of a completely anti-demographically supportable belief system.
There's no empirical evidence for it. Yeah, there's a couple of half-hearted graphs of Einstein-level breakthroughs. But is that really the model here? Some guy builds a master work, a once-every-100-years theory; it's a bit much to expect him to top himself.
If we look at other fields, like painting, composing, writing, the results are more mixed. Software reminds me more of composing, really. Composers last forever.
Folks, it takes a long time to get good at this. 30 years before the mast and I'm still gaining speed.
You say talent, I say a lack of tolerance for bullshit and asshole driven development (ADD). ADD is the boss who waits until deadlines are nearly here to drop work on you (he said he does it because he thinks last second pressure creates better work. I think it creates unnecessary stress and sloppy work. Quit that job with a quickness). Or perhaps it's just the willingness to stand up and say, "Your bad planning is not my problem". Of course there are real emergencies occasionally but the vast majority of last second panics are completely preventable by competent people. Long hours shouldn't be a regular requirement.
I have a theory. Competence increases with age. However, there's a fine line between older people not doing as many stupid things, versus older people not taking as many risks.
The perceived "talent" or "smartness" of younger people in the valley is simply the result of them taking more risks, which inevitably sometimes pay off. And risk-payoffs in the valley are disproportionately large at the high end, which skews the balance toward taking more risks.
For my theory to be true, silicon valley startups would be riddled by god-awful, marginally working code in the average case.
>However, there's a fine line between older people not doing as many stupid things, versus older people not taking as many risks.
Yes, I can back this point up. I feel it.
I used to crank out demos in crazy short periods of time. Now I take longer, but when I'm done it's code with really solid engineering that I wouldn't be ashamed to continue using.
Because that "crank it out quick demo" always gets used. And you always end up suffering with it for months (or years).
And despite feeling like I'm taking "longer", I still tend to get things done faster than most developers.
In my opinion I think it's just the free time to be able to put in the number of hours of deliberate practice. When you are younger you have far more time and flexibility to learn and experiment. In addition that time is spent on things that are seen to be new/better/modern and also come with a certain distain for doing things the way your predecessors did it. This mix gives the young person the flexibility of cutting corners and inventing/reinventing and by all appearance look 'smarter'.
I am now 30 with a wife 2 kids - I simply cannot put in the same number of hours of simply playing and practicing I did when I was 15-25. As much as I try to put in the hours, I need to face the fact I need to focus my time on them now. I also need to continue to earn a good living to give them everything I want for them (education, freedom). This means I am going to keep my 150k job doing things which may not be so innovative but I get to leave work at 5pm.
Whether that is a incorrect assumption I don't know, but from my perspective it feels true.
>The question we should be asking is why talent doesn't seem to increase with age as much as we'd like.
Wrong question, I think. Talent does increase with age, at least if the developers in question actually care enough to stay current. [1] I don't think it gets better by orders of magnitude, though -- the mythical 10x-20x programmer starts out as a 5x+ programmer. I know I was already at least that much better than most of my coworkers when I got my first day job.
What does happen is that developers are less willing to work crazy amounts of overtime just to impress management, and it's overtime bias that's the real problem.
I just blew away a team of two developers (both much younger) who had been working on a project for 9+ man weeks, catching up and passing them in less than two calendar weeks of development (on an implementation of the same code on another platform). And I did it in 30-40 hours per week. At a "traditional" company that rates performance by the number of hours each developer sits at their desks, I would have been the worst performer, and it's only the fact that I was doing almost exactly the same thing as the other team that made it clear I was getting more work done in less than half the work hours.
The corollary is that, as a young programmer, I cost the same as an idiot who couldn't put two lines of code together without creating three bugs, but I was already at least 5x as productive as that idiot (well, infinitely more productive than some developers I knew back then...). But now I have a track record and I make a lot more money; if I'm as good as 3-5 typical programmers, 3-5x the average starting salary is a bargain (considering reduced overhead, management, and increased chance of project completion because of reduced complexity).
But most companies don't recognize that, so they'd rather hire and manage 3-5 young developers for "cheap" who are willing to work lots of overtime. If they get at least one who's good, they may get more than their money's worth. Otherwise they're likely to end up with a project that's late, has tons of preventable security issues, and that is harder to extend than if they'd hired the expert.
That's what's broken. As to how to fix it: Would be wonderful if there were some verifiable way to rate programming skill, but until someone comes up with a system that can't be gamed, I don't see how.
[1] I've known many developers who effectively "give up" on learning new things at some point, and from that point on their skills rot (and/or they become managers). The ones who keep learning don't seem to hit a "skill wall", at least into their 50s, where my sample size drops precipitously.
I think this more or less nails it. I'm not a programmer, but I have hired a few in my day. My best hire I've ever made was out of high school, and this kid blew away any developers of any age by 3-5x easily. He now is 15 years along his career (long since moved on - I can't afford him), and working for Google.
He's going to be great when he's 60 years old, because he's great at what he does, passionate, and lives for learning new technology.
The only difference is that yes, he will not be working those crazy hours.
You do have one mistake in your post though. While yes, a single developer who is 5x more productive than another programmer is worth more than 5x as much - for a small team that may not matter as much as you think. As a business owner, I would want to hire you, but I also need another developer who can "keep up" so I have business continuity should you quit/get hit by a bus or something. This is worth a considerable amount and I would not be willing to hire you vs. hiring 3 lesser developers to get the same job done. It would be too much of a risk.
>This is worth a considerable amount and I would not be willing to hire you vs. hiring 3 lesser developers to get the same job done. It would be too much of a risk.
This is a fair point, but consider that having the expert work on it to start with, even if the expert leaves at some point, would mean you have a far more maintainable code base than if you'd started with three junior developers.
Probably the best answer is a compromise: Get the expert to put down the "bones" of the project with the understanding that they would commit to training a (less expensive) project maintainer (or a team of them). EDIT: Also have the expert screen your maintainer(s) so you don't end up with idiots. :)
The value of having good code to start with can't be overstated. There are times when I've come in and told people that they should start from scratch than try to maintain their pile of garbage code created by junior developers.
More importantly, what could we do to facilitate an increase in creativity or productivity or 'talent' over time? The fact that we don't has a huge impact on the kinds of things society achieve.
I don't think it's correct to just say "brains get less flexible" and give up. Most Nobel Prize winners (even in theoretical physics) tend to do their prize-winning work after this cutoff (I believe average is 40+). I suspect it has to do with the fact that scientists frequently have to shift the problems they're looking at as they go from grad student to postdoc, and grant to grant.
If I had to guess, I would say that it's not biological age, but time spent in a particular field that suggests how big a breakthrough you can make, because you get stuck thinking in a particular way. If young people have an audacity that lets them tackle new problems, we might focus on how to preserve that audacity as we get older by making it easier to switch fields.