I'm not sure I'm buying that. The JVM ? Still rocking it 20 years after. Memory allocations pattern ? Still there. The network stack ? Well, doesn't seem to have changed a lot.
The older guys, they seem like they had the time to correctly learn the unix network tools, the jvm debugging ones, the memory inspection ones. I've known older devs for which I have the utmost respect because I felt like they can just debug the shit out of anything happening on a computer, with tools I don't even know of but have been there for decades.
And here I am, needing to google how use tcpdump or jstack, and scrambling to correctly understand the result.
I agree that new tech is always stacking, but I feel like it's pretty damn hard to catch up on the in fact still very relevant and important old one, because it's no longer taught, no longer a meetup topic, no longer hype. And the new one is really easy to learn when you realize that it's 90% rehashing of old concepts (Observables are all the hype in javascript ? Well, great, I've learnt that pattern 15 years ago...)
Admittedly, I have no clue if the management realizes that !
But the impact of this knowledge, from what I observed, is really huge on productivity, and is especially a boon when the production is on fire or generally when tricky stuff happens.
So, respect for the elders, and please come and teach in conferences and meetups, we need more wisdom and less hype !
Yeah, you hear all these horror stories about how developers have a "shelf life."
In the meantime, though, every single older developer I've met has been extremely knowledgeable and frankly much better at the craft than myself.
Look at any hobby -- say, guitar playing. The difference between someone who's been doing it for 4 years and someone who's been doing it 20 years is ridiculous. They're so much better. I feel software dev isn't that different.
The only problem is if you're older you do need to follow new technologies. Older devs I've talked to who were pushed out of the field are the types who worked on, say, mainframe technology well into the mid-aughts and never thought about learning anything new.
As long as you get off the sinking ship of technologies that are obviously waning (and you should have at least a decade of warning in advance, I think), I don't think it's honestly that hard to stay current. You don't have to follow every fad, but make sure people will continue to use your main programming language.
Also, startups discriminate on age because they need people that can work until 2 AM and drink their Kool Aide. But there's a lot of stable, non-startup jobs out there. People who have never left the scene probably think of it as death, but there's actually some really nice enterprise shops out there with great developers.
One thing I've noticed is that there seem to be a lot more older developers and engineers at hardware-oriented companies.
I currently work at Wing Aviation, the Alphabet drone delivery company. We were originally located in the Google X building, which is the old Mayfield Mall in Mountain View. I used to shop at that mall in the 1970s!
X is full of hardware startups, and when I started there I noticed one thing right away: for the first time in a while, I was not the oldest person in the building.
Now that we've moved to our own smaller building, I may be the oldest again, but not by a wide margin. There is a lot of gray hair in our teams.
That's probably not by coincidence. The cost required to fix a hardware/firmware defect vs a software defect is much more. I'd definitely want to put my best engineers on the product that can't be easily fixed once it's shipped vs software than can always be patched.
> I'm 67 and have been programming for 50 years.
Also, that's really impressive. You've probably forgotten more than I've ever learned!
Related experience in the opposite direction: I'm a relatively new software engineer that works on a 13 year old product. It's fun to see a particular commit and think about what I was doing in middle school, high school, or college at the time it was committed.
I mean, I feel like more programmers were working closer to the metal back in the day.
In the 80s you needed to understand computer hardware much better and there was a bigger fraction of C++ developers. Now all those people have migrated to hardware (or have kept doing it) because that's where their skills are needed.
My experience has been that many younger engineers don't want to work closer to the metal as much. They want to work on the bleeding edge, sexier technologies.
I would enjoy working close to the metal in some capacity, it's just not really what the majority of modern tech businesses require, and moving in that direction is likely not worth the sort of career reset it would require.
I've noticed that most dev work is "full-stack developer". This usually means that you'll be bouncing between back-end (python, rails, node, go, w/e), front-end (react, css), and possible devops.
I'm not sure how satisfying this is. You probably don't get much chance to dig deep and solve tough problems, or craft as you would in a hardware-oriented job.
I've never worked in hardware though, so I'm quite possibly complete wrong about their day-in-the-life. Just rampant speculation.
> As long as you get off the sinking ship of technologies that are obviously waning (and you should have at least a decade of warning in advance, I think)
You also don't need to get on the ground floor. Every technology boom creates next years legacy apps, there will be React work for years after the next tool gets popular for one example. The article touches on it, but you can make a good living out of consulting work for older technologies.
Another angle is moving into non-tech domain specialization. Becoming an eCommerce consultant, your value proposition isn't knowing React/Shopify/Demandware etc, it's knowing the eCommerce ecosystem and domain. I know devs who specialize in agribusiness, automotive software, construction, energy infrastructure and such. They build solutions to problems in those spaces that happen to be software. But they get hired based on their time in those industries not their current tech stack.
> every single older developer I've met has been extremely knowledgeable
To be fair, this observation could be massively influenced by survivorship bias - more talented devs tend to be the ones still programming in older age.
Indeed. My previous job was very Java-focused but wanted a more senior man on the team. So management hired a developer with 20 years of experience in C and Fortran.
Now in my country, when you hire someone, a probation period of two months is very common. In that period, you can be fired and/or leave without cause.
Unfortunately, he was not able to pick up object-oriented programming in those two months and left before the probation period expired.
That argument applies in the other direction as well: it's a very relatively young industry full of big industry-wide disruptions at least once a decade since it has been its own industry. Companies with young median and mean ages may have been more likely to survive (or evolve out of) some of those big early disruptions, but that may not be a long term equilibrium for the industry.
Keep in mind, too, how much the 80s, 90s, and 00s median/mean ages were impacted by "flash flood" millionaires that could retire early simply by how much money was thrown around in the various "software revolution" and "Dot Com" booms.
I seriously doubt the number of instant millionaires is significant enough to impact the total numbers. 25% of programmers did not become rich enough to retire, and if so, I'm doing everything wrong.
It was always a matter of luck. If there were ways to predict such a lottery ahead of time, I think the industry would look quite different today.
As with flash floods of rivers, the coverage of such events was complex. It probably affected some companies a lot more than others.
Anecdotally, it used to be an aphorism at Microsoft that if you hadn't made your first million by 30 (or was it 25?) you were doing something wrong. Certainly the demographics at Microsoft showed several clear waves of early retirements from stock booms and bonuses, and for a ~forty year old company the median age is still staggeringly young today, even accounting for industry ageism.
If it wasn't clear, I don't expect those flash floods to happen again, they definitely seem to have been flukes of luck. But I think it shouldn't be ignored that it had an impact on the industry demographics.
At the same time though the industry has an age distribution that reflects the arc of the industry more than the selection of the workers.
I entered the field in the mid-90s and saw the industry hugely explode at the end of that decade. So as all of those nascent developers look around they noticed how few older developers there were and said "boy, this really is a young person's profession!". I remember all of the "before you turn 30, get a plan B" articles then, everyone fearfully looking into management or dubious project management paths.
And as that cadre aged, of course most new workers are younger, so proportionally they drop from a majority to a minority, but there are a massive number of very gainfully employed, successful older developers, and it certainly has been normalized far more than then.
The guitar playing is a good analogy here. Say you've played classical for years and dabbled in rock and some jazz. Like any musician you've played around a little but probably focus on one style. You're best friend asks you to play flemanco for them at their wedding.
Sure, you won't be as good as someone that's played flemanco all their life. But if flemanco was invented 5 years ago you're sure going to pick it up fast and probably be better than someone who has only been playing it for the past few years.
Skills translate. Transfer learning is unsurprisingly a real thing. The funny thing is that a guitar player will pick up a brand new instrument and quickly gain and surpass someone who has only been playing that instrument for a few years. Even though each instrument is a different "language" per say, there are common patterns in the language tree of music. I don't think any programmer worth their salt would disagree that this is also true for programming.
No matter the language you use, there are common patterns. Someone who has been programing for years generally picks up a new language quickly (because of this). Certain languages will make you a better programmer in general too (low level languages help you understand what's going on behind the curtain).
So should a programmer with 30 years of experience in C and 1 year in rust be paid more than the programmer with 5 years of rust experience? Absolutely. Those 30 years they were learning to program, to debug, to solve problems. The skills translate. And I say this as someone under 30. I've seen the wizards solve problems in languages they've never used because they just understand programming.
>No matter the language you use, there are common patterns. Someone who has been programming for years generally picks up a new language quickly (because of this). So should a programmer with 30 years of experience in C and 1 year in rust be paid more than the programmer with 5 years of rust experience? Absolutely. Those 30 years they were learning to program, to debug, to solve problems. The skills translate.
Tell this to HR/recruiting departments.
I'm a highly experienced C developer with some python and JavaScript experience on the side yet can't get any jobs on those as HR/recruiters will just filter my resume out every time due to "insufficient python/JS experience".
HR has no idea how programming skills and experience translate across languages, they're just trained to filter out people based on buzzwords.
On the other side, everybody is looking for people already versed in the languages they need right NOW and aren't gonna take the chance on someone proficient in other languages hoping they'll master the new ones soon enough. Too risky for business.
I think that's much more true of large companies than small ones. At a small company, the engineers are more likely to be involved in the hiring process, and (unless they're very inexperienced themselves) they'll have a better sense for how skills translate.
For example, on my team, we're writing most of our new code in Kotlin. But even though the language has now been around for the better part of a decade, we regard prior Kotlin experience as only a very slight nice-to-have when we're evaluating someone. If you've programmed in Java or Scala or C# or C++ or pretty much any other statically-typed language, you'll pick it up quickly enough that the time to get productive in Kotlin will be dwarfed by the time to get familiar with our code base.
What we do usually filter out, though, is monoglots. If you have 10 years of experience and have only ever done, say, Ruby on Rails development, you will probably not have the breadth of engineering perspective to succeed on our team. But if you've done RoR as well as something else that's dramatically different, that's fine. We would probably even filter out a Kotlin monoglot in the unlikely event we ever came across one.
You would definitely pass a resume screen at my (large-ish) company. We don't screen people out for lacking specific experience in our tech stack. Don't get me wrong, we like experience in our tech stack, but with 30 years of C, we wouldn't care what else is on your resume.
I don't think many employers are interested in training anyone at any age anymore. Margins are often too lean to dedicate productive resources to training. When switching jobs, I always emphasize the aspects of previous jobs and accomplishments that would be most relevant to the employer (eg, delivering on time, under budget, reduction in reported defects). Having projects in GitHub or a portfolio also speaks volumes.
I like this analogy for a separate reason. Many musicians can relate to hitting a ceiling in terms of ability. You might have X years playing any instrument, but depending on talent, you may have plateaued for some period of time. It is highly likely for anyone that has stayed in one role (in a mature industry) that has not actively sought extra-curricular technology to experience this.
But plateaus are local. Usually they happen because you don't know how to progress anymore. Once you figure that out you jump ahead again. I think with age it is easier to become complacent and just say "this is enough". Which there's nothing wrong with that. But I think it is different than hitting a real ceiling. You're not at peak.
Also, most languages are not sinking ships. Certain uses of them are of course. But even languages this site hates, such as C, C++, C#, and Java have very long lives ahead of them for certain uses. Obviously C++ vs Rails... well, you're going to be out of a job. But if you were using C++ for the sorts of things you would have used Rails in 2009 you were already a decade or so behind the curve.
I doubt it. C++ has come a long way in the past decade and still has a lot of momentum.
As for rails? Admittedly I'm in an anti-ruby bubble... but the Ruby developers I do know definitely don't use Rails anymore. Ruby seems to have matured into basically just Chef/Vagrant in my bubble and any Rails apps are being deprecated in favor of the bubble-biased languages.
Rails developer here. It's still quite easy to get a Rails job; maybe not for Google, but easy. There's much less hype but also less competition over jobs (many young developers come with Python/Nodejs backgrounds nowadays, many senior Rubyists left for greener fields).
The shelf life is due to a business decision, where the knowledge the older person has is deemed insufficient to justify the compensation they've grown used to.
Sure, you love working with them and it's better for engineering, but the business wants college grads they can pay less, even if they don't do the same quality work.
Importantly, this doesn't have to be logical or good for the business. It just has to make sense in the context of quarterly targets.
More significantly, you can probably do things that no amount of entry-level engineers could do. The question is usually whether your (prospective) employer benefits from any of those things, whether they realise it or not.
As a former software guy, life long computer hacker, who moved to desktop support for years, I ran into new people hired as programmers who had to be coached to hit Ctrl-Alt-Del properly to log into their newly deployed laptop.
I'm working on hiring back into software roles now... I figure if the bar is that low, I can rock it out if I can just convince anyone that I can still code.
I keep hearing this but sometimes you just need hands on keyboard to push things out fast. Yes I can do a passable job at front end, middleware, databases, and I'm pretty adept at cloud infrastructure and devops. Does that mean I'm as valuable as 5 people? I can only do one of those at the same time.
It depends entirely on how much autonomy and motivation you have. I have to do all of that for my own modest product, but it's only viable because I have complete autonomy to decide what the most efficient approach would be... and I enjoy it.
This isn't typically how it plays out when you start a new job for someone else. Which means they won't reap the rewards of your experience until enough trust has been built to give you the autonomy to make it happen (this may never happen). So it takes time, and not everyone wants to take that time or can see the benefit, and arguably in the startup space there may not be enough runway for it either.
It’s not about autonomy. I work for a small company where I have a reasonable amount of control of how I implement anything. But, I can only do one thing at a time. At a certain point, you need more people to get anything done in a reasonable amount of time.
This may be true, but your company always wants the most business value, not the best/most elegant/most maintainable code. If you can deliver your boss's business interests at even 1.5x, I think you will have a long career.
The myth of developers not being able to keep up as they age seems to be mostly propagated by two types of people. 1) Managers who want to keep wages low and 2) Young developers who try to hard too prove themselves and don't want to listen to/or work with older developers. [0]
[0] Note, I'm not saying all young developers or all managers. Only a small subset of each.
> As long as you get off the sinking ship of technologies that are obviously waning
But then you would have to compete with young people starting out in the same technologies. And they are more naive, more easily manipulated and exploited, corporations love that and prefer that to 20 year experience in the field they can't even understand could be important to all the new technologies.
> In the meantime, though, every single older developer I've met has been extremely knowledgeable and frankly much better at the craft than myself.
While this is true for you, there is a lot of selection bias and survivor bias involved in this.
For These older developers you need to also consider their comrades they were working with 20 years ago. For every older dev you work with today, there are 10 of their former buddies that decided long ago they didn’t need to advance beyond VB 6, or IVM mainframe assembler or RPG/3.
Or they hit the stock lottery jackpot and have retired, or moved into management, or marketing, or started a sandwich shop.
But I do think there's a bit of generational shift that the stereotype of outdated engineers haven't caught up with. When I started out in the mid-90s, anyone more than a little older than me didn't have a CS degree and had come up in a world without source control, continuous integration, automated testing, iterative development processes, open source libraries, etc, etc. So in some ways they were from a different planet than the next generation. Although the tech du jour still changes quickly, programmers in their 40s these days still have a comparatively small gap when it comes to education and development process.
I am at a startup just now and they are complaining about frontend devs as
- they don't hang around
- they don't want to touch the old frontend written in Dojo, just the React stuff.
I'll do whatever (I am officially a 80% backend dev who can do enough frontend when needed). The Dojo way of writing things actually seems a lot nicer than React, but it ain't cool.
>> In the meantime, though, every single older developer I've met has been extremely knowledgeable and frankly much better at the craft than myself.
This may seems strange from where you sit but trust me. If you stay in the field that length of time, you will probably become that guy. You don't need to set out to be him. The only thing it requires is time and a refusal to stagnate.
I think a lot of the age-ism has more to do with cost than skill. Older developers ask for more pay, negotiate more, and are less willing to put in uncompensated overtime.
The way to get past this is to make sure you develop skills that are exotic and high-end and hard to find so you can't easily be replaced by a cheaper college grad.
I went "full software" back at the age of 36. I did this because I saw the writing on the wall for systems/devops and being a half-developer, essentially from writing scripts to full OOP.
I put a ton of time and thought, and effort into the change. It wasn't easy to complete the transition and get hired, but I settled on C#. Chosen because C# and Java are similar enough (my only formal education in programming was in Java), and I could move between the two, if I had to. I also saw MS driving the future of .Net really hard, even after about 10 years before it looked like there was no future and MS was going back to unmanaged code. I also wanted as stable of a development platform as possible, an attempt at avoiding churn with all the kiddos in the web space. I'm still working fulltime as a C# dev and enjoy it. But if anything ever happens, I'll jump ship to the Java space. Both the city I live in and less-populated state that I'm from have copious amount of C# and Java jobs available, so I'm not on something so cutting edge that I have to live somewhere specific. It's my opinion that the best balance for most software projects was nailed with Java (and then it's descendent, C#). There's a reason it's so popular and they're not bad reasons.
My dream job is to come full circle and be "the IT guy" at a small company, handling software and systems as a one-man operation. That can be done in the MS space, as less expertise is necessary in general on the systems side to manage the entire stack. Even less so with Azure.
Will I be pushed out? I'm sure. I'm expecting less of a shelf life than the old hats because so many more people are raising their children to be ready for this field. Also, outsourcing/visa migration was fully implemented to diminish the industry for workers, since about 2000. The industry that was supposed to be the next buoy of the American middle class has been successfully undermined by American hypercapitalism. Greatly reducing prospects and the future for even my generation, the golden generation of children that grew up on a Commodore. A special time of astute technologists raised when we had personal computers at home, but before iOS collapsed the barrier of entry and nullified the mystery & effort required to get to entertainment.
OA> Herein is the source of the problem. The more irrelevant experience a candidate has, the more lopsided the utility/value equation becomes…
No, the source of the problem is the idea that 10 years of “C++ experience” is “irrelevant” to a project using Rails, as if development experience is somehow locked to the language you happened to do the development in.
Being a valuable senior developer is about 10% what technology you know and about 90% how good you are at working in a team to solve a business problem with a machine, and those fundamental skills haven’t changed much since 1960.
As you point out, the more experience you have, the more you realize that all these “different” technologies are mostly the same wine in different bottles. Or, as I like to say, “it’s all just software”.
I worry the real problem is not that managers are looking for someone who can do the job, but someone they can exploit.
The gleeful exuberance over 'new' things is something you can use against a younger programmer.
The older one who knows the vintages doesn't get as excited, because it's really not that exciting. You haven't discovered Shangri La. We've kinda already done 90% of this before, just maybe not all at the same time.
But there's good exciting and bad exciting, and it takes a while to learn the difference (I work with a bunch of people who apparently have not).
What's bad excitement look like? War rooms, for one. But they are the same dopamine hit as Stockholm Syndrome and so the battlescarred have bonded while the rest of us are doing everything we can to stay out of war rooms (I haven't been in one in well over a year).
I've started calling these people BASE jumpers, because they're adrenaline junkies. There are not so many older BASE jumpers (I hope it's not because they're all dead). I think you learn to value other things besides adrenaline. Like teamwork.
Maybe the problem is the big tech producers that introduce new technologies over and over, that don't raise productivity but create churn, force software rewrites, etc.
Tech history looked at from 40yr perspective, really seems like reinventing wheels over and over. I'm not sure if there is another industry with so much churn yet with a lot of rehashing of old ideas.
"It used to be the case that people were admonished to "not re-invent the wheel". We now live in an age that spends a lot of time "reinventing the flat tire!""
That's just the startup world. There are a million chill software jobs out there. If you get sick of it, save up a bunch of money and move to Dallas or Atlanta or any other large-ish city and find a quality enterprise shop.
I think a lot of problems some older developers have is that the culture has shifted. Maybe I'm off the mark, but I feel like older devs come from a world where programming was all about programming and it was less a formal career than it is now.
To succeed in programming, you have to work well in a business. You can't be Linus Torvalds, walking around with a big ego, unless you're so important it's hard to get rid of you.
You have to work with people well, and some of the older people I've worked with seem to understand this less.
Again, not an insult to older devs. Plenty of them are wonderful. And I could be completely wrong. Just an observation.
I agree with your premise but not your reasons. personally I'm a programmer - I know enough about business to know I don't have good insight.
everyone I talk to doesn't want* a programmer in the older sense. they want people to integrate external large packages. they don't want to to test, they want to prop something up and see if it makes money. they don't want to talk about architecture and features and make plans - they just want to respond on a dime to what the market is saying.
without any value judgement - I personally don't have a lot to offer in that environment. and to varying degrees I think the hiring managers understand that (sadly, quite a number don't)
Well, that's the thing. Most jobs are 80% stuff you don't want to do and 20% stuff you do. You're not being paid to do your hobby, you're paid to make money for the business and do what your bosses say.
There are many places though that understand writing good code, even if it's boring, is important, and any decent programmer can go fish until they find a workplace like that.
Having a solid older developer around is important even if most of the work is boring. You're there to help the younger folk from making costly mistakes.
I recently started working on a project in which the company had outsourced it to a wordpress shop, despite it not being a wordpress site.
fast forward a few years and no one fully trusts the software so everything it does gets checked/verified by a human, and they're actually hiring people because the workload is too much.
So they brought it in-house, which is where I come in.
And let me tell you, this codebase is horrific. To give you an idea, they needed a progress bar and ended up writing a FSM with the transition between nodes being http redirects. Now, the developers who worked on this wouldn't know what a FSM is if it bit them in the ass, but that's what they built. And the data for detecting if the background processing was still running was duplicated in 3 or 4 places, which means you would get states in this FSM that disagreed with each other about whether or not processing was finished. These pages would literally "argue" with each other over it by redirecting back and forth until the browser threw a too many redirects error.
I came across some code yesterday in which the developers had created a string of multiple insert statements. They then split on the string "INSERT INTO". Said split removes the INSERT INTO from the string. They then looped over the resulting array, prepended "INSERT INTO" back onto the sql and called the DB 1 at a time. I don't think these guys realized you could separate MySQL statements by semicolon.
My point is that the codebase is horrific, written by someone who was used to the style of development you're referring to, and they made an absolute mess.
BUT
To your point. The company realizes how bad the software is and I get a surprising amount of respect when talking about what needs to happen. The result is that we're making great strides towards making improvements, and despite how horrific the code is I find myself enjoying the work because it allows me to have an aspect of my skill set respected that can sometimes get ignored by people because it's not forward facing. Good software dev is boring as shit, so when you're doing it well no one notices.
Also, this post got me to thinking. I should submit that SQL code snippet to the daily wtf.
> To succeed in programming, you have to work well in a business. You can't be Linus Torvalds, walking around with a big ego, unless you're so important it's hard to get rid of you.
I keep running into the inverse of that at companies. People who put on a smile and a nice demeanor, but then fail to do any actual work and subsequently lie and politic to cover that up, leaving the rest of the group to play cleanup and cover. I've always loved working with the Torvalds, you do your work and all is well.
In companies that I've worked in, the traits listed would (very roughly) distinguish a junior from senior developer.
There's various ways of describing these levels, but I think programmer/developer is less clear, and seems to suggest programmer almost as a pejorative.
Being a valuable senior developer is about 10% what technology you know and about 90% how good you are at working in a team to solve a business problem with a machine, and those fundamental skills haven’t changed much since 1960.
That’s true. But if you haven’t learned that in 10-15 years - there is something wrong. That means the difference between someone with 15 years and 30 years would be marginal.
I'm going to disagree. For every year I've spend doing something, I've got better at it. While there are definitely people with "1 year of experience repeated 30 times", I can't believe that you hit a point in software engineering where there is nothing material left to learn and/or where everything left to learn has marginal benefit.
There are things left to learn, but how many of those things are valuable to an employer who is just wanting yet another software as a service CRUD app, mobile app, or bespoke internal app that will never be seen outside of the company?
It’s practically a trope that every time this subject comes up, the top-rated comment is a skeptical response...from a junior developer.
As someone who actually has worked as a programmer on the far side of 40, let me assure you that yes, older programmers have value. Your perceptions are not wrong. But that said, the article is right. The people who do the hiring and firing do not care about what you care about.
As the article noted clearly, the marginal benefit of an older engineer has to exceed the marginal costs. And if we’re being honest, the fact that a graybeard can use tcpdump without reading the docs carries little marginal benefit. Guess what, kid? You’ll figure it out, and you’ll do it quickly enough that the total cost of your learning won’t really compare to the cost of hiring me.
That’s why you see lots of anecdotes about the value of older engineers, but lots of articles from older engineers who know that the discrimination is real. And this is coming from someone who has managed to “stay relevant” a lot longer than most software devs.
The thing is as startups mature and become enterprise level shops, they need people who know how to work with enterprise software. Guess who has all the experience with that? Yep, the people with 10+ years experience. My company is hiring very senior devs like crazy right now for exactly this reason.
I think there are some orgs that definitely benefit, but it’s an 80/20 thing: 80% of software shops are more than content to hire junior devs and live with the costs of their mistakes. (Especially because most of those
costs never really get experienced by the people who make the decisions.)
I litterally wrote that I don't how management values what I value.
I also never wrote that the article is wrong, I wrote that I didn't find it convincing.
I'm simply observing that after more than ten years developing and learning the craft, I still find myself not able to correctly know the old tools, and instead I merely have an idea of some that exist, and I'm sometimes able to google the correct one.
As so, I'm stating that when I see a graybeard with actual knowledge of fundamentals that I still totally lack, well, I respect that, and I _hope_ that he's not unemployable because if it is the case, I fear that our industry is doomed to produced always more buggy software due to a lack of basic understanding.
And I think than faced with no evidence from the article or you, it's well within my rights to be skeptical.
I honestly wasn’t trying to be condescending to you. I even said, right at the very beginning of my comment, that you’re right about many of your observations. My word choice may have been poor in some places, but overall I defend the content.
You’re (self-admittedly) missing some of the life and business experience that comes with being an older programmer, and then expressing skepticism about direct advice communicated by older programmers. This is also so common as to be a meme:
Person A: “I experienced $something_rare at work”
Person B: “I have never experienced such things. I am skeptical of your claims.”
So sure, you can call it whatever you like - skepticsm, disagreement or debate - but when people tell you about their experiences, and then you reject those experiences because you don’t have “evidence”...well, eventually you’re just confirming your own biases.
Again, it’s not that you’re wrong - I actually agree with your perceptions of older devs. I just think you’re missing critical life experiences that connect your observations with the arguments being made in the article.
> It’s practically a trope that every time this subject comes up, the top-rated comment is a skeptical response...from a junior developer.
I looked at the top rated comment for the other times this exact same article was posted on hacker news. For none of those, the top rated comment "is a skeptical response...from a junior developer".
Actually, it's more often (https://news.ycombinator.com/item?id=16934500, https://news.ycombinator.com/item?id=9361580) from a self declared senior developer. And none of those top comments really collaborate the article. Talk about confirming one's biases...
But taking into account all the anecdote from senior devs available just on hackernews, there's no consensus emerging whether the article makes sense or not.
So, at this point, I don't really feel the need for a plan B, even if being 40 years old is not really this far away for me. And it seems from the shared stories that whether this is needed or not is really based on personal experience, and is not a generally needed advice. I certainly hope to be in a path where I'm honing my skills well enough to be able to continue beind paid to engineer software after I'm 40.
Well, sure: you disagree, and you’re seeking out arguments that confirm your existing beliefs.
In any case, I’ve said what I have to say on the subject. I can’t make you listen to my
direct experience, but I’m not going to spend time arguing with you about it.
I had a recent conversation with someone who was into electronics & programming 40+ years ago as a youth. He has had successful careers in other non-related fields. He recently was messing around with a Raspberry Pi & remarked how little things have changed & how easy it was to jump pack in.
I strongly believe an employer is very short sighted if they are more concerned in a specific framework than knowledge as a whole. I agree, learning a framework doesn't take much time. They all borrow their patterns & ideas.
If your needs are urgent for a person to hit the ground running on a specific framework you should probably hire a contractor. If you want an employee you should care more about people & project skills, plus overall programming/IT knowledge.
I’m always mirin’ my current boss when he opens midnight commander and figures out the answer to his question by looking at the hex of the binary.
I needed to change a string in a compiled binary and as I pulled the project down to change and compile from sources, he suggested we just patch the binary and we did in 1 minute with hexedit.
I really don’t buy that skills don’t transfer. Where this is actually the case must be people getting too locked into abstractions and never really understanding the essence
What's funny is that it's already getting dated. I remember it was ubiquitous online in the forums I frequented in 2010 just like Urban Dictionary's graph shows. And that is already a decade ago (yikes)!
> I'm not sure I'm buying that. The JVM? Still rocking it 20 years after.
Java may not be my favorite language, but I respect how the language has both progressed and maintained its heritage. It allows the developer to build upon their skills rather than relearning nearly identical skills every time a new technology comes out.
The problem is that this 70 year old industry still behaves in very immature ways. While this may be acceptable when it comes down to attitudes towards technology, the unfortunate side-effect seems to be the adoption of similar attitudes towards people.
Every time someone says something like, "young people are more in tune with the current state of technology," or, "old people are more capable because they have more experience," they are legitimizing prejudice. That's true even when the statement is being made in an attempt to combat prejudice since it is applying generalizations to individuals.
Granted, it is very difficult to shed generalizations. Just look at my comment about the immaturity of the industry. That isn't universally true. Some companies are going to have a younger workforce, some older, some heterogeneous. Some companies are going to have old people working with old technologies, some are going to have young people working with young technologies, but some are going to have the young with old technologies and vice versa. Yet the generalization comes about because of how we frame the industry, which is one where the new replaces the old.
Which brings me back to why I respect Java: it looks both towards the future and back at the past. It is language of growth, rather than a language of revolution. Perhaps this inanimate technology has something to teach us about how we regard people: we should be encouraging growth and advancement rather than treating people as disposable.
> I know you love programming because you like technology, so this may go against your very nature, but no one says you’ve got to jump every time some snot-nosed kid invents a new way to run byte-code. You have invested a lot of time and energy mastering the technology you use, and your experience differentiates you. Money follows scarcity, and snow-birding on an older technology, if you can stomach it, may just be the way to protect your earning potential. The industry turns on a dime, but is slow to retire proven technology. It is highly likely that you will still be able to earn some decent coin in the technology you know and love even after a few decades.
> I know you love programming because you like technology, so this may go against your very nature, but no one says you’ve got to jump every time some snot-nosed kid invents a new way to run byte-code. You have invested a lot of time and energy mastering the technology you use, and your experience differentiates you. Money follows scarcity, and snow-birding on an older technology, if you can stomach it, may just be the way to protect your earning potential. The industry turns on a dime, but is slow to retire proven technology. It is highly likely that you will still be able to earn some decent coin in the technology you know and love even after a few decades.
Yes. But as the article said, there is a diminishing amount of value after every year of true experience after a number of years. I would say around 10.
I definitely don’t believe in the “10x Engineer” (individual contributor) - yes they do exist but are so rare they aren’t worth talking about. I do believe in being a force multiplier as a team lead/mentor.
On the one hand, sure, percentage-wise you learn less in year 20 than in year 10, because you already know a lot more in year 19. But that is no more true of this field than any other field. Is a doctor, architect, civil engineer, or auto mechanic with 20 years experience more valuable than one with 10 years experience? Heck yes.
20 years ago,much of what I used today didn't existed there was no AWS, no C# (but C++ was close enough I guess), mobile where you had to worry about semi-connected networks and syncing, etc. There is no part of the human body that exists today that didn't exist 20 years ago.
You could argue the opposite is true: 20 years ago, there were dangerous misconceptions about how some body parts work. Many chemical pathways were completely unknown. Medicine and biology both evolved a lot in the 20 years.
The big picture however is still valid. Concepts and paradigms hold for decades. We still use TCP/IP. Computers still use the Von Neumann Architecture.
Looking at the details however, AWS is just a fancy GUI over time sharing on a mainframe.
C# is just new syntax for concepts that are older than I am.
While mobile brings new challenges, you now don't have to deal with the challenge of customers connected through a 330 baud modem.
In 1986 I had to not only know 65C02 assembly language to get any performance out of my 1Mhz Apple //e, I had to know that I could get 50% more performance for every access I did to memory on the first page as opposed to any other page. If I spent time doing that type of micro optimization today, I would be fired. I couldn’t imagine doing the types of things I could do today with modern technology.
In 1995, when I wrote my first paid for application in college, Internet was a thing for most colleges where I did some work on a HyperCard based Gopher server (long story), that wouldn’t have been possible in 10 years earlier.
In 2006, I was writing field service software for ruggedized Windows Mobile devices, architecting for semi connected smart devices is a completely different mindset than terminal programming or desktop programming. That wasn’t feasible before hardware became cheap and at least 2G was ubiquitous.
Even then what we could do, pales in comparison to the type of field service implementation I did in 2016 when mobile computing was much more capable, much cheaper and you could get a cheap Android devices and 3G/4G was common place.
But people thinking cloud computing is just “sharing mainframes” and don’t rearchirect either their systems or their processes is how we end up with “lift and shifters” and organizations spending way too much money on infrastructure and staff.
Also anyone who equates managing AWS to a “GUI” kind of makes my point, if you’re managing your AWS infrastructure from a GUI - you’re doing it wrong. 10-15 years ago you didn’t set up your entire data center by running a CloudFormation template or any other type of infrastructure as code.
How has medicine evolved in the last 20 years? I don’t doubt your statement, but you make it sound like common knowledge, and from my point of view (average non-medical person) not much has changed.
All of which is just details, which are much less important than the fundamental skills of building systems with whatever people and tools are available. (And I'm sorry, are you implying no significant changes have occurred in the tools and practice of medicine in 20 years?)
Thinking that moving from on prem to AWS for instance is now you end up with “AWS Architect” who were old school net ops guys who only know how to do a “lift and shift” and end up costing clients more. Because they pattern matched thought AWS was just an implementation detail and they could just set up everything like they would on prem.
Just one note about 10x, because I often see people ho don't believe it have the definition wrong. 10x programmers are not ten times better than the average programmer, they are ten times better than the worse programmers. This is based on an actual study and, by the metrics the study chose, this disparity does exist. Of course, measuring programming performance is notoriously intractable.
> But as the article said, there is a diminishing amount of value after every year of true experience after a number of years. I would say around 10.
I don't buy it. 10 years is about when you start moving into the actual expert category. Note I said start.
At 35, I finally had real, full control over multiple languages, could pick up CLR and understand and implement any algorithm in it, finally understood exactly why concurrency was so damn hard and how to mitigate that, and would pass practically every interview with flying colors. I could finally drive my tools with some facility and started to realize gdb was my friend.
At 45, I can predict the errors I and others are likely to make and take steps to mitigate them up front--although I still get irritated that I when I make the mistake anyway. My comments are now psychic--my team often remarks how "I just thought that I could really use a comment explaining this-and, behold, there it was". I can reduce interviewers to tears and can surprise all but the most knowledgeable experts in their own domains. I reach for gdb far more often, but am still frustrated at how much I don't know about it.
I still only consider myself an "expert" in very few subdomains--none of them involving programming languages.
One of my heavy hitter software guys once said: "Your code is the most straightforward code I have ever read." I apologized for being so simple. His laughing response: "Don't apologize. That was a compliment, dumbass."
At 35, I finally had real, full control over multiple languages, could pick up CLR and understand and implement any algorithm in it, finally understood exactly why concurrency was so damn hard and how to mitigate that, and would pass practically every interview with flying colors. I could finally drive my tools with some facility and started to realize gdb was my friend.
You may be an expert in multiple languages but as the article said, if the company is looking for Ruby developers to write a CRUD app, they no more care that I spent years doing C than they do the years I spent doing 65C02 assembly in the 80s.
No matter how many subdomains you are an expert in, if the company doesn’t need that experience, it doesn’t matter.
But some things translate much better than others: Django and Rails aren't that different (I'd also put Laravel in there). So I think the real problem is moving from a senior Django role to a senior Rails role and vice versa, but I see no reason why a 10 years Django developer can't get a mid-level Rails job (other than blatant age discrimination that is).
Isn’t that the point? That if you have 10 years worth of experience as Django developer, you aren’t as attractive to someone looking for a Rails developer as someone with 5 years of Rails development.
Simplicity is definitely a virtue. When I maintain large codebase of 10+ year project, it is easy to spot where someone tried to be very clever, and it is rarely benefit for codebase maintainability and extendability in the long term. Things rarely get reused that much to take advantage of overcomplicated abstract solution. Most of the times developers do not predict future business requirements correctly and simple solution would be the better one.
Seeing someone fire up wireshark, sniff some traffic and solve a complex production issue in five minutes by looking at the raw packets never gets old.
> Admittedly, I have no clue if the management realises that !
Well, that's the key thing in many places: it's management that decides whether to hire and how much to offer, not you :)
The first instinct would be to put 'resources' that costs little into a project, and that usually means people with less experience that requires less pay and are easier to drag around. Imagine the difficulty they will be having trying to grok your line of reasoning for respecting and going after these older devs.
Try to update a one year old npm project and you will find that literally 75% of the code in node_modules changes, and also that the app no longer works.
Sure. But learning a new framework is like 100th on the list of hard parts of being an engineer. Its new incantations but ultimately the loop of solving business problems with automation is precisely the same.
100% of my team had basically zero C++ experience before joining. Our application is developed entirely in C++. I didn't care one bit because learning the programming idioms is so trivial compared to the actually hard stuff.
Ecosystem, sure, but I am still using the JavaScript I learned x years ago when I need to see how something is implemented in Angular or React under the hood, etc.
That’s basically how I find work. I’ve been in Unix admin since the 90s.
Do I know the latest coding pattern some adjunct professor is teaching after copying it from a math book into code, like the college grads I often encounter?
No.
Do they know much about operating at scale? How USE their tools?
They know what ideas are important to computer science. They often don’t know how to make anything work that isn’t an insecure mess.
I think that’s the important take here: I program less these days because I’m teaching people how to make shit that’s more complex than their reduced to a few talking points homework assignments.
We barely could get IPV6 to get adopted. Replacing the whole TCP ? Most probably not. The successor to TCP if it ever gets adopted, will look more like TCP than a new protocol.
> Observables are all the hype in javascript ? Well, great, I've learnt that pattern 15 years ago...
Well, the listener/observer pattern is used to explain observables in order to make them familiar, but no, it's not the same thing.
Observables are streams and you work with them the via composition of streams. When viewed from that high level, the underlying protocol (the observer) becomes irrelevant and in fact, if you're often finding yourself working with that protocol, you're doing it wrong.
People familiar with functional programming will be very familiar with this style, because composition is very natural to an FP developer and the composition of all kinds of streams is in the repertoire of FP developers.
But I've seen plenty of colleagues, who are otherwise very capable people, really struggle with the concepts involved. The mentality shift from the imperative programming that people have been taught, to describing actions via function composition and then doing evaluation "at the end of the world" is a mind fuck.
Interestingly, functional programming has been with us for some time, being older than Java.
However people are not interested in actual functional programming and more recently there's this trend to classify junk as FP, just because you've got a shitty API that takes functions (often doing side effects) as arguments to other functions, but that's not FP.
So going back to Observables, as a piece of advice, don't mention that in an interview ;-)
Well, it was my obvservation reading typescript code (never wrote a line in this language) with a background from java and then c# and then scala. The observables part felt really easy even without knowing the language syntax, but I suppose that the past 4 years of scala made me instantly translate the code in functional fashion and it was all smooth.
On the other, while it's wrapped differently, the whole idea of "this object will start emitting events, and we will be having functions as callback on them defined in various objects" really feels a lot like good old java observables/listeners, even if it's applied with a different paradigm. The stream part really feels like an implementation detail to me, it's just less boilerplate than before.
Then again, it was just an anecdote based on reading code in a language I don't use ^^ javascript and typescript aren't even mentioned on my resume, for good reasons, so I should be safe in interviews !
There’s a trap in software dev careers. If you fall into it, you can really get stuck post 40.
If you work at a small-ish dev organization - especially in-house dev in a non-tech company - you can rise pretty far and become pretty senior, and your indispensability can net you a decent income. But on the open job market those skills don’t transfer as well. The senior roles in small dev shops are filled by promoting from within (because they value in house knowledge), and the senior roles in BIG dev shops are filled by hiring people who know how to operate in a big company, and you don’t have those skills.
Best way to avoid the trap is, try to work somewhere big for a bit before you get to 40, to keep that door open and make it possible to be hired into a senior role.
> ...senior roles in BIG dev shops are filled by hiring people who know how to operate in a big company, and you don’t have those skills.
I'm not convinced of that. I was recently hired as a senior engineer at a large tech company that has more engineers than the sum total number of employees at every other company I worked at previously. My skills transferred just fine.
I reckon it's possible to acquire most of those skills at small companies. It just really depends on the small company. It's also easy to get stuck in a deadend of specialised practices at a small company if you're not careful, and those people will struggle to get jobs later.
Recruiters and interviewers focus too much on "technology of the day" that will always change and be out of date, and not enough on foundational knowledge that is always transferable to new tech.
So if you don't have they right modern tech stack on your resume then you have no chance of getting hired even if you are perfectly capable of doing the work.
This is where "resume driven development" comes from. Smart and rational developers will choose tech stacks that look good on their resume, not necessarily what is best for the work at hand, because they want to make sure that they stay hireable in the future. It no good to do the best work you can do today, if you cannot get hired in the future. This is partly how new tech stacks are always so hyped even if they are always just more or less rehashes of what already exists.
If you're saying that it doesn't matter if you're at a small or large company using an internal company specific tech stack, then I agree. Both will leave you as unhireable even if your actual development skills can transfer just fine.
To add to your comments, there are remarkable opportunities for developers of all walks of life in the market. It takes some luck to find an open door, but people are willing to pay for excellent talent if they can find it.
Yet another +1 on this: I worked at the same tiny shop (a partnership-like handful of people) for about 15 years; We navigated a few technological transitions throughout those years and made more money than we probably should have, but when I (at age 40+) found myself at a larger shop, learning the latest tech stacks was definitely not the biggest challenge. Indeed, the challenge was slowing down a bit, submitting thoughtful PRs, coordinating with designers, writing documentation... you know, actual engineering process. I got my hands smacked mightily a few times at first, and I got PRs rejected, but I got the hang of it. And yes, my long-dormant telephone magically started ringing again as the technologies, people, and companies on my Linked profile started catching recruiters' eyes again like they did back in the day. It can happen to anyone, but I can see how it'd be particular common among people over 40. After all, I spent my entire 30s with the attitude of "this is working and making plenty of cash, why change it?"
+1 This is really solid advice and what I inadvertently did.
I'm in my 30s and spent a couple years at a no-name shop out of college. I was getting 0 recruiters calling me. Once I realized it was a dead end I switched over to working at a Unicorn for a few years and have since switched again to big tech. Now recruiters are interested in me for senior dev roles.
^ This became apparent as I looked at older "role model" employees in big orgs when i was still very junior when I'd wash up there after startups crap out
Probably the best thing is to change around every couple of years. IMO, small/medium-but-growing companies offer the best opportunities to work on your technical skills, while larger companies offer the chance to grow your ability to navigate complex organizations.
This is a replay of a comment I wrote several years ago to the same topic:
I'm 60+. I've been coding my whole career and I'm still coding. Never hit a plateau in pay, but nonetheless, I've found the best way to ratchet up is to change jobs which has been sad, but true - I've left some pretty decent jobs because somebody else was willing to pay more. This has been true in every decade of my career.
There's been a constant push towards management that I've always resisted. People I've known who have gone into management generally didn't really want to be programming - it was just the means to kick start their careers. The same is true for any STEM field that isn't academic. If you want to go into management, do it, but if you don't and you're being pushed into it, talk to your boss. Any decent boss wants to keep good developers and will be happy to accomodate your desire to keep coding - they probably think they're doing you a favor by pushing you toward management.
I don't recommend becoming a specialist in any programming paradigm because you don't know what is coming next. Be a generalist, but keep learning everything you can. So far I've coded professionally in COBOL, Basic, Fortran, C, Ada, C++, APL, Java, Python, PERL, C#, Clojure and various assembly languages each one of which would have been tempting to become a specialist in. Somebody else pointed out that relearning the same thing over and over in new contexts gets old and that can be true, but I don't see how it can be avoided as long as there doesn't exist the "one true language". That said, I've got a neighbor about my age who still makes a great living as a COBOL programmer on legacy systems.
Now for the important part if you want to keep programming and you aren't an academic. If you want to make a living being a programmer, you can count on a decent living, but if you want to do well and have reasonable job security you've got to learn about and become an expert in something else - ideally something you're actually coding. Maybe it's banking, or process control, or contact management - it doesn't matter as long as it's something. As a developer, you are coding stuff that's important to somebody or they wouldn't be paying you to do it. Learn what you're coding beyond the level that you need just to get your work done. You almost for certain have access to resources since you need them to do your job, and if you don't figure out how to get them. Never stop learning.
I love your comment and can back you on your last point. I know plenty of people who are very successful because they took the time to become experts on some industry that also aligns well with programming. Compare:
Person A has become one of the world's ten foremost experts on the GPS system and other industry-critical location/positioning technologies. She is also a good, above-average programmer but nothing special.
Person B is an academic, a master of C++ who can recite chapter and verse from the language standards and writes bug-free code. He can point out undefined behavior, implementation-defined behavior, and memory leaks with ease in code reviews. He builds entire systems using template metaprogramming and is already an expert on C++28.
Person C is a highly productive generalist. His career jumped from a bank to an enterprise company to an operating system vendor to an online store. Always working on API-to-API middleware, expertly pushing Protobufs and JSON around and designing vast systems, but never gaining any expertise in an actual application topic.
Person A is going to be much more marketable later on in life, assuming she placed her bet on the right industry vertical. Person B and C may have good, successful early careers, but are often at risk of being replaced by Yet-Another-Protobuf-Slinger fresh out of college. Be as good a programmer as you can, but also build up knowledge of the business or detailed knowledge of a specific technology application that you know is not going away.
Indeed, this has been my career path to, I'm in my 50's and have always avoided 'going vertical' as I think of it - becoming a specialist in one little area. Stick to the bread and butter and just keep learning new stuff, really programming hasn't changed much at all if you know C and assembler, and a bit of hardware, know some maths, all the rest is just rearrangement of the words.
I want to say thank you for all the "youngsters'" posts.
As someone who been in the industry for over four decades, I do get lost in some of the newfangled things. Turns out, most of the time it is a language change, not a dramatic world-view change.
Of the three points of the article, the last two (10 years major shift, ; shift creates leveling playing field) I disagree with.
I have seen some things that felt like major shifts, but once under the hood, they tend to be re-shaping and combining existing technologies. This often put me ahead of new-comers. I had not only the current technology, but I also understood the underlying historical technology.
There is no such thing as "irrelevant experience" in my opinion. For example - it is unlikely that Algol, SNOBOL or Fortan will make a new revival. But, all that experience gave me the edge to be able to recognize short-comings or advantages of newer programming languages.
So thanks for all the nice words here. It's time for me to swap my tapes in my PDP.
I don't know. I feel like there's a pretty clear power law governing the skill-ceiling below which additional experience adds actual value to dev work. ~80% of programming labor (testing, basic REST services, static web content, managing a small-to-medium-size SQL database, etc.) has a very low skill-ceiling. I've been in the industry for 7 years, I now run my own show doing all of these things. It's just simple grunt work. Then there's a very long tail of projects that require actual software engineering, where the skill ceiling is very, very high. Those latter positions exist all over the place, but you have to either create them or actively seek them out.
Conversely the skill-ceiling on managing technical projects is incredibly high. So although one certainly can efficiently stay on as an engineer, there is a much larger pool of opportunities in technical management, and they are much easier to find if you're grinding away at a FANG / enterprise company / growing startup / whatever.
If you want to do work that rewards many years of experience, it's there. Management work is just much more likely to fall into your lap (which, let's face it, is how the median employee finds anything).
For additional context, I also don't really think "programmer" is a profession in a vacuum. You need to level up inside a business vertical to really start adding specialized value, otherwise yes, you're a fungible cog. There's no shame in that, it just is what it is.
> ~80% of programming labor (testing, basic REST services, static web content, managing a small-to-medium-size SQL database, etc.) has a very low skill-ceiling.
You’re specifically talking about web development and have listed 0 of the things I, am embedded developer, do in an average day. All programming !== web development. There are more difficult problems out there than throwing up a web page. Especially in performance-sensitive applications.
Sure, I also didn't mention kernel programmers, database engineers, machine learning specialists, or game engine programmers, or any of the other myriad of semi-specialized programming domains.
I am specifically talking about the kind of development that employs, at a cursory glance, more than 65% of the development workforce. The lion's share of the remainder works on executables (commercial software) that run on a consumer device (most of which, again, is incredibly similar to web-client development in 2019), and within those specialized domains, I suggest the power-law largely still holds.
If you have some numbers to suggest that embedded engineering is a particularly good field for engineers who want a profession that rewards a high degree of specialization, I'm all ears.
Also: having worked on performance-sensitive backends for most of my career, most applications are performance-sensitive only in the shallowest sense and the marginal returns for improving performance do not in general justify the cost. One can (again, like I said) seek out domains where understanding how to build very low-latency or very energy efficient or highly concurrent (or whatever measure of performance you care about - very durable? very reliable? etc.) applications matters. But that's not 80% of the work out there.
Edit: also I find embedded engineering to be a very strange choice of counter-point. My last job alongside embedded engineers (at TE Connectivity) saw two of them defect to become fungible Java backend engineers so they could find stable work and transition into management. The remainder worked on specialized switches. Their day-to-day was managing a piece of software that polled chip readers and a REST service that hosted that information. I don't recall ever overhearing a conversation about optimizing TCAM usage or packet switching latency under load or whatever, maybe some of them got to get into the weeds on that every once in awhile. I suspect, again, that this is largely akin to having kernel development skills on a team that largely operates in user-space -- not part of the 80% -- but I am absolutely not an expert in the field).
The whole point of the post was that knowledge-intensive domains exist (and are quite common, 20% is a large fraction), and that some domains reward vast amounts of knowledge (hence power-law distributed, not normally/exponentially/etc. distributed), but that you had to seek them out.
I'd say 90+% programmer jobs these days are working on web applications one way or another (whether UI, back end, or a service to support back end). Even when they're rich client UIs, like phone applications, the main guts of the application generally lives online and requires an internet connection to access.
The fact that you've summarized web development as just throwing up a web page, and that you've assumed web development is never performance sensitive, leads me to believe that although you're an embedded developer - you're likely an inexperienced one.
> Conversely the skill-ceiling on managing technical projects is incredibly high.
You don't always have to go into management to follow this path though. At companies I've worked at, as you become more senior as an IC, your role expands from contributing locally on your team to contributing to the entire org. There is also often a split between the technical leadership and the people leadership on teams, which allows lead engineers and managers respectively to focus on each area.
IMO an underrated and difficult aspect of modern tech is figuring out technical practices that scale with team size, so that you can add more people without sacrificing quality, focus, speed, etc. I'd say microservices and continuous deployment are two examples of technical practices that address this, but there are more.
I've started to propose we s/FANG/Big N/g "FANG" leaves off other obvious big companies, e.g. Uber, and it wouldn't scale to keep trying to add them to the acronym. Here N is used like when one talks about a list "N items long"
But that highlights the inconsistent usage of the term. Often it's used to just mean "a company with a lot of engineers with similar practices and culture"
Amazon made $2.62 billion in the last quarter, and $3.5 billion the quarter before that.
Amazon makes a ton of money, and (quite transparently I might add) they plow a lot of their earnings from some of their businesses into investing into their other businesses, which is what most investors want.
This is completely different from Uber and Lyft, where it's not clear that their profit potential supports their current valuations.
It’s not entirely about profit, but I would argue you have to show a profit, or, demonstrate you could show a profit like Amazon has, to be considered a “top company” in any sense.
I agree, I guess I just meant "enterprise" - the purpose of including FANG was to call out that there's nothing particularly distinguished about working for those companies, the workloads are very similar to ABC enterprise workloads these days.
> While a technology shift doesn’t completely negate the skills of veterans, it certainly levels the playing field for recent grads.
It can more than level the playing field, especially when the hiring managers are particularly narrow-minded. I'm seeing more and more "degree in data science" (an academic discipline that didn't exist 10 years ago) as a requirement for what are effectively programming jobs. Of course, we can agree that this is stupid, and that not getting a job at such a company is "dodging a bullet", but unemployed is a pretty nasty bullet in and of itself.
It's a combination of many things, from brain "degradation", ageism, simple economic math ( fresh meat is cheaper and more malleable ), other more important interests like having a real life and when you really mute all the marketing BS, your delusions and wishful thinking, the actual work of being a developer/programmer in a professional setting is a really shitty job and sad life for the majority of people on those jobs.
I'm a few years from 40 and not having experience or interest in large organizations, I don't know what to do.
Programming is cool, but I always found it to be a tool to get somewhere, and that somewhere hasn't happened and I don't even know what it is or looks like.
But either way, always have a plan B and don't forget about it.
> the actual work of being a developer/programmer in a professional setting is a really shitty job and sad life
The thing is, it doesn't have to be and, until relatively recently, honestly wasn't. It's a shitty job and a sad life because we have open offices, ticket-tracking systems and daily standups. There was a time when programming was exciting and rewarding.
I don't have a problem with standups. I actually like having ticket tracking systems (how else are you supposed to know what is high priority to work on?). But, open offices are objectively terrible. Research shows it, people here complain about it all the time, and yet companies keep building them.
Where I'm at, the floor is divided in such a way that teams sitting together can sometimes have their own little "bull pen" area that's a little bit separated from other teams (if only by a whiteboard wall). This helps, and I like working in an area with just my team, but I would prefer real walls.
All a standup to me is an interruption 30-45 minutes into the day which causes me to lose an hour or more of time every single day. I basically just fart around with things that don't need much focus until the standup happens.
The worst part is that I suggested instead of standups we just send what we we would have said in the standup via slack. That lasted 3 days. The reason no one liked it? "I don't read what other people put into slack". Yeah, ex-fucking-actly. It's literally not helpful because you don't actually need to know what someone is working on that day, and if you do it's because you're also working on it and you can communicate that privately.
Sounds like your team might be doing standups wrong. Firstly they should be 3-8 mins tops. Secondly, I consistently find that when someone mentions what they're working on, more days than not, someone shares something useful (hey, we did something similar last year . . .) and when someone is blocked, someone agrees to get them going again. It's also just nice to have a sense what the whole team is working on.
[EDIT] I also misread length of standup. And agreed, it's tough to have an interruption 30-45 mins is as you lose that time. Wonder if you'd be better with middle or day standups or something so at least it was between worthwhile chunks of time?
if your standup is taking 30-45 minutes you are doing it wrong.
All you need is: What you did yesterday, what you are doing today, and what you are stuck on/need someone else's time with.
Some clarifying questions/interruptions from others in the stand-up are fine ("yesterday I got stuck on foo and I-" "Was it foo-bar, or foo-quux?" "Quux" "Oh did you try the flibble button?" "Why would I need to do that?" "Well since last month's release if you dont use the flibble button..." etc), but if they go on for more than 30 seconds they need to be shut down and done outside of the standup.
My team has up to 20-25 people in a standup and we're usualy done well inside of 15 minutes.
Other tips are scheduling them before a set time (e.g. before the canteen opens so if you over-run you are late to lunch), or doing them early in the day so they don't interrupt the daily grind too much -> get in at 8:30, get your coffee, 20 minutes email triage -> 10 min standup, and its 9am and you've got the whole rest of the day to do
I would add code reviews to that. The advent of those was when it really stopped being fun for me, because it's really hard to keep that from turning into "your boss nitpicking every line you write."
In my opinion this is much, much better than trying to decipher what the heck that one co-worker pushed to production and how exactly it broke things. Of course there are ways to smuggle rubbish through code review, but it's still better than no reviews.
Both have problems. The best system is one where a passing review means "I feel comfortable maintaining this;" too often a passing review means "this is how I would have written it."
I have suffered under far too many "this is how I would have written it" reviews. It can be really demoralizing to write perfectly good, working, readable, performant, well-tested code only to have it rejected and have to rewrite it because the reviewer wishes you had used a different C++ feature.
I tried something like your "I feel comfortable maintaining this" approach recently when I was the reviewer. We have an internal geodata visualization tool that I've done some work on lately, and a talented C++ developer added a cool and useful new feature to the JavaScript front end.
He asked me to review it and said, "I'm sure you will find a lot of things I should change! This is my first real JavaScript project other than some hobby stuff."
And he was right. There was a lot of stuff I would have done differently. There was a mix of jQuery and document.getElementById, a fair amount of repeated code, var instead of const or let, and so on.
Our "house style" for reviews would have been for me to comment on every single one of these things line by line and expect him to change them all to my satisfaction. It would then require a second review pass to clear up misunderstandings, and a third to see if it all ended up OK.
Instead I told him:
"Your new feature is awesome! You're right, there are a lot of nitpicky things I would change in the code to bring it up to more modern JS style, and I'm glad you asked about that. But I didn't find anything that looks broken or dangerous. And the feature works, right? I could spend half a day writing comments explaining everything I would do differently, and that would keep you busy for another half day fixing it up. But I know you have more important things to work on right now, so maybe we can try something different. Go ahead and submit the code as is so people can start using it. When I get a little time I will just go ahead and make the changes I'm thinking of. I'll add review comments on my own changes and send it to you for review so you can see what I changed and why. That will avoid a lot of back-and-forth. I think it will save us both a lot of time and be a more pleasant experience too."
Needless to say, the developer liked this idea. And his manager was within earshot when we discussed it and thanked me for thinking of this approach.
I still haven't made that update, and this reminds me to do it sometime soon. But the code still works, people are using it productively, it has not failed once, and who really cares if there are some minor imperfections in the code style? None of our code is perfect!
You'll never do that update right? The more you wait, the more that developer will get annoyed when you interrupt them with it. :)
In this case actually I would have taken the half an hour and sat with them to explain what I think should be done differently and why, prioritized by importance.
Reviews done through tools work if there aren't many comments and if the basic design is ok.
I've seen reviews where someone's not skilled enough and they have to be taught idioms in dozens of comments in 10-20 patch sets. It's pretty horrible, but the positive side is that at least they're not merging crap.
I resemble that remark! Yes, procrastinator here. ;-)
The thing is, his code is plenty good for now. Even if I never make that update, the code works and people are getting useful results from it. There's nothing overly complicated about it either; I or anyone else could pick it up and easily make any simplifications and improvements whenever needed.
And this developer is working on far more important things for the company, mostly in C++. I made the call that it would be better for our business to let him get back to that, since any improvements to the JavaScript code style on this internal tool simply weren't that urgent.
But I do appreciate the reminder and I will get to that update soon!
That sounds very pragmatic. I probably would have suggested sitting down and making the changes together. However anything that avoids a couple of cycles of review is a bonus, it's super corrosive to productivity and often moral.
Few things to consider:
- linters to help move away from code style comments
- Peer code reviews (if possible)
- inform your team/boss of bike shedding
- use a tool like reviewable.io where comments can be marked as blocking or not. I’ll often comment on preferences and block on issues.
For me daily stand-ups are a great tool for closely collaborating with colleagues. I get that managers have often turned them into top-down status report meetings. (And that almost anything "Agile" has similarly, despite initial aims, been turned into an instrument of control.) But my best working environments have been in strong teams, and I'd hate to see the baby get thrown out with the bathwater.
The thing is, it doesn't have to be and, until relatively recently, honestly wasn't.
Indeed. It's almost as if it's a bad idea to take people working in a field where creativity and careful thought are fundamental and a wide variety of experience with different but related ideas is often helpful and then try to turn them into interchangeable commodities with the whole process dumbed down to the level of the least competent developer or manager in the team.
For my own career, escaping that sort of foolishness was the single biggest benefit when I made the jump to freelance work and even more so when I first became a founder. As soon as you go independent, you're no longer dealing with a subordinate employer-employee relationship, where everything you do is subject to the whimsy and caprice of whoever pays your salary. Instead, you're dealing with a business to business client-service relationship, where your objective is to provide the service the client needs. How you choose to solve whatever problem they have in terms of technology or process is much more down to your own professional judgement. If you do follow entrepreneurial route, you become effectively both client and service provider in that sense, and it's what you learn from your market research and customer relations that guides your direction in often even more general terms.
There was a time when programming was exciting and rewarding.
IMHO, it still is, as long as you find an environment where using fun things to get useful results is the emphasis. Mind-numbing corporate box-ticking is soul-destroying in any field.
We're privileged to work in a field where all we really need to do useful work is often a laptop, an Internet connection, and a willingness to use our brains, and where that work can still be valuable enough to others to make a very nice living off it, and where there is no shortage of potential customers.
Now, there's nothing wrong with being a competent professional who turns up and writes software during office hours and then goes home to enjoy life with their family/friends/hobbies like anyone else, and maybe for those people the structured corporate environment is helpful. And of course going independent has other challenges that aren't just about technical skills, and those are not for everyone. But for anyone who wants more than an office job and doesn't mind taking on a broader skill set to operate independently, I don't understand why they would continue to work in the kind of toxic employment environment we are discussing today. I suspect that in many cases it is simply ignorance (meant literally, not derogatively) of the possible alternatives and the paths to transition to them.
Incidentally, as a convenient side effect of working as an independent professional or through your own hopefully more enlightened business, the sort of ageist nonsense that motivated today's discussion also largely disappears. For a lot of clients outside the tech bubble, a 25-year-old in trendy clothes spouting the latest buzzwords is much less impressive than a 45-year-old who immediately gives off a competent, professional vibe. And if you're the 45-year-old who did keep up with developments and has had an extra 20 years of honing their skills, your rates can reflect your greater capability and productivity, which is much harder to achieve if you're still someone's employee at that point.
I love your comment and agree with everything you said.
I am a few years away from 40 as well, but as a freelance software developer who works remotely, I can only say I /love/ my job (most of the time) and hope to do it for many more years in this capacity.
Started by talking with contractors who worked where I was full-time. Many had long term contracts & were making 2x what I made, doing the same work. Test the market before making the jump. Also, keep in mind that staying at the same job is not as secure as it seems; contracting is less risky than full times these days, especially as you get older.
Fair point, but to be honest nearly all modern work is borderline dystopic, regardless of how "important" it's deemed by society.
If you think being a programmer in BigOrg, Inc. sucks, then take a summer and work on a construction site or with a landscaping crew. Trust me, you'll yearn for that loud, open-plan office soon enough. Modern society would quickly implode without plumbers, yet actually being a plumber and crawling around under houses in rat shit is not very fun.
This doesn't work for me. It seems depressing as hell that we have to put in earplugs and pipe distracting media into our brains just to make work tolerable. How is cutting off one of our senses from our environment a solution?
At 40? If you're suffering from loss of mental capabilities to the point that it affects your performance at work at 40, you have medical issues that need immediate attention.
> simple economic math ( fresh meat is cheaper and more malleable )
I've seen this stated many times, but nothing is forcing anyone to offer a particular salary or demand a particular salary.
Not sure that I agree on the first point. In my 20s, I could easily work 12 hour days, keep tons of complex state in my head, and almost never walked into a room and wondered why I went there (both literally and in the programming metaphorical equivalent).
In my 40s, much of that has changed. I'm beat mentally after a 6-7 hour day of coding. I can only keep smaller portions of the system in my head easily.
It's possible that I was blissfully aware of how shallowly I understood things or how ineffective I was in hours 7-12 of a workday in my 20s (and both of those no doubt have shreds of truth in them), but it seems way more likely that I am noticing a genuine difference in mental ability over the intervening two decades. None of that seems medically abnormal to me.
I'm mid 30s, and if anything I can keep more of the code in my head than when i was in my 20s. My abstractions got better, and I've seen lots of stuff before, I think this has had the effect of compressing everything.
I think of all the edge cases and pitfalls that I wouldn't have in my 20s. I have an easier time reading and understanding documentation, using libraries, reading and understanding other people's code. I think I'm also much more sympathetic to the poor sod who wrote this broken code under a time crunch 5 years ago.
I think you're selling yourself short. In your 20s, you have a lot less mental baggage to deal with, on so many levels. This is a good thing when you need to work hard (which we all did when we were young :) ).
Even if you can only work 6-7 hours a day now, you're most definitely spending that time a lot more efficiently than you did a decade ago. You don't need to keep more information in your head at the moment because you have decades of wisdom guiding your decisions.
It's hard to deny that one's memory deteriorates before age 40. But does it really matter that much to job performance? I'm not sure, but then I'm not a professional software developer, so that might explain our differing views.
I do have to disagree about the number of hours worked being a sign of mental degradation. I'd call that physical degradation, and to be honest, I'm able to work longer days now than when I was 30. Everyone's mileage will vary on that, though.
For me, what dominates everything else wrt mental performance is that I'm much more efficient at learning things now than I was in grad school 20 years ago. Whether it's a new technology, a mathematical proof, or reading someone else's source code, I'm massively more productive than I was at 25. If anything causes me problems, it's that I enjoy learning new things too much, and I spend less time than I should doing the work that pays the bills.
Executive and working memory decline is a real thing that happens with age, I didn't say I'm demented to a point where I suck at my job, but I definitely felt a slight decrease from my 20's to my 30's.
> I've seen this stated many times, but nothing is forcing anyone to offer a particular salary or demand a particular salary.
I didn't think bosses& hiring managers trying to pay the minimum and get the maximum was a contentious point. Our industry is still made of young people who prefer younger people to do their work, for both economic ( many times short sighted ) and social reasons.
Literally everywhere outside of Silicon Valley you can get great jobs in Software that move at a reasonable pace. SV is very different from the rest of US
Not to mention the great overall quality of life that a regular 9 to 5 job as a software developer provides in the rest of the country outside of SF/NY/SEA/LA/BO.
Be a median software developer, pair up with a spouse making the national full-time median, now you've got a $160,000 household income (which doesn't sound special in SV). Married that's $115k-$125k in take-home pay in most of the country.
$160,000 family income vs a $350,000 house. That's the kind of ratio that tilts life a lot further in your favor in many regards.
Absolutely. I admire the idealism in Silicon Valley 100%, and personally enjoy working here. But it’s important to point out that a) Most people who come here don’t become millionaires and b) You don’t need to be a millionaire to have an amazing quality of life in other parts of the US.
"The actual work of being a developer/programmer in a professional setting is a really shitty job and sad life for the majority of people on those jobs."
I'd love to give you a tour of what people do for work, and then you can personally decide how bad programming is as a profession.
I’m sorry but if you think programming is not a “shitty job and a sad life.”
While individuals certainly experience really bad situations in every walk of life, programming is an exceptionally valuable and versatile way to make a living, and offers a level of freedom and economic mobility that most people in most fields could only dream of.
While I'm aware and fully agree with your University of Phoenix pitch, please try to listen and understand that's only true given the right context ( both economic, social, time, etc. )
Most people sit at an office in front of a computer for at least 8 hours a day, some of them are developers. Most "problems" they "solve" are very distant from the "Changing the World and give meaning to your life, one PHP line at a time".
It's a great gig, but a sad life when you'll look back on your deathbed.
I'm not sure where you're working but where I work, being a developer is awesome.
Fresh, healthy, high quality, free food catered for 3 meals a day? Check.
Best healthcare possible provided at no cost? Check.
Stocked kitchen with all kinds of snacks, high end coffee, kombucha etc with ability to make requests? Check.
Freedom to come in when I want and leave when I want? Check.
Work remotely when I want? Check.
Any equipment I want at any cost? Check.
Top of market pay for size of company? Check.
Beautiful office with natural sunlight, plants everywhere, and fresh air? Check.
Autonomy and creativity in my role? Check.
Top percentile talent as coworkers who are genuinely amazing collaborators, interesting people and great to work with? Check.
Like with all professions, there are depressing jobs and great jobs. If you're a dev working at a paper mill in the midwest with draconian dinosaurs as management, yeah it might not be the best gig. But if you work at a company that values software and understands the leverage of great developers, then I can't think of a job that's more cushy and fulfilling.
What you say is that for you all those benefits outweigh the actual fact of sitting in front of computer all day. It's hard to give them all up, that's true.
Could you imagine working retail where you have to be on your feet hours a day with no break? You have no choice.
I get to sit or stand when I want (I have a motorized, adjustable standing desk). I can go for a run mid-day if I want. I can hit the gym if I want. I have any number of options. If I choose to sit in front of the computer all day, that's usually MY fault (but to be fair sometimes there's a launch date looming that results in me working extra).
It's all about choice. I have that. Many professions don't.
When I was younger with no profession I tried painting houses and waiting tables. Now *those^ are hard. Programming is paradise compared. Fun, challenging and comfortable.
I understand if being indoors isn't for you, but programming can categorically be the highest paid job with the lowest requirements physically. When I need is an internet connection and a laptop, it allowed me to do things like live out of an RV full time traveling, or flying to different cities, or being able to spend more time with my family/friends while we are all healthy.
I go to my family reunion and out of 5 of us in the same age bracket, everyone else has to rush back to their jobs in 2 or 3 days, whereas I can spend the full week with the "old folks" - that's time I can't get back.
Am I changing the world with my code? No, not really. Am I helping people in turn work less while still getting to solve increasingly harder problems? Yes. I can only think of a few other professions that would keep me as happy, and none offer the reward/effort level of this.
Maybe the one difference is I started with computers out of a love for problem solving, not for the other things I said. Those are just a nice side benefit I now won't give up now that I'm older by a bit.
I've been the opposite - preferring to work at big, rather boring places. The work itself is not as exciting or fast paced, but things are stable. And I've not seen any signs of age-ism. The two guys on my team pumping out the most code are both 40+, I think 50+ actually. Sharp as tacks. And it doesn't have to be old tech. We're using a lot of Go and React now, for example.
With all that experience you must have forgotten about or never held a truly crappy job. Try working in a call center or retail. Any job where I can get paid to code is better than 90% of the jobs out there.
> the actual work of being a developer/programmer in a professional setting is a really shitty job and sad life for the majority of people on those jobs.
Oh man, I cannot relate to this at all. I feel very lucky to be a software developer - there are far more boring jobs out there, that pay far less, with less flexible hours.
I’ve admittedly only had 4 jobs in 20 years but I wouldn’t describe any of them like this. Maybe this happens more at scrappy startups and I’ve worked at more established companies?
Sitting in front of a computer for years, trying to fix mundane problems, bugs, implement very important and urgent features that are very useless and not really urgent.
You know, being the brick mason of yet another cathedral of delusions and failure as most business are. Which is life and I don't really have a problem with that.
But we are all human and after a decade of this shit, it gets hollow and meaningless.
I did but got lost in my own mind and the solo business is obviously a lot harder ( if you're just a grunt programmer).
Without maturity, a strong social network and discipline every solo venture will end up in isolation and depression.
But I still try the usual side projects that get started and not finished like everybody else. :)
Right now, I just wish I would find something 80% business and 20% tech with a feedback loop comparable to a plumber ( work for a day or 2, when it's done its really DONE. Move to next job )
Oh, its you again. I don't mean that in a mean way, but your comments do stick out as being burned out/not happy.
I'm happy you opened up a little as to how you feel. If you want something that's 80% business 20% tech, I can high recommend getting into "high end" consulting. "High end" just meaning rich people, their houses, and a small smattering of businesses that they themselves have a high earning to effort %.
I don't mean writing code, I mean doing networking/server work/maybe some desktop stuff. I know that doesn't sound glamorous, but you get to make good money (really good money), most jobs are DONE when you're done, and it's generally very low stress. (learn and understand VLAN's and that level of networking and you're already in demand for a lot of people)
Most of the job is learning to setup/meet expectations, know how to talk to those kinds of people, and other business end skills. You'll also never age out - these kinds of people have long ago learned to identify other markers and often are older themselves.
I did this for years and it fit all the things you say you'd like and it left me with plenty of time to write code on the side (or not, the time is there to do what you love.)
Age discrimination is absolutely a thing, but (and this is only based on anecdata) I would say it is much, much worse in the valley than just about anywhere else. The rest of the world seems to have a less damaged culture.
I'm a 56 year old programmer from New Zealand. I moved to the US to join a hot Silicon Valley RISC-V startup earlier this year. I think the average age of our software team is probably over 50. Certainly the median is.
My previous job (and my first ever outside NZ) was working for a multinational electronics company in their Moscow Russia R&D lab, a job I got when I was 52.
Working on semiconductors/electronics is "old tech" that is still very important. It's been around for quite a while.
Compare to Fortran or supercomputers. Was important, isn't much so these days, although is still used.
What I'm saying is that you got on a train that has been running longer than some other trains, hence you meet older people on the train. As someone who has little interest in management I hope to be as successful as you, in this sense of picking long distance trains.
I’m in Portugal and everyone who’s moved their tech hubs here in the past 2 years is only hiring juniors. And we’re talking about a mix of big companies (who ought to know better) and relatively fresh (<5y) startups.
Because juniors in Lisbon will work for peanuts while anyone with more than a year experience knows they can demand a higher salary for living in Lisbon.
I'm 50 and still programming. Programming is all I know how to do. If that was taken away from me, I would just rot to death. I don't want to be anything else, and I refuse to be forced to do something I don't want to do. I'll learn to farm and support myself that way.
So, is it still preferable for the manager to not hire anyone and let the work sit undone rather than to be forced to work with a detestable "older" worker?
Perhaps the government should start giving disability status to people because they are over 40.
> I'm 50 and still programming. Programming is all I know how to do. If that was taken away from me, I would just rot to death.
Interesting statement. Someone recently told me the biggest problem with US Presidential Candidate Yang's Universal Basic Income plan is that people who have had a successful career in one field their whole life often do not want to learn a new career even if you help them financially while they transition. For reference, the discussion was about winning an election more than solving a problem.
It's very interesting to hear the above statement from someone in tech vs a different industry. Reminds me how similar people across all industries are when often some of us tend to feel unique for some silly reason.
They probably meant farming on a backyard gardening level, I know plenty of people who got into it in middle age, particularly Mormons who try to be relatively self-sufficient. Not sure about the backyard size needed to actually support yourself though.
Actually if everything folds in completely the way the alarmists are telling us it will, being able to grow your own food will be the only thing that will save you.
Just like all the other predicted climate apocalypses the past 50 years. For a group that should be good at picking up patterns they sure are slow to pick this one up...
Meta: I really wish HN had some way to automatically aggregate and display this data when a link was previously posted, so you had some sort of temporal representation of the various thread discussions that took place, adding a "+" expander and drop down to "past". I'm aware "past" takes you to HN search results for the link, so it's mostly good enough, I'm just throwing a wish into the ether :)
The solution to this problem is as simple as it is revolting to programmers - realize that programming is a low-value task. The high-value task is solving business problems, if it happens so that programming is the easiest path then go for it. The real value to the business however, lies in domain expertise - something programmers are forced to become in order to be able to write code for it.
The trick is to find a domain that stable, fairly interesting and has enough companies so that you can jump around. I myself chose occupational pension insurance (kinda-sorta 401k but in EU) and spent the last year or so explaining fairly easy edge cases to top brass.
Well, the topic here being the employability of programmers over 40, the statement seems fairly spot on as to how they will be judged - purely based on "business value".
On the hand, programming (at least to me) is so much more. It's a creative activity like thinking or writing, it can be fun and "pointless" play, it can be insightful philosophically. Bringing productivity and efficiency to a business is one of the many applications of this medium of expression. I suppose by this point it's become a way of life, whether I'm solving a business problem or not.
It's one of those statements that doesn't actually have to be said since everyone implicitly understands (because it's not unique to programming), but that someone feels the need to say because it makes them feel insightful. related: bikeshedding.
What others have said about keeping up with technology is certainly true. However, I've found the opposite to be true -- there are many more young programmers because the industry has expanded so much in the past twenty years, there is a relative lack of more senior people, and in my experience they are in high demand.
That said, if you're going for a straight individual contributor role rather than a lead or manager, there may be questions about why you haven't moved up and whether that might indicate a lack of ability to learn new things that might make you a less valuable individual contributor.
>That said, if you're going for a straight individual contributor role rather than a lead or manager, there may be questions about why you haven't moved up and whether that might indicate a lack of ability to learn new things that might make you a less valuable individual contributor.
Continuing as an IC (doing things) rather than becoming a manager where you help people do things indicates a "lack of ability to learn new things"?
Mid-40's here. I'm in management but considering getting back into development/architecture. I've kept up with my skills in Java and DevOps and Kubernetes so I have a breadth of knowledge and skills that I know people are looking for. Management was my plan B and honestly I hate it.
A couple of years ago, the company I worked for abruptly went out of business (like, I came into work on Monday and they said, “we’re closed effectively immediately, go home, we’ll e-mail you severance details”). I sent out about a dozen resumes, got two interviews, flunked the first and got an offer at the second - the whole process took about two weeks and I was no longer unemployed. I was on good terms with one of the VP’s at that company, who also lives close to me - I ran into him about a year later and I asked him how things were and he said, “well, I’m still looking for a job…” We were both around the same age (over 40, anyway) - I’m not so sure management is as unemployment-proof as some of us have been led to believe.
Development skills are absolutely 100% more portable than management skills, because a) basic math says that everyone hires way more devs than they hire dev managers, and b) many dev managers get promoted from within so there are even fewer hires out there than there are positions. The further up you go, the more true this is.
Having closely watched someone go through a director-level job search vs. having jobs thrown in my lap as a developer, empirical reality matches theory here, too.
When you need 1 manager for every ~8-10 ICs, and one 2nd level manager for every ~8-10 first level managers, and so on, the jobs become fewer and further between the higher you go up. Management does not seem like the path of job security to me.
Right. It's a path to faster financial security though. Realistically, the amount of time it takes for them to find a new VP role is baked into the compensation.
This is especially dangerous if you've been promoted rapidly within a company. If you add value and do well someone can be promoted to say director / second line manager within 3-4 years. But with a total of 3-4 years of management experience, it'll be tough to find a comparable position outside.
Yup, I know an engineering director who had a similar story. It took him about 8 months to find a new job. He did ultimately get another engineering director job though.
Which is fairly common, and often why compensation packages for folks in those positions are structured the way they are.
The assumption is that someone leaving a director/VP level position will not find another job quickly.
They often have severance agreements that provide either large lump sum payouts to cover them while they look, or multi-month notice that they're being let go (I've seen 3 months and 12 months, with the expectation being that they'll continue in an advisory position and draw salary during that time, but focus mostly on job hunting).
I think that's great to realize! Management is really a career change, not a promotion. Too many managers aren't good at it but stick with it because they like the status.
I think you'll also find hiring managers receptive to that if you pitch it right. I know I'd be happy to hire somebody in that situation as long as they were really excited to get back to building things. The easiest people to manage are the ones who appreciate what good managers do and work with that, which should come naturally for you now.
I did the same thing. I was in management for 5 years, it was ok but it can be a dangerous place to be. Your tech skills start to atrophy and companies continue to embrace a "lean" model with less management and more developers.
Software engineers don't depreciate as they get older. Software engineers who stop learning depreciate whenever, at any age. I'm 58 and I've seen it over and over again: the people who find it hardest to stay employed are the ones who are least eager to grapple with new ideas and approaches. It's so easy to grow comfortable with that thing you're already good at (I'm a Powerbuilder developer and everyone needs Powerbuilder developers!). I don't know, do people think engineers in other disciplines just get to stop learning and cruise through the last twenty years of their careers? I'm not one, so I can't say, but I would guess civil engineers and mechanical engineers and aeronautical engineers and chemical engineers all have, like, disciplines they have to keep up with :). I know our family doctor isn't working with 20 year-old ideas and tools either.
Ok I'll bite. I had the realization yesterday that I peaked at 25 (I'm 42). I got my computer engineering degree in 1999 and used to do things like really complicated integrations in calculus, and solve weird 3D intersection formulas for different variables to do change of coordinates on various surfaces for rasterizing before wolframalpha.com was a thing. I look at my invention lists from 20 years ago and they were pretty far out man. Like, they expected a future where maybe we had real multiprocessing with FPGAs or something instead of proprietary video cards.
But now, I can barely install a new software stack without this primal sense of exhaustion. I can't explain it, but it feels like everything is just.. wrong somehow. Or a little off. Like maybe the profit motive and disruption became the primary motivators instead of progress and scientific discovery.
It's to the point where I could probably write my own programming language more easily than learn the various idiosyncrasies/dealbreakers/errata of whatever industry standard language I'm supposed to use. I find major problems with most any framework in the first 5 minutes, and then spend most of my time after that figuring out how to use tricks with inheritance or pointing the package manager to a forked version of the repo with fixes so that I can extend the framework to do very first principles kind of things.
I think what it comes down to, is that I see programming as this giant spreadsheet where data reacts to other data changing, constraints get satisfied or exceptional situations happen. But unfortunately, the most overly hyped languages (like Ruby for example, or Java, not to pick on them) seem to go out of their way to block the developer from staying in a functional paradigm. Almost every codebase I've seen has descended into spaghetti code hell.
And unfortunately, that nebulous ball of cruft is where the money is in software development. I've spent the last 20 years knee deep in code so ugly (from assembly code up through SQL) that I feel like a plumber who sees code as.. sewage. There is just so much code, filling every crack where declarative and data-driven solutions would accomplish the exact same thing but with no side effects.
But new devs don't seem to care to learn where I'm coming from or why my code might be better in fundamental ways even though it looks a little strange to them. Sometimes in a conversation I'll see myself and think "wow, I really do resemble a crazy person". I just feel too old to begin the training, but too young to be this burned out. I fantasize often about giving up all worldly possessions and moving to an artist compound in the woods.
I worry I've just reached the point where I'm a fish trying to ride a bicycle but I don't know of an ecosystem that would value my natural and hard-earned talents for making the world a better place. There's just an endless stream of soul-sucking contracts, each more punishing than the one before. All of the money is tied up in the establishment now. No room for creative daydreaming along implausible trains of thought in the universe of possibilities. Just the grind. Always the grind.
If I'm depressed then it's environmental, and I feel stuck with it because I can't change the world. But that's why I learned to code in the first place 30 years ago. So now what? I guess I'm just curious if any older devs here have reached similar conclusions. And like, were you able to rise above it? Were you lucky enough to start 5 years earlier than me and capitalize on the booms to the point where you're independently wealthy now? Or is that a widespread fallacy, that "if I was just a little older" maybe I could have ridden a bubble? Or do you fantasize about being 20 years younger, starting in the world today where you have access to all the secrets of entrepreneurship online and still have a beginner's optimism?
I think the gist of my melancholy is that I have a knack for solving very difficult problems handily (nobody seems to need that skill anymore) but my short game stinks and I struggle to summon the motivation to do the mundane things (that are in demand). I'm tired of learning new languages and frameworks because they are each terrible in their own way but collectively the software engineering industry can't see the flaws so they inevitably remain.
Sorry this was a bit of an overshare but hey it's Friday.
Your comment resonated with me quite a bit, in fact enough to persuade me to finally create an account here (longtime lurker). I'm an older developer as well and have had many similar thoughts to yours lately.
> But now, I can barely install a new software stack without this primal sense of exhaustion. I can't explain it, but it feels like everything is just.. wrong somehow. Or a little off. Like maybe the profit motive and disruption became the primary motivators instead of progress and scientific discovery.
I have felt this way recently too. I have thought that in my case perhaps it's burnout, but I think I could put it even more accurately as a sense of futility. Learning another framework that largely solves the same problems, but with a new author or a new company backing it. I think some (many, most?) of these new frameworks arise because thoughtful engineers want to dig into a problem (and learn by doing), and the really good ones come up with these useful frameworks/languages. And maybe that is the best way to really learn something. I'm sure many folks posting here could easily, like you mention, write their own language (or lisp interpreter!), or web framework etc. In fact, I guess I don't even necessarily see it as a bad thing - I'd just rather be driving my choice of technology and I feel the futility when something is /oversold/ to me and I'm /forced/ to learn it or be considered a dinosaur. Maybe exhaustion is an even better word than futility, exhaustion at doing the same thing over and over again (maybe even new framework or not).
For me though, I have felt this sense of exhaustion/futility more profoundly since I've gotten into Lisp. I think it really is a different, and a fundamentally better approach to language design. It's interested me in the history of computer science, and I've been reading a lot of foundational papers in the field. I'm honestly pretty disappointed my undergrad school's CS curriculum didn't give any of this stuff more than a cursory look in any of the classes I took. It's made it that much harder though for me to get into the framework of the day -- it's almost ruined regular development for me :-). Thought exercise: look at how the "walrus" operator in python has caused such strong reactions out of so many people -- it's just a simple macro in a lisp, developers could write a macro that accomplishes this functionality in a few lines of code and use it without forcing any consumer of their code to even update their language runtime! Now while the technical solution here is so obvious and perfect, maybe the strong reactions people have would be the same -- whether or not one /should/ write such a macro etc :-).
> And unfortunately, that nebulous ball of cruft is where the money is in software development. I've spent the last 20 years knee deep in code so ugly (from assembly code up through SQL) that I feel like a plumber who sees code as.. sewage. There is just so much code, filling every crack where declarative and data-driven solutions would accomplish the exact same thing but with no side effects.
Yes -- I've had the same experience. I think this is mainly for me though a dissatisfaction of writing or being a part of so many similar apps (but yet writing them all like the first time, in different ways). I'd rather I think in the long run have written them in one language (Lisp, but I'd even take just sticking with C over the long run) -- and be able to build on my past work. Then at least if another app was truly like something I'd done in the past I could just instantly complete it, or better yet, spend my time /just/ learning the real domain of the problem, rather than wasting time with the shiny technical wrapping paper around it.
> If I'm depressed then it's environmental, and I feel stuck with it because I can't change the world. But that's why I learned to code in the first place 30 years ago. So now what? I guess I'm just curious if any older devs here have reached similar conclusions. And like, were you able to rise above it? Were you lucky enough to start 5 years earlier than me and capitalize on the booms to the point where you're independently wealthy now? Or is that a widespread fallacy, that "if I was just a little older" maybe I could have ridden a bubble? Or do you fantasize about being 20 years younger, starting in the world today where you have access to all the secrets of entrepreneurship online and still have a beginner's optimism?
I'm not independently wealthy, as you have a few years on me, I missed all the bubbles too. I do live minimally in the hope of getting out as early as possible though, or putting myself in a place where I could maybe contemplate a career change. What depresses me more though is this: I'm pretty sure I could continue on this treadmill pretty easily if I believed more in where the industry was going. I've recently read "The Age of Surveillance Capitalism" and am fully in agreement with the author's views on the large tech companies. In fact, I've declined working for one of the big tech companies for these reasons already. I fantasize about getting out of tech, or if I do stay in tech maybe of starting a non-profit similar to the FSF or EFF (and figuring how to sustain my family while doing so). That or get into tech related to medicine. I've always wanted to teach, but I fully believe if I ever were to teach CS (and I absolutely /love/ it and got an advanced degree in the hopes of eventually teaching), I would have to believe in the types of jobs out there, and I don't want to quit industry to teach, I want to /retire/ happily from industry and teach (believing in where industry is going). I also don't want to teach people who will have no job prospects but those that continue to further the age of surveillance capitalism. Though, I don't blame people who do. In my younger years I never cared about a "why", but was always just hyper-actively interested in whatever the technical problem was.
I don't long for being 20 years younger and getting in to tech today, I long for the age when Microsoft was the big bad tech company. At least then they weren't spying on everyone, and Linux/open source was a bright spot, and we controlled our own hardware. If anything, I wish my prime had been when comp.lang.lisp was in its prime. Mostly though, I fantasize about working with deep thinkers to solve problems actually worth solving.
> But now, I can barely install a new software stack without this primal sense of exhaustion. I can't explain it, but it feels like everything is just.. wrong somehow. Or a little off. Like maybe the profit motive and disruption became the primary motivators instead of progress and scientific discovery.
The world hasn’t changed, only your knowledge of it. You started fresh at a given level, call it “software”, and you implicitly assumed that the hardware was a given, existing property of the world. Now you know that most software is flaky and bad, and you feel disillusioned about the software you were imagining to have by now, and you feel that the world is worse with all this bad software in it. But the world was always bad, you just couldn’t see it before, because all the badness was in hardware, and that wasn’t your “level”, so to speak. The hardware you were imagining to write all this beautiful software on, that very hardware was, and still is, ugly and full of cruft, just as the software you now decry. This fundamental property will always be true.
To not be exhausted by the world, you must learn to accept it the way it is, the way it was and the way it will always be. You must accept the existence of badness, and build your creations with badness in mind.
That's a good post, but I think that I went down a different set of windy little paths. My worlds (graphics and video products) had tons of companies disappear, and the ones still standing matured to the point of offshore manufacture, maintenance, decreasing margins. It hasn't all disappeared but I think that a lot of the fun is gone.
I'm older than you, never bothered with internet technologies and languages, and feel a bit of ennui about the whole deal. My guess is that a world that only consisted of workstation software and embedded systems, slightly fleshed out with an internet made up of Amazon, email, USENET (and cars without video games on the dash) is just the fantasy of a Luddite.
My temptation is to take on increasingly archaic skills. Rebuilding a carburetor or playing a musical instrument is already in the pallet, so maybe bookbinding or blacksmithing should be on the horizon.
> My temptation is to take on increasingly archaic skills.
I am reminded of the protagonist in Tlön, Uqbar, Orbis Tertius who at the end, when the whole world is changing into a new world with a new culture and languages, he retreats to spend his time translating a classic 17th century English text into Spanish, both of which are soon-to-be dead languages.
Agreed those of us over 40 lived in a time where there was no such thing as a "web developer" only some of us jumped over to doing "web development" so the actual pool of people available to do the job, that have the skills from Gen-X is insanely small compared to the overall demand the industry has generated for developers.
Almost all of the developers I knew from my younger days have either, gotten rich, gotten out (one bought a worm farm), or went into some form of management. All due to their own decisions no one was forcing them out of development, rather development has a way of chewing people up to the point where they want to be spit out. It is the reason that Office Space was such a resounding hit with us Gen-X'ers it struck at the heart of what it can be like to work in IT.
> The demand for programmers is today exceeding the supply.
Tell that to the company having to screen through 100 resumes and interview 10 candidates for every open development position they post. Or to the developer who has to apply to 100 jobs in order to get 10 interviews and maybe one offer.
The article uses Ruby on Rails as an argument from 2009, saying the 10 years of C++ development is no longer relevant.
Apart from that statement not really ageing well, it doesn't really make sense as there will always be a consistent demand for C++ for embedded etc., likewise Java for enterprise, as they are mature and proven technologies in instutitions.
Just as there is an established market for RoR with companies that adopted it early on.
I think you're making the author's point. Java at this point is the new COBOL. C++ has gone from important mainstream language to narrow niche. Ruby on Rails is on the same declining curve, just not as far along; much of the excitement has shifted to Node, React etc.
Demand for old languages isn't consistent. It declines over time as code bases get replaced. A programmer betting that they will die before their chosen language does is taking some terrible risks.
You can say it, but I'm not sure why you'd think it's true. As I mention elsewhere in this thread, Java is mainly being written in contexts where they already have a lot of Java.
And C++ has been declining for years. I'm not even sure you can write C++ for the common mobile platforms. I very rarely see it used in new server-side code bases. Or even old ones these days. It was a popular language for Windows apps, but that's a declining market these days. 5 years ago, C++ was already not getting taught much in schools [1], and surely that hasn't changed.
I met a COBOL contract programmer 15 years ago. At the time he made a lot of money from it because there were few left, no new work, but maintaing old code bases.
Java today, OTOH, is nowhere near that point yet. People still build plenty of new things in Java and it's easy to search a job site and see tons of Java jobs. The JVM is widely used and no-one will blink twice if you want to build something on it.
I think it's a mistake to confuse JVM languages with Java. But that aside, By "Java is the new COBOL", I'm not saying that Java today is COBOL today. I'm saying it's a dominant language for legacy apps and a lot of "enterprise" coding in shops that already have a heavy Java investment.
But relatively few new companies are doing new work in pure Java, so it's definitely headed down the same road. And there are definitely plenty of companies where people will blink a lot more than twice if you propose building something in Java.
That's not to say it's bad; I've written a bunch of Java for money, and may do so again. But I think it's hard to justify in a lot of environments, and correctly so.
According to angel.co there are 88 startups in London alone using Java. Whenever I attend London's Silicon Milk Roundabout bi-annual jobs fair the dominant languages are Python and Java. Java's status in the top 3 programming languages worldwide cannot be explained away by its dominant position in legacy enterprise software. What about Android, for example?
I personally have done a startup in Java in 2004, and have written Java code for money as recently as last year. So I feel comfortable saying Java is generally a bad idea for startups.
Java was the core language for Android because it started in the early 2000s, when Java was the new hotness. But it is no longer the primary Android language; that's now officially Kotlin, because the majority of developers had already switched away from Java: https://techcrunch.com/2019/05/07/kotlin-is-now-googles-pref...
I'm not disputing whether Java is a good choice for anything, just refuting your assertion that Java is the new COBOL and is mainly used in legacy enterprise software. Jobs stats on Indeed.com (corporate) and Angel.co (startups) show that's clearly not the case.
Java is the new COBOL in that it dominates in enterprise contexts because a) it's seen as a safe choice and b) they already have a lot of Java. Yes, there are plenty of jobs in it. There were plenty of jobs in COBOL too for decades.
If you have stats that suggest it's a popular language for startups relative to other languages, I look forward to seeing them. But my experience is otherwise, and it has pretty clearly been on the decline for years. You seem eager to ignore the fact that it's now a second-class language on Android, but Google Trends shows the same decline more generally: https://trends.google.com/trends/explore?date=all&geo=US&q=%...
Sure. All of those companies are over 20 years old, so it's not surprising they'd have codebases in a language that was very hot early on in their lifespans. There was a also crapload of COBOL out there for decades after new companies turned away from it. There is still a fair bit.
I became my own boss, and specialize in projects where my expertise is still valuable. With the dwindling supply of engineers still working in these spaces, contracts are pretty easy to find. As time goes on and the space becomes sparser, my rate goes up.
A professor once told me that I shouldn't become a software engineer because my cognitive abilities will decline with age, thus making it difficult to keep a job :(
I'm at Microsoft now and see plenty of older engineers that have very, very valuable experience and skills. This myth is dumb.
While I’ve no doubt that age discrimination does exist, when you see people who are former programmers at 40 (or 30 for that matter) it’s often just that they started out in programming but got bored of it and moved on to other careers. Or gradually became managers rather than programmers.
Statistics may certainly show that programmers skew young, but its not always due to being forced out, or denied opportunities, because of age.
I don't think those are cleanly separable. When you talk with people who left the industry, it's very rarely, "I was successful and engaged and really enjoying it, so I decided to switch to something entirely different where I had to start over and earn a lot less."
For a lot of people who experience discrimination, it's not one big blow-up that forces them out. It's the steady drip-drip-drip of being treated poorly, denied opportunities, etc. Eventually they just get worn down.
And sometimes when people aren't that good at their jobs, they blame external causes instead of accepting responsibility for their own performance. (Note: I do realize that discrimination and bad managers are real things, but then again so is the Dunning–Kruger effect.)
This is true, but at best it's irrelevant to the point being discussed. And at worst it's the sort of victim-blaming that is used to undermine every time discrimination comes up.
I'll second that. I'd also add that most programmers have high fluid intelligence. This makes it rather easy for them to retool into another field and end up being talented at it.
Industrial psychology has very consistently found that fluid intelligence is a stronger predictor of job performance than even experience. It's often the case that the software engineer who becomes a manager at 45, ends up being better at it than the MBA who's been doing it for two decades.
> Over time, the validity of job experience for predicting performance declines, while that of ability remains constant or increases. Path analyses indicate that the major reason ability predicts performance so well is that higher ability individuals learn relevant job knowledge more quickly and learn more of it.
Recently I found a relatively cheap goat farm for sale back on my home country. That would be my plan B.
Forget consulting, forget management, my plan B is taking everything I know about automation, optimisation and continuous learning, and apply it to a completely different domain.
I think the entrepreneurial route is the most realistic route in the long term. The idea is to use your experience and knowledge to build a software product that a niche market needs, and sell it.
If you live outside the main city centers like Silicon Valley, New York, Los Angeles etc. its not hard to make a full-time income online.
A key thing to understand is that, as an employee working for someone else, your revenue will always be capped to the rate associated to the job market of that profession. There might be a 10% difference between jobs, but that's it.
While if working as an entrepreneur, your revenue cap is the one of the market you are on, whatever that is.
I completely agree with this. I'm nearing 40 and have been consulting remotely for the past 5 years and have a side business, that will eventually be my full-time income.
The problem is that it's impossible to rely on any one company until you retire. Although I've been keeping my skills updated to the latest technology and trends, if I suddenly lose my job at 50, it's going to be really tough to find another one.
I started working as a programmer, and for all practical purposes started programming, at age 37. I'm over 50 now. The vast majority of my professional career has been spent in violation of this article's thesis.
https://www.rosshartshorn.net/RossHartshornResume.pdf
A better take on it would be, "don't expect to be able to avoid learning new stuff every few years, just like younger devs".
I love learning new stuff that builds on what I know and increases my understanding of the world, but this is not what's usually happening in SW, where it's common to have to learn a different version of the old stuff because the market shifted.
There are many domains esp. Finance where you move up or move out. Programming also tends to be one of them.
Scott Adams wonderfully says develop a talent stack, Programming is just one of them. Add Math, Sales, Communication, Leadership etc. Become a triple threat! Being above 80 percentile of 3 skills put you at a unique place. Many people here have the mental mechanics for it.
This one thing is true, I resisted.. learning is a life long thing! after 5-10 years of graduation your credentials do not mean one damn thing (with some exceptions).
At forty I was just getting started. I'm 63 and I have done lots of super hard projects and I keep learning more and I have advanced up the tech ladder. And, the problems are harder than ever. For me the key is going after the hardest problems because that is what I like the most. Think system software, compilers, AI. It just seems to have been a good career move though that was never my intention. I've worked in areas of the software field that are thought of as difficult, and that has helped me.
Yup. I moved into compilers and related things by starting to contribute to an open source compiler project in 1999 when I was 36 and got my first full-time compiler job when I was 43. Now I'm 56 and in the last couple of years started moving into designing new CPU instructions as well as the software to use them.
I probably felt this way 10 years ago when I was in my early 30s and this article was written. Now that the time has caught up with me and Im in my 40s and still slinging code? Im doing technical consulting now and have had the best 3 fiscal years of my life, with an upward trend. Getting out now would be foolish...
Suppose that it is true that it takes 10,000 hours to become expert at software development, just like for other arts and skills. Then, if the half life of IT is 10 years, you have to accumulate 5,000 hours of new learning every 10 years. Consequently, you have to devote about 10 hours/week to learning new stuff in order to keep current. If you are lucky in managing your career much of this can be done on the job. Otherwise it must be done on your own time. It is the inability or unwillingness of software developers to devote this much time to continuing learning that accounts for their early obsolescence.
> Suppose that it is true that it takes 10,000 hours to become expert at software development, just like for other arts and skills.
The 10,000 hour rule is just something Gladwell made up on the back of a study which found that people at a certain level in certain fields (classical violin and classical piano) averaged that much experience (EDIT: actually, deliberate practice), while people at lower levels averaged less. It isn't a threshold.
Totally anecdotal, but I have long thought distribution of programmer skill becomes increasingly bimodal as they/we age, with them/us either becoming particularly skilled or particularly unskilled.
Either the one language/framework/paradigm that the aging developer is locked into becomes ever more aged and irrelevant, or else if the programmer has made good use of their time and experience, becomes particularly skillful and insightful and valuable with time.
I've known some amazingly sharp older developers, and I've also known some particularly poor ones.
The current generation of 50+ years old developers came from a way smaller pool of developers when they all started 30+ years ago and that's the reason that only a minuscule of them still codes. The percentage of current younger developers still coding in their 50% will be more closer to other engineering fields based on amount of developers entering the field every year last 20 years (even considering how many will simply abandon doing development once hitting 30 or 40) - not everybody will become managers or start their own businesses.
I've always gone with the assumption that I'll have a really hard time getting a job as a developer after 50. It's something I think about often and try to plan for with my finances.
Probably, system administration or the rest of IT is the closest to a natural fit for a programmer going into other things. For people with the MS programming certifications this may be a pretty clear path and aided by their programming skills.
I’ve got family who seemingly bucked this problem, though. He has maybe going on 40 years of C experience, now. I think he’s said it’s become harder to find work that really needs his skillset, but he’s been doing the contractor thing for much of that time and always kept employed.
Software engineering is a field that evolves and you need to stay up to date. It's far from the only field like that.
When you get out of school (or whatever path you took), you are knowledgeable about the current state of things. If you're an entry level engineer, chances are you'll have people teaching you what you don't know, mentoring you, and generally expect you to learn with them. Of course, as high as software salaries are these days, your expectations then are still lower than the older folks. Chances are you have less things to pay for outside of student loans in the US.
As you get older, you can do 2 things: you can passively coast as long as you can, or you can actively keep yourself up to date. If you choose the later, your knowledge stays current and you keep adding experience, making you more and more valuable. Short of health issues getting in the way, that will never stop. You just keep getting better.
If you passively coast though, what you know becomes less and less relevant. For the first couple of years, experience gain >> the part of your knowledge that is outdated, so you still go upward. There's a limit to that. That 35-40 yo line, assuming you started in your early to mid 20s, sound about right for when you lost more than you gained. And thus you're no longer worth the price of admission you're likely expecting to pay for your kids/mortgage/401k/whatever. And that's when things get rough.
But if you play your cards right? There's no problem.
Well I’m already a few years into my plan B which became working for gov. This article assumes that government tech is static but it is changing rather quickly. I’m knocking on the door of 40 and regarded as one of the younger leads. My commute is a 6 minute walk home and the high quality projects sometimes get scheduled out for completion in between 3-6 months or even a year if it’s a big one. Medical care and family time for all devs is a huge priority for management. Lots of vacation time, amazing benefits, and employer matched savings (or pension if you go that route). I’ve worked for a startup before and for a big corp as well but feel more secure now. Also, there is a tech allowance fund for equipment for home use and if you want to take classes part time they pay for some of the tuition. They pay for training and events (room and board, food, transportation). I’ve nearly doubled my salary since I started. I also get to mentor recent grads at times one of whom just got a position with Mozilla. So there ya go- with all the negativity this topic invokes there CAN be a silver lining (just like silver hairs beginning to appear on my head). For the younger devs: whenever you get tired of the ‘culture fit’ shenanigans or feel like you are more playing a role like an actor than solving real problems you may find yourself in my shoes. Whenever you get tired of practically living with ‘the team’ knowing that your reward for good work is even more caffeine you may find yourself wanting to get off the tilt-a-whirl just like I did. That day may come where you actually want to have your own uninterrupted thoughts as you work— then it’s time for Plan B.
It really depends on how you manage your career, I for one plan to work to 70s by coding, just love doing that. My theorem is that, many professors are still doing researching and teaching PhD students in their 60s/70s, software developers is no different, unless you give up learning yourself. Anyway just want to say, if you really like being a programmer, no need for Plan B at 40, just keep learning and coding like a 'professor'
So I'm guessing if you're approaching 40 and want to become a developer you should seriously consider other options.
I've actually seen this talked about a lot. The arguments against it is usually anecdotal but the arguments for it is usually statistical. It's a tough thing for people to accept I think. Is there any statistics out there that actually disputes this claim? Not arguing for or against it just out of curiosity.
I used to work in advertising as a graphic designer. Age discrimination is rampant there, I would suspect more so than software development in some ways.
Therefore, software development is my plan B. I have a friend who is older than I, and he's specialized in SQL. He only does SQL, and he's still finding jobs. If I needed a plan C, it would be taking up legacy code base that seems to never disappear.
Firstly I am a proponent of the idea of software as a new form of literacy - it is eating the world and as such if you are literate you run rings round the illiterate - no matter what your age.
But
companies are pyramids - and some roles (like sales) mean their value to the company can be measured on an individual basis. But most roles cannot - and the further away from the customer you are the more it is true
As such the only metric available is "number of people reporting to you"
It would be nice to find some other metric - there may not be one.
As such the push away from coding into management is inexorable. if we look at highly literate companies (say Washington Post) then the managing editor spends most time not writing but managing (with some reading involved)
to;dr - we have two antagonistic forces - being software literate is for now a force multiplier so why would you not keep doing it? But most organisations can only measure management ladders - so if you want pay or security you climb.
Perhaps the book "Developer Hegemony"
offers some answers?
This is why you need to have Senior and Principal engineering roles. As a programmer or operations engineer you should be able to choose between a management or engineering track for advancement.
The managers work as an interface between the management hierarchy and the individual contributors.
The top engineers work across teams to steer technical solutions and provide cross team focus and continuity. Your principal engineers are charged with breaking down the silos and sharing and setting technical standards and encouraging the reuse of existing services and systems.
Tech managers and team leads don't have time to do the technical leadership and work properly with business. Workplaces without this structure invariably fail to have well articulated, cross team technical decision making capabilities. Everything winds up fragmented and reimplemented.
People here hinting at outdated technologies are mentioning things like Fortran. I'm 41 and some of those things are technologies my dad used. So it seems that people are off by one of two generations.
I went through phases with PHP, C#, SQL, and then many years mainly focused on JavaScript and Node. I did some Angular and a little React with recently C++ and Lua on the side. I built about 90% of a Docker orchestration system. The front end stuff I am most familiar with at this point is Vue. I have been looking into decentralized technologies in the last few years like IPFS and dat etc.
It might be nice to have more experience with Kubernetes or Go, but the experience I do have is still relevant. And if I needed to pick that stuff up for a job I could learn it.
But anyway I think that the idea that older developers don't have relevant skills is false.
This is definitely a problem, particularly for those that aren't in tech cities with endless jobs, but the alternatives discussed in this article are all....not great.
* Work as a consultant - I'm biased here from my own bad experiences, but consultants are often treated as a source of future problems, not solutions. (largely because of communication and incentives, not because of programming skill)
* Work in management - which the author admits is not because it is a good match in anyone's mind but the employer
* The third option is not easy to solidify, but it basically sounds like becoming a solo contract coder in your area of expertise
I suppose any better answer involves a change in the industry that is beyond one person's ability to control, but I was hoping for...I dunno, maybe advice on how to plot your savings to manage lower income in your later years?
Joining or creating a small agency is an option I could see looking into eventually. Something with partners and associates similar to how law firms are structured. Kinda splits the difference in being a contractor but not being self employed.
Re: consultants, depends on the engagement. If you engage with clients where you are effectively the only way for them to get anything software done (ie, not in addition to an on-site engineering team), you can be seen as an invaluable resource that just gets things done for the business.
Doesn't that fall apart over time? The consultant quickly becomes the monopoly provider, and the large amount of previously investment means that the customer can't afford to switch to another provider (since anything from the previous company will either go away (if there's a bad contract) or just ends up being very difficult to maintain. And the consultancy has incentives to create this rather than fight this.
I'm not saying every consultancy is bad/evil, and it is entirely believable that they might choose to be a "good" player and bank on getting a good reputation to make up for any business they lose by doing so - but the overall trend looks like such "good players" lose in the larger market or stay in a small niche.
As I said originally, I'm biased, having seen a few BAD relationships, but I'm trying to remain objective and look at incentives and counter forces. Totally open to hearing counterpoints.
Say a business needs some web dashboards, you write them in a way that should be clean and maintainable, and has documentation for the next person to come in and help if needed later. They like what you did so ask for more of your help. They could decide to change at any time but like working with you and how you get things done, so keep going with the contract.
I don't see how that could be seen as bad. You are helping the business, they are happy, paying, and you've even made it so if they wanted to switch it could be done in the easiest possible way. Did I miss something you'd be concerned about?
I see nothing concerning in what you've described, except that they are totally at your mercy for whether you are delivering what you describe.
EVERYONE will say that the code is clean, maintainable, and has documentation that is meaningful for the next programmer to come along. Not everyone will be lying about that, and if I made it sound that way, my apologies.
But if the next programmer comes along and says the code is a mess and poorly documented, the company is STILL not sure what's true. Is this next programmer the one with an issue? Was the original not stating the truth? Are both fine and this is a philosophical difference of approach? You can ask around...but you have no real way to judge the comments you get, and they have all the incentives to create misleading info.
There's a power imbalance. I'm not saying consultancy is evil, and I'm not saying the power imbalance is avoidable, but in the context of the article - I'm not a big fan of encouraging the _furtherance_ of such a power imbalance, because once you get in, there's very little means of getting out.
You make a good point, and I don't disagree there are possible downsides. To be fair, some of the details you described could be issues with in-house employees as well, imho (folks/teams claiming clean/maintainable code when it's not, wanting to seem necessary, having the power to claim what the team is doing is right and good for business without business knowing any better, etc).
I wish it'd be emphasized a little more - as a software dev, you'll likely either end up having to move laterally to a new skill set that isn't entirely related to your previous experience (moving from C++ to Ruby on Rails and web dev as the example in the article) or move 'up' to management or architect positions.
Both of those are difficult and fairly unique to software development. Most jobs don't require these types of shifts. A number of jobs do simply wear your body out and leave you in difficulty in your 40s however, and that requires a complete career shift or that move 'up' to management, both of which are non-trivial paths to take.
keeping up with the latest technology has a side effect: you will still be older than most of the people who have learned the latest technology. so you will still face a builtin cultural mismatch that does not work in your favor.
I moved from Bay Area to LA about 7 years ago. In Bay Area, they treated software engineers like a king but here in LA, I worked at 3 companies so far and engineers aren't treated that well. LA is still very much old fashioned corporate style(unless you work for a Bay Area company who has an office in LA or somewhere like Snap). Managers get the respect and micromanage engineers. One engineering manager in a big entertainment company in LA told me once - "oh engineers are just commodities." I sometimes miss the Bay Area vibe of engineers being treated like celebrities.
This doubly goes for ops guys (who used to be called sysadmins but now are all kinds of fad of the moment names, I still like sysadmin but I notice many devs and CS tend to look down on it... but I digress)
IAC and other newer tech has largely shifted the perception about sysamin'ing and has caused a huge swath of people who haven't been keeping up to have to turn to plan B. Even when you see a Linux admin or engineer pos they often really mean SRE or something else.
This sounds dramatic and dire but I feel like the reason for the statistic is that by the 20 year mark in someone's career, if they're working for a big corporation, they've likely been promoted to management if they have any social skills at all.
Not because they can't program anymore, but because corporations try to promote people to management if they've been around a while and haven't screwed up, and they tend to pay them more if they accept.
I'm 51, an electrical engineer who has migrated over time from ASIC design, to board design, to ASIC validation, to embedded software development, to team lead, development manager, back to IC as architect and enventually principal engineer for media, real-time software development. You shape your career. You make it happen. Find what you like, and go for it.
This type of article keeps on being posted in HW and immediately someone posts something about how it can't be true since he/she sees elder techs all over the place.
They don't realize that they are seeing the Survivorship bias in full force. To all those that doubt the validity of the article, keep in mind that eventually, statistics will prove you wrong. Good Luck!
50 yrs old, just started a new job as a Sr. Software Engineer, and now learning go, kubernetes and google cloud platform. Been in management in the past, but grew tired of that route. I do find that my experience (C++ -> Java -> Scala) with healthy doses of (Ruby, Groovy, Python, Perl) is a good foundation to work from.
The strangeness of Silicon Valley right now is that if you're lucky you can "retire" long before 40. I'm 31 and have through a bit of luck accrued a small 7-figure net worth. I don't plan on being a "rank and file" employee by 33, let alone 40.
> The strangeness of Silicon Valley right now is that if you're lucky you can "retire" long before 40.
The entire Financial Independence (FI) movement is all about doing this everywhere (not just the valley) and without luck aside from household income around $45,000/yr in the US.
You're assuming zero real interest investment growth over 30 years. I totally agree with stacking as much as humanly possible, but that's not a realistic assumption, unless we have worldwide zero economic growth for a third of a century.
Plus modern retirement isn’t necessarily “never work a job again.” It’s more a freedom to do what you want (income generating or not) without having to take a job simply to cover your cost of living and retirement savings.
The problem is, when everybody thinks like this, it stays like this. With rising age People tend to change into people jobs if they can because somehow the air gets thinner as there are not that many advanced software jobs - although there certainly are a lot and even more places that do appreciate advanced abilities. On the other hand a lot of skills/experiences are transferable. E.g. some weird DB connection keep alive bug might be analogue to debug on C++, node.js and Rust. Developers with not enough experience might give up, change DBMS, refactor the whole DB layer (hoping it's MVC'ish) or even give up on the project if it's not an important one.
Also most new tech builds on the learnings of the old tech and discards outdated patterns that were possible with the existing stacks, but advanced users might have stopped using those anyways. For example Java has no multiple (class) inheritance in contrast to the much older C++.
Most graduates think they are programmers. Then they fail misserably. Just get over it: its very rare to become a good proframmer when you start coding at 20yo. Unfoetunately only nerds survive and only nerds drive the real technology.
Joke aside, I don't know how much of this problem is more related to older devs insisting to have the same dev employment they always had than acutally being old.
Go into management.
Become self-employed.
Start your own shop, service or product based.
Teach younger people what you know for money.
There are many jobs besides just being a "senior developer".
I've also seen many non-tech occupations becoming more technical.
Learn something unrelated on the side and try to apply your dev skills to it.
Being a dev is cool, but being a dev-lawyer, dev-md, dev-techer etc. Is probably gonna let you have a good job later.
You joke but you would be surprise that there are some developers that once burn out, become Luddites. I know a few, one literally bought a worm farm, one bought farm land and started farming lavender, and the other bought a tackle-shop / dive bar down in the Caribbean.
That only works if you start at 25 when you graduate. Not a viable plan for someone approaching 40 who wants to make sure they can continue to support themselves into their 60's and beyond.
A solid Plan-B is to learn an older tech where people are already retiring out of. Examples would be anything IBM used in the 80's early 90's. C/C++, COBOL, JCL, UNIVERSE, VB6, etc. etc. Young people are not competing for this work and you would be surprised at how much it can float you thru lean times if you need to lean on it.
Obviously you need to plan ahead, perhaps read some personal finance books now and then. "The Millionaire Next Door" is still relevant. I read it when I was 22, followed most of it...
Why? Someone who starts at 40 is probably earning more than someone at 25. And someone at 40 has fewer years left to support themselves after their 15-20 years of saving are up.
Because 15 years of compound interest in the stock market more than covers the difference in income between a 25 year old and a 40 year old. The 25 year old has enough time to take risks with money, while the 40 year old needs to be thinking more about preservation of capital.
I don't think so. The 25-year-old needs the money starting in 20 years; so does the 40-year-old. The 25-year-old needs the money to last 45 years after retirement; the 40-year-old needs it to last 30 years. Those aren't significantly different time horizons. The 40-year-old can therefore invest with a very similar risk profile, and get very similar returns.
Try like hell to save your 1.5 - 3 million and strangle your living expenses down to the bone. Then after your "retirement date" which is code for "slowly drain my principle and hope I don't live past age 80" hope to continue to land contracts on the side for clients that are smart enough to hire you.
Talk about out of the frying pan and into the fire! Unless you have a guaranteed biglaw job lined up, and you enjoy 80 hours weeks necessary to survive as an associate in biglaw, becoming an attorney has far worse income and prospects than being a developer.
That's why I told my students when I was a professor -- don't become a "programmer" become a "computer scientist." Could Alan Turing get a job if he was still alive?
Why just not to enjoy the moment we're living right now, and trying to appreciate it every second, gratefully accepting the decisions of the fortune and trying to do what we like!
I think this must be a US thing. There are plenty of older programmers in my current job. Probably helps that it is C++ and not something like Typescript, but still.
You're assuming that "write websites" is the only thing a web developer does. Running web services at scale requires polyglots who understand multiple tech stacks as well as distributed computing, databases, networking, and operating systems.
I went from being the spry youngster (mid-30s) at one company to being the old veteran at a startup, so this is a subject that is often on my mind.
I have definitely seen things like the young people saying some piece of knowledge I have is "obsolete" because they think, for example, that ORMs are a bold new concept that will invalidate the need for databases, just like they were 15 years ago. Especially in node, which trends young, there's a lot of reinventing the wheel, mostly due to a lack of awareness of older tools, although also because of contempt for anything older than 12 months.
I think another side of this is that the old folks are always telling the younger programmers "no." We've seen all the pitfalls, usually because we ignored our elders when we were spry, and we don't want younger programmers to make the same mistakes. What this comes out to is a lot of "do it my way," and then a bunch of predictions of doom that may befall you a year down the road. Nobody wants to hear that.
I know from my own job hunt experience I was largely ignored by many for lacking experience yet after I got the magical peak number I got call spammed for short term positions. And that looks sensible compared to the absolute madness of refusing to hire people for being unemployed that was more common in the great recession.
Maybe some folks here will find my perspective informative, speaking as someone who made the transition from individual contributor to engineering manager around the age of 40.
I've worked at big companies since I graduated college. As a software engineer, I remained hyper-focused on specializing in pretty much one specific thing. I've never considered myself particularly brilliant in comparison with some of my peers, but the thing I chose to specialize in just so happened to become super-important to the technology sector about five to seven years ago. I was in the right place at the right time. I cleared my calendar for about half a year, shut out the world, and focused entirely on embodying all the expertise I had accumulated up until then into a new feature.
Now the thing I created is a critical feature in the phone you probably have in your pocket right now. As in, if I hadn't created the feature, it's very likely that a team would have been spun up in a company like Google or Samsung to create it instead.
This catapulted me into a relatively senior position, and then I found that my hyper-specialization couldn't keep me competitive among my more capable and more generalist individual contributor peers. I got to my position by pulling a rabbit out of a hat at a critical point in time, and then I (proverbially) got promoted to my level of incompetence. I can't possibly keep delivering the same super-high impact on a continual basis.
Shortly after I got promoted, my organization started to grow very quickly. Upper management was scrambling to build levels of hierarchy to absorb the growth, and they pretty much pushed me into managing a team in addition to working on some of the residuals of the technology I had built. I soon realized that I didn't have the capacity to continue designing systems and writing code while effectively managing people at the same time. I felt that I had to make a choice: Either give up management and focus on individual contributor work, or make the switch entirely into management.
Of course I had formed for myself a false dichotomy. I could always go into consulting, switch career ladders to sales or product management, or something else along those lines. But getting a taste, I was finding that I liked management. Politics started becoming less of a dirty word for me, and I felt fulfilled and useful when I successfully negotiated a mutually beneficial compromise among parties in the organization. I loved figuring out what my reports needed and helping them to achieve their goals. I loved that I could hack some code here and there and not worry at all about whether it ended up shipping in the near future, since my impact was not being evaluated by that metric any more.
And so now, a little bit past 40, I am 100% an engineering manager. It's something that I just sort of grew into, but at the same time it feels sort of inevitable if I am to stay at my current company. At my level of seniority, upper management would only be happy with me going back to an individual contributor role if I were to pull more rabbits out of hats. I suppose I've accepted that those days are probably behind me, and I'm ready for the next set of challenges.
I'm 42 and still coding. I have a lot of rambly thoughts. . . .
I love programming and don't want to go into management, but it becomes really hard to keep upping your salary after the first 5-8 years or so. (And isn't it silly how many job ads for "senior engineers" ask for 3-5 years of experience?) I could join a FAANG but I don't really want to.
You do keep getting better, even when you keep changing the tech. It feels almost unfair that I started on the web in the early 90s (and was coding long before that) so I got to progress along with today's complexity, starting with early HTML, CGIs, debugging HTTP & SMTP with telnet, supporting IE 5 Javascript & CSS, building websites with Perl & Python run from an in-house app server we built in C, on & on. You can go way down the stack and debug things quickly. You know all the junction points to do a binary search on where a bug is happening. Your intuition is well-honed. You have lots of perspective and good judgment. You get better at seeing the big picture, including crossing from the tech stuff to the business.
Back around 2005 I started doing freelance development ("and consulting"), and that has been a great path for me. I earn more, I work from home, I don't live in the Bay Area, I get more responsibility and respect and variety and control. I've neglected to follow most of the patio11 and tptacek best practices, at least most of the time, but I'm still happy with how I'm doing. I've raised my rates a lot, and should probably raise them again. (At the moment I have four customers who all wish I'd give them 40+ hrs/wk. . . .)
I'd like to experiment with value consulting. I'm much better at estimating than other devs will admit to.
My Plan A has been to start my own dev partnership, although it's been hard to find the right people to get it off the ground. I really like Managing the Professional Service Firm by David Maister, and I think the partner/associates model has great potential for making programming viable as a life-long career.
I haven't tried hiring any people (full time or as contractors), except occasionally designers. Maybe I will soon but then you bill less yourself, you have to sell more, you have to manage them and their work, and of course you have to find people willing to charge less than you. (Tech is obviously supply-constrained right now, so getting good workers at a low cost is the #1 competitive advantage. Like everyone else, I believe I can recognize good developers when I meet them, but I don't really have any great process or insights to make that part of a business plan.)
The longer you do it the more T-shaped you become. Eventually your T has more than one vertical line. :-) But still you kind of have to pick one focus to advertise, if only for credibility reasons. I say my specialties are Rails, relational databases, and devops. But then I also do lots of Python, Angular, and React, and I'm happy doing Java or C, even Android & iOS. Right now I'm almost full-time with C#. :-/
I've enjoyed finding researchy outlets for my spare time. In my case it's temporal databases, but there are so many other cool things happening. I wish I had forty lifetimes to explore them all.
I'm glad I've fended off management so far. I've still had to learn lots about sales, finances, project management, contracts, collaborating, leadership, etc. I already spend plenty of time in Jira, customer meetings, and code reviews.
Probably ageism matters less when you're independent. For one thing there are no 8-hour on-site interviews (even though you cost more!).
Anyway, I'm 42 and coding all day, and I'm very happy. I feel totally lucky to be in this career. I feel tremendous gratitude to our community. HN, Open Source, Linux, bloggers, O'Reilly, Stack Overflow, tech meetups, tech companies, friends & colleagues: all have given me so much.
As someone in a similar age group and position, I enjoyed your rambly thoughts. It gives me the warm fuzzies to hear about your hard work, satisfaction and gratitude.
I'm right there with you in feeling grateful to be part of a generous and intelligent community. I feel lucky to be able to make a living doing what I love to do, and even though the business can be tough, it's a joy with so many bright colleagues around the world.
>the software engineer depreciates only slightly more slowly than the machine he or she toils behind
The author takes a debilitating credibility hit with this statement. A computer can be worthless after 5 years. So an engineer is what, worthless after 6 years?
He also chases this experience paradox too far. When I (and believe it or not more than a few others) hire someone or negotiate pay I assign zero value to years of experience.
If being a seasoned veteran has any value, I want to discover it in relevant interviews through conversation, problem solving, wisdom, etc. I could make a mistake and not see it but I'm not paying more because of a number.
Same with a recent graduate, there's quite a bit of variation in maturity, communication skills (yes a fine weapon for an engineer to wield), and ability. Some 22 year olds having been coding since they were 12 years old. If all this seems to manifest value during the process, why would I possibly not pay this person more the than the standard "new graduate" scale and risk losing them to a competitor?
I realize not all will think this way and yes, some of the things written do happen but the tone is too alarmist. It's just something that can be unfair and a challenge but it's not insurmountable. It's not like being black in Alabama in 1950.
One thing older engineers should do is the same thing younger ones should. Don't just naively learn tech that's interesting. "Interesting" is important because it can grant effortless motivation and passion. But at least research and sort rigorously the salary differences for 20 different things along with variables like generalist vs specialist company size etc.
Then choose whatever the heck you want, because you've made a conscious decision about your preference to balance interest vs money and choose to make 75k or 250k a year in a state that doesn't have to be on a coastline.
You can even try to put a less enlightened interviewer at ease by being honest. You could tell them you can't be demotivated even if you did take a salary cut, because you don't build your identity on that because it's not about me. It's about how much market value I can deliver via tangible and measurable results. It's about whether I can actually help the company reach goals faster, add productivity and help the company be more successful. And if can convince you such objective things are likely after our interview, I believe they should be the most decisive criteria for your decision.
Here's some random 81 year guy. probably all washed up no doubt. Would you be willing to give him a shot to help you out on some algorithms? For some reason I just feel after speaking with him it wouldn't seem like much of a risk.
That is the basis of the assumption of 40 as a boundary as well, isn't it? I don't think most people really believe there is some biological limit of 40 for programming, but more an apparent shelf-life for your run of the mill programming career. This all strikes me as a specific case of the more general character of any ~20 year or "generational" endeavor.
You experience phases of intellectual and social stimulation, and your response is probably somewhat stereotyped as a normally functioning human. I think it is a bit more acute if you are also doing this in a cohort with many of your peers in the same phase at the same time. Things like the "hype cycle" are observations about these effects in a population or market as a whole, but are rooted in the similar experience of the individual participant.
So, I might rephrase it as: you should give some thought to subsequent phases of your life, and not expect this one to go on indefinitely. There will be changes. The only choice is whether you attempt to steer the changes and/or mitigate the challenges that might be beyond your control...
You don’t seem to understand how ageism works. The limit is “you look old,” whether that’s in person, or because you have 20 years of work experience on your resume. Think “overqualified,” or “not a culture fit.”
It isn’t rational like you seem to think. If I were 40 and thought I could have a 15 year career in software, I wouldn’t be worried, either.
You might be right. I am ~45 years old and have been doing R&D programming for about ~25 years. I have known strong performers of all ages, and seen attrition happen at all ages as well.
I did go through some FANG type interviewing about 10 years ago, ending without an offer. I don't think they rejected me for being ~35, but I imagine they may have decided there wasn't a "cultural fit". There was something a little Lord of the Flies about the experience, which I might have grossly summarized at the time as having been interviewed by a bunch of little kids. I didn't resent this, so much as think I had dodged a bullet.
> There was something a little Lord of the Flies about the experience, which I might have grossly summarized at the time as having been interviewed by a bunch of little kids. I didn't resent this, so much as think I had dodged a bullet.
I recently had a similar experience at Cruise Automation. Of my 4 interviewers, none had been with the company more than a year, and at least two looked like they were in their 20s. The younger interviewers were the ones doing "coding" interviews, rather than system design, and their questions were straight up pulled from leetcode.com.
Apparently, I "struggled" in one interview, because I was only able to answer one question and part of the second one. What I actually struggled with was trying to prove that the algorithm I was using was optimal, which seemed important at the time.
C'est la vie. I later heard from some fairly reliable sources that the place was kind of a shit show, so I also feel like I dodged a bullet or two there.
The article is a bit ironic. It apparently talks about people who don't learn and don't change but suggests to them to just go and "find a new profession"...
A new grad, compared to a 10 year veteran :D. Yes sure, it levels the playing field, totally. As if language and frameworks would mean anything. I don't even want to know what kind of "veterans" would compare to a new grad. If you are worth anything at what you are doing, you would kick any smaller team of new grads out of the park, even when working alone, in a language you don't know and with a framework you don't know. New grads essentially know nothing about software engineering, nada...
It's baffling to me how any company would think otherwise. Perhaps they should review their hiring policies then.
"Skills pay the bills." Generally, the kind of people you want to work for (by the time you're 40-50) will not give a fig about your age if you can deliver.
Here's the thing about being an "old" programmer. If you can't keep up with the newfangled shit, then you probably shouldn't be competing against younger programmers, because at skill-level-parity they will beat you on speed and total hours spent, eh?
For example, the fact that FP is finally making it into mainstream development shouldn't throw you if you have had twenty years to experiment with e.g. Haskell, eh? The only thing close to surprising in decades is how well backpropagation works when you have hella data and CPU power.
By the time you have been programming computers for twenty years, you should be really good, or you're just a "jobber". There's no shame in that, the world needs jobbers, but yeah, just like in wrestling, there are very few old jobbers.
If you're not a jobber, then at some point you see through the fashion-driven parts of the IT ecosystem and figure out what you really want to do with yourself. At this point a lot of people switch to management, or start their own company. But that leaves the field (of programming) to the die-hards that really give a fig about it. So if you're still in the game, late in life, and you're capable, then yes, there is high demand and you can e.g. command hefty salaries.
I myself am 40 and am gainfully employed and paid well in a top SF tech company, I have several mid to late 40s friends who are highly paid and appreciated at their jobs and have turned down offers from FAANG. I think the key is realizing what more jr engineers able to do well and compete at something else, like depth (or extreme breath) of knowledge, soft skills, organizational skills, and of course leadership.
I wish this comment would get more visibility. Experience is extremely valuable when building, maintaining and scaling software systems. The caveat here is of course that you actually spend your early years learning rapidly and not just doing mundane tasks. Self study and keeping up with technology isn’t easy but it’s doable, not impossible. And great engineers with tons of experience will get hired regardless of age.
I'm over 40 - I found myself suddenly back on the job market a couple of years ago when the company I worked for abruptly went out of business. I was only out of work for three weeks before I found something else. I keep current, though - one of the realities of this profession is that you have to keep learning, and you have to do it "on your own time", if you want to stay in it. Personally, I find that to be a reasonable trade-off, but I know a lot of people don't.
The $64,000 question is "Does the experience and skills that you have as a 40+ year old make you worth hiring versus a junior developer or fresh graduate?" Unhappily for a number of people, the answer is "not really".
This is not a profession in which one can coast. Every year in which you're not learning something that makes you a better and more capable professional, whether at your job or on your own time, is one year in which you're falling behind the rest of the pack.
I think software has a bit of an "up or out" mechanism built in. If you're 40 and a seasoned team lead, effective manager, unflappable consultant, wizardly engineer, or irreplaceable specialist, then you're fine. If you're 40 and just another serving of cannon fodder, you probably aren't.
I know plenty of people well over 40 who are going great guns in the software industry, in leadership or consulting roles. I also have one former colleague who, sadly, isn't - he was a senior developer at a company for several years, but got laid off; he's experienced, but not as quick as he once was, and doesn't have any particular special roles on his CV, so couldn't immediately bounce into another job.
I turn 40 next year, and can't honestly put myself into any of the categories i list above, so i should probably start worrying about this myself!
It seems to me that older developers tend to fall into two groups. One group goes, “the old ways were better, this new stuff is crap, I can’t believe they make us use it.” The other group goes, “there’s a lot of good stuff in the old ways, here’s how we can use that experience to our advantage now.” The latter has a much better time.
In general if you don't keep up with the pace/trends, you will become out of touch. I would suggest this is for programmers that stop learning while they age.
The older guys, they seem like they had the time to correctly learn the unix network tools, the jvm debugging ones, the memory inspection ones. I've known older devs for which I have the utmost respect because I felt like they can just debug the shit out of anything happening on a computer, with tools I don't even know of but have been there for decades.
And here I am, needing to google how use tcpdump or jstack, and scrambling to correctly understand the result.
I agree that new tech is always stacking, but I feel like it's pretty damn hard to catch up on the in fact still very relevant and important old one, because it's no longer taught, no longer a meetup topic, no longer hype. And the new one is really easy to learn when you realize that it's 90% rehashing of old concepts (Observables are all the hype in javascript ? Well, great, I've learnt that pattern 15 years ago...)
Admittedly, I have no clue if the management realizes that !
But the impact of this knowledge, from what I observed, is really huge on productivity, and is especially a boon when the production is on fire or generally when tricky stuff happens.
So, respect for the elders, and please come and teach in conferences and meetups, we need more wisdom and less hype !