Yes. But as the article said, there is a diminishing amount of value after every year of true experience after a number of years. I would say around 10.
I definitely don’t believe in the “10x Engineer” (individual contributor) - yes they do exist but are so rare they aren’t worth talking about. I do believe in being a force multiplier as a team lead/mentor.
On the one hand, sure, percentage-wise you learn less in year 20 than in year 10, because you already know a lot more in year 19. But that is no more true of this field than any other field. Is a doctor, architect, civil engineer, or auto mechanic with 20 years experience more valuable than one with 10 years experience? Heck yes.
20 years ago,much of what I used today didn't existed there was no AWS, no C# (but C++ was close enough I guess), mobile where you had to worry about semi-connected networks and syncing, etc. There is no part of the human body that exists today that didn't exist 20 years ago.
You could argue the opposite is true: 20 years ago, there were dangerous misconceptions about how some body parts work. Many chemical pathways were completely unknown. Medicine and biology both evolved a lot in the 20 years.
The big picture however is still valid. Concepts and paradigms hold for decades. We still use TCP/IP. Computers still use the Von Neumann Architecture.
Looking at the details however, AWS is just a fancy GUI over time sharing on a mainframe.
C# is just new syntax for concepts that are older than I am.
While mobile brings new challenges, you now don't have to deal with the challenge of customers connected through a 330 baud modem.
In 1986 I had to not only know 65C02 assembly language to get any performance out of my 1Mhz Apple //e, I had to know that I could get 50% more performance for every access I did to memory on the first page as opposed to any other page. If I spent time doing that type of micro optimization today, I would be fired. I couldn’t imagine doing the types of things I could do today with modern technology.
In 1995, when I wrote my first paid for application in college, Internet was a thing for most colleges where I did some work on a HyperCard based Gopher server (long story), that wouldn’t have been possible in 10 years earlier.
In 2006, I was writing field service software for ruggedized Windows Mobile devices, architecting for semi connected smart devices is a completely different mindset than terminal programming or desktop programming. That wasn’t feasible before hardware became cheap and at least 2G was ubiquitous.
Even then what we could do, pales in comparison to the type of field service implementation I did in 2016 when mobile computing was much more capable, much cheaper and you could get a cheap Android devices and 3G/4G was common place.
But people thinking cloud computing is just “sharing mainframes” and don’t rearchirect either their systems or their processes is how we end up with “lift and shifters” and organizations spending way too much money on infrastructure and staff.
Also anyone who equates managing AWS to a “GUI” kind of makes my point, if you’re managing your AWS infrastructure from a GUI - you’re doing it wrong. 10-15 years ago you didn’t set up your entire data center by running a CloudFormation template or any other type of infrastructure as code.
How has medicine evolved in the last 20 years? I don’t doubt your statement, but you make it sound like common knowledge, and from my point of view (average non-medical person) not much has changed.
All of which is just details, which are much less important than the fundamental skills of building systems with whatever people and tools are available. (And I'm sorry, are you implying no significant changes have occurred in the tools and practice of medicine in 20 years?)
Thinking that moving from on prem to AWS for instance is now you end up with “AWS Architect” who were old school net ops guys who only know how to do a “lift and shift” and end up costing clients more. Because they pattern matched thought AWS was just an implementation detail and they could just set up everything like they would on prem.
Just one note about 10x, because I often see people ho don't believe it have the definition wrong. 10x programmers are not ten times better than the average programmer, they are ten times better than the worse programmers. This is based on an actual study and, by the metrics the study chose, this disparity does exist. Of course, measuring programming performance is notoriously intractable.
> But as the article said, there is a diminishing amount of value after every year of true experience after a number of years. I would say around 10.
I don't buy it. 10 years is about when you start moving into the actual expert category. Note I said start.
At 35, I finally had real, full control over multiple languages, could pick up CLR and understand and implement any algorithm in it, finally understood exactly why concurrency was so damn hard and how to mitigate that, and would pass practically every interview with flying colors. I could finally drive my tools with some facility and started to realize gdb was my friend.
At 45, I can predict the errors I and others are likely to make and take steps to mitigate them up front--although I still get irritated that I when I make the mistake anyway. My comments are now psychic--my team often remarks how "I just thought that I could really use a comment explaining this-and, behold, there it was". I can reduce interviewers to tears and can surprise all but the most knowledgeable experts in their own domains. I reach for gdb far more often, but am still frustrated at how much I don't know about it.
I still only consider myself an "expert" in very few subdomains--none of them involving programming languages.
One of my heavy hitter software guys once said: "Your code is the most straightforward code I have ever read." I apologized for being so simple. His laughing response: "Don't apologize. That was a compliment, dumbass."
At 35, I finally had real, full control over multiple languages, could pick up CLR and understand and implement any algorithm in it, finally understood exactly why concurrency was so damn hard and how to mitigate that, and would pass practically every interview with flying colors. I could finally drive my tools with some facility and started to realize gdb was my friend.
You may be an expert in multiple languages but as the article said, if the company is looking for Ruby developers to write a CRUD app, they no more care that I spent years doing C than they do the years I spent doing 65C02 assembly in the 80s.
No matter how many subdomains you are an expert in, if the company doesn’t need that experience, it doesn’t matter.
But some things translate much better than others: Django and Rails aren't that different (I'd also put Laravel in there). So I think the real problem is moving from a senior Django role to a senior Rails role and vice versa, but I see no reason why a 10 years Django developer can't get a mid-level Rails job (other than blatant age discrimination that is).
Isn’t that the point? That if you have 10 years worth of experience as Django developer, you aren’t as attractive to someone looking for a Rails developer as someone with 5 years of Rails development.
Simplicity is definitely a virtue. When I maintain large codebase of 10+ year project, it is easy to spot where someone tried to be very clever, and it is rarely benefit for codebase maintainability and extendability in the long term. Things rarely get reused that much to take advantage of overcomplicated abstract solution. Most of the times developers do not predict future business requirements correctly and simple solution would be the better one.
I definitely don’t believe in the “10x Engineer” (individual contributor) - yes they do exist but are so rare they aren’t worth talking about. I do believe in being a force multiplier as a team lead/mentor.