It's a legitimate concern. I spent 4 months of evenings and weekends(could have been one or two to be honest) to prep for the Google Cloud Architect exam. I have used NONE of this knowledge (past what I already knew - load balancing and spinning up VMs/containers...I think I used Dataprep instead of a Excel once too for shits and giggles).
I am looking to spend another 4 months learning ML, which I likely won't apply in any way.
That's a year down the drain with ONE cloud provider and ONE way of doing ML. It's easy to completely waste your life like this WITHOUT getting better at your job.
Carry-over is far more limited than people make it out to be, unless you REALLY know a lot, but those people are rare.
Being 35 and having worked in IT for 15 years now and seeing the rapid acceleration into DevOps/Cloud/nix/Programming/Stacks I fear for my future. I want to learn a ton of stuff, but the vast amount of stuff needed to learn in order for me to move up in my salary bracket is stifling. AWS/Azure/+ the former I mentioned, then Python, YAML, Cloud networking. I'm good at some stuff, but the industry is just moving so ultra fast now it's hard to keep up.
I've been contemplating getting out of IT altogether because I'm not fully confident in career growth at this point unless I murder myself with study and ignore my family.
I've been a MS SysAdmin for 15 years, moving into nix devops (the new way of sysadminning) isn't easy.
"unless I murder myself with study and ignore my family"
I think this is completely true as well. To move past senior developer, you fall into one of these camps:
1. Very bright.
2. Spent a LOT of time studying or messing around with the right tech on your own.
3. Sell your soul, i.e. ignore (or not have) family/kids.
A lot of people from camp #1/2 don't understand that for most, #3 is the only option (in the short term). There is also the very real tradeoff of not going with #3 and risking declining job prospects/salary.
I think this is doubly painful for devs, because they are generally used to quick career progression / salary bumps, and then it stalls hard at senior dev.
Why do you need to progress past senior developer? Senior developer jobs provide interesting, fulfilling work, and excellent salaries (probably 90th percentile).
There is strong cultural pressure (at least in the US) to constantly progress in your career. This is measured primarily by title and compensation.
I've been a senior developer at the same company for 13 years. I feel that most of the time, I am progressing in my knowledge and experience, so, in my mind, I am making progress. It just doesn't seem that way on my Linked In profile.
What prevents title changes if only to please the LinkedIn profile? Junior - Senior is a change that occurs within the first 5-10 years. Isn't there a way to discuss with your manager for title changes just to show you haven't been doing the same thing?
Deputy Developer? Elder Developer? Doyen of Development? Development nestor of company X? Director of Engineering at Sub-sub-sub-sub-department that happens to be just your team? Level 20 Wizard? Does it even matter if it sounds good on LinkedIn?
I'm a little confused, because I wasn't aware that there was career progression beyond Senior Developer. I mean, you can go and lead a team or something if you want I guess (in fact, I'm doing that at the moment), but most older developers I know have been and done that, and settled back in highly-paid, highly-respected, and much easier individual contributor / architectural roles. Looks like a good life to me!
The best developer I've ever had the pleasure to work with was a 50-year old senior developer. He cut his teeth doing a lot of C/C++ stuff back in the day, but was also (pretty successfully) leading the company's adoption of Angular. If you have a sharp mind, and you don't get stuck in your ways, then people will be begging for you to be their 50-year old senior developer.
I know more than a few places that have Principal Developer positions. This is basically for senior devs who have tons of domain knowledge that companies don't want to lose.
Um, I know lots of 50 year-old (and older) senior developers.
I'm not ignoring the fact that ageism is a real thing (it most definitely is), and it is more difficult for many older programmers to "keep up", but that doesn't mean that no one is doing it.
Our engineering job ladder does not recognize any differences in core software engineering skills after Senior. It is entirely about social and organizational skills.
Running contentious meetings and herding directors are difficult skills to even begin to practice on your own time, but navigating family life is probably as close as you can get.
Maybe this is a good place to articulate what's been on my mind lately regarding growth. I don't think there's much local opportunity for developers beyond starting their own businesses past the senior level. Instead you have to be willing to move. Individual companies will not be able to challenge you beyond their business needs.
The other thing I wanted to talk about is how I solve technical problems. The first thing I do is get a representation of what the problem is in my mind enough to where I can see a clear path forward. This leans on my ability to take a 10,000 foot survey of a problem space. My current role deals with microservice architecture. Microservices is right in my wheelhouse due to my better-than-average sysadmin skills.
But I end up having to learn a lot within a short amount of time. So in order to cut down on what I call the "sheer mass of information needed for mastery" any time I look at a new tool or tech, I make a beeline for the "architecture" or "concepts" page. This is where I work out exactly which concepts and which level of the architecture I'm working at.
I then use the problem statement and vision above to hone in on a perfect implementation. Then I look at the actual state of the system and work to bring it more in line with the perfect one.
I recently was tasked with getting one of our guys unstuck. He was having a tricky issue with aquasec that he'd been beating his head against for a week. It took me five minutes to understand the problem, then I went to my desk and spent twenty on obtaining a reproduction. I didn't want to redo his work, so I then asked him what happened when he did X and Y. From his answers I had a clear path to being able to demonstrate that it was aquasec throwing a false positive on a npm library, and was already in talks with our devops team about next steps. It took 30 minutes for me to move his issue forward.
I feel like this manner of solving issues with techs that you don't necessarily have full understanding of could revolutionize the industry. But I can't really grasp how to teach it. It looks like magic to people when I show it to them, they think it relies on years and years of experience. I mean, it kind of does, but I was able to avoid ever getting fully stuck on problems even as a teenager.
But stretching out the problem space and treating each barrier in turn, diving in a little bit into complex techs along the way, I don't see a lot of coders doing that. Instead they just kind of muddle around with what they know, believing they need perfect understanding of a tech before being able to solve problems effectively with it seems to be the norm. And we have this tech landscape where years of experience in technologies becomes the primary determinant behind how most employers judge candidates.
I think the increasing march of devops and other techs that purport to unite the whole world into one walled, splendid garden will eventually bifurcate the tech world into supermen who know everything, and the underclass who can only work in one garden. If that's not how things already are?
Maybe a secondary school for advanced coding or bringing guilds back.
As an MCSE from the late 90s, I can tell you with oldfart-confidence that you can rely on Microsoft maintaining a landscape where you can be an MS-only sysadmin for decades to come. Sure, the salaries might not be stratospheric, but corporate AD & GPO work will be never-ending. There are lots of businesses running on Windows laptops and desktops and that ain't changing.
yeah, I would agree with this. You may not have the servers locally, but there is still a lot of admin stuff to do, and Azure doesn't configure itself.
'I've been contemplating getting out of IT altogether because I'm not fully confident in career growth at this point unless I murder myself with study and ignore my family."
That's a real problem. I had a few years when I worked on pretty cutting edge stuff so most of my learning was on the job and could be applied quickly. In my current job there is a lot of repetition and stagnation so you have to spend a lot of time outside the job learning stuff which you then never apply. This gets really old after a while.
You seem to need to study programming before being able to do real DevOps. nix is kinda advanced in this sense even if it is has a small language. In my experience, very few ops are able coders.
Interestingly, earlier in my career, most of the developers I worked with were also capable at system administration, and could easily fill that role if needed.
The sysadmins during that time were quite capable, but had no desire, or an admitted lack of ability for a development role.
Given their background I think they meant nix as Linux/Unix and not Nix the programming language/package manager.
Despite its clear fit with the current hotness of immutable systems and functional programming, I don't think I've seen anyone in for-profit industry on a Nix/NixOps stack.
I feel a similar way, and I'm a little bit older with a fairly strong desktop/web/mobile dev background. My focus was heavy on Microsoft technology before moving 50/50 to Linux.
I look at industry as a series of different waves happening where I just need to get on one so that I can find the next. A wave is going to cross the startup world before it moves into established business. If I time it right then I can hang in there for a while. It doesn't help with my anxiety, but I've done this before so I know I can do it again. I suspect I'd find the same pattern in a closer examination of your background.
There are very few people learning all of these technologies because time is required to learn the basics, as well as put it into practice within industry. That's deep learning. A lot of people are lucky to work in places that will put unproven technology into production. The number of places willing to take this risk is increasing because cool technology is a requirement for attracting top talent. If you are in a more risk-averse organization it may seem scary to move to a faster moving, less risk-averse one, but you might find they actually have more leeway for mistakes and learning.
I watch for particular patterns around how a technology is hyped and who is applying the technology. Thoughtworks publish their perspective (https://www.thoughtworks.com/radar) and I try to read this, plus other analyses to understand what direction things are moving in. You have to take a longitudinal view. It's not good enough to just compile your research and make a decision. You need to know how a technology has moved over time. Once you start doing this you start to develop some "spidey" senses when you see something at the top of HN - and you don't have a ramp-up cost each time you have to make a switch.
Other than that focus on a few fundamentals, including one programming language reasonably well. I'd strongly suggest learning Python and having an incremental two year development plan so you don't half-ass it. A lot of the Linux world is built on stable skills. Rather than focusing on AWS, or Azure networking learning networking and TCP/IP beyond a cursory level. There are a lot of folks building cloud infrastructure badly, slowly etc. because they don't understand these fundamentals. The references are old and boring like TCP/IP Illustrated Vol. 1.
I would suggest attempting to learn the general principles behind the one way to do ML/cloud/whatever you have studied. That type of breadth of knowledge is the useful one. It lets you gain further knowledge much more efficiently. As I mentioned above, to great extent breadth of knowledge is about learning how to be an efficient learner.
I would say the most important knowledge is the one that allows you to ask the questions of "why is it still not done?" or stating "well, they did it, but it won't work well in situation X" and then either finding the products that do exactly that, or knowing that in a few years there will be an industry providing exactly that kind of service.
Saying that you can't move between cloud providers because you need to learn something new probably implies you don't understand the fundamentals constraining the design of e.g. databases provided by cloud.
I am looking to spend another 4 months learning ML, which I likely won't apply in any way.
That's a year down the drain with ONE cloud provider and ONE way of doing ML. It's easy to completely waste your life like this WITHOUT getting better at your job.
Carry-over is far more limited than people make it out to be, unless you REALLY know a lot, but those people are rare.