I think it's possible to agree with both this post and with the other programmer he's railing against.
Yep, I know that if I don't take care of my own training, it's just not going to happen, most companies are not so altruistic that they'll hand me everything on a silver platter.
But at the same time, a company that never hires people unless they already have the exact skillset they're looking for, a company that fires people on a whim because priorities change, and a company that provides zero incentive for people to keep learning (e.g. with 20% time or a willingness to let employees experiment with new tech) – well, those are not companies I want to work for.
You hit the nail on the head. It's the programmer's responsibility to stay relevant, not some company's. But the best companies will take an active role in ensuring their programmers stay up to date (since it's in the company's best interest, too).
I've been just dumb lucky in that the companies I've worked for have been very supportive of individual engineers’ efforts to learn new things. Whether through funding conference attendance and/or travel, or by simply giving employees time during work hours to learn on their own. But the biggest gift my past and present coworkers have given me was demonstrating that learning outside of the job is an essential component of doing great work.
It wasn't until my first few times sitting in on hiring discussions that I encountered professional engineers who didn't take an outside interest in expanding their toolset.
There is absolutely nothing wrong with them. They are fantastic folks who are very good at their chosen profession. And for the majority of professions (I only have anecdotal knowledge of this, but I would love to hear examples from other fields) that is enough to ensure stable employment. But I can't imagine the fear, uncertainty, and doubt that accompanies company layoffs and downsizing when you haven't played outside of your comfort zone in a while.
This is just a long-winded way of saying, “great summation, SheepSlapper!”
I have been writing and rewriting a post for the better part of a year on this same thing. I think I've finally accepted that a majority of programmers actually are not interested in learning new things.
I got into software because there was so much to learn and explore, so this realization still baffles me. Why on earth would someone want to do this job and not want to learn new things? It's like a baseball player who hates being outside.
Not only that, but often times I'm faced with the prospect like the author's, where people I've worked with actively prevent those around them from learning new things on the job. "No, don't write this standalone module in Python, our standard is PHP; it was good enough 5 years ago, it's good enough now!" (in a four man shop).
As someone who loves constantly learning more, it's suffocating to be around people who are so paralyzed. I simply cannot fathom the fear that drives someone to say to the offer to learn something new on the job, "no thanks, I'm happy becoming obsolete, and you can't learn it either, because I might have to one day support it, and I'm not interested in learning anything new!"
No, they are interested, learning is fun for them too. But not everyone is interested in learning the same things you are. For me, for example, there is absolutely nothing more boring, than some new fancy framework, API or a language. I am, however, very excited to try things that might reduce amount of bugs and potential security issues in my code, make it cheaper to run, cheaper to support.
So long as you don't have to learn a language, API, or framework to do it... I'm not sure what things are left you could do differently? Your list excludes all technology-related changes, so the rest becomes meta-activities: pair programming, documentation, planning, a good nights sleep, etc. Those are excellent, but I'm curious why are you willing to change those other areas but are resistant to technological ones?
That irrational fear of technological change is exactly what I'm talking about. There's a joke around some of my friends, "What's the best way to get a mediocre .NET programmer to stop talking about safety and reducing bugs? Bring up the safest most secure .NET language that exists: F#."
As the joke goes, most of these developers are confronted with the reality that they don't really care about bugs enough to even bother learning a different fully supported .NET language that will find entire classes of bugs automatically and is orders of magnitude safer than C# and is faster to write.
Most people when they have told me they "care about bugs, not languages" have been just saying something nebulous to kill the discussion and defend their life choices.
I'm not trying to be snarky to you, I'm actually very interested to know what changes you've made that are not technological to reduce bugs, fix security holes, etc. I'm always on the prowl for such stuff, and this sounds intriguing.
Let's see, I'm learning about unscented Kalman filters, moving object tracking in computer vision, PCA, and more. I'm learning about the problem domain that I'm working in. I have no interest in learning yet another API unless I absolutely have to, because it isn't really interesting knowledge that I have to call the fibble() function before calling xacxtyor() in this poorly documented API. That's not learning, it's the accumulation of facts.
Of course, sometimes learning new API/languages enables meaningful learning, and I'm all for it. Want to quickly get a handle on Kalman Filters? Probably a lot faster to put something together with numpy (say) and experiment on the REPL than to program it in a performant language like C or C++, even if that is where it will eventually end up running in your project. So, yes, there's a great reason to learn those languages (or Julia, or whatever).
But the endless march of APIs is pretty tiring to me, and I do everything I can to isolate myself from that. I'm pretty happy writing my C++, and Python (2.7, btw), while trying to solve rather hard problems.
Another way to put it - what I value is intellectual work, not memorization/puzzling things out. Learn a new math technique? Awesome! Learn a byzantine set of calls I need to make to make a widget widge on the screen? Not so much.
Well it sounds like you have no qualm learning the right tool for the job. I would consider numpy to be an API, but the difference here is it's one that helps you with your domain. If widging widges was your domain I'd expect the professional thing to do would be for you to learn that API. I'm talking about people who in your job wouldn't bother with Julia or numpy and would just trundle along with C.
This is great, but it's not addressing the point, which is "I am, however, very excited to try things that might reduce amount of bugs and potential security issues in my code, make it cheaper to run, cheaper to support."
Also, if you like math go learn some functional programming. The ideas of abstract algebra, which you will encounter in FP, are increasingly relevant to machine learning as well. E.g. http://jmlr.org/proceedings/papers/v28/izbicki13.pdf
> But the endless march of APIs is pretty tiring to me
i hear you on that one
> and I do everything I can to isolate myself from that.
but still, you can't stop thinking about design entirely. when a new ideally-pure-C-but-C++-if-you-really-need-stl library is ready to actually be put to use you'll have to put an api of some sort on it. hopefully that api will be informed by your use of other people's apis, both good and bad.
don't forget that the whole point of computers is that they're immediately useful. you can still do most of the math you want with a pencil, paper, and stack of books.
> I'm not sure what things are left you
> could do differently?
There are so many thing that are left - one cannot do them in a lifetime! Analyze as much bugs as you can, invent your own language that prevents them from happening, write your own compiler for this language, make it fast, with better memory management, not some stop-the-world GC, but with something, that honors low-latency and so on.
EDIT: As RogerL is saying, useful to me intellectual knowledge is what I want to learn. It's pretty much never the same thing as anyone else in the room wants to learn.
So clearly you are not the class of person I have ever dealt with in the past. It would be an incredible breath of fresh air to work with someone who cares enough about their work to write their own compiler for it.
Right now I'm working through SICP, having just finished PLAI, do you have any suggestions for a book on compilers I could do next? I was recently steered away from the dragon book as it's "missing a lot of recent compiler research", but that person had no good alternative.
I think there is definitely three main groups of software people: those that wire together languages and frameworks, those that write languages and frameworks, and those that use a tiny subset of languages to do research. My problem is I think I've been heading down the path of learning that leads more to the second, and you sound like someone who manages to do a job similar to mine, but by being good at that second far more interesting path. I'd be really interested to hear any advice you have. Do you basically have to just work alone?
Oh sure, right now I've been deep into Scheme as I just finished PLAI and am halfway through SICP. I'm not chasing every new Perl derivative every six months, but I'm mostly stuck in the day job using C#. It feels constraining to be learning all this awesome about languages and not get to use most of it, hence the desire to sometimes try other languages at work.
What sort of things have you been learning that you would recommend that are off the beaten path of just another mvc framework? I'm thinking of learning about compilers next, but I'd like a book to work through rather than just "the internet". Also I'm trying to catch the type safety bug by working through Real World Haskell, I'm open to other suggestions for that too.
there is absolutely nothing more boring, than some new fancy framework, API or a language
Hear, hear. There must be 100+ "frameworks" out there, all for the task of rendering a web page. If you learnt them all, you would be stupider than when you started. Pick a handful of technologies - ones that will last - and get deep into them. Step off the crazy treadmill and go for quality, not quantity.
If you had gotten into a "full stack" (lol) of Unix, Oracle, C++ in 1994, you would still be very, very employable today as long as you remained more or less current with them, and you'll still be in 2024. Whereas if you learn "frameworks", you'll be starting again from scratch every year or two.
You are assuming that most of the languages people use don't have safe automatic memory management, right? This is of course incorrect. Now, lets compare Rust and Go, what makes Rust programs have fewer bugs, than Go programs? Is there any supporting research that shows how Rust eliminates any particular class of high level bugs? No, there isn't. So, no, Rust doesn't focus on eliminating bugs.
I'll make one effort in good faith to answer your questions, though I expect you're not interested in hearing the answer.
There is a class of programs where manual control over memory layout is important. For example, if you're writing an OS, as Mozilla is, you need this control to talk to hardware. It is also important in some domains where performance is important (e.g. games and big data.) Rust is the only language (outside of research) that offers control over memory layout while also providing memory safety. That is, no access to uninitialized memory, etc. This clearly eliminates a huge class of bugs relative to C/C++, the only language with substantial usage in this space.
Race conditions (data races) are considered important enough by the Go developers that they have a tool to detect them: http://blog.golang.org/race-detector In Rust these error cannot happen as programs containing data races cannot compile.
Then there is the usual modern type system stuff of eliminating nulls and so on.
I don't like arguments like that, as they don't have any supporting data. I happen to analyze significant amount of bugs in a large C codebase and things you are talking about don't seem to be as important, as you claim. But well, this is what's wrong with programming languages, nobody cares enough to do some research. That's ok though, we just think about languages differently.
I didn't say that memory safety is not important. I was talking about the need to control memory layout and other bells and whistles.
Anyway, you suggested Rust under false assumptions. Rust doesn't care about reliability and security any more, than most of the modern languages. Even Perl with its taint mode is more secure, than Rust.
Mozilla's Gecko team is incredibly interested in both uncompromising performance and strong security. Rust is the first language in industry to offer the zero-overhead abstractions of C++ while retaining memory safety (with the exception of perhaps Ada, which has never taken off outside the government sector).
You seem remarkably uninformed as to what Rust's goals are. Allow me to enlighten: reliability is a big, big deal to the Rust developers. Security is a big, big deal to the Rust developers. Speed is a big, big deal to the Rust developers. Memory efficiency is a big, big deal to the Rust developers.
As for your mistaken assertion that such efforts at memory safety are unnecessary in real-world code:
>>> * Do we have data showing how many security bugs we could be avoiding in
>>> Servo in comparison to Gecko? Is the security benefit truly as valuable
>>> if expected performance benefits don't pan out?
>>
>> We've been talking to some members of the security team (Jesse, Brian). In
>> general the main class of security vulnerabilities that Rust offers a layer
>> of defense against is memory safety problems in layout, rendering, and
>> compositing code. Use-after-free is the big one here, but there are others.
>> I'm not in the sg so I can't run the numbers myself, but I am told this
>> constitutes a large class of security vulnerabilities.
>>
>
>A quick scan suggests that all 34 sec-critical bugs filed against Web Audio
>so far are either buffer overflows (array-access-out-of-bounds, basically)
>or use-after-free. In many cases the underlying bug is something quite
>different, sometimes integer overflows.
>
There are 4 sec-high bugs --- DOS with a null-pointer-deref, and a few bugs
reading uninitialized memory. The latter would be prevented by Rust, and
the former would be mitigated to the extent Servo uses the fine-grained
isolation Rust offers.
There are no sec-low bugs.
Web Audio is an example of a feature which has very little security impact
of its own. Its security impact is entirely due to bugs where violation of
language rules can trigger arbitrary behavior. Rust prevents such bugs. A
lot of Web features are in this category.
TL;DR: Firefox's Web Audio component, which in theory ought to have practically zero attack surface, contained at least 34 critical and exploitable security vulnerabilities. All of these were a result of the lack of safety afforded by C++. Rust would have made these vulnerabilities impossible.
This is pointless. Of course memory safety is necessary, I never said that it isn't. I was assuming that any sane person would understand that. And guess what? Most of the mainstream languages are safe in that regard. So you cannot claim, that Rust is particularly safe, it isn't. It's safer than C and C++, but that's about it. And that's ok, no need to be offended.
I'm out; this discussion is not productive to me. You have given no evidence to back your claims -- you have no data, to use your phrase, though you are so keen to see mine.
Specifically:
I gave examples of the need to control memory layout and memory allocation.
I gave examples of Rust's features leading to reliability and security -- memory safety and absence of data races.
A former coworker of mine said it straight out. He said: "I don't want to learn anything new. I'm too old for that.". He is 50, and a terrible C++ programmer, a good assembler guy and quite the hardware wizard. But he refuses to learn anything new. He still can't type with more than four fingers, despite spending half his life programming. He programs object oriented C++, with obvious lack of understanding of almost everything about OOP.
I am baffled by him. After spending two years with him, I still can't understand his motivations. I quit that job in large parts because of him.
Technical decisions should not be made on "which technology sounds cool on CV" basis.
Unless there is strong reason, codebase should be in one language. Having it in multiple languages makes learning curve for newcommers unnecessary steep, especially if the company is willing to hire juniors. It also makes maintenance more difficult and expensive. Too many languages will also force your people to have very shallow knowledge.
We are going to do this one module in python should be done either cause you consider to move to python altogether or because the usual language is really bad fit. "Someone wants to learn it" is both bad reason and unprofessional.
> Too many languages will also force your people to have very shallow knowledge.
I spent the last 10 years learning programming languages and PLT; it's a hobby of mine and I put quite a lot of effort behind it. While I certainly forget some things I learned (I regret not using C on a daily basis and not keeping with C++ advancements), at any given time during those past ten years I was fluent in at least 3-4 languages. Without needing a refresher, right now I can speak and I really know in depth the following languages: JavaScript, OCaml, Erlang, Racket, Python, LiveScript and Pharo Smalltalk. I have about 20 other languages I could become similarly fluent in with a week's effort. It's not shallow knowledge, it's just 10 years of work. There is really nothing to force you not to have deep understanding of many different languages and technologies.
On the professional side - I have quite big system under my care which is mixed Python, JavaScript, Ruby and Erlang, not to mention bits of C, shell scripts and Makefiles and two compile-to-JS languages, some compile-to-CSS and compile-to-HTML languages. The system works - and believe it or not, working on it is a pleasure and using the right tool for the job really feels liberating. It let's me move twice as quickly with twice as good results than I'd get trying to do for example fault tolerant, concurrent backend service in Python instead of Erlang (or quick data mining script in Erlang instead of Python for that matter).
I believe using the right tool for the job is the very definition of professional. Of course, the learning curve is probably steeper for newcomers, but for professionals above a certain level language specifics are rather easy to grok and becoming fluent in a language takes a few weeks tops. Besides, it's not like every team member is required to know every technology used in a project - there's a tech lead for this and I wouldn't want to work with one who can't easily convert iterative algorithms to tail-recursive ones and switch from algol-like to prolog-like to python(-like) syntax on the fly.
Anyway, I know what I know and I know what I do and you're basically saying that these skills are irrelevant and using them would be unprofessional. In short, what I'm saying in response is: bullshit.
some people feel that their lives do not revolve around programming, that there are other interesting things like traveling and child raising that take a chunk of their ever precious and limited time.
so it's not to say they aren't interested, but rather lack the time to dedicate. therefore, it helps if the company they work for can offer a little time to learn new things.
I'm talking about developers who when offered a chance to learn something new at work reject it, usually while getting pretty defensive.
My current hypothesis is people get so used to internally defending their neglect of the skills that earn them travel and shelter that when the opportunity arises to actually get paid to improve, all the reasons they normally use prevent them from even doing that.
I thinks there's truth on both sides of this. There are many blub programmers that know some narrow scope of Java or C# and a little SQL and that's their whole programming life. This is true regardless of age.
Interesting article, sounds like the author has really taken charge of their career and managed to do well.
Now, what about an alternative world where he did not "get oo" or perhaps a lifestyle where he had children and no time at work to learn. Or one of these newer not quite as successful software companies which has no money and no extra time.
Keeping up with new tech requires time, and money. Start ups provide neither of these. Even bigger "start ups" attempt to keep up the illusions of a smaller company including mandatory over time and no extras (eg tuition reimbursement, sabbaticals, more than 2 weeks of pto a year, etc).
The other thing, computing as a career is quite a bit harder, more complex, and highly competitive than when the author had their formative years.
The real rallying cry is how do you make an industry that respects career advancement?
> where he had children and no time at work to learn.
Using children as an excuse is laziness. It might not be as easy, but having children does not preclude you from learning new things or advancing yourself. It requires effort and planning, but frankly, using children as an excuse is wrong.
> Keeping up with new tech requires time, and money.
It requires time and effort. Money is rarely an issue.
> Even bigger "start ups" attempt to keep up the illusions of a smaller company
I'm going to assume that you just have bad experiences, because this is hardly my experience.
Though I have no children, I do have a wife I want to spend my evenings with, so I organized my professional life such that I could take a train into work 25 minutes each way. That means I get almost an hour of uninterrupted study time to advance my skills. It has been fantastic. When I get home, it's us time all night. I've been doing this for over a year, and I've seen a profound difference in my skills.
Even though he advises people to keep up, he actually kind of admits how ridiculous this is today.
It's not possible to keep up anymore, there are just too many people creating too many things, like languages, frameworks, technologies.
I think what he's trying to get at is to stay on top of large industry changes in general. Even though there are many frameworks and libraries being created everyday, it's still reasonable to keep informed about the ones that have a substantial amount of traction.
And you don't have to get too deep into everything. Take frontend js frameworks. You could learn one, say angular and then read about how the others differ. Gives you insight into the whole field.
I don't know if I'd call them "large industry changes," simply because it begs the question of how one defines "large" and "industry." But I know personally I'm facing some frustration as a web developer in my mid-40s due to how much the concept of web developer has become an accelerating target, not merely a moving one. It's only been over the last couple of years that I've started being genuinely concerned that I'm falling behind, but "we're looking for someone who knows Python and Ruby and Coffeescript and AngularJS and MySQL and MongoDB and has previous experience scaling web sites to ten million hits a day" is becoming de rigueur in Silicon Valley.
I don't consider myself too old to learn new things, but there are certain things that are difficult or even impossible to learn noodling around at home -- and even in this Show Me Your Github era, professional experience counts. I'm not sure I'm ever going to be given the chance to learn how to scale web sites to ten million hits a day. I agree with The Codist that as programmers we really are responsible for our own ongoing education, but I'm not convinced that's always sufficient.
I agree with poster about the general idea (every programmer should stay in touch with development. However, I have a distinct impression that he likes to stay at the very edge of technology advances:
> I was writing web applications when I first heard of Ajax (a few months after the term was coined) and I started using it; again I wound up teaching my teammates about the new thing first. Sadly it scared the architecture team who thought I had bought some new technology without approval and wondered if it was supported. None of them had heard of it (since they didn't pay much attention) and when I told them it was just Javascript they were only barely mollified.
I can imagine being an architect and having a programmer like that, bringing up every hip thing he encounters, just because it is cool and new... Probably not even considering all of the ramifications. Yeah, sure, AJAX is here to stay (as we know now), but how many "perspective" technologies are now long dead?
I like staying a bit further behing the edge. I follow the direction of technology but I use it only when it is proven and supported well enough. Well, usually. :)
Terrific post. This is exactly the type of individual I want to work with - someone who recognizes that they're in charge of their own advancement, and doesn't lay blame on any outside factors. As developers/builders/hackers we are ultimately responsible for our own success or failure.
the article was a bit heavy on the narcissism, i wouldn't want to work with such a person. i'd rather work with the Brazil-esque Robert DeNiro type: "Listen, kid, we're all in it together."
I see it this way, the better the company you work for, the more responsibility they will generally have towards feeling you need to stay up on education and possibly provide opportunities for you to stay educated in the field you're paid to do. The poorer the quality of company, and generally the more you have to do on a regular basis as the primary individual to do your job, the less chance you will have to learn new technologies. Unless your company feels external pressure for you to pursue these new tech's you are on your own. Small shops with bad scheduling will make it so you are unable to schedule time for new technology stacks. You inevitably end up pigeonholed to stick with what you always use.
I think ultimately, many people in the industry, they only get to learn new tech when they leave for their next job. The pressure is momentarily reduced while they learn at their new job.
This is absolutely true, at least from my perspective. All of the programming jobs I've had in my career have been for medium- to freaking-huge companies, and most of my projects at those jobs have required me to do a ton of self-teaching to get up to speed on a bunch of technical (or scientific, or mathematical) stuff that I had no previous exposure to, in order to get my job done. On only one occasion did I ever get any training for anything, and that was just for two days.
I guess that's a combination "back in my day/get off my lawn" statement, plus a little whining, and maybe a humblebrag, but I don't think that's an unusual story at all for software developers.
I interned at IBM during grad school with a team of consultants that all did enterprise Java stuff for financial institutions- that was very different. IBM would frequently send those developers away for a week or more at a time, multiple times a year, to get training on specific technologies. I'm not sure how common that is anywhere other than IBM though, or if IBM even does that anymore. Maybe Google does it? I don't know.
Sometimes I deal with developers who either can't or won't teach themselves anything, and can't or won't learn by doing. They absolutely need someone to hold their hand and explain things to them every step of the way, and they will just throw their hands up in the air and fail before putting any time into trying to read up on whatever topic is giving them trouble. I don't know what to attribute this to, so I'm trying really hard to not jump to the conclusion that they suck or they don't care or whatever. I'm sure a lot of them do just suck at their jobs and/or just don't care, but maybe some of them have genuine problems with learning that aren't their fault. The only thing I can say for sure is that this is a trait that is a major impediment to their careers and getting their jobs done without sucking up too much of their cow-orkers' time (as we all know, orking cows requires long stretches of uninterrupted concentration).
TL;DR Spot on, and being able to develop your own technical skills to keep up to date and expand your horizons is absolutely critical to being a really successful developer. You are also the only person that you can count on to do this for you. You can't really count on any employer, even some mythical ideal company with bottomless resources that treats each employee as a magical snowflake, to do this. Even if your company does provide training, it's not necessarily going to be the training you want or need to receive.
I disagree with that and the notion of constantly learning new languages on your own. I happen to know couple of companies that were unable to find Perl developers, so they decided to train their own and now they see it as the only way to hire such developers. What you know is no longer a requirement for them, just be good at learning things and preferably have a CS degree. And I see this becoming a norm in a heavily fragmented future of software development.
I find it is usually possible to convince your employer to give you time to study up - whether this is extra time tacked onto a project because "I need to learn the framework", or actual formal training.
I think not making your employer pay a fair share of your training is like being one of those people who stay 4 hours extra when their project isn't late. People do it through a mixture of anxiety, peer pressure, and possibly not liking their children all that much.
To me, it is your responsibility to learn the tech you want to use in your next job, but it is fair to ask the company to pay for new skills you need for them.
From the companies perspective, it is worth spending the money up front so that you don't mess up a project by not really knowing what you are doing.
This is basically the choice I finally had to make for myself over the last couple years.
It was just three years ago that my main responsibility was maintaining code on a black & yellow terminal for a VMS server. Another couple years and I could have easily have been one of those people pushed out of the industry with no easy way back in.
Although my company has provide an avenue for me to transition to doing things with the LAMP stack it is still in some sense legacy. It's a large website base that started over a decade ago.
I have made the choice that I'm done with being legacy and am doing whatever I can to learn current tech. I will even be willing sometime later this year to get a new job at a junior level just I can cut loose the legacy code crap I am tied to. At this point it feels mostly like a bunch of anchors holding me down. I want a new job where I can learn from the people around me and truly be focused on my direction.
When I was first breaking into the field I was also forced to work with legacy mainframe code. It was a nightmare. The code was horrible, the pay was low, and we had no respect. I spent a little over a year learning Ruby and JavaScript in my spare time, joined a startup, and have had a happy and successful career in the many years since.
I can only speak for myself, but the transition improved my life immeasurably. I can't even imagine how different things would be if I had stayed. Keep at it. If I can do it then you can too.
If I were you at this point (actually a 2-3 years ago) I would already starting going back to using Java for Android, too. iOS will be on a billion devices in 3 years, but Android will be on 3 billion, so the impact is much greater, and probably the revenues, too.
I'm in the same boat as the author - been developing software professionally since the 80's, and have navigated my way through a swamp of different platforms and frameworks and languages, oh my!
I'm currently in the state of mind that the true way to stay ahead of things is to keep studying, every week, something new. Yes: every week. I take at least 15% of my work-time and use it for self-enlightenment -whether its learning how to put the LuaVM somewhere, tinkering with RethinkDB, sharding my mongo's, or whatever. Constant change is the only constant in this industry; one must change oneself, constantly, to catch up.
This isn't so easy to do if you're not into enlightenment, alas.
If it has nothing to do with a way to increase production, then the company has no business by investing in you learning that. BUT, if the company can benefit from you learning those skills, then it could be a missed oportunity not give you the resources to do it (learning it while on the company's time).
Of course it's all a product of culture and supply-demand (systemic), if there are enough great programmers that are willing to learn everything on their own time, then of course it will become the norm that programmers should learn everything on their own time. And, of course, that's great for the employers.
I get what you're saying, but at the same time, how can a company possibly know whether any given technology is useful to them if they don't have anyone evaluating it?
You may as well say that your job advancement is your responsibility and not your employer's. And it would be true, strictly speaking you're rarely owed promotion - even if you perform incredibly well, there are no guarantees. However, someone could still not wish to work in a dead-end job.
It feels to me that that's the sense in which the young man's comments were meant. It doesn't seem unreasonable in that light. So the compensation he'd like isn't entirely monetary in nature, that's hardly unique.
I've talked to other developers and what I've understood is that the best way to learn new stuff is to get paid to do so. I'm just wondering how you keep up with stuff if your job demands so much out of you that you can't keep with with anything besides your main stack.
I reckon that learning Maths, Statistics, Electronics or any related other field would make you a better programmer than having a superficial understanding of a plethora of technologies.
I've devoted this year to learning some Statistics. Will check if my hypothesis is true in a few months.
Thats why you work for a company where your skills are the product. Everyone loves selling a better product, so you'll get upgraded... If you are a cost center, then you'll be nicked and cut and eventually hacked at until there is next to nothing left.
Yeah, I think this is fine. But is it also not irritating to think that following every web mvc framework fad is really "keeping up to date" with programming? This seems to be a very common view and I don't think it is any less irritating. :)
I agree completely with this blog post, though honestly, to me it makes the field sound pretty grim. I highly recommend you follow the link to the "technology steamroller" (earlier post by same blogger).
"If you don't keep learning, keep reading, keep improving your skills eventually that nasty steamroller behind you will flatten you permanently. Then your career is likely over."
and
"And that clanky monster breathing down your neck has an endless supply of fuel."
Egads. Not blaming the messenger here, he's right. It's a tough field. So the pay is extraordinary, right?
Take a look at these jobs, and in particular, look at the pay in higher salary regions. The best job, software developer, earns 116k a year on average in San Jose.
The average registered nurse earns 122k a year in San Jose. The average dental hygenist in SF earns about 106K a year. Nurse practitioners clock in at 125K a year.
There are all kinds of ways to interpret this data, and in the end, I'm talking about the greenness of the grass somewhere else. Not that I wouldn't welcome comments about these comparisons, I just want to make it clear that I acknowledge these other fields come with their own stresses and challenges and barriers to entry (and I don't object to good salaries in these fields at all). And everyone has to keep learning...
But is there a steamroller that threatens to make dentists obsolete, and do dentists have to bet the farm, so to speak, on whether to learn "enterprise java beans". It does seem particularly relentless (and difficult to predict) in software, and the career stakes are very, very high.
I think programming can be a wonderful career for some people. I think the main reason I pay so much attention to this sort of thing is that I often think about pay and work conditions for software developers within the context of claimed "shortage", as this is frequently discussed (and until recently, often accepted without question) in the mainstream media.
Judging from this informative blog post, it takes a very unique wiring to really thrive for a career as a software developer. Can we really say there's a shortage of people willing to put themselves in the path of a steamroller? (The author of the blog post in no way made this claim, this is just a question I'm turning around in my own mind).
The biggest issue I had with that comment is: its your responsibility as a programmer to keep yourself educated and up to date, not some employer's.
I agree and disagree. It's a moral responsibility of the employer. Work takes up such a large portion of a person's time and energy that if the company isn't invested in the employee's progress, he owes that company nothing. My work ethic is strong as hell, but if I get the sense that management isn't interested in my progress, I slack as a matter of principle. If your manager isn't looking out for your career and you put more than about 10-15 hours per week in on your assigned work, you're just a chump. (In the MacLeod analysis, a Clueless.)
That said, expecting your employer to manage your progress and education is unreasonable, because no company can possibly account for the variations in peoples' abilities and desires. Even if your employer is genuinely well-intended and wants you to advance-- let's ignore the 80% of companies that aren't this way-- your company will figure out where you should go much later than you will. That's why open allocation is the best solution: the workers can figure out what's worth working on faster than central/upper management.
So, yes, it's a moral responsibility to the employer to give the employee time and resources to look out for her career (and, if it doesn't, engineers should slack). However, for the employee to put the self-executive responsibility of picking out what to learn on the company is, in practice, an irresponsibly bad idea.
By my third year I saw the microcomputers were going to be the future and wiggled my way into the group that worked with them.
The problem is that most modern companies have such mean-spirited, insane policies regarding performance reviews and internal transfer that internal mobility is pretty much impossible in them. At a closed-allocation tech company, the only time you can realistically get a transfer is when your performance history is in the top-10%-- in which case, lateral transfer is a terrible idea anyway, because you should wait for the promotion instead of restarting the clock. Closed allocation and Enron-style performance reviews are all about inhibiting mobility, i.e. keeping the poors in their place.
But once you discover you are obsolete it's too late. Assuming your employer will retrain you is a fool's pipe dream. These days employers may drop you, your job, your projects, or even the whole company without much notice, and then you have to find a new job. Expecting them instead to retrain you is not going to happen.
This is why I hope to see a French Revolution-style uprising. Silicon Valley looked like a way out, a "middle path" between serfdom and violent revolt. Now that that middle path is closed due to the VC good-ol'-boy network, I think that a (probably global) class war is just an eventual necessity. It may come next year, and it may come in 50 or 100, but I hope that it's the last major war humanity has to endure.
In programming you need to look forward because the only thing behind you is that nasty steamroller.
Honestly, I get the feeling that this guy was very lucky. He had the autonomy to pick new technologies and he picked winning horses. Imagine what he'd be writing if, instead, he'd learned Blackberry app development. Or, what he'd be writing if his manager, long ago, had fired him for attempting the transfer to the microcomputer team (possibly forcing him to take a suboptimal job due to financial pressure, with long-term effects on his career). He should at least attribute some of his success to having been luckier than most engineers.
Yep, I know that if I don't take care of my own training, it's just not going to happen, most companies are not so altruistic that they'll hand me everything on a silver platter.
But at the same time, a company that never hires people unless they already have the exact skillset they're looking for, a company that fires people on a whim because priorities change, and a company that provides zero incentive for people to keep learning (e.g. with 20% time or a willingness to let employees experiment with new tech) – well, those are not companies I want to work for.