I think that article makes a number of faulty assumptions, but one stuck out to me:
>We need to start with the assumption that engineers are smart learners eager to know more about their craft
I have encountered many people in the field that do not support this assumption. Code reviews for these people always end up with them defensive - you will see a lot of remarks along the lines of, "isn't it good enough though?", "it works with the one input I tried", "yeah that would be better but that would require me to do more work to change something, vs just hit merge on the pull request", etc.
You can try and help improve their workflow, give tips, point out faster or better ways of doing something, only to come by to help them 2 weeks later and see them still stubbornly doing something the slow way.
They treat code they don't write as a black box, or "owned" by the person that wrote it. You will encounter, "Hey, I was having a problem with X, and I traced it down to this function in the Foo subsystem. I think I heard you wrote the Foo subsystem. Could you take a look at this?" When you try to help, you realize they navigated to this point and then didn't spend 1 minute trying to read the code and see what's going on. You might take time and give a good overview of it, how it works, offer to answer further questions, etc. It goes in one ear, out the other. Three days later the exact same question comes up.
I've seen a programmer get thrust into the node.js world without prior experience. I offered a few suggestions for learning resources, and offered to loan a book. He said, "Yeah, I really try not to read books like that or spend that kinda time outside the office. I'll figure it out."
I could go on and on with anecdotes, but I think the reality is there are people who are very into the craft, and WILL be quick learners (of that which they don't already know), and there are plenty that don't. For those of us that do strive to improve and get better, working with the former is a terrible demotivator.
A step further, I think that the former group will slowly attribute to technical debt, or just over time make your code base harder to work with. You can't micromanage forever, and having to harp on the same things time and time again in code reviews is draining for both parties. Eventually you start giving in, and letting sub-standard work through, because to do otherwise will make you seem like a nagger, nit-picker, etc. Then one day you have to edit something they touched, and your own task takes 10 times longer due to tons of coupling, faulty assumptions, things they didn't bother to learn about how the overall component works, etc.
The "hazing" style of interview surely does suck, and I myself don't go for standard library trivia, or ask algorithm questions that would never come up on the job. But more and more, I feel that they are used throughout the valley for lack of a better way to defend against someone joining the team that's going to be a net negative. I think the thought might be that it's extremely hard to detect the person that can't be bothered to read through the code base, and figure out how things work a little before giving up and asking someone else to hold there hand - yet if one can make it through one of these tough interviews, there's a higher chance they aren't that person.
Another aspect is that I feel like once people get IN in tech, few companies are willing to make corrections. Someone can consistently write bugs, or submit code to the repo it was clear they didn't even test against the base cases of input/usage, give estimates that are WAY long for simple tasks, not pay attention in meetings, etc - but many companies are afraid to fire, or give harsh feedback. From what I've read, Netflix is one company that does not seem to operate this way.
Perhaps we'd all be more trusting in our interviews if we didn't know that the false positive would be so hard to correct.
When you try to help, you realize they navigated to this point and then didn't spend 1 minute trying to read the code and see what's going on.
I'll take the other side here...
I track my problem to a particular part of a submodule. The person that owns it didn't document it well, didn't write an overview of how the thing works, and generally has made "just reading the code" a poor and error-prone substitute for asking them directly about context they've chosen not to share with anyone else.
The person becomes annoyed at my continued questioning instead of "reading the code", and eventually complains about my lack of motivation. The code, meanwhile, never gets fixed, and is a ticking timebomb of technical debt.
Whose fault is this? Mine for not "reading the code" and breaking encapsulation, and changing my code to match undocumented behavior? Or the other person's, for failing to behave professionally like an engineer and identify that they are knowledge-hoarding?
Who's to say, really. It doesn't matter...I've got another gig lined up.
I think what you described exists, it's just not what I was referring to. Quite the opposite, I've seen this happen when there WAS very clear documentation, possibly a link to a wiki or readme file in a comment that points to an overview, and yet the person clearly hadn't bothered to read any of it.
This description is uncanny. Having this person on your team is far worse than someone who flat-out can't code. You can fire someone who knows nothing. Everyone knows what the problem is, and it's easy to talk about. If your team is competent, they'll recognize incompetence.
It's almost impossible to fire someone with the more sinister qualities described here. You always hope they'll catch on: they're smart and they've successfully written working software in the past (this is why you hired them). You end up waiting forever for them to 'click' with the team. It doesn't happen. They're a constant drain and their mistakes compound until your code is unmaintainable.
Note that I'm not picking and choosing any of OP's complaints. They come as a complete package. I've worked with a person who had every single bad quality listed. As I read the OP, I was thinking we must be talking about the same person. I couldn't have described it better.
>We need to start with the assumption that engineers are smart learners eager to know more about their craft
I have encountered many people in the field that do not support this assumption. Code reviews for these people always end up with them defensive - you will see a lot of remarks along the lines of, "isn't it good enough though?", "it works with the one input I tried", "yeah that would be better but that would require me to do more work to change something, vs just hit merge on the pull request", etc.
You can try and help improve their workflow, give tips, point out faster or better ways of doing something, only to come by to help them 2 weeks later and see them still stubbornly doing something the slow way.
They treat code they don't write as a black box, or "owned" by the person that wrote it. You will encounter, "Hey, I was having a problem with X, and I traced it down to this function in the Foo subsystem. I think I heard you wrote the Foo subsystem. Could you take a look at this?" When you try to help, you realize they navigated to this point and then didn't spend 1 minute trying to read the code and see what's going on. You might take time and give a good overview of it, how it works, offer to answer further questions, etc. It goes in one ear, out the other. Three days later the exact same question comes up.
I've seen a programmer get thrust into the node.js world without prior experience. I offered a few suggestions for learning resources, and offered to loan a book. He said, "Yeah, I really try not to read books like that or spend that kinda time outside the office. I'll figure it out."
I could go on and on with anecdotes, but I think the reality is there are people who are very into the craft, and WILL be quick learners (of that which they don't already know), and there are plenty that don't. For those of us that do strive to improve and get better, working with the former is a terrible demotivator.
A step further, I think that the former group will slowly attribute to technical debt, or just over time make your code base harder to work with. You can't micromanage forever, and having to harp on the same things time and time again in code reviews is draining for both parties. Eventually you start giving in, and letting sub-standard work through, because to do otherwise will make you seem like a nagger, nit-picker, etc. Then one day you have to edit something they touched, and your own task takes 10 times longer due to tons of coupling, faulty assumptions, things they didn't bother to learn about how the overall component works, etc.
The "hazing" style of interview surely does suck, and I myself don't go for standard library trivia, or ask algorithm questions that would never come up on the job. But more and more, I feel that they are used throughout the valley for lack of a better way to defend against someone joining the team that's going to be a net negative. I think the thought might be that it's extremely hard to detect the person that can't be bothered to read through the code base, and figure out how things work a little before giving up and asking someone else to hold there hand - yet if one can make it through one of these tough interviews, there's a higher chance they aren't that person.
Another aspect is that I feel like once people get IN in tech, few companies are willing to make corrections. Someone can consistently write bugs, or submit code to the repo it was clear they didn't even test against the base cases of input/usage, give estimates that are WAY long for simple tasks, not pay attention in meetings, etc - but many companies are afraid to fire, or give harsh feedback. From what I've read, Netflix is one company that does not seem to operate this way.
Perhaps we'd all be more trusting in our interviews if we didn't know that the false positive would be so hard to correct.