It is, regrettably, not the case that all people who work as software engineers can e.g. code a for loop which counts the numbers of lower-case As in a string. This is true even if you spot them the syntax for a for loop and finding the Nth character in a string. It is equally true if you allow them to complete the task in isolation, at their own computer, given an arbitrary amount of time to complete it.
I am, naturally, constrained from saying "Here's a list of three of them." It would not be difficult.
It is also, regretfully, not the case that all applications one receives to an advertised position of Senior Ruby on Rails Programmer would be from people who had ever opened a command line.
Both of these are very difficult things to accept. They may be even more difficult to accept if one is extraordinarily smart/diligent and one studies/works solely in organizations which apply brutal IQ/diligence filters before one is granted even a scintilla of the admission committee's time.
I remember, rather vividly, the first time I figured out that an engineer couldn't program. I was attempting to tell him where a value was being assigned in a program. After failing to do so over email, I went over to his desk and asked him to navigate to the file at issue. He was unable to do so. I told him I would do it, assuming that he was unfamiliar with the directory structure in that part of the program, and opened the file. I then said "So you see, the assignment is made in the fooBar subroutine." He couldn't find the fooBar subroutine. I said "It is the third method on your screen." He couldn't find it. I said "It is this one, here, which I am pointing to with my finger." He said "OK, that one. Where is the assignment?"
The fooBar subroutine was one statement long.
A coworker, having overheard the conversation, stopped at my desk later, and explained to me that, if I had pressing engineering issues, coworkers X and Y would be excellent senior systems engineers to address them to, but that Z should be allowed to "continue to devote his full attention to work which he is well-suited for."
The industry has many destructively untrue beliefs about hiring practices. That there exist at least some people who are not presently capable of doing productive engineering work is not one of these beliefs.
> I am, naturally, constrained from saying "Here's a list of three of them." It would not be difficult.
I can't. Over my career I have worked with hundreds of engineers and scientists who wrote code. I know zero who are in positions where they have to write code at all, let alone as software engineers, who cannot do that.
> Both of these are very difficult things to accept. They may be even more difficult to accept if one is extraordinarily smart/diligent and one studies/works solely in organizations which apply brutal IQ/diligence filters before one is granted even a scintilla of the admission committee's time.
I spent my first 9.5 years at Raytheon, a company that is very light on the detailed technical aspects of interviewing and very heavy on the behavioral aspects. There are not committees, just the hiring managers and the interviewers. You know, the system that is supposed to have companies awash in these people who supposedly can't construct a basic for loop but can bullshit a good game. Never met one. Not at Raytheon, or Northrop, or Lockheed, or Boeing, or any of the other companies I worked with during that period.
Sure, most of them were what this community would disparagingly label as "9-5 engineers", but they all produced code that basically worked and provided some value to the program they were on. Normally if there was a problem it was a political or attitude problem, not one to technical ability.
The Matasano hiring process (to which this article refers) was designed in part because of this bad experience you've never had. I've seen it happen repeatedly over my career; people who'd pass interviews and then, over 15-20 minutes of face-to-face discussion a couple weeks on the job, not be able to comprehend the idea that a C pointer needs to point to valid memory.
That's what's so mind-blowing to me about our terrible process today. It aggressively tries to find and dispatch unqualified candidates, so much so that it forcefully repels talented people. And it fails miserably at screening out people who can't code! It is the worst case scenario of processes.
It's also really important to remember that the problem is People Who Can't Code, not "People Not Smart Enough To Code". Time and again I've "worked" with people who could very effectively critique my own code, who could participate in design meetings down to the bits-and-bytes layer, and, having taken responsibility for actually implementing some portion of those designs, could not commit usable code to save their life. I don't know what the deal with these people is, but they are out there: plenty smart, totally fluent, and completely ineffective.
I was talking to Erin this weekend about the phenomenon and realized that these people not only exist, but are actually favored by the way the market works. A smart person can last many months in a dev role before anyone realizes they're ineffective. Then, it takes many, many more months to manage them out of the team. By the time the story plays itself out, the bad team member has more than a year on the job, and when they parlay that resume line item to another prominent job elsewhere, their previous teammates aren't outraged but instead relieved. They end up with gold-plated resumes and tons of credibility.
The software industry is broken for older engineers. Look at the typical algo/CS heavy interview today. It favors and selects for these characteristics:
1) Young, fresh graduates who still remember well the CS material. Or those who have a lot of free time and can study for these interviews (for example look at this Quora link http://www.quora.com/How-much-time-did-you-spend-preparing-f... the first reply mentions spending 28.5 hours per week for 5 weeks, which is 142.5 hours total) From a senior software engineers perspective that's a ridiculous commitment for something where an arrogant asshole interviewer can shutdown a whole onsite interview process because they're having a bad day or have some favorite question they think is the One True Question for determining if someone is a software engineer.
2) The software industry tends to overvalue youth. It has a very high salary at start, but also a very quick salary cap. And most companies have no technical ladder, and even those that do it's a joke compared to the manager ladder. All of this is not surprising. Since the industry is so heavily biased towards hiring and rewarding youth, older experienced engineers are not valued.
3) Senior software engineers have heavy pressure to leave the software field. I'm feeling this now. I'm a senior at my current place and my salary is basically capped. There would be little to no benefit going to another company. And even if I wanted to, I have a family now and don't have
the free time to study for software interviews. My options are basically go into management, consulting, or start a business. I have no desire to start a business or go into consulting because I simply don't like doing those business related things. I want to be an engineer not a business person. But if I just stay an engineer, I will probably get laid off someday due to ageism. And if I'm an old engineer who is laid off, job mobility is basically gone due to ageism and the CS heavy interviews that I don't have the time or motivation to study for. So the only safe path is management. A lot of senior engineers face this decision. Which again reduces the number of mentors and experienced engineers to teach the new younger engineers.
4) And doesn't anyone notice just how wrong whole concept of studying for an interview is? There's a whole book publishing industry dedicated to technical interviews such as Cracking the Coding Interview. The technical interview has become like the SAT. Many could just study to pass it, but be horrible engineers.
Interestingly, the author of that book also wrote the top comment on the TechCrunch story, disagreeing pretty strongly with me about hiring processes. I got to talk to her a bit about it on Twitter:
It seems to me that Google's perceived hiring issue is "we can't tell the candidates apart, because they all studied our interview questions they found online." (and not "our process is fundamentally broken to the core" like it probably should be).
In that context, "give take-home tests" and "ask better questions" sound like the exact opposite of good answers.
I think this is a problem that's unique to very large corporations. Not only do thousands of people interview there, meaning questions and answers are likely to be traded verbatim online, but there's also actually an incentive to cheat your way in: you can hide out for 1-2 years and acquire a horde of cash and resume fodder.
Google have actually made this problem worse for themselves, too. By overvaluing their interview process, they've made cheating even more lucrative. Cheat your way into even an offer there and you have something valuable to put on your resume!
I suspect you haven't seen a lot of cheaters on take-home tests before because at a company like Matasano, there's little incentive to cheat your way in. You certainly can't hide out in a corner doing no work there.
Bummer. Her career seems to be predicated on a correlation between algorithm tests and performance. Can't expect her to seriously entertain disagreement as long as the current model is profitable for her.
Google is no more a proof of the value of algorithm tests than Github proves the value of manager-free organizations.
I was a bit horrified at the acquisition coaching comment. I did not realize that happened but of course it makes sense. Can't manage an acquihire if your devs can't pass the algorithm tests at the big acquirers. Shame.
It is actually more profitable for me if I say that these algorithm tests are BS and completely study-able, and even the worst engineer can pass them. That easily translates into "here! buy my book and even YOU can get a job at Google!"
It's actually a much harder sell when, really, it's more like studying helps someone perform better, but only to an extent.
Oh, and the acquirers encourage studying and prep, just as most of the top tech companies do for their normal dev (and non-dev) candidates.
To be clear: my experience is that the algorithm interviews, when done properly, work reasonably well to identify intelligence/problem solving and coding skills. Those are more important at some places than others. They can work well for certain places -- but not at all. I do not think that all companies should use this process. It's not right for all places, but it's reasonably good for some.
As for Google being proof of the value of algorithm test: I think you're pulling that from a twitter conversation very much out of context.
Context: Some people were arguing that the algorithm interviews offer zero value - that you'd be at least as well off, if not better off, hiring at random.
If they truly believe that (and they appeared to be sticking to this argument), then it's absolutely relevant that not just Google, but basically all the top tech companies, use this hiring process. Could a randomly selected group of engineers build the technology at companies like Google, Facebook, Amazon, etc? I suppose that's what they must believe, and I find that pretty absurd.
If they had argued that these interviews sort of work, but aren't nearly as good as some other interview processes, then you're right that the company about Google and other companies wouldn't be relevant. And I also wouldn't have brought it up.
"To be clear: my experience is that the algorithm interviews, when done properly, work reasonably well to identify intelligence/problem solving and coding skills."
Given that there is no correlation between job performance and interview score, are you saying this based on your gut instinct, or do you have objective data somewhere?
My best estimation is that the Google interview process is just a glorified fizzbuzz presentation, and that the hiring decisions are more or less made by arbitrary factors, including the nebulous "culture fit".
While it may be amusing to work through the details of doing a merge sort on a linked list in 30 minutes, it's certainly not germaine to the quotidian experience of an SWE at Google.
>> Given that there is no correlation between job performance and interview score, are you saying this based on your gut instinct, or do you have objective data somewhere?
That's not a correct assumption. I suspect that you are referring to the (infamous) Google studies on this that were alleged to show this. The articles about this got many details wrong, and the study was fundamentally flawed. Remember: they only studied the people who did very well. That's like finding no link between exercise and health when you only study people who run regular marathons.
Merge sort on a linked list is a bad interview question, so I agree with you there. Good questions are problem that actually make you design a new algorithm.
It may be that the studies were flawed, and that the reporting was bad, but that flavor of study has been replicated in other venues besides Google. Given that, it essentially comes down to culture being the deciding factor in these interviews. Given the stark cultural skew at Google, it's possible that the pseudo-analytic SWE interview is actually masking systematic bias in the hiring process.
The question for Google is whether Page, Brin, Bak, Buchheit, and Dean passed an algorithm interview before building Google's core technologies. Hiring doctrine after the fact does not ensure a billion dollar company.
I disagree with your assertion that lambasting Google's hiring practices is an easy sell in the market. The last time I tried I was dismissed out of hand by a talent consultant.
> I disagree with your assertion that lambasting Google's hiring practices is an easy sell in the market. The last time I tried I was dismissed out of hand by a talent consultant.
>And doesn't anyone notice just how wrong whole concept of studying for an interview is?
This has been going on for ages. A lot of coding bootcamps and similar things are just interview prep programs. A lot, I mean a lot, of people I have worked with can nail interviews but can't do real work for shit.
Unfortunately to land a job, you have to go through this process. Developing interview skills is parallel to being able to do actual work.
If your engineering org is 50-weeks-a-year structured around delivering stories and fire-fast and doesn't build in time for ramp-up, it has a good chance of spitting out ineffective hires. But most organizations --- and we worked with lots of them as a high-end engineering consultancy --- aren't structured like this. More importantly: it's disproportionately the larger organizations with bigger brand equity that tend not to be run entirely out of Pivotal Tracker, which feeds the resume effect.
At many large organizations, there's even political pressures that keep employees from being let go early in their employment. If a hiring manager says yes to a candidate and they go through training/onboarding, there's a certain amount of clout to be lost if you jettison that employee early on. It is, instead, better for the manager to make the employee appear to be productive to make it seem like the manager did a good job in hiring.
Team accomplishments can be embellished, stronger team members can cover for the weaknesses of weaker team members, but admitting that a decision you made was incorrect gets punished. It's obviously better to hire someone competent in the first place, but the next best alternative, for the manager, is to live with their bad hiring decision.
I used to work at a company with thousands of staff, and the HR policy was pretty clear:
As engineering managers, we were expected to "manage out" about 10% of the lowest performing staff on a yearly basis. The way that worked was basically that the 10% under-performers were to be put on strict personal improvement plans, and were under no circumstances given raises at all, and it should be made clear that no raises would ever come their way unless they improved.
That was it. Sounded harsh if you were used to 10% a year raises, as many of the good performers got. Not so harsh for those who were used to live in fear of getting "exposed" and fired.
And as a manager, if I were to designate a lot of my team as under-performing come review time, it'd reflect badly on me, so managers also had an added reason to push for good reviews for their reports: You'd end up getting paid more by getting good reviews yourself if your reports were helping create a good impression through rankings that you as a manager could trivially fudge (we did "360 degree reviews" were the manager and report could both designate some co-workers to review the person, which was of course trivial to game for those who wanted).
There was also an implicit expectation of a bell curve. If you had an under-performing team, that'd be awesome news for at least some of the weaker members on your team (and conversely, I several times had reports who got short-changed because I was told too many of them were designated "above expectations" even though my manager and other people who had worked with them agreed, and so several of them were ranked lower to get the "correct" spread - it was absolute lunacy).
Unsurprisingly, come hiring time, I felt like we were under siege by an army of ineffective developers hoping to find refuge somewhere safe and hard-to-get-fired.
There are scarily many places like this for ineffective people to hide. Sometimes for many years.
"comprehend the idea that a C pointer needs to point to valid memory"
I ran into a C programmer that did not get the relation between pointers and parameter passing. It was some scary awful code. Damn prep for code review almost got me in a car wreck.
Is there a process in place to handle such people? Do they get asked to leave (shortly after being hired), or are they moved to a role which doesn't require programming?
I've been coding professionally for 15 years and only have one such story. I once worked with a guy who had supposedly been coding professionally for over 10 years, the most recent 5 of those in Java. He, in all seriousness, asked me how to sort a List. I can forgive not knowing the Collections interface in Java, but not being able to google it and figure it out as a senior developer and team lead is troubling.
Conversely, there have been plenty of people who I would call anti-workers. They technically accomplished "something" but the quality of work was so bad that the team would have been more productive without them. They were a net negative productivity for the team. At most companies I've worked at there was one of those.
The problem is not that these people "cannot code", the problem is that they cannot implement even the simplest things when it cannot be done by using existing libraries and frameworks they already know of with designing any kind of problem-specific data structure or algorithm being completely out of the picture.
For example I've been told that iterating over list of objects and aggregating (min/max/sum) some of their properties would involve introducing dependency on local installation of SQL server.
"the problem is that they cannot implement even the simplest things when it cannot be done by using existing libraries and frameworks they already know"
If that's the bar, then I'd estimate that roughly 80% of programmers in the valley right now can't step over it.
This debate is filled with so much heat -- and so little light -- because everyone carries around their own subjective definition of competence. I've personally never met anyone working in a company who can't code at all, but I've met plenty of supposedly elite programmers who are adrift without a framework. Are these people useless? I don't know, but I know they're interviewing other programmers.
If I think about it long enough, I go in circles and I end up back at the thesis of the article: this industry is filled with some spectacularly judgmental people.
What's the debate as you see it? Can you boil it down to two sides? I'm asking because, maybe we actually agree about stuff, or maybe we totally disagree.
From what I've read from you in the past (as well as the way you're mentioned in the article), I think we probably agree.
If I had to reduce the debate to two clear groups, it's that one group believes that there's a bright-line distinction between "good" programmers and "bad" programmers, and that the questions-at-a-whiteboard model is an effective way of discriminating the groups. The other group doesn't believe in partitioning the world into two clear groups.
It isn't debatable that there are a lot of people who can't write code. I believe that trivial phone-screen coding problems are pretty effective at weeding those people out. Beyond that, it's much more about candidate/position fit, and that's a more subjective question. The problem is that programmers tend to be uncomfortable with subjectivity, preferring instead to posterize the shades of gray -- that's how we end up with the apocryphal Google engineer who can't code.
In your stated example (person hired; can't fathom basic realities of C pointers), I doubt that it reflects a complete inability to be functional in some programming role -- just not the one you were needing. Most "Rails programmers", for example, would probably be hopeless in that situation, but perfectly capable of cranking out websites with Rails. I've seen a fair number of real-world programmers whose (lack of) knowledge of algorithms and data structures makes me cringe -- and I certainly wouldn't hire them for the kind of work I tend to do -- but I have to accept that they're effective within their niche.
We agree more than we disagree. Most importantly: the big problem I see with our industry is our inability to detect aptitude in candidates who don't "show" well in resumes and interviews; in other words, the opposite of the problem of being "too selective". It sounds like we agree on that.
On the other hand: I think the counterintuitive thing about ineffective programmers is that many of them are perfectly capable of reasoning through a CS problem in Java or C++ or Python on a whiteboard. But they're --- for lack of a better term --- useless when it comes to actually getting real-world systems built.
My thoughts about why this happens aren't yet well-formed. I just want to contribute the perspective that the "bad programmer" problem isn't just about people who can't buzz a fizz.
> On the other hand: I think the counterintuitive thing about ineffective programmers is that many of them are perfectly capable of reasoning through a CS problem in Java or C++ or Python on a whiteboard. But they're --- for lack of a better term --- useless when it comes to actually getting real-world systems built.
Software engineering is a performative art. Sure, there is a heavy intellectual component, but you actually have to do actual software engineering to get good at it. You can't just study the theoretical aspects. It's like how you can't learn Haskell just be reading a book and some websites. And yet, someone could go through school for CS and slide by without actually doing any real engineering / programming. I had a CS graduate TA (for a CS discrete math class), at a very well regarded school, who literally did not know how to compile a C program, full stop. He was very into the pure-math side of CS...but still.
As an analogy - you can imagine that someone who has studied music theory for years, has perfect pitch, knows how all the various branches of music relate to each other, which artists influenced whom, etc...could still be shit at playing musical instruments or creating new music.
That points to another problem: Computer Science is not Software Engineering. Yet a lot of the time we insist on hiring CS people for our Software Engineering jobs. It doesn't help that there's not really a clear definition of what good quality software engineering is.
It's an interesting problem. Before I started interviewing I expected to get a ton of really good candidates and have trouble being selective for lack of being able to find anything wrong. Boy was I off the mark. The number of people who don't pass the fizzbuzz test is quite high. I think if we tried to give the "sleeper" candidates a break we'd let in so many bad ones we'd quickly get overwhelmed.
We thought so too. But the (nearly) resume-blind process we settled on quickly converged on a nearly all sleeper candidate pipeline; in several years, we hired I think just 2 people that had our field on their resume. We didn't just retain all those candidates: we were knocked on our asses by how well they performed.
Recruiting is two problems: outreach and qualification. We did novel things on both fronts. For this thread, I just want to be pointing out: the changes we made to qualification were critical, instrumental, fundamental to our success. Most of our best hires could not have happened had we qualified candidates the way we did in 2009, and the way most firms do today.
I have a close friend that joined matasano about 2 years ago. While a good chunk of the readership here is familiar with some of those changes, I think your knowledge is an example of where the 'lessons learned' should be shared as widely as possible. I'm no marketing expert but if someone can find a way to make some of your ideas go viral that would be great for the industry.
Can any of this be attributed to salary negotiation or "better" options being available to those already established in the field, such that a large commitment seemed unnecessary to them? For those who were established, where did they drop out of the pipeline and why?
FizzBuzz is easy iff you are aware of the % operator. I only know how to test for divisibility (cleanly) because I looked it up in the course of doing Project Euler. I could easily see there being developers who worked exclusively on non-mathy business logic and never had to test divisibility, which would explain floundering on FizzBuzz. (Then they should be able to reset counters every 3 and 5 iterations or something, but that's a tad more involved than the canonical solution and might be regarded by some interviewers as incorrect.)
fizzbuzz can be modified to remove knowledge of the modulus operator. "write a function that loops through an array and prints 'fizz' if it equals the parameter 'a', 'buzz' if it is -a, otherwise print the value of the element". Or something like that. This tests writing a function declaration, passing values in as parameters, iterating over an array, writing an if statement, and printing. That seems pretty basic and fair to me.
Are you a js engineer? It's hard to imagine someone could write c, c++, java, python < 3, or probably c# without cleanly understanding the modulus operator, because
System.out.println("" + 3/4);
(or the moral equivalent) prints 0 in all the above languages. Every developer gets bit by integers performing integer division.
Uhhh, the mod operator is not the solution to that problem though.
In reality you'd only use a mod operator if you were doing things like sorting into 4 columns or doing an operation on every 3rd thing. And it's not a concept that is introduced at school. Now I try to think of it, I think I only picked up mod when I was learning rounding in my first language and the man page happened to mention modulus at the same time.
The mod operator isn't the solution to that problem, but that problem shoves integer division in your face. If a dev sees that and isn't curious enough to understand there is a division operator and a remainder operator, just like when you studied fractions in 3rd grade, I don't know that I want to work with that person.
And every dev should have hit the remainder operator at bare minimum when they had a long running loop and wanted to print a status every kth operation, eg
for(int i=0; i < 100000000; i++){
// some operation
if(i % 100000 == 0)
printf("** operating on count %d\n", i);
}
or when processing a big file, printing every k lines; or when running a slow operation, printing every k seconds; or ...
No, because for most of those problems there's an easy alternative, declare another counter variable and you can just go:
z++;
if(z>100000) {
print x;
z=0;
}
When I was taught maths there was no emphasis on remainders and it certainly wasn't denoted with a % sign. I vaguely remember writing something like 12r3 in primary school. The r meaning remainder.
While I learned about the modulus operator at a young age, I grew up programming in environments where "nobody" used floats because the CPU's in question didn't have FPUs, and floating point operations resulted in costly library calls.
While only "older" (I'm 39) developers will be likely to have been in that situation, it took another decade after I moved onto hardware with FPU's before I worked on anything where we actually used floating point math.
Instead we'd be working with fixed point stored in integer. E.g. for financial systems, floating point is a nightmare. Working with fixed point to whatever number of decimal points our accounting department wanted (5-6 typically) for tax calculations and the like was preferred.
So there are large number of areas where people can have worked successfully for many years without ever using floating point.
I mean, I know about the mod operator because I did Projrct Euler problems in middle school, but if I hadn't, nothing I've done since would have made me learn it. The extent of the math I've had to do was tracking send buffers in C, which was just addition and less than/equal to.
A naive implementation of % is pretty trivial, so even if they weren't aware of the operator, I would at least expect them to write an equivalent function.
But they're --- for lack of a better term --- useless when it comes to actually getting real-world systems built.
Part of the issue here is that each of us has a different understanding of what it takes to get a "real-world system" built.
That's somewhat obvious because each of us has different "real worlds", so even within the same language/ecosystem there is a huge disparity in what skills are required to be genuinely successful in different teams/organisations/applications.
To put it in concrete terms, when I worked in banking, the main impediment to building successful "real-world" systems was getting clear, unambiguous requirements that could be translated into something implementable. Some of the best developers in that organisation were excellent at their job because they could take the incomplete and inconsistent desires of a banker, and turn that into a coherent and complete "world view" about the system they were required to build. We were primarily a Java shop, and they could produce an adequate application within the frameworks available to them, but I shudder to think what they would have said if you asked about "coupling and cohesion" or the java memory model, or how dynamic proxies work, or even how they should choose between making a field a short, int or long.
So, for my current (startup) organisation, I wouldn't hire those people - despite them being instrumental in producing some of the most satisfying and successful (ROI) systems I've been involved in. This team needs people who can do hard engineering, and if my former colleagues applied for a role here they'd end up looking like the proverbial "inept programmer" that we're debating.
This is a reflection on their skills, but it's also a sign that similarly named roles ("software developer") in different teams require and cultivate different skill sets.
Although I read the answer from the OP, reading the original message made me think of something slightly different. The debate is almost impossible to characterize because nobody is arguing apples to apples. When we talk about incompetence, almost nobody will agree to its meaning.
Thanks to the Dunning Kruger effect, people who are incompetent do not realize they are incompetent. I remember reading a follow-up paper to this study (I wish I could find it again!) which concluded that when people have their mistaken beliefs pointed out to them, it actually reinforces that mistaken belief! This means that when people actually engage in debate, the two sides entrench and become more confident in their positions.
I have actually argued that this may be what leads large corporations to have a kind of "talent inversion". The people with the least talent have the most confidence and volunteer for the craziest projects. The projects fail, but due to politics the failure is hidden and spun into a success. The original developers begin to believe their failure was a real success and any debate to the contrary only strengthens their belief -- which fuels their confidence. This confidence allows them to be promoted where the cycle continues and multiplies.
This leaves you with a strange situation. Everyone will agree that incompetence is a big problem in the organization. Unfortunately, everyone will point to completely different people when asked who is incompetent.
Let's say that you want to improve the level of talent on the team. In other words, you want to hire only people who are more talented than the average person already on your team. I think this is not an unusual situation to be in. If talent is normally distributed on the team then half of the team is below average (I hope I got that right because I suck at statistics ;-) ). But thanks to "talent inversion", you may have a higher proportion of people in senior ranks that are in the "below average" section.
Now you can have a debate about the what kind of people to hire: If person A thinks that they are at the top of the talent scale, they can't expect candidates to be as talented as they are. Probably they think it is completely reasonable to hire people who are considerably less talented than they are. Person B may actually be at the top of the talent scale and may be arguing strenuously that candidates must be considerably more talented than Person A.
Clearly such a debate will have a lot of heat without a lot of light.
I think the difference is that at medium-large companies there are plenty of technical career paths that don't require a person to actually code - business analyst, QA, documentation, compliance, project manager, etc.
So people at a place like Raytheon who simply can't code (or don't like to code) end up doing something else that they are better suited for.
Early stage tech startups just don't have a bunch of ancillary positions like that. Founders and early hires had either better be great at code or great at biz dev.
Also it's worth mentioning that it's not just a tech/engineering phenomenon. It's very common for people in other professional fields (law, medicine, etc) to get a little ways into their career and realize that the job is not at all what they expected or wanted or are suited for, and there are alternate career paths there as well.
This is a great point. Again, we're getting a weird look at things because we're posting on Hacker News, not Defense Contractor News or Accountant News. Software companies are definitely a significant group, but there are lots of companies whose programming division is just a small part of what they do. They're big enough that they need their own programmers and can't outsource it to someone, but they have all sorts of other stuff they're working on.
If you can't code, you're probably still useful for other stuff. Businesses love to take people they already know and find better roles for them. It's way better than firing badly utilized talent and having to interview for some other candidate with that skillset.
I wouldn't go so far as to say, "There are no incompetent employees," as there are definitely worthless people in the workforce. But it's not as clear as a lot of people might think.
> Not at Raytheon, or Northrop, or Lockheed, or Boeing, or any of the other companies I worked with during that period.
All these companies require degrees, check GPA, and typically have managers that came from technical backgrounds. While not perfect, this combination reduces the chances that a total dunce slips through. Also, all of the above companies tend to be looking at engineers/physics folk over cs/IT, although in the past 15-20 years a lot more CS.
I work at one of the above and for a long time only EEs were hired for any software positions. In regards to math ability that's a fairly high bar. And math competency is pretty well correlated with the ability to program at all.
Not saying that someone couldn't slip through. But all of the mentioned factors tend to reduce the pool of potential flops vs some of the now-standard hiring practices (from the article).
EE/CE here. I took the intro to data structures class and stopped at that. Been kicking myself for years because that's seemingly the only thing people care about; not actual competency (at least in the context of interviews).
Thing is, I'm a decent programmer, maybe even above average. My career so far has consisted largely of being the only or one of a few technical people and doing whatever needs to be done to make X happen. I built a poor-man's version of the mail sorting machines the USPS (and other postal organizations) use to photograph, OCR and sort mail. I remember everyone's mind's being blown when Outbox (R.I.P.) showed people pictures of their letters, and I built a system like that by myself for a previous employer in about two years while also completely rewriting the website. 65kLOC of VBScript/ASP to 20kLOC of python/django plus a whole bunch of new features.
If you believe sloccount (I don't really) in two years I did about 4 person-years of work that it says is worth a substantial fraction of a million dollars and a substantial multiple of what I got paid.
But I don't remember what a red/black tree is.
Knowing data structures and algorithms isn't the only way to be a competent programmer. To get my EE degree I had to take calc 1,2,3, diffeq, discrete math, linear algebra, senior level statistics, and these were all just pre-reqs to my engineering classes.
I think a lot of people get hung up on computer science == programming. There's a lot of overlap, sure, but huge amounts of programming doesn't have anything to do with computer science.
Yea. I and many of my coworkers are similar to you. BS&MS in applied math. Did some of the little programming courses but not the full CS path (though I did last summer work through all of this just to catch up some background http://ocw.mit.edu/courses/electrical-engineering-and-comput...).
The reality is different industries have different standards. I'd probably never pass a SV interview because I'm not ever going to memorize many of the CS fundamentals. I work with EEs, Aeros, MEs that code (robotics types), math physics and chem PHds, etc. Sure there's a few CS guys here and there usually that double majored in phys or math.
On a given day, what's more important to our engineering problems is my understanding of advanced linear algebra topics, not a sorting alg (lot of work in CV recently for ex). We really don't bring anyone into an interview that hasnt studied the full gamut of math required in a math/phys/engineering degree. In such an environment, the interview questions are more about how you solved problems and never the memorization of some sorting alg. Oddly enough, when you weed out everyone that doesn't have a full math curriculum first, you don't seem to find many people that 'cant code'. Go figure.
> the interview questions are more about how you solved problems and never the memorization of some sorting alg
How you solved a problem is probably the only meaningful way to screen candidates. Sadly there are a lot of people who think that a whiteboard coding session is "how you solved a problem" when few -- if any -- meaningful problems get solved in an hour or two.
When I get really stumped on something the first thing I do is get my head out of the problem. Go for a walk, work with my hands in the shop, get lunch, anything but try and solve the problem directly anymore. This strategy used to make me really nervous in college with short deadlines but now that's less of a concern.
I have yet to step back from a problem and not figure out the solution (at least where one exists).
I believe part of this is your own frustration, and I do not blame you for venting.
Another part is the inherent error is using education as a proxy for ability to contribute.
My biggest concern, however, is that we assume there is a hierarchy of programmers. That the person who surpasses John Carmack is automatically more valuable than someone else, because it's a vertical pecking order.
In reality, it's likely that given the constraints of a position, Carmack++ is not a good fit.
We see someone build a faster sqrt algorithm as ability. What about the programmer who does his best to navigate internal obstacles adroitly -- why is that not praised just as much? It's certainly a difficult accomplishment -- why praise people for ability entirely?
That tends to bug me, who has no formal education beyond High School, but I have worked a lot to gain knowledge related to software development, both conceptual and practical. There's no "in" when you don't have the minimum requirements at such companies...
I have a friend who is a mid-level manager at one of the listed companies, who came in without a degree, and was promoted because of proven competence to where he is now, and is spending several hours a week working on his degree because he's been promoted to the point, none of his superiors will let him be promoted further without a degree, despite handling the job better than his peers.
On the flip side, I've worked with people with a degree in CS (or similar), who couldn't develop a good software system if their life depended on it. The other side of this is, that many people don't continue to learn and only stagnate once they become familiar with one system. Software development tools don't stop coming out... there will always be new languages and platforms, but I have always seen a lot of resistance... I'm one of the few 40+ y.o. developers I know who keeps an eye on what's coming up in the community.
Yea, hopefully my post didn't indicate that I agreed with the process. You certainly listed plenty of counter examples why it can miss the mark.
But my point was about Statistics. The HR warlords for all those defense/aero companies that have set these limits have done so because it segments the pool of candidates to one with a lower chance of bust (or so they've decided as such).
I'd like to see data, I'm some HR or consulting group somewhere has it. Or maybe it doesnt exist. A simple logistic regression problem with a high number of variables (degree or not, age, major, self-study, continuing ed, etc etc).
edit: also - all my comments are speaking from the engineering side of these companies. not IT. So the actual work content is a bit different than the original article may be concerned with.
I didn't mean to imply that you agree with the process... I think my own comment is more along the lines of, this is one of those rules people should be able to break now and then... I absolutely agree, that there are far fewer risks of people without the necessary skill/ability to do a given job in terms of those with a degree to those without. The challenge is then to not lock those who don't fit that mold out, either by experience and/or referral.
The OP's point is, unless I misread it, not that there are no incompetent engineers who can't code, but it is ridiculous to, like, ask some "fancy" algorithm questions, brain teasers, etc. that are mostly not related to the candidate's working experience at all.
For candidates who are fresh graduates, it looks it is the most effective way to ask algo and data structures. For a senior engineer who has at least 10 years of experiences, the interviewers cannot find some better questions to ask, to verify if the candidate is a good coder?
I was interviewed once. I was asked a question. I knew the question could be solved with a "fancy" algorithm, since I read the original paper in graduate school. It was elegant and beautiful, very creative. After so many years, I couldn't remember any details, and I had had no opportunity to use it in my work. I am not a genius, so I couldn't figure it out in 20 minutes during interview. Does that mean I am a fake?
> In all my years immersed in the tech industry, I have never once heard a firm talk about the idiots lurking in their own offices. They always seem to be elsewhere. For everyone.
It certainly sounds like the author doesn't think there are any actual incompetent engineers out there.
Personally, I know that to be false. I have worked with incompetent engineers, who literally couldn't code FizzBuzz.
I think the author is saying we should do interviews like nearly every other industry.
Look at past experience, and ask the candidate to talk about that experience to verify that they aren't lying.
If someone has 5 years experience working at Google, and they can talk coherently about technical aspects of the projects they worked on, there is no reason to ask them to code on a whiteboard.
The way it currently works is this: I see you have a CS, degree and 10 years experience at a well known company.
Excellent. Now you have 2 eggs and a 100 story skyscraper. Tell me the minimum number of egg drops you'll have to perform to discover the minimum height that will break the egg.
Every engineer no matter how experienced has to answer questions like they just left school yesterday, no other industry does this.
>Personally, I know that to be false. I have worked with incompetent engineers, who literally couldn't code FizzBuzz.
99 times out of 100 you could catch that engineer by talking to him about technical aspects of past projects he worked on.
"Look at past experience, and ask the candidate to talk about that experience to verify that they aren't lying.
If someone has 5 years experience working at Google, and they can talk coherently about technical aspects of the projects they worked on, there is no reason to ask them to code on a whiteboard."
I always argue for this, and about 98% of the time people look at me like I'm sort of crazy man and our company is going to go down in flames. Everyone has an anecdote, "well, there was this one time ....", yet I'm convinced that there's roughly 0 real value add from all of the hoops and hurdles most interviews provide in terms of accurately identifying strong candidates.
I always try to make the argument that if someone can talk intelligently about a wide range of programming knowledge in an actual conversation - for instance, knowing that data type X would be a logical segue from the current point, and do that throughout the interview, personally I think the likelihood of them being an idiot are slim. However, most people I argue this to are always afraid of this, afraid that some interloper will sneak through unless we put them through a gauntlet of questions regarding things they'll never do in their day to day life.
This is why I like to give laptops to engineers and ask to make something simple, that might require logic, but doesn't require any sort of specialized knowledge. They can even look up documentation.
Like make a table view that has word counts for a string. Or make this simple puzzle game. Or make a simple address book app.
Just an FYI -- working on someone else's laptop can often be an unpleasant experience (weird keyboard configuration, different software installed, etc.)
My current opinion on this would be to offer a laptop so you aren't disadvantaging candidates who don't have one too badly, but let people use their own if they have one (and let them know ahead of time that it will be an option and what they should be set up for.)
We tell candidates up front that they'll be doing some live coding and welcome them to bring their own laptop if they would like. We have a backup around with just about every reasonable IDE and editor installed for our languages but even then I agree, it's definitely not the same using someone else's computer.
I program on my desktop with two monitors, I wouldn't be comfortable on a laptop without at least two extra large monitors hooked up to it already. And a normal keyboard, don't give me that laptop keyboard crap.
Wow... I understand that you'd be much more productive with a large monitor and good keyboard, but to the extent that you couldn't perform in a coding interview? Give me a break...
In the past, I've used CollabEdit for "phone screens" and the whiteboard for in-person interviews. I try to ask some "coding" questions (identify a better data structure / algo for a particular scenario and then implement it in code) and some problem-solving conversational questions. Seems to work pretty well, but I'm always on the lookout for better approaches. Part of the problem is that the software field is so broad that it's hard to get a sense of a person's abilities with such a limited amount of time to ask questions.
Sure, we'd all want an ideal setup but ideal isn't possible in an interview situation - both sides need to compromise a bit for practical reasons. And really, while we say "laptop" if someone wanted to lug in their desktop with two monitors and keyboard we wouldn't stop them. In fact, it'd probably earn them points for being extra nerdy ;)
Would you rather a laptop w/ standard editors and IDEs or would you prefer a google doc, collabedit or a whiteboard like many places do?
That's a bit like saying, "My daily driver is a Mercedes Benz SL55 AMG, so I can't drive a Hyundai Elantra, not even for a block or two." Or like saying, "Oh, I normally wear Air Jordans, so I can't walk with flip-flops."
Moreover, if you're that useless without your dual monitors and a full-size keyboard, would a whiteboard-based interview be any better?
Hmm..This just pop into my mind as I read your comment.
Could you use this specific scenario to view how the candidate would react? If he starts to complain that the right tools are not on the laptop, or he is unfamiliar with environment, does it is signify that he is focusing too much on the tools rather than on solving the problem?
Maybe his time management skills needs some work, ie limited time to solve the problem and too much time worrying about environment.
Well, we don't do the puzzle questions, and I agree that those are not useful to determine the level of software engineering experience a candidate has.
However...
99 times out of 100 you could catch that engineer by talking to him about technical aspects of past projects he worked on.
That does not work for us.
We must put each and every candidate in front of a keyboard, and bang out at least a little code.
We had one candidate, recently graduated with an MS in CS. Nice personality, good talker in general. The candidate seemed to understand CS and could explain how to solve problems.
However, even with repeated prompting, I could not get this person to even type in a loop to start to solve the problem presented.
We've interviewed multiple candidates with a BS in CS or similar who also spectacularly failed at doing even basic coding tasks.
Some people are really great talkers. They really do sound like they know what they are doing. But that does not mean they can actually do anything.
As someone else said, I'm not talking about someone fresh out of school with a few months internship experience.
>That does not work for us.
How do you know, did you try it? I said 99/100, sure there are some rare people who may slip through, but there are people who slip through a coding challenge as well.
How do you know that your coding test is better at detecting secretly terrible engineers than talking through past projects (for experienced engineers)?
>Some people are really great talkers. They really do sound like they know what they are doing. But that does not mean they can actually do anything.
That's sounds like a problem with the questions you're asking. Some people can throw out buzzwords and talk at a high level, but if you really dig down into the implementation details of project you can weed those people out.
How is this relevant to the parent? You're talking about someone with a CS degree, which means they have no experience. How can you ask them about the technical aspects of what they did on a project. "What parts of the system did you write? And what problems did you hit. What solution or bug were you proud of solving?".
Nearly all the the candidates that we've interviewed have at least a little job experience, such as an internship. And even for the ones that don't, they all have worked on team-based projects as part of their coursework.
So we end up asking them all those sorts of questions.
Working on team-based projects as part of coursework does not mean they've ever written a line of code in their life.
Going through a CS degree without learning to code is easy. Depending on your CS course you may hardly have touched any code beyond extremely entry level courses where you're being hand-held the entire way, if you pick the "right" subject.
CS is not software engineering.
Similar with internships.
For both those situations, I can understand that you want to verify ability to actually write code.
You're just agreeing with ansible - no one here is arguing that a CS degree means you can code. Rather ansible is relating his experiences with people graduating from those courses and not being able to code.
What you seem to be implying, is that this is obviously true for graduates, but couldn't possibly be true for people with real experience. (But your implication has no evidence to support it)
Exactly why do you believe that it is completely plausible for a recent CS graduate to be able to explain the technical details of their final year project when in reality they contributed zero code to that project, yet it is entirely implausible for someone with 3 years experience to be able to be able to explain the technical details of their most recent development projects when in reality they contributed zero code to those projects?
> yet it is entirely implausible for someone with 3 years experience to be able to be able to explain the technical details of their most recent development projects when in reality they contributed zero code to those projects?
Here is problem. It's not entirely implausible, just very unlikely.
The type of interview I'm talking about isn't going to catch every single bad candidate just the vast majority. But whiteboard interviews don't catch every candidate either and they have an extreme amount of false negatives.
To your point about students, they have much less work experience to talk about. The point of the interview is to look at their experience and talk about it to verity they aren't lying. In most cases the student doesn't have enough experience to tell me anything.
That being said, if a student can talk me through technical implementation details of a final project, then I think that's a pretty good indication that the student can code.
I would say the false positive rate for that test would be very close to the false positive rate for a white board interview with drastically lower false negatives.
If you're Google and you have a limitless number of people who want to work for you, you can get away with taking a vastly higher false negative rate for a slightly lower false positive rate. However, if you're almost every other company out there, you might want to reconsider.
I am not sure if your comment is supposed to reply to mine.
I do think there are incompetent engineers, and I have interviewed about a dozen of them. It is very easy to filter them out.
The challenging part is to evaluate how good the candidates are, and if they would be good for the team and for the applied positions.
In start-ups with <20 engineers, you would like to have the candidates to be able to wear many hats. In a big engineering team, the requirements could be different. For example, besides challenging and creative work, there could be some tedious, even boring, work. The latter is also important, both for the whole system running smoothly and for the long term health of the system. So if the candidate is sufficiently competent, very detail-oriented, disciplined, team-oriented, she/he would be an excellent fit for the job, even if she/he is not technically very strong and couldn't figure out challenging technical issues. An excellent engineer, on the other hand, might get bore, grumpy, and leave in 6 months. It is expensive to restart the hiring process.
I worked with a guy who, after having been at the company for 6 months, called me over to his desk to help with a "problem." He wondered why he was getting a syntax error. Okay, fair enough. I looked at his screen, and he was putting the arguments to the function OUTSIDE the parentheses. The guy had clearly just been copying and pasting code until "it worked" the entire time he had been there. I almost palmed my face right off of my head.
To be fair, I have done that before when just messing around on my own late at night. After working for as long as I had been I felt like reading my own code was like wading through sludge and my eyes just didn't catch the error. I was still however well aware that putting the arguments outside the parentheses is not how this works, not how any of this works.
That reminds me of how I deal with C's pointer syntax. "Well, it's either * or & or a combination of a low number of the two, let's try a couple of combinations that seems vaguely reasonable... and some that don't, just in case". I don't know who's most at fault: Me or C.
In case it helps, I've noticed what I'd consider a wart in C that seems to be a common stumbling block with this - including when I first learned C a couple decades ago: * when used for types, has the "opposite" effect when it's used for values/expressions.
int* foo; // The addition of a star here, on a type, means we're converting the type /into/ a pointer-to-int /from/ a plain int.
int i = *foo; // The addition of a star here, on a value, means we're converting the type /from/ a pointer-to-int /into/ a plain int (reference).
I found it much easier to keep the two straight once I segregated their use cases like this. There's a similar duality with & in C++ using references, although not quite as cleanly:
int& foo = ...; // The addition of & here, on a type, means we're converting the type /into/ a reference /from/ a plain int.
p = &foo; // The addition of & here, on a value, means we're converting the type /from/ a reference /into/ a pointer-to-int.
Regardless of "fault", I hope this helps :)
(edit: C++ style comments used even in 'C' code to try and prevent YN from treating stars as formatting... replaced a few instances with 'a star'... changed emphasis to use slashes because it's still eating asterisks characters...)
rule of thumb: "*" means "look up the area of memory this value points to". "&" means the reverse - "give me a reference to this area of memory".
The problem with randomly trying things with pointers is that sometimes you can accidentally fix the issue but introduce a subtle bug that's dependent on the data being stored.
I'm with you.. I'd have to read a refresher on getting started with C if I were to even do something trivial with it... I've been in higher level languages so long (C#, JavaScript, etc)... I find it helpful to understand the concepts sometimes, but would be unlikely to reach for something more low-level than go these days.
I'm sure there are several variations but I have a sort of mnemonic to remember this - that * looks like an arrowhead seen head-on and so it "points" to a value. & is then, a reference.
I've been fortunate enough to work on small teams my entire career where someone like you describe simply would not last. I don't think I've come across a person this bad though. Even the 'bad' programmers I've worked with were able make working code, but it was not designed well.
You should work on your reading comprehension. What you're saying is precisely what the author is talking about: everyone has a story about terrible engineers, but they're almost always "elsewhere". Co-workers tend to become "stupid" and "incompetent" after you leave. And yet they maintain employment.
It's an idea, not an iron-clad rule, that perhaps "incompetence" isn't as rampant as so many people suggest.
I read a really good comment by I think Nolan Bushnell about engineers. He said said he got asked all the time about how to avoid hiring unproductive engineers. And he said he found that baffling. Because an unproductive engineer just costs the company their salary. More important is ferreting out and eliminating negatively productive managers, engineers, and technicians in that order.
Algorithms? Why would you care if a programmer knows the details unless that's the job. More important is how they approach and attack a problem. Sometimes a half broken shell script is fine. Or leveraging a database because even though slow an inefficient it will do it right. Half the secret to success is... Um.. seriously just watch this video and you'll know it's truth.
An anecdotal case of the worst candidate I interviewed. He was friends of our VP. He had an engineering degree from UC Berkeley. Over 20+ years, he was a consultant for firmware development at 28 positions. The non technical portion of the interview went great and I said I wanted to ask a few coding questions. He was really resistant and used all kinds of excuses. I told him to humor me and code something simple like draw how a linked list works and how the to write the insert function in C. At the point I was totally caught off guard. He knew not a single like of C syntax and just made things up. He didn't understand the concept of pointers at all. Later when I talked with my co-workers they had the same feedback. He didn't know how to program at all. We wondered how he consulted at 28 companies for the last two decades!
I think his average stay was maybe 9 months which is not unreasonable for consultants. My guess is they were all charmed by the CV and didn't interview him thoroughly enough.
Or your analysis is wrong and he didn't need to know a single keyword in C to be fantastic at his job. i.e. if this is the truth then if we re-run history but you spontaneously accept an excuse and let him skip your technical interview, you might have been extremely satisfied with his work and input with you over the next 9 months.
We have a couple of things to analyze here: 1) could this have happened? 2) in this case would your satisfaction be false, i.e. he wouldn't have been doing good work for you over the next few months?
Or is there a chance that he had some skills that didn't require knowing any coding, while being able to contribute valuably to you...
Or maybe he's old enough that he's used to thinking in assembly. I learned firmware on 8051 assembly, I've used C mostly since then but always with a reference, and only the last few months finally am getting into C++.
That doesn't mean I don't understand how firmware works. I know what's going on deeply under the hood, and I understand systems architecture. I know how to debug the stupid SPI interface when you're talking to an Analog Devices ADC that just won't respond for no clear reason. I know how to read the datasheets for parts and pick reliable ones that can be sourced readily and have sane interfaces.
I have no idea what a linked list is. I do know pointers, but not super well, I just know that they're a spot in memory that gives you the address of another spot in memory, like the program counter. I know my limitations, I'd never try to interject in dealing with a system that has a full OS on-board, and I'd refer questions about encrypted bootloader based firmware upgrades to someone who knows what they're doing. But I still know what Mode 2 SPI is, and the typical noise levels on different kinds of shielded cable. That's not nothing.
shrug
I think the best way to determine if someone is a decent engineer is to get them talking about previous projects. If all they can do is talk about it at a high level and they don't seem to have any real knowledge of the implementation details, they didn't do the work themselves. I once had a guy who claimed to be an instrumentation engineer, but didn't know (even vaguely) how you would go about measuring temperatures in a process you're dealing with. I could be wrong (often am), but anyone who has done instrumentation but has never encountered a thermocouple... well, that would be a very unusual history.
> Or maybe he's old enough that he's used to thinking in assembly
"I'm sorry, I don't actually know C, can I show you this in assembly instead"?
> I have no idea what a linked list is.
I cool thing about the linked list is that it's probably the simplest data-structure in all of computer science. I can explain to you what they are in less than 30 seconds, provided you have even the faintest grasp on how computers actually work.
> I could be wrong (often am), but anyone who has done instrumentation but has never encountered a thermocouple... well, that would be a very unusual history.
This is exactly what I'm talking about: A good programmer who can't understand and reason about a linked list, that's very unusual.
Haha, thanks for the schooling. I looked it up and it turns out I studied them in scheme back in the day. I just didn't know the term, so I'd (probably) come off as incompetent in an interview where asked.
> This is exactly what I'm talking about: A good programmer who can't understand and reason about a linked list, that's very unusual.
Well put then, I say rather chagrined ;-)
My intended point though was that sometimes you can't tell how good of an engineer someone is by asking them about things you know. Better to ask them to talk about what they know. But that's not very controversial I'd hope!
Your entire comment seems to hinge on the assumption that the GP wasn't hiring for a C programming job. That's pretty disingenuous.
Or do you really find it plausible that people who don't know a single keyword in C can be "fantastic" C programmers? Or that it's a good idea to hire very, very nice people who can program for programming jobs because they might have other skills that might be valuable?
Sorry I'm still not in industry, but isn't there a review system for past employees, and specially for past consulting work? This would be extremely valuable for future employers, problems with lies and uncertainty notwithstanding.
It looks to me a few amazon-style 10 minute reviews could save many many hours of HR rumblings and expenses.
No, people are too lawsuit-crazed these days. Many companies will give no facts about a past employee other than the dates they were employed. Even "would you hire this person again?" can be dangerous.
At one of my past employers, my manager (who sat next to me) told me that he wasn't allowed to say anything whatsoever about past employees - when anyone called to confirm past employment he had to give them HR's phone number, where specially trained operatives would presumably answer the one and only question allowed.
We do all development in 95% C and 5% assembly. While not with him, I have given people chances at other times and they fail miserably if they don't have basic C programming skills.
We've all worked with them. A senior engineer I worked with previously, claiming 25 years experience, did not understand when I rejected his code that declared a pointer, set it to null and then proceeded to pass it to another function with a memcpy style interface to put data into a supplied buffer. It compiled and was therefore correct to his mind. After several incidents like this and lots of proclaiming "it doesn't work" when encountering a bug, which he would then proceed to make someone else's problem, I learned to think of him as a very poor tester.
I didn't perform that interview but I was left wondering what the hell he'd managed to do for so long. A long list of experience on a cv can be meaningless if every post has ended up being a net drain on the team while others work around the problem.
I've interviewed, and been an interviewer, in interviews with the most obnixous technical questions, interviews where the hardest technical question was equivalent to "can you code?" and interviews that were in between. In no case was there a discernable correlation between the technical difficulty of the interview and the abilities of the eventual hires.
This industry's biggest hiring/talent problem isn't "posuers." The biggest problem is the persistent and incorrect belief that applied mathematicians (CS majors) can easily learn the skils needed to engineer complex systems as they go (or, worse, that said skills are trivial or even irrelevant compared to the ability to recite textbook DS/Algorithms trivia) and that they are also able to adequately determine the capabilities of other similarly trained persons to do the same mainly by relying on textbook test problems (albeit sometimes with a twist).
In other words the (undisputed, in my opinion) existence of people who can't write a simple for-loop doesn't justify the absurdities seen in modern software interviews.
Having been involved in a large number of interviews as an interviewer, my experience tells me the issue is that many highly-competent engineers are incompetent interviewers.
To illustrate my point, here is an imaginary example in the extreme scenario. I once spent a large amount of spare time in graph theory. So I could easily fail at least 95% of candidates by giving algo questions in graph theory. The questions are easy to describe, so they are perfect for interviews in this sense. But I would have done my team a big disservice by turning away potentially very good candidates, and I would have harmed our company's reputation.
This has always mystified me. The key to any interview I do is a pair programming session. Before I hire somebody, I'd like to see what they'll be doing after I hire them. Crazy, I guess.
I have never understood what application Mensa-quiz puzzles and algorithms trivia questions have to the day-to-day work of making things go. It seems like a smart-person pissing contest. Maybe that that's what the bulk of their work time consists of? I hope not, but I wouldn't be entirely shocked.
Come on -- you're on of the commenters I make a point to read on HN. You can't possibly have missed that a huge chunk of valley interviews are, exactly, a fucking smart person pissing contest.
A previous employer loved them some graph problems for interviews. I got in an argument with one of the senior engineers, and demanded I be shown where in our codebase we used graph algorithms. The answer: nowhere. But you had to know your way around them to get hired there.
Now when I interview, I refuse to study them. I think this serves me well in helping me avoid jobs I wouldn't enjoy anyway.
An interviewer once essentially demanded I invent morris traversal on the whiteboard in front of him. I'm not sure how that related to my ability to be an ml engineer.
Heh. Thanks for the compliment. I have actually never sat for a valley interview, so I didn't want to judge based on second-hand data. But I'd believe it entirely.
I think in some (maybe many) cases it is a "pissing contest." In other, and probably the majority, of cases I'd hope it is a case of simple inexperience coupled with arrogance resulting from the relative immaturity of the field. And I mean "immaturity" in more than one sense. The industry itself is young, especially compared to "traditional" engineering and sciences, and it is also filled with a high proportion of relatively immature people, at least in comparison to the slice of academia I was in.
Been through dozens of interviews in the last 5 years and never encountered this. I think this would work well because it draws out a conversation. An interview should be about discovering how a person solves problems, not about either the problem or the solution.
In fact, a long time ago an accountant used this very technique to decide that I was a smart software guy. He then hired me even though he knew nothing about computers and software.
If an engineer can contribute in a pair programming session then you know they have something to contribute in your development team later, even if you don't pair all the time.
i bet there is a little bit of interviewer insecurity going on. "look how hardcore we are, we act like this is normal and we solve problems like this every day"
when i run in to an interview like this, it tells me i dont want to work with these people.
"The biggest problem is the persistent and incorrect belief that applied mathematicians (CS majors) can easily learn the skils needed to engineer complex systems as they go (or, worse, that said skills are trivial or even irrelevant compared to the ability to recite textbook DS/Algorithms trivia) and that they are also able to adequately determine the capabilities of other similarly trained persons to do the same mainly by relying on textbook test problems (albeit sometimes with a twist)"
thank you for this golden nugget. its like thinking you can build the best basketball team by getting the most athletic looking players, except much worse. i guess the conventional wisdom is that good software engineers are like mathematicians, but in my experience they are more like craftsmen, like a cabinet maker or ship builder. i suppose its contextual to the software problem, but analytical, deductive aptitude, while helpful, seems to pale in comparison to what i consider the craft (consistent style, naming, testing, pacing, working together and communicating, effective editor and shell use, knowing how the fuck to use git, etc) which have very little to do with analytical, deductive aptitude.
Actually I think engineering requires the same analytical ability as the science that forms the basis for it. It's just applied in a different context and requires a different path to develop into a good talent.
Terrible $whatever make it in to every field imaginable. The tech industry is nothing unique at all. Bad doctors exist (despite the frameworks for professional exist) for example. I bet almost everyone has dealt with bad customer support people or bad secretaries.
It seems to be something that is clearly inevitable. We've been trying for decades to construct the perfect interview process to weed out people, and while we've had some fair success, people always manage to slip their way through.
I wonder if we spend sufficient time on the remedial end of things? Can we improve our practices for handling bad software engineers? Can they be saved? When do we get to the point where it's appropriate to cut and run?
Mostly what I look for in candidates I interview (for sysadmin/sysengineer roles) is signs of aptitude. I have a baseline knowledge I'd expect from them, but after that I'm more interested in gauging how flexible they are, and how they adapt to circumstances and change. e.g. "I see you introduced Chef to your current place to handle configuration management, why did you pick Chef vs other solutions? What did you compare it to? What hurdles did you come across implementing it?" and diving deeper and deeper in to it. The deeper you dive the more apparent it should be what kind of role they played in the whole thing and what they've learnt.
Me? I suck at whiteboard coding exercises. I'm a sysadmin, and one who does a reasonable amount of coding, but put me in a room under interview conditions and ask me to write something to solve a problem and I'll struggle. Worse it always seems like I get stuck with a developer asking me to solve some ridiculous thought exercise that bears little resemblance to anything I'd do in the real world, that I'm assuming ties in to their favourite algorithm or something.
> It is also, regretfully, not the case that all applications one receives to an advertised position of Senior Ruby on Rails Programmer would be from people who had ever opened a command line.
It always amazes me how many "Geek Squad" candidates you'll see who never even mention Linux on their resume that go and apply for Linux Sysadmin roles.
It strikes me that in the same day (albeit one filled with hiring articles) we get an article about the success of Greyston Bakery, which will literally hire anyone and train and support them as much as needed to be successful—and this, which is along similar lines but is about software development, which, obviously, requires special aptitude that not just anyone can succeed at, and must be protected from those who would trick us into getting hired. /s
Somehow, it feels warm and fuzzy to support the social justice of the little bakery that could, but when it comes to our own companies, our true colors shine.
Continue to take the systematic approach. It is still the best way. Of course, of course you should hire as effectively as possible, and try to have a baseline of aptitude, but there are always going to be outliers and bad fits. It's far more effective to optimize your system for the success of all employees than to focus on weeding out the 'bad ones.' Sure, there will be people that won't fit your system, but they made it through the hiring process for some reason—figure out their true aptitude and transfer them to something they'll be effective at.
The article may be wrong about the existence of fakes and bad programmers—anyone who has filed through a stack of resumes knows that—but it's right on the unreasonableness of the fear and the weight we give it.
I say it all the time to our CEO: if hiring is really a huge problem we want to focus on, how did we get all these great people? Hm.
The real problem is organizational. Stop worrying so much about hiring, optimize your process and be done with it, and focus on what happens afterward instead.
It seems to be something that is clearly inevitable. We've been trying for decades to construct the perfect interview process to weed out people, and while we've had some fair success, people always manage to slip their way through.
Much as successful security involves "defense in depth," I suspect that good HR involves much the same. It's not reasonable to build a strategy on some sort of ultimate impenetrable barrier.
Most organizations have two actual levels of membership that often exist outside of the formal rules. The reason it works this way, is to sequester the "net negative" members and protect the contribution of the "net positive" members. This must be what Valve is aiming for with it's hyper "flat" lack of structure.
I asked someone with a PhD in statistics to, "Explain regression to me like I know some math but have never used statistics." For those of you who don't know, regression can be taught to people either with a single course in calculus or people with a single course in linear algebra. It is also a, maybe the, fundamental statistical tool. I'm generally satisfied with people drawing some points on a 2-variable graph, explaining we're trying to calculate the least error fit, then telling me what that the error specifically is squared error. If the candidate can discuss penalties and outliers and glms and exponential family distributions and IRLS and sampling distributions and gauss-markov then great, but I'm satisfied with a really basic understanding. This guy couldn't get any of it. I simply don't understand how you can have a phd in statistics and not be able to explain in 60 seconds roughly what regression is, how one interprets the betas, and at least one basic inference method.
FWIW I've come across many PhDs (I've worked in PhD-rich environments for the last 15 years) who are surprisingly poor at fundamental level stuff in their field. They're so far beyond it and so specialized that they just don't really remember how the basic stuff works.
Part of my standard phone screen approach is to open up an etherpad or other collaborative coding tool and ask the candidate to perform a Fizzbuzz variant (variant enough that memorizing Fizzbuzz wouldn't save you). This is entirely intended for failing people who can't code at all.
Yes, I've had people fail this. Yes, I gave them an extra day post-phone-screen to complete it. No, they did not complete it.
While I loathe the alpha male dance of software engineering interviews, I fully support the Fizzbuzz test, having encountered those that fail it.
That story could have been taken directly from my own brain. People like this get hired not infrequently and it's always frustrating to get one on your team.
I want to work where you have worked, then. Rant from the aircraft industry follows, feel free to ignore.
Engineering incompetence is widespread from my experience (flight test engineering). I wish, as a flight test instrumentation engineer, that I had not had engineers from various disciplines ask me how to interpret the contents of their test monitoring screens. I am responsible for providing the accelerometers, strain gages, data bus monitoring, and so forth that the various teams request and making sure that the data from those sensors is decision quality and properly telemetered and recorded. But it is NOT my responsibility to interpret that data. And yet, some of these other engineers, who are responsible for real-time monitoring of safety of test parameters during flights, clearly have no clue what they are looking at. One engineer even confused hydraulic pressures with engine temperatures during one test I was involved with and was about to raise a fault against my system because she thought the data was faulty. That's just one instance -- sadly, I have encountered many more during this career.
Yes, these people are trained in their disciplines and on the content of their monitoring screens, but for some, it just doesn't seem to stick. How can they, in good conscience, respond with a "GO for test" when asked if they know they do not understand their monitoring screen?
I've never worked with anyone who flat out couldn't code. I've worked with some who needed more help than others, but I'm fine with that since I've been that person myself with some things. But never with someone who just couldn't do it at all.
On the other hand, I've certainly had interviews with people who are just awful at it. So much so that I think the interview process is by far the greater obstacle to finding competent people than some huge mass of incompetent people out there.
Think about it this way- Almost no one has ever had explicit training in interviewing. Everyone learns by trial and error and rumor and superstition. It's one of the most cargo cult aspects of any given business.
I had a "software engineer" work for me that didn't know if from while. Literally. He was copy pasting stuff until it "worked". One day he calls me over because his program is hanging. He had something like " while (x) print something; ". I asked how that would work if x never changes in the loop. After I bit of blankness, I said " isn't this just supposed to print once, like maybe use an if? "
"Oh yeah, if. That's the one I wanted."
...
Worked with another person of similar skill, that went on to become Microsoft MVP. Do not underestimate how far drag-n-drop and copy paste can take someone with modern timing. Especially if they're good/inclined to tout themselves.
Say what you want about Google (it seems to be common here), but you will never work with anybody like that at Google. Really never ever.
You will work with managers, PMs, and lawyers who have much more coding skill than that.
Not to say there isn't bad code. Actually the problem at Google is people write too much code... But there is a base level of competence you can expect.
> you will never work with anybody like that at Google. Really never ever.
Well... at a previous company I worked at, we hired an ex-google software engineer. He had a good resume, and was one of the best interviewees I've ever had. He totally nailed algo questions and even some arcane functional stuff I threw at him he hadn't seen before (yes, those kinds of questions are pretty dumb, but I was a bit more naive back then).
But, this guy ended up being a terrible programmer. I mean, he seemed pretty darned bright from that interview session. Amazing even. But he was unable to ship code, let alone write good, elegant code. He just wasn't wired for it.
Ironically, he ended up getting booted from his team and moving into the biz side where he made a lot more money than the rest of us (trading).
Yeah, I'm not saying there aren't terrible programmers. I'm just saying that everybody can write a 1 to 3 line "for" loop and navigate a directory structure :) That's the level of incompetence the parent was describing.
Definitely not true. I'm a current Google employee, and one of my teammates (a senior staff engineer, no less) does NOT know what a unit test is, what an API is, what Borg is, or how MapReduce works. After suspecting this for a while, I asked him to try to explain some of these concepts, and he either had no answer or his answers were hilariously wrong.
(He did come from an acquisition, though, and doesn't write any code. He's more of a manager, despite not being on the manager track.)
I don't know what Borg is either, but then I realized it is likely an internal Google thing; unit test, API, MapReduce -- these are all ubiquitous concepts to programmers outside of Mountain View, so the inclusion of Borg seemed a little out of place.
I am currently working with Google. We had people who couldn't code on our team. I don't know if they were officially contractors like me or Google proper.
I knew a guy who could go to the board and draw a FSM, construct a context-free grammar, or construct a proof for anything the professor asked for.
And in Algorithms, he could go to the board and walk through any algorithm we happened to be talking about.
In essence he was perfectly suited to white board interviews.
But when I worked with him in Software Engineering his code was unmaintainable, poorly commented, and he was very abrasive and unpleasant to work with.
Yup. The difference between a computer scientist and a software engineer.
It's a sad state of affairs that our industry is currently conflating the two. Or maybe the industry has only now reached a stage where the two need to become separate?
I mean, you wouldn't expect a theoretical physicist to design an experiment or, worse, a bridge.
I absolutely would expect a theoretical physicist to at least be able to competently contribute to the design of an experiment, especially as it pertains to his specialty.
The bridge analogy is spot on, in my opinion. It mirrors my above comment, and I couldn't agree more.
That may have been the case for 2006 Google, not 2015 Google; there's ~18000 engineers now.
I helped a classmate who didn't understand basic programming concepts batting 1/20 in full time interviews before getting and accepting a Google offer.
That was when I realized how far Google's hiring standards have fallen.
Do you work at Google? I agree with the grandparent - I haven't met a single person lke that here. I'm sure false positives exist but they are rare - you need to do well on a bunch of interviews that involve both coding and algo questions.
Either your friend learned how to code or won't last long.
I knew somebody like that who got hired at Google. Bad coder, got fired from two companies in a row, and would have gotten fired from where we worked too, except I went out of my way to stop it from happening.
That turned out to be a mistake, but that's another story. Point is, he went on to work at Google.
This also happens in places where people are hired 'in bulk', and where technical questions can be answered by rote. Even when your standards are higher, it's still easy to hire people that are just slightly more incompetent, just by not doing a coding assignment.
This is why, at least around here in the Midwest, it's easy to stereotype other programmers by just looking at the last few companies they worked for. Came from XYZ? I won't even suggest an interview, because their architects would barely be junior devs here. Came from ABC? You don't need a screening, because you'd not have lasted two years there if you aren't at least decent.
This is the real reason hiring through networking is so prevalent around here. If someone competent vouches for you enough to bring your resume forward, chances are they'll at least not embarrass themselves in a simple interview that asks for a little bit of coding.
The reluctance of accepting a mistake in hiring is also an issue too, but I see more of that when we add both time and changing standards. For instance, I know of a big company that started with a really low talent level. They somehow managed to hire better people than they had, as time went by, but promotions have a lot to do with seniority. So you have a team of 'architects', that are supposedly in charge of things, but that, really, are worse at their job than the people that they are currently hiring. So what happens there? They have this nice cycle of hiring developers, having the good ones see that they will have to spend their time there arguing with architects that were never that talented, and whose skills are now outdated. But the architects have been there so long, they are part of the scenery. They'll never quit, as they'd never pass another interview for their experience level somewhere else, but they'll never leave their spot open unless they are fired. And management will not fire them, because they are old buddies. That's how organizations decline.
I know in South Africa, where I live, you have a third reason. And fourth.
3. There are regulations in place that disallow you from firing employees easily. Even if they're incompetent, they can only be fired outright if they break their contract in some really big/obvious way. So they're forced to follow the motions, giving warning letters, eventually having a meeting to discuss it, etc, and only after that has all been done, can they dismiss the employee.
4. Affirmative action quotas. Theoretically sound and noble. But in practice just means that if there is a shortage of said "group", then you are invariable forced to hire sub-standard individuals, and keep them. Sad, but true.
Funny aside from above. We once had a junior that was hired fresh from undergrad school, with the idea that they'd be trained up to speed. They were trained by the company (while getting a salary) for, I think, about a year before they were passed off as real junior developers in to the company. Said junior had the nerve to chuckle/mock me because I had an "IT" degree, instead of his fancy CS degree. In reality, he couldn't code a coherent piece of code to save his life. We spent countless wasted hours (this was after he finished the internal training) trying to at least get him to contribute something, anything even non-coding related. In the end, once he got his 9month/1year experience, he jumped ship. Probably to trick another company into paying him to sit around "learning".
"Never attribute to malice what is sufficiently explained by incompetence". Nepotism is not necessary.
In one company I worked for, the interviews were done by the self proclaimed CTO (which was one of the founders of the company) which was actually a dentist with no tech knowledge whatsoever. Of course in this case the interviews were just generic chats, with no technical question.
I once was offered a job for director of engineering of a big financial firm in the UK based on a couple chats over the phone with the CEO and some tech guy (don't remember the title). With the CEO, I just chatted a bit in general about my past and my future, with the tech guy, mostly explained my past, talked maybe 5 minutes about tech (design patterns or something) and that was it.
No code, no hard technical questions on what to use what or when, nothing. I know a lot of people that didn't have the skills for that job but their attitude and bullshitting techniques would probably have gotten them the job. And this was a very successful company handling millions in revenues and working with the major banks.
However, the question we should be asking is: do we need a question more complex than FizzBuzz to determine whether or not someone is an STE? If so, why?
And if not, then why do most developer interviews consist of a series of coding questions that are more complex than that? Because as much as we like to say "so we can see how you think", really, virtually all interview coding questions are FizzBuzzes.
I had one job that, days before the interview, sent me the spec for an archival format they used, and asked me to implement an extremely basic archive generator and email them my code. At the interview, they asked me to explain what my code did. I went into detail about specific implementation decisions, things that worked for this restricted domain but would not work for a more general version, and ideas for future development.
Best job I ever had.
I think the answer to your question is that it's impossible to determine whether someone is an STE with 15 minutes in front of a whiteboard, because nobody codes like that. You will get false positives from people who can slap together a simple algorithm but don't understand why slapdash hacks aren't good enough for production, and false negatives from people who are brilliant coders but need a little while to get into the headspace, or have their flow disrupted by having to use a damn marker instead of a keyboard and their favorite text editor. You cannot identify an STE in a one-hour interview. If you want to see if someone can code, have them actually code.
Oh, here's another big problem I don't think anyone has mentioned yet: people who think that the stuff they, personally, have memorized represents the core knowledge of computer science that everyone should know, and everything else isn't really important. This is the guy who is unimpressed by absolute mastery of regular expressions, but sneers in disgust when you don't know all the arguments for pushd off the top of your head.
I was once asked "Can you do a quick fizzbuzz implementation on the white board for me" by an interviewer. I stood up, looked at the whiteboard and said "So it's, what.. multiples of three are fizz, five are buzz and both are fizzbuzz right?" and he said "Yep, nevermind, let's move on."
There was a lot more technical, and a lot more practical stuff to that interview as well, but as someone with both dev and dev management positions on their resume, I appreciated that the interviewer just wanted to know if I'd prepared or at least been paying attention for the last few years with the easy opener. Would you even consider someone in this career field who didn't know what fizzbuzz was?
I'd say after 7 years as a professional software engineer. I had never had to do anything harder in an interview than show up. The companies I applied too had prior coworkers and managers who wanted me on their teams. Then I ended up in a company where the management was so dysfunctional that it had the overtones of an abusive relationship. They then laid me off. At that point, I was a Sr. Software Engineer, during the depression, with a 2 week old newborn, out of work, stressed, not sleeping, and I had never white boarded a class, no idea what fizzbuzz was, and the last design pattern conversation I had had been 2 years previous.
I failed the interviews with some pretty decent companies. I got better.
My learning experience with that, is that the majority of interviews I took were not to determine if I was actually good at my job. They were to determine if I was good at interviews.
"Would you even consider someone in this career field who didn't know what fizzbuzz was?"
Yes, of course. Why would not knowing what fizzbuzz is disqualify someone? Simply because they have not heard of it or not read the article on it means absolutely nothing. Not being able to do it, on the other hand, would obviously disqualify someone.
I heard this many times, and yet I've never encountered them in my career. I was even interviewing for a while and none of the candidates were as bad as people say.
Some wrote C or pseudocode on Java interview but pseudocode made sense and they could explain it.
I wonder if people with such experiences interview candidates without degree in CS (it's free where I live so it acts as a good filter). Basicaly unless you are on 3rd-5th year of CS degree or graduated, or worked as a programmer - you will have problems with getting to interview.
I'd say somewhere between 1 in 3 and 1 in 6 people that graduated with me (from a decent university!) could not actually program at the end of their three or four years.
That said, the same goes for at least two of the professors...
That is pretty scary. But more importantly it is really sad that the example programmer's manager doesn't know he's in capable of basic functions. I've seen programmers who were in a new place (new language, new environment) struggle and have helped them get organized, and if they weren't able to adapt move on to a company that used a more familiar environment. But that does take paying attention, and sadly some managers don't really pay attention.
I'm a bit of a fan of the mindset a few consulting companies I know have (only in this particular regard):
You are a warm body. We need a programmer. We will teach you to program well enough to make the client happy, regardless of your prior experience. If you don't learn, we'll bench/fire/dismiss you.
It's very cynical, but at least it's better than "oh no they can't program whatever shall we doooo" I see elsewhere.
To your point, a good manager should be helping grow the skills of their team, even if a member who technically shouldn't be there has landed in their midst. The industry is littered with stories of people who, given proper support, turned from a liability into a huge asset.
As long as they're honest about it upfront, and provide honest feedback through the process, I don't really see that as cynical at all.
Having someone take a risk on you and be prepared to invest in teaching you necessarily will come with obligations to deliver, and even if it doesn't work out chances are you're better off (hopefully you'll have learned something even if it worst case is just that you're not cut out for that work) than when you started.
We could use a lot more of that for people to get past the initial challenge of making the move into industry after graduating. Especially given that so many areas mostly have CS heavy courses rather than software engineering.
Is there any chance the bad programmer was new to whatever language you were working with? Stuff like this:
"The fooBar subroutine was one statement long."
reminds me of when I first started to learn Clojure. I was so used to "=" or ":" for assignment that I had a difficult time seeing where the assignment was happening. Especially the rules about destructuring, which took me at least 6 months before I was comfortable with them. Also, the recursive rebinding in a Clojure "loop".
I don't mean to defend the bad programmer that you are talking about, but I can imagine an experienced programmer having trouble recognizing assignment if they were new to the language and the language had some unusual system of assignment.
My wife ended up with an employee like that one time. He was absolutely incompetent at even basic development tasks.
More troublesome, the place she worked at the time had a very complex technical interview process where candidates have to show at least some level of competency in the core languages and concepts the company was using at the time -- and he had passed it.
But once work started, he was completely incapable of even trivial development tasks...somebody who knew nothing and had access to stackoverflow would have performed better.
Eventually, after about six long months, he just had to be let go. But the mystery of how he made it through the technical gauntlet remains.
I thought I had escaped this for the most part, but recently, while working a contract that had my company in close proximity with another, I uncovered perhaps one the best bullshit salesmen I've ever encountered. He was a master at moving goalposts, organizing meetings, and producing lots and lots of activity, but no actual technical output of any sort. He knew all the latest technobabble buzz-words and even taught as a professor in the subject at a local college. His pedigree seemed great.
Finally, during a forced meeting, he was asked to show something anything he had produced, and he pulled up a half-formed, invalid JSON file he was hand writing for another project.
We've been able to separate our work from his group, but it's unbelievably aggravating that he's still getting paid.
> We've been able to separate our work from his group, but it's unbelievably aggravating that he's still getting paid.
How common is this?
I had a similar situation. I didn't challenge the developer directly, but when asked to justify why our project was falling behind I sent a summary of the VCS log to my manager. The developer hadn't committed anything in five years! Management promised to sort things out, but it's over a year and she's still employed.
Throwaway account, because this is one of the reasons I won't be employed there much longer...
I suspect it's more common than anybody would think. A disturbingly large number of stories I've heard about regarding "consultants" (from numerous points of view, including consultants themselves) seems to indicate there's a huge quality variance in the field.
wait, wait, wait. In your first two paragraphs, by "software engineer" do you specifically mean a coder, a programmer, someone who is paid basically to write lines of code? (As opposed to: software tester, architect or manager-type role on technologies they might not know personally, front-end designer who happens to accidentally be able to do some javascript etc.) You mean someone with nobody below them, who is not doing software testing, debugging, front-end design, or the like, but paid to put out code, but cannot write a for loop that does a character count in the specific language they are paid to code in? And who is gainfully employed "coding" in that language, for months, years?
I know, you think I'm insane, right? I doubted my sanity, too. Not a software tester, nor an architect, nor a manager, nor a houseplant, nor a figment of my imagination. A software engineer.
Actually, come to think of it, I think it's almost exactly accurate that "He had the same seniority in the same role that I would be have today if I had not quit that company 4 years ago." (This incident would have been from 5 or 6 years ago.)
>that Z should be allowed to "continue to devote his full attention to work which he is well-suited for."
So what was this work? What did the person saying this have in mind?
(I cannot possibly imagine anyone saying that with a straight face, if they meant that he is totally incompetent but is the boss's son-in-law. If your quote/paraphrase is anything near accurate, the person must have genuinely thought that z was making an excellent, albeit different type of contribution. He took time out of his day to tell you that. What could he have been 'well-suited for'?)
Both of these are very difficult things to accept. They may be even more difficult to accept if one is extraordinarily smart/diligent and one studies/works solely in organizations which apply brutal IQ/diligence filters before one is granted even a scintilla of the admission committee's time.
It's irrational to base a network security plan on the notion that one can concoct an impenetrable barrier. Very few things in the real world are built like this, because it's very hard to get that to work. Most successful plans are built on a notion of "defense in depth." Valve is probably using their structure to create a "free market" within its workforce, allowing natural leaders to create subgroups, elites, and further compartmentalization.
Even if Valve, being comprised of very smart people, has a magic interview plan that can select for real technical talent 100% of the time, all of the other possible workplace pathologies will be impossible to filter out 100%.
Could you at least give us some hints about these supposedly utterly incompetent software engineers? Since I've never encountered anything close to that level of incompetence, I'm assuming there is some variable that predicts it and that I have managed to avoid that variable. Is it a certain type of company that employees these engineers? Is it possible that the issue is not even with the employee's competence at performing his or her job, but rather just a bad job title that doesn't reflect their actual role at the company?
I've seen people leave University with all As in their degrees who couldn't parse a string into a number, and were pretty sure that that was in fact impossible.
> A coworker, having overheard the conversation, stopped at my desk later, and explained to me that, if I had pressing engineering issues, coworkers X and Y would be excellent senior systems engineers to address them to, but that Z should be allowed to "continue to devote his full attention to work which he is well-suited for."
What the hell would he be suited for in that case? Sweeping? Mopping?
I used to think that working with harder languages, e.g. C++,Scala,...[0], would avoid such people in the team, but no, some managers always manage to cherry pick them into the projects.
Now, to place this in context--was this at a company in Japan, with (by you even!) well-documented social norms about how technically incompetent employees are dealt with, or was this with a company with more conventional engineering culture?
This is not a Japanese-only thing. It is a thing so common in America that there is a named test to detect it -- FizzBuzz [+] -- and this test, or one like it, actually catches many people out.
[+] c.f. Any of the following and note that they're from the nation of, ahem, conventional engineering culture:
Not that it invalidates your larger point, but I feel I should point out that I invented FizzBuzz while primarily interviewing European candidates in London :)
I've gotten interviewed by people who thought they were great but didn't know what they were doing. And I mean for serious companies like Pebble. 22 year olds who can't even properly phrase interview questions.
Whose the terrible engineer? Me with a patent and 25 years of innovation in key major products like the playstation network? Or the guy who interviewed me who gave me a thumbs down for being "terrible"? Or maybe it was just cause I have grey hair in my beard?
I see a lot of attitude about how there are "bad engineers" that seems to come from younger people who are very ... proud ... of their skills, but are more bluster than brains in my opinion.
I don't think they're terrible- I think they have potential. But they have the wrong attitude.
"Me with a patent and 25 years of innovation in key major products like the playstation network?"
You might be an excellent engineer, but this is a terrible pitch for yourself.
A software patent is meaningless to most software developers. Most software patents are nonsense. Many of us are listed as inventors on plenty of silly patents, and that isn't very impressive. To me, the mention of a patent in this manner is a negative signal.
25 years of experience is nice, but just mentioning that is hard to differentiate from the case of 25 times 1 year of experience. Just mentioning that work was done on something like the PlayStation network is in itself not very impressive: need to spice it up with some actual details.
Not trying to put down the parent comment by MCRed, but my experience is very much in line with what you say and in fact getting a patent is even easier if you work at large company (such as Sony). Those technical writers in the patent divisions of orgs are amazing at their job:
1) Submit a thin, whacky idea, with loose documentation to the patent department.
2) Hold a 1 hour phone call with technical writers.
3) Two weeks later you get emailed a 10 page patent application of your idea complete with diagrams asking you to review it.
>A software patent is meaningless to most software developers.
This might seem so if you don't leave the HN bubble (which, along with tech media as a whole, has no idea how patents really work yet and so harbor notions like "most software patents are nonsense").
But really it's impossible to generalize a statement like this to any significant portion of our industry. Right here grandparent is an engineer who obviously doesn't agree. There's another comment just below that actually thinks one patent in 25 years is low. The overwhelming majority of software engineers I personally know are proud of their patents. Anytime somebody posts a new patent to their LinkedIn profile, I see nothing but a cascade of congratulatory comments.
Not to say that all the patents out there are awesome, but the notions most people here have about them are far from commonplace.
I am also listed on some. But I have actually prosecuted a couple, and I find most tech communities to be entirely clueless about patents.
Posting that Spolsky article shows that no, you don't really understand how patents work. Spolsky has some rudimentary understanding, and I really appreciate what askpatents is trying to do, but he misunderstood that patent as well. When that article was posted on HN the top comment pointed out that the patent application was talking about something else. Indeed, following up on the application, I see that it has been granted with only slightly narrower claims: https://www.google.com/patents/US8933971
You may think you know what your patents cover, but have you actually read the claims? If you do, you'll find the claims are about as narrow as the invention is trivial, so narrow that nothing will likely infringe it. If that's what you meant by "nonsense", then I'd agree. But the fact remains that the vast, vast majority of programmers never do anything worth filing even those kinds of patents.
I know what the patents filed on my behalf entail. I know the claims. I collaborated on their "invention".
The fact it was granted a patent is not very note-worthy. It is not representative of anything useful. It doesn't represent deep thought, a lot of research, or anything of extraordinary value.
>The fact it was granted a patent is not very note-worthy. It is not representative of anything useful. It doesn't represent deep thought, a lot of research, or anything of extraordinary value.
1. Having read a ton of patents across multiple fields, that could be said of most patents ever (not just software patents). Most patents come out of engineering work, not high-level R&D.
2. That is still a huge step above what most engineers do their whole careers. Think of the average software developer - what percentage do anything approaching "real" engineering as opposed to churning out CRUD apps and web UIs dictated by business needs?
I interviewed a "Senior Architect" from a large company, and he had a masters in compsci. Very impressive sounding, much more than me. I asked a simple question on searching a sorted array and literally got a laugh. He was incredulous that anyone would be expected to know basic, and I mean basic, data structures and algorithms.
People dismiss these questions with a "I'm not gonna be writing a stdlib". No, but the concepts appear everywhere. Someone that doesn't understand them is likely to, I dunno, use a SQL DB, then put a separate index on each column to " make it fast ". Whereas someone that understands the basics will stop and think a bit before assuming platforms are pure magic.
(And BTW, I've seen enough people with patents to know that just having one is more a measure of going through the process than necessarily having technical invention capability.)
>Whose the terrible engineer? Me with a patent and 25 years of innovation in key major products like the playstation network? Or the guy who interviewed me who gave me a thumbs down for being "terrible"?
Possibly neither, possibly both. Having been involved with something that launched at a company I've heard of doesn't necessarily mean you aren't a terrible engineer.
As someone who's in their late 30s and has accomplished a lot, I can agree with the experienced dev above's attitude. But also having interviewed a lot of candidates at scale much of these tests are just filtering tests to reduce the amount of interviews team members must do. We always have the majority of our team members interview candidates so that everyone has buy in, but if everyone had to interview every single resume that looked good on paper (I'm assuming we all at least agree on filtering out bad resumes) this would get cumbersome.
It is frustrating as you get older to deal with the fear of ageism, I haven't experienced it yet, but I'm starting to expect it's arrival.
I actually think the testing you on problem solving is the right way, not necessarily knowing the Big O notation, but at least being able to attempt an answer to difficult questions and explaining your thought process is very telling to intelligence and ability. Definitely there is much cargo cult behavior going on at companies.
> I can't tell a "stakeholder" who did real engineering work apart from a "stakeholder" who got their name attached to the patent for political reasons.
Lets say the dude invented something important? are you to have a better follow up question?
If your point is that they commenter is delusional, and at once over-estimating himself and under-estimating others, you should come up with a better way communicating it.
A proper punk-nosed-kid would certainly be more clever.
It adds nothing to the discussion. If the guy did something major (case#1), the discussion is over. It just makes the person asking the question look like an ass.
The other issue is that anyone who would have a reply of that caliber most likely wouldn't reply...why bother...? Thats the (#2) second case. Again, the discussion is over.
The third best case is that the guy is bluffing, and doesn't bother to answer the question. This is case (#3) but isn't really any different in motivating the discussion than case #2. Game theory says the bluff plays to look like the win, right? So just don't answer (and mimic #2).
So there are no rational replies to this question.
It's notable, and obvious, that this guy has never actually tried to hire a developer. He has the luxury of thinking that these "secretly terrible" people are rare and misjudged. He's wrong, though.
(That's not to say that industry standard interviewing techniques are perfect -- they're definitely not, and they're often awful -- but "lots of professional programmers simply flat-out can't program" is a true fact, and a hiring process needs to deal with that reality.)
This phrase is seriously abused and it needs to stop because our field has anything but standardized and well-known competencies that we expect people to know before they commit anything.
Compiling main() with an assignment and a print is technically programming, but has extremely little to do with what is actually being developed in real products. You aren't going to give anyone a job if this is all they can do. As soon as you get out of the absolute basics (IMO, if/for/assignment/operators), your demands are simply arbitrary with what you think will indicate that someone can do the job.
You might only consider someone who can pass FizzBuzz as a 'real programmer'. Someone else might only consider someone who knows and can implement advanced patterns and quickly absorb an unfamiliar API as a 'real programmer'. Programming-Genius-32 might only think someone's a 'real programmer' when he can implement every algorithm from MIT's curriculum from scratch.
And if I had to make a wild guess, I would guess that people create their own standards of what a 'real programmer' is based on things they have already solved or things they read from someone else. Then they go make hiring decisions based on this standard. It is very likely that someone can look down on many of us and call us fake programmers because they're just so much further ahead.
Convincing someone that you are a real programmer of a certain skill level is a social exercise, not a technical one.
No, I mean it literally, as in can't do do a loop/if/assignment level operation. ("Write a function that returns the sum of the two largest numbers in an array" is the literal question I would ask that the majority of candidates -- all with years of experience doing software dev -- were unable to answer.)
They simply flat-out could not program. Full stop.
Congratulations, you are now better than 2/3 of applicants I saw.
I'd follow up by asking whether there's another way you could have done this (the obvious loop and store method); if you didn't come up with it, I'd explain it to you, paying attention to how well you seemed to understand it. I'd go on to ask about what characteristics make one of those solutions better than the other, hoping you'll mention the performance characteristics of sort vs. simple iteration, and maybe also the maintainability/extensibility of the two methods.
And then I'd move on to my next question with cautious hope.
Another way to do it, assuming that the list can be mutated: find the largest number in the array, store it somewhere and remove it from the list. Do this a second time. Then add those two numbers.
It evades sorting the array, but you have to search it twice.
Optimisation: just scan the array once, and keep track of the two highest numbers.
a = [7, 9, 2, 8]
highest = None
nexthighest = None
for item in a:
if not highest or item > highest:
highest = item
elif not nexthighest or item > nexthighest:
nexthighest = item
print highest + nexthighest
And my beef is with the huge variety of interview techniques (sorry if it felt like I was calling you out there, I wasn't trying to) and that candidates have to prepare for a whole range of them. One answer gets a spectrum of responses and there is rarely any good answers for everyone. (Maybe I'm just stating the obvious here)
Reading the original problem, "Write a function that returns the sum of the two largest numbers in an array", I took the liberty of assuming that the input list would always contain at least two items.
>Compiling main() with an assignment and a print is technically programming, but has extremely little to do with what is actually being developed in real products. You aren't going to give anyone a job if this is all they can do.
I do not think that is even necessary to being a programmer. It is possible that someone has always worked on projects with an already existing build system, and only knew to run "make" (or equivalent) to compile. These people could be competent at actually programming, but not know how to compile something outside of an existing build system.
You're missing one key thing - by the time we see the candidate they've given us a resume that says that they can and have programmed some actual things in some actual langauges. How do you explain their inability to then do some trivial thing (more trivial than fizzbuzz even) in the language that they've already claimed skill in?
I could be wrong, but these simple tasks seem to help distinguish "hung out on a team that did those things" or even "has been stuck in meetings 38 hours/week for the last 5 years and is startlingly good at powerpoint", when those are not what we're looking for.
He also make the assumption that secretly terrible people don't exist in other fields. In my experience there is a small set of people in every discipline that are just leeches who contribute almost nothing but somehow stay employed. There are plenty of crappy lawyers still practicing, for example.
I see little value value in being purposefully hostile, but I also see little value in this hand-holding, spoon-feed me approach he seems to want. We should encourage people and mentor junior team members, but I am not interested in extending "love" to some person I just met.
The first is called "Rob's Pairing Interview", or RPI. We go through a very simple problem. Takes 30-60 minutes. Nothing fancy is required, nobody is getting an algo pop quiz. Just a basic smoke test to see if you have, at some point in your life, programmed at all.
The second is pairing interviews. You come in and pair with us. At work. On real software.
When I got hired I kept waiting for the shoe to drop. It was just like the interview. Exactly like the interview.
The shoe never dropped. That's just ... how we work. The pairing session were a 1:1 example of daily life. And in a pairing situation there are no Secretly Terrible Engineers. And only a few Secretly Awesome Engineers, which is why we do it that way.
"can't program" is such a sweeping generalisation as to necessarily be null and void. I don't really think there is such a magic binary quality. Most everyone can learn programming and it's just a matter of patience and guidance. The only hopeless cases I've seen is folks who are poor coders but are convinced otherwise.
That's great, but if I'm hiring a person with ten years of experience whose resume is full of "senior developer" stuff, I don't think that either of us is interested in them doing an unpaid internship while we teach them the basics of programming.
I would be deeply suspicious if that person was in fact a terrible engineer. More likely than not they'd be in a wrong place, doing the wrong sort of job.
I've just gone through a rather prolonged search and hiring process, which involved hiring and then having to (regretfully) release some candidates (near the end, we were bringing them on as contractors so as not to create too many expectations). After dozens of interviews and resumes and a few attempts at hiring, we finally got a great candidate fit.
But here's the thing -- only a very few of those candidates actually were "flat-out" incapable of programming, and it was pretty easy to figure that out. Of the others, most of them were capable of programming, but not capable of developing, which seemed to be the dividing line.
Part of my interview consisted of data-modeling exercises (from workflow documentation) and slightly more-than-basic algorithm exercises (in which I wasn't looking for good code, so much as to understand how the candidate attacked the problems). I wasn't trying to bigfoot the candidates, but it was amazing how many couldn't work their way through relatively simple modeling or algorithm design problems at all.
And yet these were professional programmers. They had degrees, sometimes graduate degrees, from good universities; they had spent years in the field; they had solid references from good employers. I refuse to believe that so many practicing programmers could simply be incompetent and never raise any kind of concerns.
Which leads me to think that there's something wrong with our discipline. (I remember an old USENIX symposium where one of the talks was, "If computers had blood, they'd call us doctors." My reply at the time was, "If computers had blood, they'd call us serial killers.") If we built bridges, we'd assign a different engineer to each leg of the bridge, start at both sides simultaneously, never bother to check flood gauges or soil samples until halfway through the project, and try to swap out the cable suspension design two weeks after opening it to traffic.
In other words, we take perfectly capable programmers and turn them into useless cogs. Programmers who can't see the forest for the trees or the trees for the leaves. Who never are exposed to systems design problems, who use HashMap as the solution for every problem because we need that code yesterday and there's no time to discuss more elegant solutions (seriously, at least half of those candidates were obsessed with HashMap, for a problem where a hash table made no sense), who are shoved into a tiny chunk of the code of a massive enterprise monstrosity and expected to keep making changes and bugfixes until the code bears no resemblance to a reasonable solution to the problem.
Bad programmers who don't learn aren't the problem. Good programmers who don't teach are. Not brilliant programmers, not rock stars or code ninjas or whatever the fuck we're calling them this week, just good, solid programmers, the sort we say we can't find. If we don't give good programmers the time, space, and incentives to teach by praxis -- if the only way to be a good programmer is to bootstrap your way up -- what right do we have to complain?
Virtually no teams are made of 100% incompetent people. If you've got a 20 person team with two worthless people on it, it's barely even noticeable from the outside.
The real problem is when you have someone who is just competent enough to deliver mostly-working bit terrible code. Which the rest of the team has to maintain.
If the person is fresh out of college, of course, this is understandable and a senior person will likely be mentoring them, and reviewing their code. But when you have two such people who gravitate to reviewing each other's code, you have a serious problem that may not be immediately apparent.
You are correct, for the most part. Even on a smaller team, if you have stronger and weaker programmers, the end result tends to be a functional product. However, I think it tends to create stress on the team, as the stronger programmers feel like they have to pick up the slack and be extra mindful for mistakes made by their counterparts.
> doctors are asked trivial questions just a handful of times in their careers during their bar examinations and boards.
This is incredibly inaccurate. Physicians have to take a series of comprehensive, extremely difficult licensing exams just to graduate medical school and get a residency. Then, to be board certified, you have to take and pass comprehensive and expensive board exams in your specialty every couple of years, depending upon the specialty. Furthermore, they keep increasing the frequency of the required examinations. Doctors take tests to prove their knowledge their entire careers.
Maybe the point was about specifically "trivial" questions. The article's author may have been attempting to draw an analogy between the questions we ask in interviews, and asking a doctor to state the normal range for human body temperature (e.g.), which probably only happens once during early-stages of training.
I think that is a possible interpretation of this sentence, but I'm not sure that I agree with it in the larger context of the piece.
In the preceding paragraph, he states that applicants are expected to know "the latest RB tree research." In the following paragraph, he states that applicants are asked "technical minutiae." These examples to me don't seem to imply he thinks that "trivial" questions are being asked in interviews. The entire piece seems to argue that questions which are too difficult are being asked in order to unfairly weed out engineers who cannot answer them, which seems to me to be the opposite of trivial.
To contrast, he explicitly used physicians as an example where occasional questions (which are 'trivial') have to be answered early in training in order to be licensed, a statement which I submit is false. There doesn't seem to be a corollary either stated or implied that physicians are otherwise asked nontrivial questions when they are not asked trivial questions.
I was a little unhappy to read this line as well. I thought this blog post was insightful and addresses an important topic, and I do think that the difference between the interview process in medicine and programming is worth investigating.
For starters, yes, you're absolutely right. Doctors must pass very difficult exams, and the statement is too inaccurate to go unchallenged.
But please consider the next statement as well: "These fields have the benefit of well-defined bodies of knowledge and monopolies over their skill, so in that regards I guess they have a huge organizational advantage to prevent incompetents from walking through the door."
I think there's so much of importance to consider here that I hope we don't stop after rebutting an exaggeration that I will fully acknowledge is very inaccurate.
Of course, it's always going to be difficult for a mid-career programmer to comment intelligently on interviews for mid-career physicians and vice versa. But my impression is that while physicians do take rigorous exams (and must continue to do so throughout their lives), the career and study path is much better defined, at least in part because Physicians are in far greater control of their profession than programmers are.
For instance, when a physician interviews, is the candidate possibly facing back-of-textbook questions about organic chemistry? Do they need to keep that book around and study it for 40 hours (or more) when they change jobs? If there are additional training requirements as the field evolves, how well defined is the study and exam path?
The impact of this particular blog post was weakened a bit by the inaccurate depiction of what it takes to be a physician. But let's still consider the issues. In other HN discussions about this topic (interviews), someone wrote that they'd happily take a brutal "Bar" exam for software if it meant that they could finally be free, for once, from the highly unpredictable exams they must take every time they interview.
For final thoughts - I was a math major (with some CS coursework) and I have an MS in Industrial Engineering. I interviewed for a company that does software in this area. During my interview, I was asked (from memory here)
Prove the dual of the primal is the primal of the dual;
Now, with matrix notation;
Various binary tree operations;
Print all permutations of a string;
Find the long term state probabilities in a Markov chain;
Write a query testing to see if I know outer joins;
Swap two integers without creating a third integer (asked over lunch);
Refactor code with lots of if-then (to demonstrate that I know what a factory and strategy patterns are);
Detect a cycle in a linked list;
…
That wasn't all, because I interviewed at three companies over the course of a week and a half, but I think I'll stop there. At one point or other, in my life, I've taken exams on all this. I wasn't sure of what would and wouldn't be asked, and of course, I don't walk around with all this loaded into exam-ready memory. My guess is that a physician wouldn't have equivalent coursework loaded into short term memory either, and wouldn't be grilled on it in interviews, but what do I know?
I don't know what the solution (if any) will be, but I think it's a good start to recognize that 1) it's horrible, and 2) other professions - even ones that require massive amounts of education - don't apply the process so arbitrarily and capriciously (for instance, I had no idea I'd be asked about strategy patterns or asked about optimization math proofs. The next week, I was asked about database transactions and locking).
It's painfully obvious that the author has never tried to hire developers and, in fact, has never worked on an engineering team.
This line, in particular, is blatantly false:
> In all my years immersed in the tech industry, I have never once heard a firm talk about the idiots lurking in their own offices.
Firms might not talk about it (nobody wants to say they have bad employees). But people certainly will. Nearly every developer I know has horror stories of working with incompetent colleagues who literally couldn't program. And there are even more incompetent churning out job applications—FizzBuzz exists for a reason.
So it's clear to me that we need some sort of clear technical test before hiring people. But I also think FizzBuzz could be sufficient for that test. There's no need to administer an algorithms exam if all you're trying to do is find out if someone is an incompetent engineer or not (there are plenty of competent engineers, including myself, who would have trouble writing a trie on the spot).
This article is juxtaposing our fear of hiring an incompetent developer, vs. our indifference to the skills of developers once they join our organization. That's what he's saying when he says that everyone's afraid of secretly terrible engineers.
If each company spent a (comparatively miniscule) amount of resources on consistently training their current developers, then the team would be constantly improving.
We think that miraculously, good developers become good, sometimes by heroically hacking on their own, sometimes by going to a prestigious school, never by practice within a big company. Then, one we hire them, their skill level freezes. This is why it's important to hire good people.
This is the same illusion that keeps the US public school system focused on hiring good teachers and even more focused on firing bad ones, without focusing on improving current teachers' skills. We, as an industry, need to focus on practice within the organizations we control, we need to focus on making everyone better, and we need to kill with fire the idea that hiring good people is all you have to do to have a good team.
This is also the same problem with the entire H1B situation and why Google, Apple et al. have made some pretty poor justifications of the price-fixing collusion: because they've avoided investing in on-going education to such an an extent that the only thing they've been able to imagine for getting talent is poaching it from other companies (leading to the on-going upwards price war).
They could easily solve their problems if they actually wanted to train people, and promoted those who would/could mentor more aggressively.
We're not talking about only hiring the top 0.1% programming talent at the expense of the top 1%. We're talking about the people that either can't program, or are very, very bad at programming.
Dealing with those isn't a "comparatively miniscule" task, it's an extraordinary drain on productivity and morale.
> We think that miraculously, good developers become good [...] never by practice within a big company. Then, one we hire them, their skill level freezes.
This is patently false. Are you familiar with the concepts of "experience", "promotion", "pay raise" or "bonus"?
The thing is, he's very smart. He talks like a smart person, and he's very knowledgeable. And he can code well, and destroyed every question when we interviewed him. He even works long hours, to the point where he broke up with his girlfriend.
The problem with him is that every feature he has worked on is buggy and ends up causing more work to fix because his implementations are terrible. This is something that doesn't come up in interview questions.
I don't understand, because he's a smart guy. But he worked on 3 features in 2014, and all of them are disasters. A year later, we still run into bugs caused by poor implementation.
The thing that really boggles my mind is that people don't realize it, and think he's a guru. Even in a company filled with people that I think are smart, they can't see through his ruse and realize that for the most part, the work he does is not good, and the decisions he makes are poor. He even got a raise and a promotion. It's infuriating, but I guess things like this happen all the time.
some people have very asymmetrical perception of code they write vs code other people write. every bug in their code is "just a small edge case, whoops let me fix that", and every bug in someone else's is idiotic and the library is useless. just like how some people cant hear that they suck at music because they cant be as objective about their own stuff.
it is actually a hard, underrated skill to acquire. i hope i have it.
I also work with someone like this. They get jobs easily because they nail interviews and talk intelligently about problems. But when it comes to implementing actual code to solve real world problems, he can't scale it intelligently at all. Duplicate logic all over the place, hard coded values, refusal to research and use frameworks' best practices...it's all there. It's incredibly frustrating, and because he's so knowledgeable, it makes him even more stubborn to constructive criticism. Many times his arguments boil down to "I don't feel like doing it that way", in the face of valid arguments. The facts remain, his code is difficult to maintain, difficult to scale, and buggy.
Can it be he just doesn't care about the work assigned to him so he just half-asses it? Not saying it is a good excuse, but I have seen good developers that are indeed good, when moved to projects they have no interest at all, become like your co-worker.
If I was in your place I would go further than that and try to figure out what he is missing. If his programs don't work how does it work in interviews. Making buggy programs are after all things we all do from time to time. Ensuring that you don't repeat such mistakes are utmost essential.
I have worked with a few Secretly Terrible Engineers over the past couple of years. One of those was hired before we had instituted any kind of standardized technical screen, another was senior enough that we decided to skip the technical screen (and regretted that decision later), and a third passed the technical screen but was unable to get anything done, in large part because he spent most of his time on Reddit and HN. That of course indicates that passing a technical screen is not a guarantee of success, but it does seem to be at least one good filter.
Our technical screen consists of 6 questions on a few different topics, and is designed to probe a variety of different areas. I consider it fine if someone fails one or two, as I don't expect someone to be strong in all areas; some people may be really good at actually coding, but not know what big-O notation is, or someone may be fresh out of college and so not have much hands on experience yet but if they do well at the algorithms and data structures questions it indicates they were paying attention and absorbing their classwork.
But people who don't know the big-O complexity of basic operations on basic data structures they use every day, and can't put together a coherent class hierarchy for modelling files and directories, and don't know how to interpret a system call that they see in a stacktrace, don't know what TCP is, and can't write a simple program on demand with access to Google, StackOverflow, in any language they want, using any tools they want, are definitely going to be more work to bring up to speed than they are going to be worth. And I've talked to people who have degrees in CS and several years worth of experience who have failed every one of these questions.
Mainly because a lot of the things you just listed are incredibly stupid requirements.
If I spent my life doing ruby sites that save basic data into a mysqldb, why does it matter whether I use sometching that takes 100,000 operations or 10,000 ones, when the web page still opens in 0.5 vs 0.51 seconds?
Or who gives a damn about system traces when I can put echo's in a PHP page and just view source?
Who gives a fuck about TCP when the browser abstracted that all away?
And when you say write a program, you clearly mean a console app, which again, unless you've written one, which junior programmers may never have done because someone else always wrote the helper & utility apps, it's pretty daunting the first time if you don't even know about main. Which suddenly makes it pretty easy. Imagine asking a front end developer to make a console app in their interview.
You do a particular type of programming. You seem to want to judge everyone else by your niche without understanding that a lot of the skills you list as essential are actually almost completely irrelevant to huge swathes of programmers. And they haven't learnt about them, because they don't need them.
The poster you're replying to isn't narrowing to a niche, he/she is simply trying to catch a red flag. You're right to be concerned that there are many interviewers who will climb way down the ladder of abstraction to their own particular niche and then crucify a candidate. But this isn't one of them. The poster even explained that it was only really a negative if all were wrong. I use a similar process as a quick pre screen.
Sailing through all questions allows us to proceed at a relaxed pace. Having nothing to say for one or two questions tells me nothing (it could easily be a fault of my question rather than the candidate). But no coherent answer to any question? Red flag.
Exactly. I've had several candidates that I have passed who on the TCP question have said "I don't know, networking is not my strong point." That's great! Knowing when you don't know something and should just move in is a real strength.
As long as someone does fine on most of the rest of the questions, having one or two gaps in knowledge is not a problem, I can tell from the other questions that they'll be able to pick it up if they need it, or ask for help when they need it.
This sort of attitude is part of why I suspect that even with all the incredible advances in computer hardware, that all of our applications seem to be just as slow and bloated as their equivalents were 20-30 years ago.
Mainly because a lot of the things you just listed are
incredibly stupid requirements.
Do you know the job I'm hiring for? Perhaps I'm hiring for something a little more complicated than a front-end developer over a CRUD app.
If I spent my life doing ruby sites that save basic data
into a mysqldb, why does it matter whether I use
sometching that takes 100,000 operations or 10,000 ones,
when the web page still opens in 0.5 vs 0.51 seconds?
Because sometimes, those 100,000 operations actually wind up pushing it from 0.5 to 5 seconds. Or even worse, something that works with 1,000 records in .5 seconds, once you go up to 100,000 records takes over an hour, because you're doing something that takes O(n^2) time, so a hundredfold increase in number of records leads to it taking ten thousand times longer. If you don't even know what big-O notation is, or how to think about an algorithm like that, you may not even realize why everything has suddenly gotten so slow; or even worse, you may have designed and implemented a large system with that O(n^2) baked in to the very fundamental design, so it would require a huge rewrite to eliminate it.
Who gives a fuck about TCP when the browser abstracted
that all away?
It is not always abstracted away as cleanly as you would like. For example, here is a bug that occurred when Firefox sent an HTTP request in several smaller packets rather than one larger one, and that caused it to break when talking to IIS:
How would someone impacted by that be able to even figure out what's going on, if they don't know anything about what HTTP is built upon?
Imagine asking a front end developer to make a console
app in their interview
Hey, it's a good thing this isn't my front-end developer interview; in that, I give people a basic REST API to program against, and have them write a simple frond-end for it, using any toolkit they like. No console apps at all in my front-end interview.
This is my back end, systems interview, for people working on a network attached storage system, where we frequently need to debug complex problems in a distributed environment, solve compatibility problems with third party proprietary software in which the options for debugging it are slogging through syscall traces or stepping through it in IDA, where we've had performance problems where tweaking the kernel's TCP retransmit timeout for systems that we knew were on the local network has made a huge improvement.
You do a particular type of programming. You seem to want
to judge everyone else by your niche without
understanding that a lot of the skills you list as
essential are actually almost completely irrelevant to
huge swathes of programmers.
Or maybe my interview is tailored for the job that I'm hiring for? As I said, this is my back-end, systems interview, and I have a different one for front-end developers.
And as I said, I don't expect every developer to pass every question. I do still ask the TCP and algorithmic complexity questions to front-end developers, but if they do poorly on those while being able to write up a front-end to spec based on the API I give them, and being able to describe intelligently how they would design a more complex interface with richer interactivity, then I know that they can do the job assigned and we have sufficient other people who can do review to ensure that they don't write themselves into a corner complexity-wise or help them debug weird problems due to something in the networking stack. Or if they do well at the TCP and algorithmic complexity, but are relatively new to front-end development, I know that they have a good base of background knowledge and analytic mind, which means they should be able to pick up the rest of the skills they need without too much difficulty.
But if someone does poorly on everything, or more than a couple of the questions, it means that we would basically be flying blind hiring them; we would need to train them up on everything they need to do, and in the end, there's no guarantee that they'd actually be able to perform.
You ask TCP questions to front end developers? It sounds like you are exactly the sort of interviewer this article is calling out.
Do you not think that this sort of specialised stuff is really easy to learn? That you should be asking questions to discover programming ability, not pointless reference knowledge in quiz questions?
I've debugged complex IIS bugs and found insane bugs in IMAP implementations that were caused by a change in the way Outlook handled the protocol.
I did not need to know that before I debugged it. That sort of reference knowledge is rapidly discoverable while you do the debugging. Most of that stuff I will never need to know again.
When I say "TCP questions", I mean that I ask the standard "what happens when you type google.com into your browser and press return," and expect to get them to mention TCP at some point. If they don't, I specifically prompt them, asking what protocols HTTP is built on top of, and how it finds the address of the server to contact.
I am not looking for detailed knowledge of receive windows, sequence numbers, or anything of the sort. I consider knowledge that TCP exists, and DNS exists, to be sufficient. And as mentioned repeatedly, even if they know nothing of this question, this is one of many. I want to see if they are the type to dig into the stack and learn more than just their particular domain, but the basic programming exercise and portfolio are the most important things that I look at.
The browser's abstractions are leaky, and given that I tend to work with small teams, an understanding of how the sausage is made is important. And I don't think you can really understand HTTP without understanding TCP at at least a high level. So if I'm hiring for a senior frontend person, I'd expect them to know how it works, and in discussing things (not "technical questions", I just talk to interviewees and go from there) I'd expect some demonstration of things that can make me feel confident that they won't be unduly thrown in day-to-day work by something involving it.
No they're not! Virtually no javascript developer needs to know about TCP.
Ajax doesn't go wrong. Most people abstract it even further using jquery and probably don't really know about how XMLHtttpRequest works.
What you've just said is absolute bullshit. Complete and utter horseshit. It's an artificial barrier you've erected to filter out people who aren't like you.
How wrong would you have had to write your javascript that you'd need to start knowing the specifics of two levels of abstraction down?
I am completely flabbergasted that a professional programmer would claim they needed a high level of understanding of TCP to write javascript.
I think you're misinterpreting the use of "high level" here. It's not that the candidate needs "a high level of understanding", it's that they need "a high level understanding". "High" here is in reference to the level of abstraction.
A high level understanding could mean something as simple as understanding the request-acknowledge nature of TCP, as opposed to the 'fire and forget' mechanism of UDP.
As to the claim that virtually no javascript developer needs to know TCP, that becomes less of a truth the closer you get to experimenting with tech like websockets, which is increasingly becoming part of javascript developers' jobs.
I think you're latching onto a narrative that you already made up about the poster before you even began this debate.
There's a large leap between what the poster is saying "I ask a wide range of questions to catch red flags if they get all wrong", and what you're saying "the poster is saying that javascript devs NEED to know TCP! Obviously an elitist filtering mechanism!".
Exactly this. A "pass" on that question consists of mentioning HTTP, DNS, and (perhaps with some prompting about what protocols underlie HTTP) TCP. A strong pass involves being able to tell me that DNS is usually UDP based, and being able to describe what distinguishes TCP from UDP. A very strong pass involves being able to describe in detail recursive DNS queries, talk about the fact that DNS can use TCP rather than UDP for queries or replies larger than what fit in a single packet, being able to tell me a bit about how TCP congestion control, sequence numbers, acks, Nagle's Algorithm, and the like work.
I should point out that I recently talked to a college junior who got a "very strong pass" on this question as a front-end intern candidate. If she can do that as a front-end oriented candidate with only a couple of years of experience in school, I don't think it's unreasonable to expect a professional with several years of experience to have at least the basic knowledge that TCP exists.
Some people are stronger in one area than another; so I cover a number of areas, and expect them to do well on a majority, but not all, of the questions. Even if you fail this question, just saying that you don't know anything about networking, if you do well at all of the others we will bring you in for the in-person interview; we have recently hired someone who simply said "I don't know" on this question, and was a bit fuzzy on the big-O questions, but did fine on on all of the rest.
Yes, this. Thank you for a reasonable reply to an unreasonable post.
(And as to the rest of his very angry post, I'm not a front-end developer, I do platform work and what some folks call devops. I can tell you exactly how XMLHttpRequest works, in detail, with diagrams. I can also go on at length about the DOM, the travishamockery of CSS, as much JavaScript minutiae as one would like...I think a basic understanding of the world around us is important to being good at things. Being "a front-end programmer" doesn't mean you should be so incurious as to not know how things you don't touch every day but directly inform the things that you do actually work.)
Except when it does. Did you miss the bug report I linked, in which it went wrong in one way when Firefox sent something as two separate packets, which caused IIS's HTTP implementation to break, and then when that was fixed the fix caused other regressions:
And this is only one of many such bugs; I've seen others in practice as well. It requires no special use of JavaScript to hit such bugs; just an unfortunate interaction between the particular browser and back-end server, of which there are many possible combinations, especially when you add transparent proxies into the mix.
By the way, when I'm looking for a frontend developer, I'm also not looking for someone to do the very most basic CRUD app plus some JavaScript for interactivity. I'm looking for someone to be able to do a rich, desktop-style app that pushes the boundaries of what can be done with HTML5 video. This is an area where you will be running into browser bugs on a regular basis, and need to know when to stop trying to solve the problem yourself and push the bug report upstream to the browser vendors, and in order to be able to do that it helps to have a knowledge of what is going on at that level. For this application, we accept that we will only be running on the very latest releases of a few select browsers, until interoperable support for the kind of stuff that we're working on becomes more widespread.
No, we gave him plenty of interesting work. And as we discovered, the above described interview was actually a fairly low bar; when faced with problems slightly more challenging, he would fall apart and give up in frustration. And he was very touchy, so wouldn't accept help when he hit those kinds of situations; he would just spend all of his time reading Reddit and HN instead of getting the help he needed. He was essentially impossible to work with.
We had overlooked his personality quirks when hiring him, as we didn't want filter out someone who was technically sharp but was simply personally quite weird, but it turns out that in practice we just could not work with him.
That's possible, but it's equally possible from the information given that she (*he as it turns out) was just lazy and unmotivated or was in some other way actually the problem. Both types of people exist in great numbers.
I think one thing the author fails to realize is how much worse it is to hire a terrible engineer than not hire a good engineer. I've seen a terrible hire cause more than $1M of damage in the year it took for him to get fired.
While I think that these sorts of people are rather rare within the population of engineers at large, they appear much more often in the interview process as they are more likely to need a job, and probably have to interview many more times before they find their way into a company.
That said, I do agree with the author that extremely deep technical interviews are also unreasonable. Variants of FizzBuzz are generally enough to weed out the terrible engineers and it's pretty hard to tell the good from the great until you've actually worked together.
> I've seen a terrible hire cause more than $1M of damage in the year it took for him to get fired.
As I have said in these discussions before, that sounds like a process problem. What does it say about the competency of management and the maturity of processes if it took a year to figure this out? I don't think you can lay most, let alone all, of the blame on the engineer and the hiring decision.
It took a year for management to figure it out. It took much less time for his coworkers to figure it out. You're definitely right. "Blame" for the damage done falls mainly on management inaction.
It's easy to blame this on hiring when in reality it's a complex combination of employee oversight, team building, scheduling, budget, "culture", etc...
I’ve worked with secretly terrible engineers. I’ve seen lots of code samples submitted by secretly terrible engineers. I’ve been told that putting asserts in the code is bad practice because it means that the code is making assumptions, and I’ve seen my co-workers exult because they were successfully able to change the simple implementation of an interface without having to change the client code. I’ve heard software project managers claim they’d never seen a project fail for technical reasons, and I’ve seen them openly promoting waterfall development. And if you look at the kind of abysmal nonsense that fills sites like w3schools and developerWorks, not to mention the questions on StackExchange, it’s obvious that there’s such a huge volume of Openly Terrible Engineers out there that they frequently don’t know any reasonably good engineers at all.
I’ve also had the privilege of working with a lot of really super awesome people. They didn’t work at the same companies as the total incompetents, for the most part. And they weren’t all geniuses. A lot of them were just regular schmoes who worked every day to get a little bit better.
In almost 20 years in the field, I’ve also actually been the “secretly terrible engineer” — although I’ve been pretty successful in most of my jobs, I’ve also had a couple where I just stunk.
Different people do well in different environments, and software development is a wide enough field that it has a wide variety of environments in it.
One solution to all of this is for the applicant to do a few things to avoid being confused for a terrible engineer. The last time I looked for a job, more and more companies were asking for an engineering design portfolio. Instead of waiting for the employer to ask you to talk intelligently about things you haven't learned yet, show up to the interview with a portfolio of things you can discuss on a detailed level for hours on end.
I've only been working professionally for a little under 4 years, but I've encountered a handful of terrible engineers - engineers who only know one solution and apply it to every problem that presents itself, very smart, technical people who know enough to hack something together and leave the difficult, last 20% for other people to solve, etc.
Another issue, especially in larger companies, is you have to allow a bad engineer to fail in order to justify firing them. In a fast paced environment, sometimes you can't allow a project to fail simply to provide cause to fire an engineer. And even after the failure, HR will require you to implement a performance improvement plan, etc.
The problem with the portfolio approach is that many incompetent engineers can easily point to code that they "wrote." Often it was part of a larger team carrying them, but sometimes they wholesale copy it.
A similar problem applies to pure project interviews: the total incompetents are also the ones most likely to get someone else to "help" them with it.
Yep, there also needs to be a way to detect bat-boy syndrome (where the team goes to the world series, but the guy you interviewed was just there to lug the equipment around).
I've always heard that you learn more in an hour of playing(as in a sport) with someone than you will in ten hours of conversation. I like the idea of being paired with an engineer to solve a problem because it let's someone see how you think, what your process is, and, most importantly, how you learn.
>The problem with the portfolio approach is that many incompetent engineers can easily point to code that they "wrote." Often it was part of a larger team carrying them, but sometimes they wholesale copy it.
Have them walk you through it, or ask specific questions about it.
>A similar problem applies to pure project interviews: the total incompetents are also the ones most likely to get someone else to "help" them with it.
The best interview I've ever seen is where they pair you with another engineer and give you an hour or 2 to solve a problem. The other engineer is there to help you and you actually work as a team.
I think the article starts to highlight the real issue, when it compares with other industries, but then ducks out at the last moment. So I will try my best to fill in:
In all industries, in all professions, there are a range of abilities for those working within it. From the truly gifted, to the hard-working reliable folk who mostly deliver consistently, to those who need support and guidance to deliver well consistently, to those who deliver sometimes, to those who frankly cannot and will not deliver despite support time and guidance.
I have met incompetent doctors, unreliable engineers, lawyers you would not trust to compile a shopping list, and programmers you would not ask to develop any kind of program beyond hello world.
I have also met people I could always learn from, and they too remain fixed on the idea they too are always learning, improving, on a knowledge and experience journey, without a fixed endpoint.
A final point - no one is truly useless, you just cannot find a good match for their skills and experience within your team at that point in time.
The article started ranting, so I stopped reading it after getting through about two thirds of it. But one difference between software and the other mentioned professions is that software developers can earn $140K just by doing, by performing. And they can often start soon out of high school. And it doesn't matter where they learned it. It's about doing.
I don't know much about dance, but maybe software engineers should be compared to dancers, and the technical interview should be compared with an audition. Actors and dancers still audition for roles, years after they have made it. And the audition is still as far removed from day-to-day work as a technical interview is from the job. I guess the reason that it's done is because the work is like a performance.
The article spends most of its time criticizing interview techniques that focus on knowledge of particular facts, which is fine - I agree that's generally not a good predictor of job success. But somehow the article concludes that since bad interviews can exclude good candidates, there's no such thing as a truly bad candidates.
The existence of bad interview techniques at some companies doesn't preclude the simultaneous existence of bad candidates in the market.
Yes, good people can fall through the cracks, and otherwise-talented people may not be good at being interviewed. Those are unfortunate realities that you should be aware of and vigilant about if you're involved in hiring. But there are also bad candidates out there, and they're capable of doing a huge amount of damage (anyone who has worked on even a medium-sized engineering team can verify this) so the ugly truth is that being conservative with hiring makes a lot of sense.
It is not. That's the point of the comment you're replying to.
There exist good interview questions, where you give lots of facts to the candidate and make it clear that they can ask if they don't remember the order of arguments to reduce(). (In fact, a good way to put them at ease is to admit that you don't remember either, and are just looking up the documentation :-P )
But if, given those bare facts, they can't write a legible and correct binary search even over multiple iterations, or reason about O() complexity of an algorithm they either just implemented, or just came up with, they are probably an STE.
In my experience, these people fall into one of two categories: new graduates who just barely scraped through their college classes, and people misrepresenting their past jobs in QA or product management as programming experience. (That isn't to say that non-CS-majors shifting into coding can't be great engineers - I know several! But when interviewing one, you can't make the assumption of basic competence, and it's a red flag if they play up their non-programming work rather than talk about actual coding experience/studies.)
This is bizarre to me. Yeah, I've worked on teams with people who couldn't ship code, and they got fired. But I'm looking for work now. Let me tell you what it's like. There are companies who provide automated coding tests that require you to pull out bizarre algorithms that I haven't needed in 15+ years of shipping code and creating MM's of value for employers.
After weeks of searching, I'm finally at a second-stage interview with a company where they intend to hand me a spec, put me in a room, and see if I can come out a few hours later with a functioning prototype. And it's a relief; that's a sane test. Can I produce or not?
I was talking to a team a couple of years ago. This team had a bad reputation at the organization, and I was supposed to come help them.
So I did a day of light training. Then I offered some free books -- with the caveat that they email or ask me for them.
Nobody asked for the books. Out of 20 folks, 2 of them bought the books (instead of getting them for free). They didn't want the rest of the team to know they were interested!
I firmly believe two things: 1) just about anybody can become a programmer given the right environment, and 2) Due to social issues, teams get "out of whack" and suck really big time. And when #2 happens, the other programmers always blame it on technical skills.
At a recent conference, I saw a guy talk about "mob programming". Mob programming is where you take the entire team, say 3-7 guys, and work on one story at a time. There's one terminal. One guy is typing and everybody else is planning and navigating.
He said it created a huge training opportunity for everybody because everybody was involved in every little detail all at once. People got the chance to compare notes. Poor programmers got better. Good programmers learned cool stuff from each other.
But the really interesting part was when he described doing this in teams where not everybody was a programmer. He said out of several experiments, in every case within a few hours the noobs were picking up how to program in that language. Within a week they were able to actively and helpfully participate along with everybody else. It wasn't just that weak programmers got better, it was that most anybody could learn how to program.
I don't think most people in the industry are ready to hear that.
The really horrifying thing about FizzBuzz is that it actually has useful power as a screening tool. No, I don't know how the hell these people accumulated years on their CV while being literally unable to even approximately code a FizzBuzz. But they're out there.
(I suspect Joel Spolsky's theory that it's the same 100 terrible guys applying for all the jobs.)
It's the same for sysadmins, by the way. We just went through a hiring round for a good Windows sysadmin. (Us Unix guys could all do Windows to pointy-click level, but needed someone who knew its evil little ways when something went subtly wrong.) Holy crap, you would not believe how many clearly useless bastards have literally twenty years as a Windows admin on their CV. We got someone who's great (and picked up the Linux side in short order), but it was slightly trepidatious.
No question in my mind that completely inept engineers are everywhere. It's mind numbing how many people will respond to a opening for an experienced coder and literally cannot code in any capacity.
In a previous life we were hiring C# WinForms guys, and I would always ask to code "Explorer". Some people couldn't get past creating the solution in Visual Studio, others couldn't figure out how to get the directory listing at C:\, but some got remarkably far along in just an hour or so, with directly & file listings, navigation up/down the tree, context menus, rename, copy/paste, etc..
I never ask trick questions, I just come up with cute apps. I think my new interview question forever henceforth will be to code and launch "Magic", I'll be back when it's time to get lunch, and help yourself to the espresso. Write it in whatever language you want, using whatever tools you want, on your laptop, with a full internet connection. If you want external keyboard and monitors, please use them.
The only rule is you can't claim someone else's solution as yours. You can use Stackoverflow, or any other resource you want to build up the parts, but you have to cite any code which carries a license.
Find a task which you can describe at 30,000 ft in 2 or 3 sentences, then spend 5 minutes sketching out ideas (not code!) on the whiteboard of what the components are and how they fit together. Then, probably best to leave them to it and not stare over their shoulder making them nervous.
Quite a few times, after getting a technical screen, the interviewer's attitude was "You loser! Why are you wasting my time interviewing here?" But I know that I'm really productive and really know my stuff.
When my employer was interviewing, the two best candidates (by my analysis) failed their technical screening.
I don't see any easy solution.
It also seems weird that employers complain of a talent shortage, but they always are looking for every little excuse to reject someone.
You know those college groups and side projects where there was a group of people and then only a couple or a few actually did anything and the rest attached on for the easy layup? Yep those people end up doing the same thing in the workplace and probably in life.
If you were the deliverer in the college group or project then you will have the same thing in the workplace. Usually people who don't want to be doing it at the time, they might be good but uninterested. I imagine it is extremely unsatisfying to just sit around and not contribute. I work to create and ship, otherwise why do it?
A bad part of this is it isn't always the shippers and doers that get the credit or the notice either, same people like this jump up and take credit to justify their contributions.
Ugh, you're right, and this is exactly why I hated group projects in school. 4-6 people taking credit for the work of 1-2, and everyone gets the same grade.
Pair the candidate with an interviewing engineer, give them an hour or two to solve a problem as a team. The interviewer isn't an adversary; their job is to assist the candidate like they would if they were working on a real problem.
Repeat this process if necessary.
Get everyone together and talk about past projects the candidate has worked on. Have the interviewers evaluate the candidate, and make your decision.
You know whether the candidate can code, and you've removed the adversarial nature and unnecessary stress from the interview process. Most importantly, you got to see them work in a much less artificial environment that's closer to what they'll be doing day to day.
> Lawyers and doctors are asked trivial questions just a handful of times in their careers during their bar examinations and boards.
I get the very real sense that this person has never sat a law exam.
A law exam is based on applying case law to a set of facts, which corresponds to the day-to-day work of millions of lawyers. What the author calls "trivia" is literally the law.
In your introductory course you might be asked what precedent is, but after that it's cases, statutes and IRAC: Issues, Rules, Application and Conclusion.
I think the biggest issue with this article is that most places aren't trying to hire people who aren't bad, they're trying to hire people that are good. You might use fizzbuzz as a weed-out question, but then you have to determine whether the person is good or not. I don't think there's a pervasive fear that terrible engineers will accidentally be hired. There's a desire to weed those people out and then sift through what's left for the best.
This article is incorrect from beginning to end. As a simple counterpoint to this idea that "I have never once heard a firm talk about the idiots lurking in their own offices"
Here is the well documented _policy_ held by major (not just tech companies mind you) firms to fire the bottom 10% of their employees. http://en.wikipedia.org/wiki/Vitality_curve
By the way a form of that is used at Microsoft and Google, probably others as well, but the idea that Silicon Valley and Computer Programming are the only place where people are on the look out for "Secretly Terrible Engineers" is absurd, its just as absurd as the comments I am reading where no one has ever worked with one of them. As a hiring manager I turn away over 95% of candidates and even with a very selective hiring process you get people we need to let go because they can't perform at the level I expect them to even after passing a rigorous interview process, so to say they don't exist is stupid, its like claiming that no one ever gets fired for not performing their job well enough.
It strikes me that in the same day (albeit one filled with hiring articles) we get an article about the success of Greyston Bakery, which will literally hire anyone and train and support them as much as needed to be successful—and this, which is along similar lines but is about software development, which, obviously, requires special individual aptitude that only a chosen few can succeed at, and must be protected from the pretenders to the throne. /s
Somehow, it feels warm and fuzzy to support the social justice of the little bakery that could, but when it comes to our own companies, our true colors shine.
Continue to take the systematic approach. It is still the best way. Of course, of course you should hire as effectively as possible, and try to have a baseline of aptitude, but there are always going to be outliers and bad fits. It's far more effective to optimize your system for the success of all employees than to focus on weeding out the 'bad ones.' Sure, there will be people that won't fit your system, but they made it through the hiring process for some reason—figure out their true aptitude and transfer them to something they'll be effective at.
The article may be wrong about the existence of fakes and bad programmers—anyone who has filed through a stack of resumes knows that—but it's right on the unreasonableness of the fear and the weight we give it.
I say it all the time to our CEO: if hiring is really a huge problem we want to focus on, how did we get all these great people? Hm.
And if hiring is really the solution to all our problems, then, well, what do we do with all these idiots standing around here?
Ha. That really is the question. The real problem is organizational and systematic. Stop worrying so much about hiring, optimize your process and be done with it, and focus on what happens afterward instead.
Something getting lost in the responses here is the "stress" factor of these interviews. When encountering a difficult architectural or algorithm decision, experience has taught me to develop slowly, prototype often, investigate previous work, and even step away from the issue. Meanwhile, engineering interviews require you to code immediately, in isolation, and then be judged on the first solution you crap out of your brain.
In a recent interview, my solution to a question was bad, I knew it was bad, and I informed my interviewees it was bad. It was difficult for me to come up with other solutions on the spot under pressure of judging eyes and limited time. Immediately after getting off the phone I was able to assess better ways to approach and solve the problem, but too late, those 45 minutes were more important to them than my many years of provable, directly related experience.
I was on the interviewer side of the developer interview equation once. This guy came in, mid-40s, claimed to have a long and storied career designing radar imaging systems and the like. Even claimed to have a patent to his name.
He was a fucking idiot.
We gave him our standard simple programming task, which requires a trivial amount of quasi-low-level byte-banging in C(++). But even a fresh-out-of-college kid who only knew Java could get it with a bit of prompting and a few hints.
This guy came nowhere near a workable solution, but he could generate mountains of chatter and BS about what pattern to use and such.
I had never encountered such a flagrant BS artist trying to land a developer role wver in my life. I doubt that most "secretly terrible engineers" even approach his level. But the experience was a teachable one.
I started as a stock analyst in 1981 at a fairly elite firm (PaineWebber, usually top 5 and always top 10 in the equity research rankings). We didn't get computers and spreadsheets -- Visicalc! -- until I'd been there a few months.
After a while I started publishing annual growth rates, based on formulas like (EndValue/StartValue) ^ (1/5) - 1, formatted in percent. Multiple colleagues asked me to teach them this amazing trick.
And by the way -- when writing this comment, I made a mistake in the formula above until I rechecked my work. :)
What a terrible article! Writing code is not about good vs. bad, incompetent vs. incompetent, great vs. garbage. It's about communication between people.
The best programmers I've known are never happy with their work. We ALL write terrible code! Anyone who says different is lying.
Good coders are those who welcome critique, who push themselves through self-set goals and challenges. My fav quote from Dan Kubb is "if I'm not embarrassed by code I wrote 6 months ago, I'm not pushing myself hard enough".
And then some day you'll have to visit a psychotherapist. Not being happy is destructive, being happy and greedy in your happiness is not. Attain a level of expertise where you can reliably predict task durations and then go lead a team to teach every team member the same. Then be a CTO. Then the world will be a calm, happy and steadily progressing place.
Totally agree! I hate questions asking about your "great" achievements, code that you're "proud of"...
We're not proud... It's always possible to reiterate the code your wrote and make it better (and again, and again) - but the question is: is this the most priority for today?
I think one thing that really bugs me about this article (and a lot of the stuff coming out of the valley in general) is the unqualified use of "engineer" when describing developers. Most software development seems to lack the rigors of what (atleast used to) fall under the domain of engineering (EE, ME, ChemE, etc.); I also don't think the culture is this antagonistic in other engineering fields, especially where people are licensed as professional engineers (ME, Civil, etc.).
Side point: The author throughout nearly all of the article uses engineer (without any further qualification) and programmer interchangeably. A vast range of engineering has nothing to do with the design of software, and engineering is not programming.
It's an interesting point, since I don't know how much other engineering fields are beset with this problem. The article features a photo of a collapsed bridge, but obviously its topic is software. Do civil or aerospace engineering firms have the same problem of incompetent coworkers who manage to pass all their interviewing hurdles? It seems hard to imagine -- I don't know how I'd fake the math on static and dynamic loads without actually knowing what I was doing. Why is software so different, lacking basic shared and provable prerequisite skills?
Similar to the title MD for doctors or JD for lawyers, engineers have the PE (Professional Engineer) title. It requires a bachelors degree from an ABET accredited school, 4 years of experience, and pass two day long exams. It's pretty rigorous.
Also, based on my experience with engineering interviews, they rarely have what I would call "hurdles." It's not a gauntlet that they have pass, usually it's just a relaxed conversation about their skills and experience to make sure they fit the position and will work well with our team.
"Why is software so different, lacking basic shared and provable prerequisite skills?"
It shouldn't be; I think we generally don't have enough people that yet recognize this as a Real Problem(tm), and therefore don't know how to do good filtering/interviewing up front. I think it may be another generation or two before there's really enough pushback to start demanding the same level of scrutiny other industries may have.
Now, a bad doctor, lawyer, engineer, etc might slip through the cracks now and then, but there's pretty strong proof that at one point they did Know Something (tm) about their field, more through professional licensing than standard higher ed, for example, but it was there.
I have so little faith in the value of most higher-ed degrees as higher-ed seems primarily profit and ratings driven, vs capabilities driven. Professional orgs like bar associations actually have some economic incentives to keep people out of practicing law. Universities seem to have no incentive to fail people out, and every incentive to continue grade inflation.
> Universities seem to have no incentive to fail people out, and every incentive to continue grade inflation.
Ostensibly, the idea is that if you don't weed out the idiots, you are going to end up with people with degrees from your university who are garbage, and this hurts your brand. You see this happening with for-profit colleges, where grade inflation is particularly egregious. "Well, we've given a bunch of DeVry grads a chance, and they've all done terribly. We're not going to be considering DeVry grads anymore unless they can really, really prove that they're any good."
This should, in turn, get back to prospective students, who won't go to a college if it can't get them a job.
Unfortunately, you're now at three steps that are pretty opaque and difficult to measure, and the rewards for lowering standards are immediate and vast.
> You see this happening with for-profit colleges, where grade inflation is particularly egregious
I see it happening with people all sorts of colleges and universities. It's gotten to the point where if/when I meet a young graduate who has good communication and thinking skills, I consider them an anomaly, or suspect strongly that they already had these skills before college.
In my experience (biomedical), this concern does not exist in other engineering fields. That may be because incompetent engineers are just accepted, or because they don't exist--I don't know. Other engineering fields also generally don't engage in extensive quizzing during interviews.
There are a series of licensing exams for the (non-software) engineering fields: one typically takes the Engineer-In-Training exam at the end of undergrad, then can take the Professional Engineer exam after a few years of working under a licensed PE (exact regulations vary from state to state).
These exams are more common for civil than electrical engineers; if I remember correctly, there are regulations that require a Professional Engineer to sign off on blueprints.
I've interned in big, international company, doing finite element analysis, a method for structural analysis on computers.
As an intern, I didn't get the formal training on the software suites used, I came in with a bit of experience on another software, and a bit of theoretical knowledge from school. I managed to pick up enough in 4-6 weeks using tutorials, manuals, and the engineers around me that I think I was as productive as you could expect an entry-level engineer to be (fwiw, I had a job offer on the table before finishing that internship).
I clearly remember two guys there. One was contractor, late-20ish and I think had started not too long before I did. The other was middle-aged and I think had been working for the company for quite a couple years. Both quite incompetent. At first I thought it was me, if I couldn't explain something to them there must be something I'm getting wrong right? By the end of my term I had picked up enough, underhandedly, from other engineers, that these guys were incompetent. It was never openly admitted though. But these guys were given only the most basic, boring, tedious work requiring absolutely no actual engineering analysis.
Don't get me wrong, these were 2 guys out of 150ish people doing FEA there, and all the others were absolutely brilliant and I learned a lot. But there are incompetent people everywhere, I think. In my graduating class there was definitely a 5% ish of people which everyone else knew were not going to be doing engineering work. These people often end up as production supervisors and such, never get their PE.
Even in grad school, they are tougher to find, but there's definitely people graduating with MSc's and PhD's who couldn't engineer their way out of a bag. Generally recruited by bad profs who do all the actual thinking and just give them narrow, grunt work to do. There's safeguards in place at the University level to prevent that, especially for PhDs, but it does happen, unfortunately.
I should probably have taken that job offer, but I really wanted to see what academia/research was like.
Really? As a non-software engineer, the title and bridge picture suggested this would be relevant to me, much more so than an article about programmer interviews.
Except if you're somewhere where it does matter. Take Canada for instance, you cannot call yourself an engineer without being a licensed Professional Engineer.
I strongly disagree; it seems rather perverse to label many developers and programmers as engineers, because most of what occurs in development and programming, especially because of the lack of rigor involved, could scarcely be passed off under the heading of traditional engineering, and should not be confused as such.
We send out a coding test to candidates that pass a phone screening. We provide a template file and a run file that contains tests.
I'd say 1/3 of the entries we receive (from people that have passed a semi-technical phone screening!) return an answer along the lines of 'if input = test1, return answer1' and another 1/3, who submit code that is also 'fakes' the correct results, they just pad it out with a bunch of filler code to make it look like they'd done more.
We rate each entry out of 10, its insanely frustrating how many 0's I had out.
I assume that they assume we just check the tests pass and thats that. (Do any companies do this without checking the code? The sheer amount of people that try suggest it must work sometimes!)
Presumably you also have other tests that you run the code on too, that are not sent to the candidates right? That would save some time, say if the code passes all 10 tests sent to the person, and fails all 20 tests that you keep for yourself, then that person's resume would go in the trashbin with little time wasted.
Where I work we have a "programming test". You get a simple problem, two hours, your own laptop (or one provided to you, as you like), full access to the internet, and a room by yourself.
The problems are doable in about 1/2 the time.
Still not quite like a real situation where you bounce ideas off other folks, take a break and go for a walk when you're stuck, etc, but it's closer than the whiteboard coding nonesense.
But I have seen folks coming highly recommended with "architect potential", "top of the heap", etc, who turn out to be 100% useless. And I mean useless as in "it's more work overall when they work on something than if they wouldn't have worked on it in the first place".
The good developers generally have no trouble finding jobs, and therefore don't need recruiters' help. With the best and the brightest taken off the pool, the recruiters are mostly left with merely the mediocre and the bad ones. This in turn has collateral effects: any developer coming in via a recruiter already carries the stigma of "not good enough to find job on their own".
> Lawyers and doctors are asked trivial questions just a handful of times in their careers during their bar examinations and boards.
Of course, this obviously means there are examinations and governing boards that certify laywers and doctors as fit to practice. I doubt most people in the industry would prefer that we start moving towards that.
Or maybe not. I dunno. Those of us who went through traditional CS programs would certainly benefit from such a system greatly. But the lack of formal entry barriers is part of what makes the industry fantastic, and perhaps the very fluid and sometimes irritating interview process is just part of keeping those entry barriers away from tech.
I've been seeing the same basic topic come to the top of hacker news over and over. I thought the with the advent of GitHub, Stack Overflow and easily sharable web and app projects that this would make vetting software developers easier. Are these resources not being utilized at all in the hiring process? Am I to believe that a software developer with several years of professional experience, references, and some sort of work portfolio can't code at all? I don't get it.
Many, many dev candidates do not have a Github account with any code on it, nor do they have an SO account with any answers on it. And past work often belongs to the employer (and was written by a team anyway), so.
If you want to say "don't hire those people," you're being far more restrictive than those who want to do fizzbuzz-style weed outs.
Github-centric candidate filtering is great for the tiny percentage of programmers who get paid to work on open source, and terrible for the much larger percentage who can't write open source code even on their own time without risking a lawsuit from their employer. Only way to fix that's unionization or getting an awful lot of laws passed in a an awful lot of states (for US devs, anyway), thanks to employer/employee power imbalances and incentives to defect from any strictly voluntary resistance movement.
Wow most programmers aren't allowed to have any sort of hobby involving programming without risking a lawsuit from their employer? That seems pretty crazy, I'm surprised places like California permit that.
Any ideas what companies tend to have such onerous conditions in their employment contracts? I've never seen something similar but I've only ever worked for smaller (less than 500 people) companies.
I wouldn't say "most", but I once had to fight to get an exemption in a startup's proprietary information agreement. We were asked to sign this after most of us had been there six months, and I had to threaten to walk off the job if they couldn't be flexible.
The way it was explained to me was that they needed to be able to claim ownership of anything I did for the company or using the company's technology. Since that could include things I did outside of work for the company's benefit or my own, they needed to be able to claim ownership over 100% of my "works of authorship" in accordance with "provisions of the Pennsylvania Labor Code" which they refused to elaborate on.
I finally got wording added to my agreement for open source software, provided I didn't use any proprietary startup magic to build it.
Psychology is kinda tricky - I keep my "screw-around" and "play-with-a-little-bit-of-this-and-a-bit-of-that" on a (not visible to the outside world) BitBucket account, as
(a) I'm never sure the code is high quality (idiomatic, proper variable naming, documented)
(b) I'm not sure I'd be able to complete it to the point where it's a decent-looking project, not some random half-done stuff,
That doesn't seem right, at least in America. All companies I've worked for have required me to sign an agreement giving them ownership over any work or inventions produced using their technology or on their time.
If that was the default state why would everyone require those too?
No side projects even? I used to be one one of those big corporate programmers with no side projects to show off and while I was decently good I wouldn't hire my old self to work on my startup. Also, an interesting idea - ask candidates to find any 3 unanswered questions on stack overflow and answer them. It would at least provide a useful exercise for the candidate and the community at large.
We need to move beyond the algorithm bravado to engage more fundamentally with the craft. If people are wired for engineering logic and have programmed in some capacity in the past, they almost certainly can get up to speed in any other part of the field. Let them learn, or even better, help them learn.
What you want are people who are able to perceive the functioning of the entire group, and act in ways that optimize that. Bands of highly talented musical savants that don't like each other and don't "gel" can still sound worse than a group of friends with "pretty good" talent who have become really good at playing with each other.
After having interviewed and taught, I now realize that interviewing and teaching are involved and deep disciplines, that take significant insight in order to do them well. (Including some personal insight.) Most programmers haven't been concentrating in the areas that would make them good at either. In fact, my experience suggests to me that a great many programmers use the occasions of teaching and interviewing to vent aggression against "the stupids" of this world.
I'll pile in here to note that I worked with someone back in in junior year of college on a group project. This person insisted that our entire project (which was quite large) needed exactly four classes as we identified four types of users for our classes. "J., we also discussed how Smith over here was going to build a database access layer, which means at least one more class. What you are saying does not make any sense." He refused to see reason. Not only did he graduate, but he is now overseeing a team of engineers at a corporation where he was first an engineer for 3 years. I pity them.
I also worked with an individual who could not code to save his life and had to "work with" (read: copy) a friend to complete even the simplest assignments. This person went on to work for a major US financial institution. In fairness, he only works in IT, but I am sure his software incompetencies can't be good for the company.
Most fields have reached the right conclusion. Colleges do not adequately prepare candidates for work. If tech employers never train or improve their employees and the employees don't take responsibility outside of work, then it's possible for starting programmers to be passed from job to job without ever learning core concepts.
There needs to be an understanding that a Computer Science degree often does not teach job skills and that addressing the problem early with extra reading or training is critical. There is a tendency for people to start CS majors who have no abilities or interest in the subject other than liking the money involved.
When I was in college, I took CS electives as an English major undergrad and routinely had CS graduate students attempting to cheat off of me during exams. Anecdotally, a lot of tech leads seem dumbfounded by the results of programming candidates with CS versus unrelated degrees.
I think that article makes a number of faulty assumptions, but one stuck out to me:
>We need to start with the assumption that engineers are smart learners eager to know more about their craft
I have encountered many people in the field that do not support this assumption. Code reviews for these people always end up with them defensive - you will see a lot of remarks along the lines of, "isn't it good enough though?", "it works with the one input I tried", "yeah that would be better but that would require me to do more work to change something, vs just hit merge on the pull request", etc.
You can try and help improve their workflow, give tips, point out faster or better ways of doing something, only to come by to help them 2 weeks later and see them still stubbornly doing something the slow way.
They treat code they don't write as a black box, or "owned" by the person that wrote it. You will encounter, "Hey, I was having a problem with X, and I traced it down to this function in the Foo subsystem. I think I heard you wrote the Foo subsystem. Could you take a look at this?" When you try to help, you realize they navigated to this point and then didn't spend 1 minute trying to read the code and see what's going on. You might take time and give a good overview of it, how it works, offer to answer further questions, etc. It goes in one ear, out the other. Three days later the exact same question comes up.
I've seen a programmer get thrust into the node.js world without prior experience. I offered a few suggestions for learning resources, and offered to loan a book. He said, "Yeah, I really try not to read books like that or spend that kinda time outside the office. I'll figure it out."
I could go on and on with anecdotes, but I think the reality is there are people who are very into the craft, and WILL be quick learners (of that which they don't already know), and there are plenty that don't. For those of us that do strive to improve and get better, working with the former is a terrible demotivator.
A step further, I think that the former group will slowly attribute to technical debt, or just over time make your code base harder to work with. You can't micromanage forever, and having to harp on the same things time and time again in code reviews is draining for both parties. Eventually you start giving in, and letting sub-standard work through, because to do otherwise will make you seem like a nagger, nit-picker, etc. Then one day you have to edit something they touched, and your own task takes 10 times longer due to tons of coupling, faulty assumptions, things they didn't bother to learn about how the overall component works, etc.
The "hazing" style of interview surely does suck, and I myself don't go for standard library trivia, or ask algorithm questions that would never come up on the job. But more and more, I feel that they are used throughout the valley for lack of a better way to defend against someone joining the team that's going to be a net negative. I think the thought might be that it's extremely hard to detect the person that can't be bothered to read through the code base, and figure out how things work a little before giving up and asking someone else to hold there hand - yet if one can make it through one of these tough interviews, there's a higher chance they aren't that person.
Another aspect is that I feel like once people get IN in tech, few companies are willing to make corrections. Someone can consistently write bugs, or submit code to the repo it was clear they didn't even test against the base cases of input/usage, give estimates that are WAY long for simple tasks, not pay attention in meetings, etc - but many companies are afraid to fire, or give harsh feedback. From what I've read, Netflix is one company that does not seem to operate this way.
Perhaps we'd all be more trusting in our interviews if we didn't know that the false positive would be so hard to correct.
When you try to help, you realize they navigated to this point and then didn't spend 1 minute trying to read the code and see what's going on.
I'll take the other side here...
I track my problem to a particular part of a submodule. The person that owns it didn't document it well, didn't write an overview of how the thing works, and generally has made "just reading the code" a poor and error-prone substitute for asking them directly about context they've chosen not to share with anyone else.
The person becomes annoyed at my continued questioning instead of "reading the code", and eventually complains about my lack of motivation. The code, meanwhile, never gets fixed, and is a ticking timebomb of technical debt.
Whose fault is this? Mine for not "reading the code" and breaking encapsulation, and changing my code to match undocumented behavior? Or the other person's, for failing to behave professionally like an engineer and identify that they are knowledge-hoarding?
Who's to say, really. It doesn't matter...I've got another gig lined up.
I think what you described exists, it's just not what I was referring to. Quite the opposite, I've seen this happen when there WAS very clear documentation, possibly a link to a wiki or readme file in a comment that points to an overview, and yet the person clearly hadn't bothered to read any of it.
This description is uncanny. Having this person on your team is far worse than someone who flat-out can't code. You can fire someone who knows nothing. Everyone knows what the problem is, and it's easy to talk about. If your team is competent, they'll recognize incompetence.
It's almost impossible to fire someone with the more sinister qualities described here. You always hope they'll catch on: they're smart and they've successfully written working software in the past (this is why you hired them). You end up waiting forever for them to 'click' with the team. It doesn't happen. They're a constant drain and their mistakes compound until your code is unmaintainable.
Note that I'm not picking and choosing any of OP's complaints. They come as a complete package. I've worked with a person who had every single bad quality listed. As I read the OP, I was thinking we must be talking about the same person. I couldn't have described it better.
There is a large pool of incompetent developers going from one job interview to another. Finding the decent developers amongst this herd is painful.
It's not that there are loads of awful developers lurking in companies (although there are a few). It's that awful developers pollute the waters when hiring.
Sure, you can ask companies to give their new hires books to read and pointers on what to learn. However, it's even better if the new hires already read books to improve their craft. On their own initiative. There is still plenty of stuff to learn when you start at a new place, the new codebase for example (and possible the domain). So it is good if you are already up to speed on the programming part.
This is an example of the three dimensions of programmer knowledge: Programming, Domain and Codebase. I recently wrote more about this in "Programmer Knowledge" here: http://henrikwarne.com/2014/12/15/programmer-knowledge/
It's probably a systematic issue with how programming work is paid for. There are lots of 'software companies', but there are orders of magnitude more contracting and consulting and IT companies - companies that charge work based on what some sales guy can get through the front door. After that point, it's about filling in seats with whomever can charge the most.
Software companies need people who can code, as well as do other things. Other types of companies that do 'IT' need people who can look good in a seat. This causes lots of incompetent people to get jobs for many years.
The idea that there are lots of great engineers out there who just need mentoring and guidance to excel sounds great, but the reality of the industry seems to be that people dont stay in one place very long.
I think there is an opportunity there, but you would need to use contracts or something to make sure the time and effort you are putting into making this person great isnt just a gift you are giving to the company they move to next year. Also, I dont think startups are at all well placed to do this, given their time pressures. Google, Apple and Facebook could totally be doing this though.
I think it is far more useful to build a process for software engineering that continually reviews itself and makes things better. A fizzBuzz test is useful to figure out how a person approaches a coding problem and something like that should be part of hiring new developers, but most whiteboard interviews take the whole thing way too far and approach it as an exam that you have to pass.
Fact is that everyone has their 10X moments and their doldrums. Only by working on processes and team building, can you create an environment that lifts your engineers to the 10X level more often.
I read this article, and I think to myself: "My goodness. This sounds just like graduate school."
In CS grad school, impostor syndrome is everywhere. Nobody talks about it. Everyone -- from the professors to the masters students -- is strongly affected by it.
The more you advance, the more taboo it is to talk about it, even though it doesn't go away. I understand the successful people just "get used to it" after a while.
Or maybe it's just me. If it is, wouldn't that be a certain kind of delicious irony?
> Lawyers and doctors are asked trivial questions just a handful of times in their careers during their bar examinations and boards.
True, but they are also asked lots of tough questions when they go to interview for jobs. It's not like you walk into a law firm, say "I passed the bar", and they hire you. Comparing the legal/medical licensing requirements to the software engineer interview process is apples-and-oranges.
I'm not sure I've ever interviewed anybody who can't program at all, but a large number of people I interview are shockingly bad.
My go-to question to detect this is to remove the odd numbers from a dynamic array of integers in-place, in their language of choice.
The ideal candidate writes an O(n) function, a less ideal (but still good) candidate writes an O(n^2) function, and if they're struggling I say that they can return copy of the array with the odds removed (this is worse, but they could easily make up for it later).
Not only can most candidates not do any of these, but most don't even know how to tell if a number is odd. I had one suggest that maybe he needed a loop (I don't know what he was planning, because immediately I told him he didn't need a loop to know if a number is odd).
This is for a game programming position, and most of these are people who have (or claim to have) programmed games. Every game has a loop like this somewhere. Most have many! I feel like all I do is iterate over arrays and remove things from them, and yet, this is one of those questions that over 95% of people don't get. It boggles my mind.
How is making a returning a new dynamic array worse than doing a O(n^2) implementation? The former would be perfectly fine in production, the slight increase in memory use is negligible, the n^2 solution would lead to me wanting to fire said person.
Edit:
In addition if the O(n) in place solution that I am thinking of is the same as what you are thinking of, the downside is that the remaining even elements do not remain in the same order. So overall it is hard to imagine ever wanting to use this solution over the standard create new array with just the even elements.
You can easily make an O(n) solution without reordering them by using two pointers; one to read numbers and the other to keep track of where to put the next even number.
> How is making a returning a new dynamic array worse than doing a O(n^2) implementation
Avoiding unnecessary allocations can easily mean the difference between missing a frame and not. Keep in mind that we frequently have to deal with bad mallocs and OSes/hardware without virtual memory capabilities.
The main reason I'd prefer O(n^2) vs O(n)+allocation is that I asked for an in-place solution, and there are definitely situations where you'd need to be able to do this (because of pointers to the array base, or because the caller doesn't have access to the allocator to release it, or you're using an arena and then allocating a new array means wasting the memory of the old one...).
Another reason that's a bit more subtle is that the O(n^2) is that it's much more likely to show up when profiling, unlike the allocation which will likely just make everything else slightly slower due to fragmentation.
I can think of one that I worked with briefly, but not at my current company.
Ten years ago, today's interviewing style wasn't standard and we saw them more. They seem pretty rare today, so perhaps we should be focusing on other problems in the interview process rather than fighting the last war?
Some percentage of the population are incompetent, so they end up in every business in every field, everywhere on the planet. In other words, it's reality. Why does it still surprise people?
> “We expect every employee to be ready on day one.” What a scary proposition! Even McDonalds doesn’t expect its burger flippers to be ready from day one.
There's no other explanation for the buzzword bingo.
Rather than working against the interviewer I work with them. Rather than picking a pitiful problem I pick real ones. The only way to know who you want to work with is to work with them.
More succinctly: there should be no term "software engineer", unless people need certifications to become them, and face legal consequences when their engineering product breaks and kills people.
Okay, but real engineers can rely on the quality of their building materials, like mortar, brick, concrete and iron, with constant quality through decades or even hundreds of years.
Contrast that with the building blocks the 'software engineer' has to work with. Microsoft might upgrade some DLLs your product depends on, overnight. Will your product still work? Your application probably relies on components written by others. Are they infallible? Can you even inspect the components for defects?
I sometimes have the idea that software is built upon quicksand, and the building blocks can and probably will change before the project is completed.
I don't think real engineers can rely on the quality of their building materials. They do calculations to specify what quality of building materials and assembly is sufficient to surpass a safety threshold, then contractors use the specs are used to procure materials and assemble to meet the threshold.
Real problems occur when an engineers specifies a particular quality and lower quality is used. Or the engineer designs for a set of conditions, and that set is exceeded.
I think technical interviews will be obsolete in the future, your technical competence will be evaluated with GitHub or other public projects. So the interviews will be only about culture fit. I don't know if I will ever be in a position to hire anyone or even if I'll ever work as a programmer, but this is how I would interview. Send me a link to your GitHub, if I like what I see, I want to know if we would get along well, if we do your hired.
You see this a lot in these kinds of discussions, but I can guarantee you will overlook a lot (most?) of excellent developers as many can't open source code (due to employment contract) or just don't really want to be coding 24/7 when their jobs fulfil and challenge them already and they have family/friends/sports/whatever to do outside work.
These seems to be a trend from younger developers from what I've seen, which assume every developer will have a GitHub account to show (like young non-developers nowadays can phantom people that don't use facebook or don't have a smartphone ;)
You're effectively eliminating 99% of your potential candidates then, as most developers do not do public open source work, including many talented ones. I can't imagine any sane company doing this, yet there are some that do indeed.
And in Anglo Saxon legal system code you do at work belongs to your employer and so possibly does anything related to your job you do out of work.
I would be breaking my contact with my employer if at an interview I showed you for example the Ml code I wrote to optimise our clients Ad words Campaigns.
There's too many egos, creative flakes and the alike for this to be ever called a stable work environment/career. Especially in the agency world!
Just speaking from my experiences, where I was given trials and then told I rock, then many months later being let go because we no longer clicked and or my side projects hurt their egos.
For me this field and my side projects has been soul crushing!
* Loved if developers/designers could be apart of a union, one that provided a shield from all the bullcrap noted above. Thus, providing better job security and life/work balance.
There are two separate points here. One is the false-positive problem: people whose social skills and connections get them a series of good jobs despite a lack of merit. And even in companies like Google those people can make $500k and up if they become managers, PMs, or non-programming celebrity engineers whose code sucks but who are good at making executive types (who value confidence over ability, because they can judge the former) listen to them. The other problem raised is the false-negative rate that comes from finicky and stressful (for some; I always liked hard technical interviews) interview tactics, due to our internalized low status.
These problems may be related but they're superficially opposite and this should probably be two separate essays. One about our industry's severe false positive problem (especially in management and PM roles) and another about why our extreme but mostly cosmetic selectivity (doesn't know MongoDB? Worked at Oracle? Forty-seven? No hire!) is so stupidly ineffective.
"If people are wired for engineering logic and have programmed in some capacity in the past, they almost certainly can get up to speed in any other part of the field. Let them learn, or even better, help them learn."
I couldn't agree more with this and the article in general. So many people who have the capacity and potential to continue to be good engineers are overlooked because they don't know a specific framework or language or because they worked with unpopular tools in the past. It seems to be a trend to do absolutely zero training these days and expect the engineer to not only learn on his own, but to teach himself outside of the job. Some companies even expect extensive open source work from candidates while at the same time requiring incredibly long hours. The idea that candidates might be doing other, non-programming things outside of work, want time to do those, and might not be interested in pursuing their craft 24/7 is looked down upon. I can't think of a single profession that's so demanding.
The irony is that most jobs in the field are incredibly easy for someone who is even moderately skilled and hardly ever use the skills being tested for. When these things are eventually needed, all that one needs to do is a bit of research to come up with a good solution, but somehow this is looked down upon. Most programming jobs and professions that do not require knowing how to build algorithms, jobs in which if you're writing search algorithms or any other kind you're simply doing it wrong and wasting time. Why would we expect all programmers to know how to write certain algorithms or solve certain unfamiliar problems off the top of their head when that isn't what they've focused on in years, if ever? You wouldn't expect a cardiologist to be an expert in neurology, yet many software companies are really only seeking the equivalent of neurology experts when they actually need a variety of expertise. Sure, a general background is good, and most good developers will have a broad understanding of many topics, but expecting a deep understanding of every topic is simply ridiculous. The companies with such expectations end up hiring the same type of expert while ignoring a ton of talent.
Let's not forget that in the end, this is a job. Despite all the rhetoric about loving the work and other such nonsense, 99% of programming positions are just another job that no one would do if they didn't pay well. It's not in the workers' best interests to be emotionally invested in their work as this only benefits the employer to the detriment of the employee. Thus, most things will still be learned on the job, either with the support of or more commonly, in spite of the employer. People work the job because they want to provide for themselves and their family. If the job is enjoyable, it's mutually beneficial, but that's not something that can generally be expected. Yet many employers expect employees to both love their job and love the profession enough to put in a ton of additional time outside of work learning and coding for free because they are too cheap to provide proper training and their working conditions are generally to abysmal for them to retain workers long-term.
The solution here is to hire people with a good general background and invest in their education on the job. To do so, employers need to find people who can actually see the potential in others and who know how to interview, not people whose idea of a good interview is asking whiteboard questions and insisting the candidates talk them through the problem instead of letting them think it through and actually solve it. This requires employers to actually respect employees once they are hired so they can be retained long-term and their training is worth the investment. This would be an even bigger change than changing the interview process. Employers would have to stop asking employees to work for free (salaried overtime). They would have to stop being childish themselves and literally yell at or threaten employees. Employers would have to afford their employees basic human respect, something that is severely lacking in technology companies, especially in Silicon Valley. Those changes would indeed be radical, and employers who implement them quickly find out what the rest of us have known for years: there is no shortage of talent or good engineers.
> In all my years immersed in the tech industry, I have never once heard a firm talk about the idiots lurking in their own offices. They always seem to be elsewhere. For everyone.
When you point one finger, there are three fingers pointing back to you. Hey author, you are on of them!
I am, naturally, constrained from saying "Here's a list of three of them." It would not be difficult.
It is also, regretfully, not the case that all applications one receives to an advertised position of Senior Ruby on Rails Programmer would be from people who had ever opened a command line.
Both of these are very difficult things to accept. They may be even more difficult to accept if one is extraordinarily smart/diligent and one studies/works solely in organizations which apply brutal IQ/diligence filters before one is granted even a scintilla of the admission committee's time.
I remember, rather vividly, the first time I figured out that an engineer couldn't program. I was attempting to tell him where a value was being assigned in a program. After failing to do so over email, I went over to his desk and asked him to navigate to the file at issue. He was unable to do so. I told him I would do it, assuming that he was unfamiliar with the directory structure in that part of the program, and opened the file. I then said "So you see, the assignment is made in the fooBar subroutine." He couldn't find the fooBar subroutine. I said "It is the third method on your screen." He couldn't find it. I said "It is this one, here, which I am pointing to with my finger." He said "OK, that one. Where is the assignment?"
The fooBar subroutine was one statement long.
A coworker, having overheard the conversation, stopped at my desk later, and explained to me that, if I had pressing engineering issues, coworkers X and Y would be excellent senior systems engineers to address them to, but that Z should be allowed to "continue to devote his full attention to work which he is well-suited for."
The industry has many destructively untrue beliefs about hiring practices. That there exist at least some people who are not presently capable of doing productive engineering work is not one of these beliefs.