But have you also hired people who did not interview well and weren't great at the work? The goal of interviewing is to avoid false positives, not to avoid false negatives.
"The goal of interviewing is to avoid false positives, not to avoid false negatives" - why is it? I can rectify a false positive, but if I miss greatness I might fail. If you hire people that others do not look at you can get people who are very loyal too.
Rectifying a false positive is very expensive (cost of time they're not producing but drawing a salary, eating dev time for training, etc. + cost of firing), and firing people is a morale issue as well. Maybe this is from the perspective of a company that has no shortage of applications, but rejecting an applicant that would've been good costs basically nothing (dev time + travel if they got through phone screens).
I didn't realize there was dissent on this topic. Care to elaborate? I initially picked this up from http://www.joelonsoftware.com/articles/fog0000000073.html ("An important thing to remember about interviewing is this: it is much better to reject a good candidate than to accept a bad candidate.")
I think part of the problem is that the technical grilling isn't necessarily reducing false positives after a certain point. You may simply be selecting people who have front-loaded data structures in an "exam ready" way and rejecting candidates who have a fine background in this area but aren't exam ready. Do you really reduce false negatives by doing this? I almost have to wonder if you increase them, since you may simply be selecting for people who know how to prepare for an interview (and who aren't so busy with important work that they can afford to drop everything to front load the information).
Think of it like this - you have two candidates. The first is exam ready for data structures and algorithms. You want merge sort? He whips out mergesort. You want red-black trees? You got it.
A second candidate is clearly aware of what run time is, and what trade-offs happen in different sorting algorithms. He's clearly aware of what can happen when a binary tree gets out of balance, and is aware of the types of more balanced trees. However, his knowledge is not exam ready. He can clearly code, but he starts to fade on the edge cases and in the implementation details. He mentions the chapter in a well respected data structures and algorithms book he'd read should this issue come up.
My question is, do you really protect yourself against false negatives with a no-hire on candidate #2? Candidate #1 may simply be someone who has figured out how to hack the programming interview. He might not be an especially good developer, not especially good at working with clients, with teams, with pushing through and getting it done. Or he might be very good at this. I'm just saying that I doubt that being "exam ready" is much of an indicator beyond the obvious coding ability and awareness of candidate #2.
However, this process clearly creates a huge number of false negatives - by bouncing #2, there's an excellent chance that you're passing on a superb candidate, and for what? A reduction in false positives that may be close to zero?
One thing - I think part of the disagreement on this board may come from the different definition people have for intensely technical interviews. Some would consider #2 to have passed. My own experience is that interviewers often do expect an "exam ready" preparation on data structures and algorithms, in addition to a few other branches of computing, which is probably I take a more cynical view of these interviews.
I don't see the relevance. The fact that you're asking someone to code a mergesort or a red-black tree tells me that you don't know how to interview, not that interviewing doesn't give you useful information. Candidate #2 sounds like they passed the interview - what did I say that made you think I'd reject them?
Yeah, if you're really bad at interviewing, it's not going to give you good information. I'm not disputing that. Interviewing, as it's done where I work (and everywhere I've interviewed, which admittedly isn't very many places), has been very effective at reducing false positives.
I really want to address this in more detail than I have time at the moment to do so. I am a dissenter on this topic, though, so I would like to ask you to elaborate on your view of the trade-offs.
> cost of time they're not producing but drawing a salary
How many false positives do you know of who didn't produce anything? What is the actual chance that someone is going to be a net drag if they don't work out? Of course it can happen, but I get the impression from the more paranoid that it is likely or expected. I don't see that.
> eating dev time for training, etc.
This, I feel, is highly dependent on company and codebase. At my previous employer, I worked on real-time radar signal processing code which was very complicated and math-heavy. Someone who did not know what they were doing in that code could have easily cratered the productivity of one or more software or systems engineers. Simpler codebases would not be as bad.
> + cost of firing
Depending on your local laws and corporate structure, I can understand this issue. It takes an act of Congress to get fired from a defense contractor, though they have enough small layoffs to prune the dead wood once in awhile without a full-blown firing. I would like to know what the issues are for a smaller company though, since I really don't know the process. I get mixed signals as well. It seems like the employers and managers here speak of the difficulty in actually getting rid of people, while most of the individual contributors I know are worried about losing their jobs at the drop of a hat.
> Maybe this is from the perspective of a company that has no shortage of applications, but rejecting an applicant that would've been good costs basically nothing (dev time + travel if they got through phone screens).
If your company is finding the people it needs more or less when it needs them, then I understand this view. However, if your company is also one of those that complains about how hard it is to find "qualified" people and how understaffed you are, then your false positives are costing your business far more than that, though the true cost is very difficult to quantify (as is the true cost of a false positive).
> How many false positives do you know of who didn't produce anything?
Well, none. We haven't had any false positives since I started (and this is my first full-time job).
Depending on how quickly you catch the mistake, I would stand by my statement or call it an exaggeration. If they're still in training when you catch it, they've probably produced nothing. If it takes longer than that, then they've probably produced something (although if they're contributing enough bugs, it is entirely possible for them to have negative net productivity). But even at best, they're producing less than their salary (why else are you firing them?), so you've lost something.
> This, I feel, is highly dependent on company and codebase.
No arguments here.
> I would like to know what the issues are for a smaller company
From a US perspective, most of the cost is lawyers. It is very easy to say something wrong and end up with a wrongful termination suit, so you end up being very careful, which is expensive in both money and time.
> It seems like the employers and managers here speak of the difficulty in actually getting rid of people, while most of the individual contributors I know are worried about losing their jobs at the drop of a hat.
My experience is that people are worried about being laid off, not fired. Maybe I'm out of the loop, in which case I don't really have an answer to this one.
> if your company is also one of those that complains about how hard it is to find "qualified" people
We're not. We're pretty happy with our current growth rate (recently it's a bit high, if anything).
Maybe this reverses if you're having trouble finding as many people as you want to hire. I would understand accepting more false positives if that's your situation.