I can talk in a pretty good amount of detail about how exactly our process worked, if anyone has any questions.
And, to head off a concern a reviewer gave me: from 1997-2005, I was a full-time software developer; I shipped shrink-wrap boxed software on Windows and Unix in the 1990s, then appliances deployed at tier 1 ISPs. I'm a "software person" more than a "security person".
Have you ever read The Checklist Manifesto[0]? I may be reading too much into this post, but the lessons you learned from this interview process have frighteningly close parallels to the lessons in the books. I doubt the book had any influence on your interview process, seeing as it was published after the interviews were formalized, but the book seems like it might have new lessons.
For example, a good portion of doctors absolutely hated using checklists. Yet, when pressed, readily admitted that it prevents simple mistakes and that they would prefer to have them rather than not to. Another is that entries that address more human concerns, e.g. "Have everyone introduce themselves", have a place on good checklists.
Extremely true. This is about systematizing the right things about interviewing, and making them solid. Checklist Manifesto is all about that, a great simple way to create systems that work.
At a high level, this is all about Deming ( http://en.wikipedia.org/wiki/W._Edwards_Deming ) and TQM concepts -- if you want to achieve a high-quality output, measure the things that matter, and understand the variation present in the system. Once you have a stable system with good data achieved by good methods, you may then begin improving it. Attempting to improve a complex system without knowledge results in unpredictable changes—we call that tampering. Simple but beautiful.
So, in essence, this is an extremely natural and correct application of quality management principles to the hiring process. Stellar.
How do you avoid wasting time on candidates that are a poor fit? When we advertise for an open position we get a ton of unqualified candidates.
In your post you say, "At my last firm, we had the “first-call” system. Every serious applicant got, on average, 30-45 minutes of director-level time on the phone before any screening began." You seem to continue on from that point. What happens before first-call?
To me, recruiting is two problems: outreach and qualification. You're asking about outreach. How we did outreach is a whole 'nother blog post, and one I'm bound to write soon.
What surprised me about recruiting was how much more important qualification is than outreach. Without good qualification, it almost doesn't matter how good your outreach is, because you're filtering for the same highly visible easily accessible candidates every other firm is looking for. The best-priced talent is buried. What you need is ground-penetrating radar. Clever outreach schemes helped us, but what really let us make our best hires was a qualification process that allowed us to confidently ignore resumes.
We didn't just maintain our team's level of quality by ignoring resumes. We transcended it. We never could have run a resume-based process that would hire people who could use lattice math to break crypto. To hire those people, we needed to select for aptitude, not experience.
I haven't said it anywhere yet, but in case the subtext isn't clear: the person who can implement an attack for which edit/compile/debug takes 4-6 hours also does a pretty great job on every other facet of the software security job. There are aptitudes that are very good proxies for other aptitudes!
Maybe I wasn't clear with my question. Say the company indicates it is hiring and gets 100 interested people. How do you filter them down to serious applicants without looking at resumes? I would argue that isn't so much outreach as it is qualification, as the people are already interested.
Maybe my premise of 100 candidates is incorrect based on how you did outreach?
At Matasano (slash NCC), we get a lot of candidates. We look at resumes so that we have something to break the ice with when we talk to the candidates. We do triage interns using resumes, because we get hundreds over the course of a few weeks, but never for full-time candidates.
How do we figure out which ones are serious? We talk to them, usually give them homework, and then give them work-sample tests. This is not particularly expensive for us, and gives us repeatable, relevant, apples-to-apples metrics on how qualified they are to do the work. This is, as Tom insists, surprisingly easy for us. (Not to say it's easy, but nothing about hiring is easy.)
tptacek mentioned in some other comments that Matasano runs ongoing "Capture-the-flag" contests/games (such as [1]) that basically self-selects interested and interesting candidates for them.
Serious is "testable", in the sense that you can ask them a specific question about the job to see if you get a response. We warm people up on the phone when they're putting special attention into our positions, and give everyone the opportunity to answer work samples (whether they seem serious or not). The point of warming them up is to balance out the effort — we're asking people to spend a few hours on this stuff, it's fair to give them time.
Do you really need to "filter down" your candidate list, though? Given the apparent importance of hiring processes, it seems to be like it may well be worth it to simply go through the process with all 100 of them.
One possibility is to make a very simple but non-standard request in the posting - for example, to submit the answer to an extremely straightforward coding question or some such in their cover letter. That would filter out a lot of the 'shotgun approach' applicants immediately.
Very good post. Thank you for sharing it. I've been interviewing applicants quite a bit lately, so have been thinking about these issues.
Two questions for you.
What is your experience hiring fresh college grads? Is the process different? Since one of goals of the process seems to not arbitrarily adjust it for every candidate, but rather stay consistent, do you think it makes sense to adjust it for someone who just graduated. They might have been learning computer science type stuff, data structures, did a few group projects and have certainly been practicing the typical interview puzzles, fizbuzz and O(n) times of the all data structures.
How do you account (or do you account at all) for culture fit. That is a fuzzy and slightly dangerous area. Is there a score in that regard? There are some people who are brilliant but they do not work well in a team, or rather would not well in one particular team. This is course full of danger as "culture" fit often becomes an opaque discriminator for racial, sexual and other stereotypes. "He's just not a good culture fit" is such an overload euphemisms that I feel kind of sleazy just saying.
Everyone has exactly the same process, with the exception of interns. Interns have an abbreviated interview (shorthand: we switch from assessing aptitude to assessing enthusiasm), and any time one came back looking for a full time job, they got fast-tracked. I'm not saying that was the best policy (it did keep us from collecting some extra data), but by the end of an internship you had a really good idea that you wanted to work with someone.
The lede I buried in this post is that we hired more-or-less resume-blind. I had final call on every hire and went through some effort not to look at resumes. On our first-calls, I'd ask some questions that would give me background hints about candidates, but that was primarily to tune my spiel (I didn't want to explain blind SQL injection to someone who'd spent 3 years as a pentester already, for instance).
If I had it to do over again at Matasano, I'd have done it formally resume-blind. Only Dina would have had access to resumes, and interviewers would be forbidden to ask resume questions. The process would have worked better if resumes were firewalled out completely.
There was no culture fit score; in fact, if you gave subjective feedback ("passion", "confidence", "work ethic") your feedback was likely to be discounted. We were proud of the culture diversity we managed to scrounge out of our candidate pool; to have team members with kids who needed to keep strict hours, and team members who'd roll in at 11 and still be there at 8 working on a pet problem. We had drinkers and non drinkers, college students and people with 15+ years of dev experience.
You could get booted from the process for being overtly an asshole (I, for instance, might not get hired), but that happened very rarely and with crystalline clarity.
It's interesting that you make an effort to not read resumes. Do you explain that to people you interview during the first call? The reason I ask is that, if a hiring manager asked me things that are already written on my resume (e.g. what did you study in college?), I would definitely be annoyed, as my default expectation is that they should read the resume before the interview.
Don't get me wrong, I understand and agree with your reasoning. I'm just curious how transparent it is to the other side.
We weren't super transparent about it, but we tried really hard in a lot of other ways to take hiring seriously as a customer service problem, so maybe we came out ahead anyways.
The way we do it now, the initial call person reads your resume, but we categorically do not reject based on the initial call (or the subsequent tech phone interview(s)). So, we get the advantage of being able to talk to someone about their experience and helping them make a plan to get hired, but also the advantage that our process is actually resume-blind.
this is true and at many companies the hiring process is so broken, the interviewer might have only an hour or less heads up that they will be interviewing someone. i have done that maybe twice, and since then i refuse tomdo interviews on candidates without more prep time.
on the reverse of that, i have sometimes avoided topics that the candidate was familiar with because i didnt see relevant experience on a resume. that was a major miss, and ive tried to avoid using resumes for shaping the interview topics since.. i see the resume now as more of a signal of what the person is interested in (since most of us carve huge swaths of our experiences out, and tailor them for the job we apply to anyways).
When I was at EA, I would get an email saying I had to interview someone that afternoon, here's their resume, and here's the title they're interviewing for. It was ridiculous.
It's funny, because to change it, you have to admit that the way you were doing it before is not effective. I've come across many people who admit that they are bad at being interviewed, but very few people I meet have admitted to being bad at interviewing others. It's incongruous with the results, because these are the same people that have hired folks that we all agree are not exhibiting the level and characteristics that they are supposedly screening for.
Good job coming up with something different. I hope it's really working out as well as you say.
I was horrible at interviews. I was a sport interviewer. I had favorite tricky questions. "Design the fastest possible traceroute" was one for a long time. "Describe the most important feature of TCP NO WRONG, IT'S CONGESTION CONTROL NITWIT". I trust my own interviewing ability least of all, which is why I sink so much time into avoiding them.
Guaranteed delivery is extremely easy to provide. TFTP manages it. But think about a bunch of ethernet-connected browsers sharing a 1mb/s link; that's kind of a miracle.
Hmm, I once took it upon myself to do exactly that (mostly using non-blocking sockets, skipping hostname lookups, etc.), but what's so "tricky" about this question? Assuming the position involved network programming, I'd consider it a good way to test someone's networking knowledge. Is the "tricky" part knowing how routers often drop ICMP replies if they exceed some threshold rate?
> I was horrible at interviews. I was a sport interviewer...
That's nowhere near as bad as the interviewer who has the primary goal of demonstrating how smart he is (to the person he is interviewing), with the actual hiring process being secondary.
I am painfully aware of my limitations as an interviewer. I wish there was institutional support for interviewing as a skill distinct from more specific job requirements. I've worked at big companies, small companies, been a founder, and I've never been comfortable with engineering interviews. So, more power to tptacek in his crusade.
I'll admit that I'm bad at interviewing others. I'm not trying to humble-brag or anything. I'm just bad at it. Part of it is that I haven't had to go through an interview in over 10 years, so it's difficult to empathize with a candidate. I do my best, though. The other part is that I'm happy at what I do, but I'm not the best. I sometimes can't keep up with candidates when they're writing on a whiteboard and make a claim that such-and-such works. I squint and can make a judgement on that claim, but really I need to cover x, y, and z during our hour together and we're already 20 minutes into it and I don't want to get dragged into the weeds.
So, I'm not a fan of giving interviews. I wouldn't mind shadowing someone better than I, but my first inclination is that that would be a terrible idea since it would make candidates even more nervous.
This is a good post, Thomas. From this I think I could make an efficient test for the kind of person who'd make a good performance engineer. Developing a more general work-sample for developers is a harder problem, but still feels doable.
This may not be an issue for you yet, but what about plagiarism? And can you expand on your hints about how you changed your pipeline at the front of the process? You don't interview every resume that comes your way, right?
Every serious application, regardless of their resume, got a 30-45 minute phone call. You post hiring posts all over the place and you get a lot of unserious submissions, but an easy way to screen them out is to respond with a question, like, "Thanks for writing! We'd be happy to talk to you. Can we ask what has you interested in doing software security?" The answer doesn't matter, as long as there is an answer.
I did a lot of calls where I knew a couple minutes in that we were unlikely to hire the person, but (a) I got surprised by outcomes enough not to shirk on those calls, and (b) those calls are a very small price to pay for finding buried talent.
I also advocate for a work sample based hiring pipeline and the plagiarism question often comes up. My response is always:
1) People generally don't cheat. If you have a high percentage of cheaters its how you are filling your pipeline you need to address, not the filter.
2) What most people think of as "plagiarism" is actually very common in the real work of software developers. Very frequently you see/mimic other peoples work to solve problems. Instead of being freaked out about a skill that is central to the job, why not use it to evaluate the candidate. Did they "plagiarize" the right thing? Did they do it effectively. Did they do something backwards that a simple google search would have found a thing to copy?
3) If you are big enough for plagiarism to be a real problem and have addressed points 1 & 2, it is relatively easy to detect mechanically (and there is a surprising amount of research in the field as CS professors invariably write both an automated grader and then an automated cheater detector).
I was wondering if you could share some info on the salaries. Did you offer every candidate the same salary or adjust it (based on resume/previous salary/age/experience/...)? You mentioned 100% retention - so I assume no candidate ever left for another position with a higher salary?
Being a specialised consulting firm, I imagine Matasano could offer highly diverse, very interesting work long-term (because consulting), and higher-than-average salaries (because specialised), so maybe it wasn't a problem for you to offer higher salaries than the competition, or to retain candidates at equivalent salaries.
Great article. The prelude to the interview with books given to candidates is especially interesting. I'd love to see that reading list, and would further like to suggest posting it on your careers page. I would imagine that anyone who goes to the trouble of working through your suggested materials would be providing a very strong signal of quality and investment.
You mention that every serious applicant gets a 30-minute "first call" - what percentage of applicants get this call? How do you decide which candidates get this call?
Also, what percentage of people who complete the work-sample task get an onsite interview?
I'm asking these questions because I think most software companies who filter out people based on resumes and phone screens do it mainly because there are too many applicants and they need to filter out many candidates at earlier stages. Do you think your hiring strategy resulted in more time spent on the process per successful candidate and per applicant?
As a candidate, I've soured on work-sample tests. Too many times, I've done the assignment, and not even gotten an interview (and I know I submitted a correct answer).
As someone who already has a job, committing a large number of hours for the chance of an interview seems like a waste of time.
If every employer demands a 5-10 hour work sample test before they even talk to you, that really isn't a scalable solution from the viewpoint of a candidate. I could easily put in 100 hours into works-sample tests without getting a single decent interview, so now I just refuse them.
Yeah, those are the ideal. A lot of us are already learning different things during our down-time. An enterprising company could take that drive to learn new things to find potentially good employees by 'tricking' them into learning something new from them.
Just a reminder: work-sample testing isn't the entirety of my recommendation, or even the majority. Employers need to do more than one thing to fix the hiring process.
I'm beginning to think that one way to "fix" hiring is to do stuff that nobody else is doing.
Everyone else is screening people based on resumes? Then, you can find good candidates by ignoring resumes, so you find good people with bad experience.
Nobody else hires people older than 40? Then you target people who are older than 40 but are still good workers.
Everyone wants a programmer with X experience but there's a shortage of people with X experience? Then hire good programmers without X experience, and give them a chance to learn X on the job.
Nobody else gives a work-sample test? Then giving a work-sample test is a benefit. You find candidates who are desperate and good enough to do the work-sample test. If everyone gives a work-sample test, then you get an advantage by NOT doing a work-sample test. (For example, a company with a work-sample test is now excluding me from their candidate pool, due the the large number of bad experiences I've had with them.)
One idea that seems workable is the paid work-sample test. But then you have to do some screening before giving people the test. (I.e., hire someone for a 10-20 hour mini-project before committing to hiring them full-time.)
With 100 hours of free time, I could get a personal project done and put it on the Internet somewhere. That seems like a much better use of my time than doing 10-20 work-sample tests for companies that aren't going to give me an interview even after I do it.
Just wanted to let you know that I really enjoyed reading your article and will probably crib shamelessly from it in the future☺
I will second the value of a work sample. It's so simple that it feels incredible that we didn't do them in the past. What my team __hasn't_ is issue the exact same work-sample problem; you make a compelling argument for doing that, so I think that we'll implement it.
Amazing text, last month I have screwed up at an interview just because I had no knowledge about what they were asking. After the interview I google about it for 2 minutes and then I was able to quickly answer that question.
Let me first start by saying that I, too, dislike the prevailing interview process for software development jobs.
While I agreed with many of your points, I could not stop thinking what a huge time burden of implementing something like what you propose, at scale. For a growing company with several work streams and projects hungry for talent, the interview approach you posit would never work.
Another thing that came to mind is the fact that educators and cognitive scientists have been working for decades in what constitutes a good and effective test. Here you seem to claim that somehow you and your folks have "cracked" the system and have come up with a test process that's guaranteed to yield good long-term and performant workers?
Finally, I feel this approach while intended to alleviate some of the most common pain points of the interview process, it tries to reduce it to a number. While it would seem a number is plain and objective, much like a test score, it neglects to "tell the story" and unless all you want are gifted and highly skilled human automatons, you need a way to gauge "soft skills", which one would argue are as important as their core technical competency.
Actually, go look for the 'tokenadult comments on work-sample tests to learn that this approach has been known for something like half a century to outperform interviews.
This answer could possible address the test question. It leaves the other two issues I raised unanswered. Not that you have to address them, but just saying that the approach you guys implemented might work well for a very small subset of companies that share traits in terms of company size, structure and industry sector.
How often is it that a candidate shows positive on work-sample tests but then shows negative after a face-to-face interview? Should you explain negative outcomes to the rest of the company or to the candidate?
I can think of a couple times when work-sample overrode face-to-face. I'm sure face-to-face overrode work-sample at least once (we had a scoring system, and the threshold for getting a face-to-face interview was not "perfect score"; you could be "on the bubble" and get an interview), but to my knowledge it never overrode a very good work-sample score.
Thank you for this article. I found it motivational in a way. Especially when many of us think there is not much more room to grow and we feel terrified of whiteboard interviews at the bigger cos.
Getting an internship at Matasano is HARD. Unlike normal hiring, we are limited in the number of spots we can offer, and we also have a huge flood of candidates at once. I hate that it's so hard, and that we can't provide our usual standard of support to interview applicants. There's just no way for us to scale with the seasonal demands for summer interns.
Please please please apply for full-time when you graduate! The process is a lot smoother.
Very interesting, thanks for sharing! It made me think about how we could improve our process.
Couple questions for you. How do you go about choosing which metrics go into a grade when designing the work sample test? Should the process of grading it be completely automated, so as to eliminate bias? What's the ideal length of time for a work sample test, in your experience?
We went through the phase when we gave candidates a problem and let them work on it remotely. It was in an embedded C shop that did a lot of kernel work. Basically, we'd give a short programming task (say, to write an intrusive AVL tree container in C) and 24 hours. Guess what? HALF of candidates cheated. Meaning that when the got called for an in-person interview, they stumbled to explain how "their" code worked. To say that I was shocked is an understatement. It's bloody obvious that such blatant cheating would surface within the first week after hiring and yet they cheated :-| In the end we had to switch to the in-person interviews only and implement few other things to discourage cheating.
In other words - YMMV, what works for others may misfire miserably for you.
I'm not sure the conclusion here is that remote tests are bad. It's also possible that your screening methods didn't do a good enough job. (As a disclaimer, I'm a terrible interviewer, but...) I believe that by the time a candidate gets into a room with me, that my team and I should have a good idea about whether this person will work out or not. If I make them take time out of their day, and I take time out of my day, and then 10 minutes into it I realize they don't know what they're talking about, I see that as my screw-up. I shouldn't have let them get this far. You know what I mean?
Right now at my job, we don't do a good enough job of this. I get candidates that walk through the door that can't code more than "Hello World!" without screwing up. I'm not happy about that.
This post, I think, enumerates a large number of things that you can do to determine worthiness before they even walk through the door -- the remote test is only a small part of it. If all you're doing is a remote test, then maybe you need to do more, yes?
I honestly believe that the recruiting/hiring market is ripe to be flipped on it's head for these reasons. I just can't think of any way to do it that can scale though.
There are the hired.com models where independent experts thoroughly vet the candidate, but that involves a lot of manual interaction.
My big problem with Hired.com and similar sites is that there's no incentive for anyone to be honest. I have no incentive to say "not interested" to anyone, and they have no incentive to back up the salary they post. It's the equivalent of the "always swipe right" strategy on Tinder -- it's in your best interest to say "maybe" or "interested" to everything, because otherwise you don't know if you're missing out on opportunities.
What I really want as a hiring manager is for someone to show up and say "Here's Candidate A. We think they're strong because X, Y, Z, and here's the proof." Someone whose mission is to honestly find me the best candidate, who isn't just trying to place someone as fast as possible so they get their commission. Most recruiters I know are interested in placing a good candidate as quickly as possible, but there are a lot of problems with this. "Good" isn't "best", and often times the recruiters don't know the difference between Java and Javascript, and I wind up with junk resumes in my inbox.
So how do we fix this? I have no idea. But there's gotta be a way. I don't think that way is going to be scalable with a simple web application. I think you have to have a secret weapon that nobody else has, and maybe that's just a group of in-industry professionals that vet candidates on your behalf, or maybe it's some incredible machine learning algorithm, or maybe it's something else entirely.
As a job seeker, I want to get the best job I can, and as a boss, I want the best employees I can get. And I don't think those things are mutually exclusive at all.
Yeah, but I think the resale factor is exaggerated. If they send you a "good" candidate, you might go back to them, but they're just finding the first "good" candidate they can rather than finding the "best".
I worked at a company that had a remote, 3 day test. It was pretty fun, and I still sometimes try out things on it to see how they work (it was a two-parter, with part 1 being to write an evaluation function, and part 2 solve a NP-hard search problem).
I strongly feel that the information I gained from looking at the code candidates wrote for that problem told me lots of really good info.
As far as I know, nobody cheated. (All submitted code as unique, and everybody could talk cogently about what they did. Some people did great things, some people terrible, and lots in the middle.)
In conclusion, I would strongly recommend a "take home" test. Maybe give a bit longer than 24 hours. And make sure it's the same problem all the time, so you have a point of comparison.
Yeah, that's the only real problem with remote work sample tests.
Thankfully, with application security, you can mitigate most kinds of cheating by presenting a custom-written black box and require a candidate to attack it in some way. Unless previous candidates are leaking or sharing info, there's not that much you can do to cheat on that, especially if you have some reasonable time limit (< 5 hours).
Actually, I think the opposite is true; in appsec, the final work product is a list of vulns, so if you want to cheat, it only takes 4 minutes talking to anyone else who has ever succeeded in the process before.
For a dev interview, taking something your team has built and scooping out some of its functionality seems like a test that is significantly harder to cheat on.
For my part: we ran this process for over two years and never discovered any plagiarism. Meanwhile, we drastically increased the size of our team and had total retention; from the time I took over recruiting to the time I left the firm, we lost a total of zero of those hires. All of them worked out.
The problem I'd see with using your own app and taking out some functionality, is that the first few people you throw at it, you have no data points to verify against. I.e., they do it 'wrong' (it's incomplete, it's not what you meant, whatever). What does that mean? They don't have the domain knowledge your team has, and that may be important.
I'd think better would be to have a few of your devs come up with a small app, and set of instructions, that you pass to others of your devs. They can then create a reasonable baseline you can start to measure against. You may need to be a little forgiving initially, if it's not a completely made up example, but it probably is more valid than something that was built for production, with months of domain knowledge behind it.
I've long thought that to come up with a good work-product test a company should continually save off their own bugs to put into a broken test app, after which candidates can fix the bugs and the company can see how candidates solve the problems compared to the people they've already hired. The actual production code is the baseline.
You're right that it's definitely pretty easy to cheat if you have insider knowledge of some sort, but the odds of a candidate leaking the answers are probably pretty low unless you're a huge company and don't change up your tests.
eh, I think it's actually pretty easy to tell if a person wrote a particular piece of code last night. So cheating is pretty easy to ferret out. Because of this, I think the take-home test conveys a lot of useful information.
I have had my share of bad interview questions too. The worst part is when the interviewer starts working on their laptop, like they are so bored of this question they can't think about it any longer. Yet it is now a part of my life for 45 minutes.
At the end of every trivia question I'm tempted to ask: so is this typical of what I will be doing here? Will I be implementing rand(7) given rand(5) once a week?
I've also had a bad experience with code challenges. These are given before the first phone interview. I'm happy to do a challenge taking < 1 hour. I once did a challenge that took me 3 hours. Got rejected because they didn't like my visual design (I was interviewing for a backend position).
I've realized that code challenges require no investment on the company's part, so I don't want to do one that requires significant investment on my part.
> "At the end of every trivia question I'm tempted to ask: so is this typical of what I will be doing here? Will I be implementing rand(7) given rand(5) once a week?"
If someone is asking you to implement rand(7) given rand(5) as part of their way to qualify your ability to do a job, you probably no longer want the job.
> I've realized that code challenges require no investment on the company's part, so I don't want to do one that requires significant investment on my part.
Same.
I'm happy to do one after we've talked about the role and done the "warm up" phase.
Just last week I got given a 150m challenge on an online platform with very little information about the company, role or code challenge.
It's all automated, which lowers costs for the company, but I think they're loosing out on candidates with this strategy.
A little fine tuning and they can have low costs but with a better success rate.
I told them it was rude and I expected to know something about what I'm diving into before setting myself up for 150m.
I've been interviewing for the past month or so and its been terrible from many perspectives; I'm fine with the interviews content-wise (silly questions excluded), but the entire process is such a time-sink.
I have a current position (full-time job) and really just don't have time for (with each company) 30 minutes of emailing, an hour phone call, a short coffee meeting, an offline coding challenge and then a 3 to 6 hour on-site interview. Additionally the weather in Boston has been dreadful, and getting to places just takes longer than normal.
Again, I already have a full-time position, and finding ways to schedule these into my work day just doesn't really work well. The interviews I've had have gone well, but I'm counting down the days until I find the perfect company that makes me an offer that I like, because the entire process is just eating into my life.
As a vote of confidence, this article moved me from "I would never ever interview at Matasano" to "I don't want a full-time job at a security firm, but if I did, I would probably start looking here."
Woah! Don't be afraid to plug your new firm in the blogpost :-) I doubt I'm the only one to mistakenly ascribe your new hiring policy to your old company.
No, you're not mistaken! That's how Matasano has hired for something like 2.5 years. I left the company to do hiring-related things, after feeling like I'd more or less disposed of the challenge completely at Matasano/NCC.
There's a pretty big plug coming next week, though not from me. What we're doing meshes well with the worldview in my post, but it isn't a productization of it. This is just what I really think about hiring.
Can't find a lick of information about Starfighter online - is that on purpose right now then? (Pre-"plug"/launch?) Definitely curious, give us the 2-sentence version if you can?
My biggest issue with the work sample tests is the companies that waste your time with them. I've had 3 cases over the years where I was asked to work on something, I did it quickly, with quality and exceeded the specs and I was still turned down at that point (I confirmed that the code was good in at least 2 of the cases). As a senior technical person at this point in my career I simply have very little stomach for wasting time. Have at it young'ins.
I completely agree with this, and would love to say that as the hiring side you should always give back the feedback to the candidates. I have actively pushed for that every time I've implemented work sample workflows. But it gets trumped by legal/HR. I think they are being overly cautious, but if you talk to HR folks much you will find they all have a crazy candidate lawsuit story.
But the all-day interviews have the same problem. I interviewed at some startup in SF...
I mean, this was me looking for my "job with low standards" so I was going through recruiters, right? That's how you get a job with low standards. You put your resume on dice and include a phone number. Answer the phone. Wade through all the bullshit jobs that have nothing to do with your skill-set. Find jobs you could reasonably do, and say yes to the interviews.
In my experience, this means that I'm competing, mostly, with people that don't have other options... people that can't get jobs through people they know, because they just aren't that good. For a lot of complex reasons, well, that's exactly what I needed at the time.
I negotiate with the recruiter and $150K is within their range. I mean, I'm... not 100%, right? lowered expectations. So I should be looking for less. But I've never had any problem asking for too much; at worst, they laugh and offer me something lower, right?
Anyhow, so I show up at this place and they grill me for the usual 8 hours. I mean, it's not too unpleasant; there's beer and food and everything you expect, and they seem to be really into me. The guy who is going to be my boss gives me his email and tells me to send him an email so he can get things started over the weekend.
The business guy is, well, he's a business guy, but everyone else seems like people I could work with; people I'd get along with, and the skill level was right in there; even in my reduced capacity, I wouldn't be holding the team back.
At the end of the interview, they ask what I'm looking for, money-wise. Now, like I said, I negotiated this with the recruiter, so I said $150K, but I noted I could probably be talked down some, and I mentioned a lower sum, one a friend who has worked for me for most of his career recently got.
The mood immediately changed. We said our goodbyes, and I left. I did email the boss, but total radio silence.
I never heard from these people again. I eventually got the recruiter to tell me they didn't want me, but...
Personally, I'd be less pissed I spent 20 minutes on a coding problem for them at home and then didn't hear about it.
I can relate to this comment and the others that replied to my original. I guess my overall view is that far too much time is wasted and it is mostly preventable. As a senior developer I have a detailed LinkedIn profile, a few recommendations, some side businesses and startups I have been involved in, a portfolio website showing off websites and apps I've built, a tech focused blog, a GitHub and a Stack Overflow. Right there if you just spent 15 minutes reviewing my profile you should have a pretty good picture of who I am and what I am capable of. If you need to see a 3 hour coding exam and have 8 hours of interviews on top of that I think you are just not decisive. I think a lot of it is people rationalizing with themselves that having gone through a "thorough" interview process of tons of candidates and lots of time that they somehow found the best candidate when it's a lot of noise IMO.
My favorite method, really, on both ends of the fence? Just hire the person and start them working, for pay, with the understanding that this is still an evaluation type deal. If it doesn't work out? if I'm the one being evaluated, at least I got paid for the time. If I'm doing the evaluating, seeing some one actually work is a way better indicator of, well, how well they work than anything else.
The problem with this method is that it doesn't work if you are trying to hire someone that values stability away from a stable job. You are limited to people that are unemployed or that don't value stability.
This can work well if the person currently functions as a freelance consultant or is unemployed but has worked as a consultant in the past. Also, ideally you have a specific small project for them to work on. Finally, you have to be able to pay them the market rate. If all of that fits (rare) then it's a great choice.
> This can work well if the person currently functions as a freelance consultant or is unemployed but has worked as a consultant in the past.
Sure, that's a lot like what I was getting at with the whole valuing stability thing. It can also work if the person being hired is having a hard time finding a job.
>Also, ideally you have a specific small project for them to work on.
Actually giving them a project, you know, meeting the legal definition for "contractors" is super rare, at least on the middle where I work. I do see it happen, but it's usually those "10x" programmers who get those jobs. Or it's jobs working for poor companies that pay very little.
In the middle, you find people like me, usually working through corporations (or in my case, whole chains of corporations) that end up paying the contractor on a W2... the idea being that the irs is less likely to reclassify the kid if someone is paying payroll taxes somewhere - because the work does not make the legal definition of contractor.
As an aside, I think most of this shell game isn't about legal liability; it's about making it clear to the other employees who is a temp and who isn't. From a legal perspective, compared to what you pay a body shop, firing someone just isn't that expensive. My belief is that the major cost in firing someone that is performing okay is in morale of your remaining employees. firing your contractors first allows you to have the flexibility to fire someone without making your full time folk feel like they might be next.
>Finally, you have to be able to pay them the market rate. If all of that fits (rare) then it's a great choice.
See, it's not rare, and it happens at all pay grades. Different pay grades have different rituals and ways of arranging things, though. Right now I'm working as a "contractor" - I'm going through some shady body shop, but I actually sit at a desk in the office and eat the free food with the employees. They pay okay, and the expectation they set was that I'll be a contractor for a year, I mean, assuming things work out, and then if we like oneanother, they'll hire me full time at the end of the year.
That kind of arrangement is really common in this industry when the economy is good. They need people now, and I can be a person now, and if they like me, well, that's an easy hire. If they don't like me, or if market conditions change before the year is out, they can let me go without damaging employee morale as much as firing a full-timer. (similar arrangements, with less emphasis on you becoming a full-timer at the end are common when the economy is bad.)
I like it 'cause I'm going through a company that I mostly own, and it gets a bunch of money corp to corp which I can use to pay my people to see if my business can get off the ground, (it pays me, too, on a W2, and buys me mediocre health insurance) and if at the end of the year, it does take off, well, I quit, and hey I was a contractor, right? that was the deal. and if my business doesn't take off, I've got a foot in the door at a decent big company where I might want to actually work for a few years, you know, put something on my resume besides abject failure.
But, as far as I can tell, this arrangement is super common. Actually going through a corporation you own part of is less common, but not unheard of.
Same for me. I did a 6-7 hour homework project. I have no idea how I did, the company didn't get back to me for a month, and then it was just a call from a recruiter. Never again.
Are you familiar with the psychological research about bias in decisionmaking?[0]
It sounds like you have come to the same conclusion psychologists are coming to. Under stress, the human mind uses mental shortcuts, or heuristics, while making important decisions, and act irrationally.
I don't often come across a post that takes on these biases so systematically, and translates them into one of the most important decisions a startup has to make, i.e. the hiring decision, and you even invent useful ways to mitigate their effects. Kudos for that!
Every company should be doing this. Overcoming bias should be a top priority at all HR departments.
Great post. I hate interviewing. I think a lot about it since I'm the type of person who performs horribly in an interview call when some person I've never met is asking me trivia questions on speaker phone...
One thing I wonder. Why are people being interviewed for technical positions in a google doc or whatever fake coding environment? Servers are cheap.
If you have to go the way of the technical phone interview, doesn't it make more sense to have both (interviewer and interviewee) parties SSH into a server and work together on some type of problem for an hour? The interviewer can even watch the candidate program or work in screen. This type of interview isn't perfect, but closed book "Programming Jeopardy" style interviews are just worse in every way.
Can confirm. They used Google Docs with me as well. I didn't find it to be all that bad, actually. Their back-to-back whiteboard interviews later on were much more nerve-wracking.
No, they passed, but invited me to apply again in a year and a half. That was a couple years ago, and their recruiters have started emailing again, so I guess I didn't do too terrible.
When they told me they passed they also told me that "some people get passed the first time and then study for the next 18 months and do amazing the next time!" and I just don't have the time for that. I already worked myself half to death in my twenties for two startups, I'm not going to study for a test as a second job for a year just to get a job at Google, although I still think it'd be fun and clearly challenging.
I recommend studying your ass off for algorithms (specifically figuring out what algorithm would be best for different real world problems, especially in regards to various Google products, including those you've never really used before) and practicing writing code on a whiteboard before you go. I studied for a solid week and a half beforehand, and it wasn't enough. I was prepared, just not prepared for what they asked me.
That's a common issue I have with these interviews, is that computer science is such a vast field that it's impossible to have everything in your head ready to shoot off in any interview, but interviewers somehow think that if you missed a question or two you're somehow not qualified to work for them, despite having years of direct experience at companies beforehand and being able to Google and refresh literally any topic you might encounter at your job in less than a minute.
Google seemed better than most at that, though. I still found the experience to be valuable and interesting.
(specifically figuring out what algorithm would be best for different real world problems, especially in regards to various Google products, including those you've never really used before) and practicing writing code on a whiteboard before you go.
"... I already worked myself half to death in my twenties for two startups, I'm not going to study for a test as a second job for a year just to get a job at Google, although I still think it'd be fun and clearly challenging. ..."
What's "good for google" right?
@cableshaft how would you rate @tptacek s observations on hiring with what you observed at google?
Google still does interviews. They try to do them better --- eg separating the hire / no hire decision maker from the face to face interviewer. But I'd like to see them try out more work samples.
I'm going through interviewer training at Google at the moment. (There's a few courses they like you to take before letting you loose on candidates.)
This is second time I'm interviewing Google and I felt the difference. They asked me questions more related to my work and less from "Cracking the coding interview" book. Hopefully it would be the same on-site. I'm confident that I'm good at what I'm doing and if Google really wants me to work on things that I'm interested, I would be a good employee. I have zero days of no-commits in my Github strike. I have more open source projects than many 500+ employees companies.
With all of that "Cracking the coding interview" book in my hand anyway! :D
thx for replying @eru, I don't envy the task evaluating candidates. By work samples do you mean real code you've created to solve problems?
“bring the same level of rigor to people-decisions
that we do to engineering decisions.” [0]
The weakness at google is understanding people. So I can understand the allure of HRA (HR Analytics) but feel google is missing something not intuitively understanding people and behaviour. [1]
"the 'rigor' that people bring to engineering decisions is oversold."
Possibly, I don't see google going down, so the engineering is sound. That's missing the real issue though.
Google, the people who lead and work there fundamentally do not grok people or psychology. This will is problematic as they attempt to diversify their workforce. I'm sure this is a known-known at google, hence the training that @eru is receiving. This is why the @tptacek article is such a good read. A tech-company attempting to understand more about hiring humans who understand machines.
people were discussing stackexchange testing people with google docs as well. Writing code in it sucks; it's surprising how annoying the wrong indent / return behavior is. I also was surprised how much I clearly rely on autocomplete, though java's api is so fat who can remember all of it.
I once had a phone interview wherein I was required to write some code and then read it to the interviewer (character by character!) over the phone. I asked him whether it would make more sense for us to use etherpad (this was maybe five years ago), but he said no.
Yes, I recently had two productive phone screens, as a candidate.
The first one taught me a bit about the company and team, and gave me a good impression of the interviewer (the hiring manager), which then led to me doing a work sample. (Without the phone screen I don't know if I would have sunk several hours into the work sample.) I ended up getting an offer but not taking it.
The second one also gave me a good impression of the company, team, and interviewer (a senior engineer) and led to me doing an full day of interviews. Which led to me accepting an offer. The one used Collabedit in addition to the phone.
I know they're not perfect, but it's much cheaper to talk on the phone for an hour than to fly someone in for face-to-face interviews. Given way more resumes than resources for in-person interviews, I'd rather narrow the field using phone screens than narrow the field based solely on what's in the resumes.
I saw the part in your article about work samples, and I agree that they're great, but they are significant work for the candidate. So I think it makes sense to do have some kind of phone call first, to see if there's enough mutual interest to make it worth the candidate's time.
There's a semantic gap getting in the way between us. Phone calls with candidates: very good. See "Warm up candidates" in the post we're commenting on for my thoughts about that.
A phone screen is a call whose purpose is to select out candidates. It's a call whose outcome can be "stop talking to this developer". They're very common, and I think uniformly evil.
Understood. I know that at least my second example was a phone screen: if that interviewer didn't like me, the interviewing process would have ended.
I don't know how you avoid that. If you have 1000 candidates per open position, you probably can't afford to do in-person interviews with all of them, or even all the ones with decent-looking resumes. What's the alternative to phone screens? An automated remote work sample system, maybe?
The whole point of work-sample tests is to minimize time wasted on subjective interviews, and to collect objective facts instead. You definitely don't want to do thousands of in-person interviews! Interviews are terrible.
For full time devs? Rarely. For interns? Absolutely. We had many applicants whose programming experience was closer to web design, and weren't conversant in basic data structures. I don't think it was inherently evil to tell them to come back next year, after they've taken a course in algorithms.
I use Google doc because I don't care if the syntax is perfect. Those IDEs do and when someone is nervous, they make simple mistakes and can't see them right away. It spirals from there.
I ask people to do the best they can to get the syntax right, and I will ask for corrections if it's way off.
I just got an interview with HackerRank and it sucked. It has some absurd code completion and a browser is not a good place to code in.
I tried to copy/paste in/out from vim and it was a disastrous process. Nearly I could not use my previously implemented bst implementation. (Which I was told to).
I screwed the test - of course it was not HackerRank's problem - and retried in vim after time ended. I could code it in a fraction of the time that I could in that ide. Time limits never work for the candidate.
I do that on all remote interviews. Code a simple-ish thing that doesn't require that you've read about it before and show me how you'd do it. Complete with the build & run command.
I think google's reasoning for using google docs is that they would rather use whiteboards but can't for phone calls. Google docs are similar to whiteboards in that they provide no highlighting or code completion.
As somebody who agrees in theory with tptacek's posts and now this blog post, what I find very funny is how I perceive typical technical hiring, with the interview gauntlet, to still be way better than what some other industries must do. How the hell does anybody hire a teacher?
Teacher here. Good schools will ask you to teach a sample lesson on a particular subject and prep your own materials. Principals and other teachers will observe and give feedback. It's what I did to work at my current school, and we're one of the top performing schools in our state.
How would principals/teachers know if and how much the sample lesson transfers knowledge to students? It seems like this would filter out disasters, but can it do more than that?
That's when you look at other factors like past performance, including (but certainly not limited to) performance on his / her pupils' standardized tests. Like any other hiring process, it's not always perfect, but you also learn along the way which characteristics make for great teachers (responding to feedback, ability to engage with students, timeliness, etc.).
Except that it's the kind of work-sample test that you can't do in the evening, and is hard to complete if you already have a full-time job (except if only other teachers are the audience).
I also agree with the sentiment of this post and have always been surprised how poorly prepared most engineers are to fairly conduct interviews. I'm reasonably convinced it's contributing seriously to inequality issues and the "tech labor shortage", and I'm absolutely convinced it's the reason I don't job hop more then I do (hrm, maybe that's part of why the system still sucks?)
That said, my wife happens to be a teacher. The process from what I can tell in private schools is pretty similar. Usually a recruiter ("placement agency") sends a bunch of resumes (CVs) to a hiring manager (department lead or school head). They pick some based on how they feel that day and conduct phone screens. Successful applicants then come in to do the equivalent of a whiteboard interview -- they are evaluated on their ability to teach a classroom of students they've never met a lesson plan they've probably not seen more then a day or two in advance (although at least they get that much warning!)
edit On reflection I hope that teachers are at least a little more capable of objectively evaluating candidates then your average sysadmin/software engineer (how often are you asked to assess someones ability outside a handful of interviews?), but believe me the sample lesson causes as much stress on the candidate as a whiteboard interview. It also stresses the teacher whose class is being "donated" to candidates and depending on the length of the process can derail what's usually a carefully planned out quarter/semester/etc.
A recent interview for a teaching position involved being given the name of a class to teach. At that point, generating sample classroom materials (a simple worksheet, powerpoint, etc), and a sample lesson plan. Then the interview consists of teaching the sample lesson plan to several interviewers.
A big difference is that the benefits of quality programmers manifest over long projects.
With many other industries, you can say "Do this" in a shorter, observable time frame and have it more aligned with what they'll be facing on the job. Contrast this to dumb interview coding challenges, for instance.
I myself am a pretty mediocre developer. But fortunately for me, I realized that tech interviews are complete and total BS.
So what I did was spend a whole bunch of time studying whiteboard algo questions, and I became really good at interviewing. I've gotten a couple awesome jobs that I had no business getting (and usually get fired after about 6 month, because I'm a shitty dev, but whatever, on to the next job).
Hi, I see you're a new user here and I hope you come back to check on this and it's not just a throwaway account.
You saw the flaw in the system and studied so that you could exploit it improve your own situation. Some people may be mad about that, but I'm honestly not. I'm sad that you put effort into practicing interviewing and sound like you're not putting effort into practicing your actual job. You say "I'm a shitty dev", which is a fixed mindset type of thing to say. You identify yourself as your current capabilities, as opposed to a growth mindset where you may say "I'm working on improving my dev skills".
It sounds like touchy-feely pop-psych, but think about it: you saw that getting better at interviews would land you better jobs, so you worked on interview skills, and you got better jobs. Now if you took your current job (and your next one) as a chance to work on your dev skills, maybe you won't get fired after 6 months. Maybe you'll grow in to absolutely deserving the awesome job, and then grow beyond that.
Please think about it. Paraphrasing an Adventure Time character, being a shitty dev is the first step to being a kind of good dev. And we need more good devs and fewer firings.
I really hope companies read and learn from this. It is absolutely maddening how bad hiring currently is. As someone who has been on both sides of the hiring process it's just frustrating. If you are being interviewed you are basically put in front of a firing squad. If you are interviewing you are the one pulling the trigger. Maddening.
I'm taking this and showing it to the ceo of the company I work for currently.
Now, tptacek:
What can I do to make this better? How can I contribute?
The same kind of process works for pure dev jobs. So, you’re a Rails shop? Take a Rails application you’ve actually built and deployed. Carve out some functional areas from the application: remove the search feature, or the customer order updater. Bundle up the app with all its assets, so that a single “vagrant up” gives a candidate an app that runs. Have them add back the feature you removed.
How would you compensate for candidates that are great developers but are new to Rails, or whatever specific language/framework your company uses?
1) If you think that the language/framework is intimately important to the hiring process (I don't but I'm not judging) then you don't compensate for that. It's an important data point.
2) If you don't think its important, you don't worry about matching "idiomatic" practices. You don't judge on obvious library mismatches, etc. If you don't feel confident doing that. Don't have the challenge interact with any language specific code. Let them do it in any language you feel confident of judging.
It sounds like the candidate gets to do this at their own pace, from home. If so, a good dev shouldn't have trouble learning enough Ruby/Rails to figure this out.
Are you asking how do you judge if someone performs under time pressure? If so, my first answer is don't. If that is a serious part of your real world work, you are doing it wrong.
The less flippant answer is, you give them a deadline but it is both A) flexible in the face of their "real life" (ie if they tell you they are going on vacation next week deal with it) and B) it deals with realistic time horizons (ie a week, not 4 hours).
Sorry my post was not complete enough - what I meant was that if you have two equally good candidates, but one happens to have experience in Rails, that person is going to perform the coding test more quickly than the other candidate i.e. get a more complete / correct solution in a given amount of time. So I hope in this situation there are no time limits?
I never want to have time limits in these tests. Generally I prefer to tell the candidate "This isn't timed and we don't care how long it takes you. It tends to take x hours (where x better be less than 4) and most candidates get back to us in a week or so. If you can't do that, its completely fine, let us know when you anticipate having it done. This is just so we can judge where you are in the pipeline."
Do you mean time pressure in the way that conveys that the company has poor project management? If so, you should probably find some way to explain that up front (set expectations) and non-Rails people might self-select out.
I found it a funny coincidence that the current top two links (this and a comment on the Holman post[1]) take opposing viewpoints on programmers as soldiers: "Infantry are what most people are." and some are "Commandos" vs Tom's "Engineering teams are not infantry squads. They aren’t selected for their ability to perform under unnatural stress."
This is great! As a former commando turned programmer I've long been wondering how best to express the commando mindset in programmers terms. The 4 elements of the command spirit are
Courage
Determination
Unselfishness
Cheerfulness in the face of adversity
These sure come in handy in some programming environemnts :-)
I have a friend who is a good programmer, and likes learning, and is easy to work with, but he is terrible at technical interviews. He stresses about them to no end. This sounds like a much better process.
I know no end of people in the exact same bucket. I used to think I wasn't in the bucket at all as I've been through many difficult and time consuming interviews but I always got an offer until I interviewed at Microsoft and Google (though I still maintain my failure at those two companies were not entirely my fault but I digress).
After going through the march of those two big companies I found the process incredibly annoying and not useful at all; nothing of substance was really ever discussed it was almost always academic or some sort of trick question where I get to hear the interviewer talk for half an hour how they came up with a superior technique.
I've interviewed a lot of people. Interviewing software developers is such a pain in the ass.
Reading this, I am filled with both hope and dread.
The hope that as more companies figure this out, there will be more and more companies with interesting, challenging and rewarding work as opposed to companies that are only a paycheck, and as soul-sucking one at that.
The dread is that I would never make the grade at the former kind of company.
Every time I sit down to solve a problem with code, I feel dread. Every time I even think about strapping on my skates, I feel so much dread that my stomach hurts.
I think that dread cannot be decoupled from hope. And that this is especially true for folks with higher developmental potentials.
All of the good developers I meet already have jobs and they don't want to leave them. So I'm experimenting with "taking what I can get", i.e. subcontracting out a few hours a week. The hypothesis is that X great developers for Y hours a week will provide higher quality results than Y mediocre developers for X hours a week.
It seems to be working out so far. I don't have a lot of data to go on, but I haven't had to completely throw out code yet.
Yeah, sure, I can buzz your fizz with my eyes closed in brainfuck. But that's just measuring language facility.
The real tests are more topical, and more complex: is there a way to make this code run faster?
There are two types of good candidates that will get that question:
1) the kind who have run into the problem before and know the answer stone cold, and tell you instantly
2) the kind that have no experience in that particular area (say, search algorithms) but are google-proficient, such that if you give them a day or two will talk your ear off about them.
In a work situation, the two are really equivalent. After a day, 2) is indistinguishable from 1).
What about kind 3, the ones that can derive the solution from scratch, in their heads? I thought that was the point of most of those questions. Like "given two linked lists, what's the fastest way to determine if they share any nodes (not values, but actual nodes)"? MS gave me that one in a phone screen and I'd never written a linked list before. But if you actually know what a linked list is, this should be simple for you to derive relatively quickly (not Googling for a day).
If a candidate needs a day to learn up about something trivial they should be able to derive in minutes, then that kind of thing might compound and end up producing a huge difference in actual productivity ("all other things being equal").
You end up having no way to evaluate candidates. You have no way to distinguish the following -
1. Those who had seen the question before, decided not to tell you they had, and faked figuring it out.
2. Those who had never seen the question before, but were able to figure it out.
You can, of course, distinguish them from the following, but you can't distinguish the following from each other.
3. Those who have never seen the problem before, could figure it out, but do poorly in the pressures of an interview.
4. Those who have never seen the problem before, and could never figure it out.
And it is -possible- you get the honest candidate -
5. They've seen the problem before, they -tell- you they've seen the problem before, and go on to solve it easily.
#1 looks the best, and you've learned nothing about them (and they have nothing in their favor other than having seen the problem before). #2 looks okay, but pales in comparison to #1, despite having actually demonstrated ability. #3 looks poor, but you have actually learned nothing about them, and they may in fact be absolutely amazing, except not good with interview pressures. #4 looks poor, and is indistinguishable from #3 (unless they do so badly that it's clear they have no idea what they're doing, rather than just being off in the weeds somewhere). #5 you now know is honest, and that they've boned up for your trivia challenge, but nothing else.
You haven't measured anything you set out to measure. You wanted coding ability and/or ability to reason out a problem; all you got was whether off the top of their head they were able to solve this particular problem. If that was what the job entailed, parroting back answers from Cracking the Coding Interview and the like, then you'd have a good test, but that's probably not what you need in a developer.
> But if you actually know what a linked list is, this should be simple for you to derive relatively quickly (not Googling for a day).
Actually I think this is the worst kind of question to ask, because it measures: can the candidate code under pressure? Not "we need to get this done by launch/before client meeting" pressure, but right now pressure.
If a company wants literal coding ninjas who can reason about computation while in the middle of lightsaber battles, sure. But in the much more likely event that they're hiring people to sit for days on end and work on a large project, then this is an unnecessary obstacle.
OK, so make it a conversation. Have them draw on paper, or talk it out.
I'm probably on the wrong side of reality since plenty of people make lots of usable stuff while using vastly suboptimal approaches. OTOH if someone doesn't known very basic stuff, it seems unlikely they're gonna Google stuff and become wise. We're not talking about anything advanced; these are basic algorithms and data structures. Some people just aren't going to be able to handle the concept of memory layout so might as well figure that out sooner than later, right?
This is an even worse idea than just asking for white-board coding, but it seems to come along every time. Not only are you putting the candidate under pressure, where thinking is difficult, now you expect them to walk you through their thought process and narrate what they're thinking, disrupting the thought process. I personally need to think things on my own without having the pressure of having to tell the interviewer what I'm thinking. Not only does it alter my thought pattern and remove a lot of the ability to think, but the extra pressure from wondering whether I'm saying the right things or taking too long to think before stopping myself so I can please the interviewer pretty much removes my ability to solve any sort of complex problem I haven't solved before. Not to mention the anger under the surface at such a daft and counter-productive approach to problem solving. This might work on TV and might be useful when collaboratively brainstorming, but in an interview, the only thing it's testing is the interviewee's ability to deal with social pressure.
> Not only are you putting the candidate under pressure, where thinking is difficult, now you expect them to walk you through their thought process and narrate what they're thinking, disrupting the thought process.
I expect a capable developer to be able to explain to themselves why they are choosing (or avoiding) an approach. If they cannot to that, would I trust them to be able to explain their methods or WIP problems to other developers?
When I do interviews, I care less about the solution the candidate comes up with, and a lot more how they ended up with the particular solution they used. The process of iteration and dead-end elimination is fascinating. This means that the best questions have N+1 different solutions. Some will be less optimal, but that's life. I also find discussing the effects of different solutions fruitful, on the less common occasions when the candidate ends up tossing one approach and settling on something altogether different.
And by the way, I really dislike the idea that you're supposed to do an interview in less than 1h. For anything remotely realistic, that doesn't even let you scratch the surface.
I don't know if the majority of people are good at interviews, but that is completely orthogonal to whether the majority of people who can code can code in interviews.
Thank you, tptacek, for writing this post. You have identified several important problems with interviewing and laid out the fundamentals for a reasonable alternative to the standard software interview.
It makes me sad that I will probably never see a single one of your suggestions implemented by any potential employer that might want to hire me, specifically.
This is because there is one more major problem that you can't fix from where you sit. The people creating the interview protocols do not solve their problems like software professionals. They don't collect and analyze data rationally. They don't control the variables. It never occurs to them that there are a dozen flaws in their system that they just haven't discovered yet. They don't read HN.
They don't realize that software developers can write the kind of software that runs on people, too.
The only reason I ever designed a hiring pipeline is because I was asked to be one of the people in the interview pool. Me (and others in the pool) decided we were doing a bad job and started iterating on the problem (much like we would for software). Management was thrilled with both the initiative and the results.
That was years ago and working on hiring pipelines has become something I just do. I've never had an employer push back or make that hard. Quite the opposite, they are usually happy to have help.
Try fixing the problem in your current job and see. You might be surprised.
I have never been hired by any employer that uses a hiring process that would reject me as an applicant. There's a bias embedded in there, I think--one that may also occur in other people.
I also encounter the problems endemic in the hiring process far more often at other companies than I do in my own. The reason why I start interviewing elsewhere is often because my current company has stopped hiring (or started firing). I have occasionally tried giving those other companies feedback, but the response has always been, without a single exception, "We know what we're doing; don't tell me how to do my job."
I have neither lever nor fulcrum for this problem. My frustration is not without reasonable cause.
> It makes me sad that I will probably never see a single one of your suggestions implemented by any potential employer that might want to hire me
I was implying that you could see this at your current/next employer by implementing it there. If you fix it there before the time comes to move on you personally may not reap the benefit but someone else might and the lessons learned by you and your colleagues get spread further out. If nothing else, you learn how to maneuver hiring pipelines quite well by designing them.
> The people creating the interview protocols do not solve their problems like software professionals.
As soon as you are asked to become part of the hiring pipeline whether it be resume sorting, phone screening or interviewing, you become one of the people creating the interview protocols. You can apply whatever tools in your arsenal to fixing it. I have found using standard software development methodologies to be very compelling and useful in this context.
> The reason why I start interviewing elsewhere is often because my current company has stopped hiring (or started firing).
This is a problem that is more worrisome than involving yourself or not in designing hiring pipelines. It implies a reactive approach to your career, and that is likely to lead to suboptimal results, and can be actively detrimental to your job prospects in bad job markets. I would look to the root cause of that behavior and see if it lends any insights.
Work-sample questions is not free from its own evils. They are expensive to device. At big companies, significant portion of candidates dump out their interview questions on Internet so that's a big challenge.
Second, work-sample type of questions are also expensive to answer for candidates so you can probably just ask one question as part of whole interview and that's about it. So any conclusion is drawn from sample of 1 as opposed to sample of 5.
Third, work-sample type of questions requires knowledge of engineering skills that candidate may or may not have developed well at point in time. At many companies, emphasis is how can we solve problem X from algorithms perspective and engineering is just something you learn on the way if you haven't already. Key is ability to figure out computer science part rather than language and tooling part.
Overall, I think many companies blindly try to follow the model of Google, Facebook etc and fall on their face. If you expect your developer to write CRUD apps or do mostly plumbing work, there is no point in asking questions like find common ancestor in binary tree. At other companies, developers are expected to solve computational problems and engineering/plumbing/CRUD is small part of the solution pie. There it makes a LOT of sense to ask candidate to solve problems that requires deep computer science. Also asking as many of as possible so that your sample space is large and outcome has more confidence.
We had three work-sample exercises at Matasano. All told, they took candidates mid-single-digit hours to finish (many geeked out, or golfed on them; we did our best to keep this from happening --- you can read about this on our hiring page:
This sounds onerous, but it is less onerous than the normal dev hiring process, which involves an onsite interview that eats the whole day. We did on-site interviews, but they took just 2.5 hours to complete; candidates would be out at lunch. Essentially, we were shifting some of the time candidates would spend sweating in a conference room to their couch instead.
There are bad work-sample tests. "Bad" comes in a variety of flavors. I think bad work sample tests will out-predict the median interview. But there's a lot of room to optimize and improve this process, sure.
My concern here is that it seems to signal a lack of respect for the candidate if the company expects them to complete a task with no matching employee time. Interviews are even; both sides clearly have skin in the game; a work-sample test could be given to any number of people at basically no cost.
This is somewhat alleviated by the knowledge that you do a phone screen with each person prior to that stage, but it still leaves me with something of a bad feeling.
We actually pre-pay on that. Candidates get an initial call with a very senior person to start. That call includes coaching on how to get through our interview process, and concludes with us sending free educational material (books, etc.). This coaching and material is not specific to getting a job at Matasano, you can just as easily use it to get a job at one of our competitors. We invest hundreds of dollars in each candidate before we ask them to do our challenges.
It totally depends on the work sample they ask you to do. If no thought went into it then you are correct. If, however, a company devises an interesting project to work on it show they also put some energy into this.
Most importantly though I like programming, but I don't like speaking up in front of a group of strangers, so 6 hours of programming is much more enjoyable to me then even just 1 hour of interviewing.
Interviews aren't even. How many hiring discussions on various parts of the internet have talked about putting tens of hours into studying for the "undergrad CS pop quiz" interview in vogue at so many companies? How many of those same discussions have talked about how hiring is a "distraction" from the "real jobs" of the engineers involved at the company?
The process is already badly asymmetric in addition to being artificial. At least the work sample ideal addresses one of those.
The software developer job interview doesn’t work. Companies should stop relying on them. The savviest teams will outcompete their peers by devising alternative hiring schemes.
I remember talking with a manager after an interview, and listening to him go on about how his company was empirical and data driven. Later, he mentions how this one interview activity was valuable because it resembles something in the job, and I asked him what his empirical data was, and he had to admit that he was caught out.
I started drooling over the part about writing exploit code that doesn't involve doing anything illegal or taking data that doesn't belong to me. Pesky morals :)
I posted this before but someone decided to downvote, whatever, I think this is my version of hiring!
1) start with basic textbook questions like what is authenticity and authentication or XSS
2) catch what the interviewee said and build questions (e.g. I said something about private key so interviewer asked me about pro and cons of asymmetric and symmetric encryption). Oh yeah - know your shit because they are going to catch you! It's okay to say "I don't know." Being straightforward earns respect. My interviewers didn't penalize me much (well I just graduated from college...).
3) the next couple interviews again starts with introduction, then deep dive into what the team does, what the team is building at a high level, then proceed to ask me my interest. Here i would talk about my ideal projects, show them high level how I would go about implementing my idea, challenges I face (and also why I have to build one; are there any existing solution and are they not adequate). Take caution of your words - know the things you say aloud.
Somewhere in those 4-6 interviews, add a programming sessions if you haven't done so (for me I skip that and went to onsite because of internal referral).
I didn't get an offer probably because I didn't quite know what I really want to build. My idea was too generic and probably too "child play." It was a really intense and yet fun interview. This interview process allows interviewer and interviewee to see if they are a match or not quickly and pleasantly. I always look back at this interview and believe that the rejection is just and great for me and for the team.
Terrific post, thanks! I am somewhat familiar with this hiring process thanks to tptacek's HN contributions (which incidentally led me to recommending a friend to Matasano). I would like to have a better feel for the context of the hiring problems involved here. I am wondering if the company's particular niche makes the hiring process unusually difficult. Or perhaps the company is in a growth phase with a corresponding shortage of employees? With such low turnover rates, how many additional employees are needed? Is it possible that the company is in a position to profitably hire every qualified applicant it receives? If so, the company is essentially losing money on every hire it doesn't make. In addition to a radically improved hiring process, what about offering remote work and/or extremely high starting salaries? Well-advertised high salaries and location flexibility might allow the company to out-compete many alternatives (e.g., Google, FB, Bootstrapped startups).
When I was at Matasano, there was always enough inbound work (sometimes more than enough) to keep the consultants utilized. So yes, a company with a full pipeline will lose money on every qualified hire it doesn't make. Perhaps this is not the case for other industries, but for infosec consulting, there seems to be more work than there are qualified workers. Matasano ran under its own steam and did not take funding. Offering extremely high starting salaries would have killed it in its early years. It was more cost effective to create a process for finding people who could actually do the work rather than rely on the 4-page CV candidates. The expensive ones.
The net result was a win-win: you get bargain candidates and they get 4-page CVs.
My policy (I'm co-in-charge of recruiting at Matasano/NCC, and a lot of folks report to me) is this:
1) Hire based on current ability, not potential. Hiring based on potential is a minefield that our whole process is designed to avoid. ("Enthusiasm," "cultural fit," talking a good game without being able to walk the walk, etc.)
2) Be aggressively fair in giving out promotions and raises. Keeping someone's salary low because they're bad at asking for raises is not a good long-term strategy, especially in consulting.
> Keeping someone's salary low because they're bad at asking for raises is not a good long-term strategy, especially in consulting.
Yes, in consulting the distance between money made and each individual employee's efforts seems shorter than in a big tech company. So keeping salaries in line with impact of contributions should be easier.
I do not know. I suppose even if I did, it's one of those things nobody is supposed to discuss. I overheard talk here and there about raises, but since it didn't pertain to me, I ignored it.
Funny, last week I wrote some notes about doing just this :). It would be hard to build a general purpose platform, but I think if it focused on just replacing the initial technical phone screen, it could be both reliable and useful.
There's a common set of skills that most any software engineer should have. I think those skills can probably be checked via a web tool, and in higher fidelity than you can get over the phone.
As a candidate applying to N jobs, I'd much rather take a web screen once instead of doing N technical phone interviews. And as an interviewer, I'd much rather not have to ask FizzBuzz again.
I envision something akin to the http pipeline in a web app the is composed of several "microservices".
One stage of the pipeline is left intentionally out. The candidate then is required to build an actual service. They can use any language they want. Even any cloud host they choose. All that's provided is the entry points, exit points, wire protocol, data encoding, and the spec to be implemented. After passing that, advanced topics like logging, telemetry and performance provide grounds for further discussion to assess development "philosophy".
But the real genius would be selling this work-sample platform-as-a-service to enterprise customers. Who could be prompted into building a catalog of microservices based upon their APIs and real data. And out of that hopefully some innovative, hackathon-esque mini-products could arise ;)
I definitely never thought I would become an "HR guy". But I really like this idea. Will include a contact email in my profile if anyone is interested in discussing it further off HN as well...
I've done a few online tests recently, and I was thinking of how to do a better one.
1) The main idea was to have the results of unit tests visible live. I've done a few tests where I found out I didn't get through even though the test rig says it works. It compiles, it gets the examples right, but there's some hidden test that failed. You'll never know why, even though it's probably not something surprising.
So just make it explicit. A bunch of lights for each test, a description of what the test is, and there you go.
2) That solves little problems like "given some set of numbers, how do I find some ridiculous property of those numbers?" type questions.
What you really need is something that tells you how people deal with complexity. I have a large problem, how do I solve it? There's always more than one right answer.
Here you might be better off doing some kind of subjective voting. People who are looking for work might find it comforting that other people in their situation are judging them. Or gratifying to be able to judge other people's skills. Perhaps there's some incentive structure that cleverly aligns this.
Overall a really good article. I feel however that if I'm interviewing for a job, I've already lost. It probably means that I haven't been 1) working my network and nurturing those important professional connections, 2) contributing in meaningful ways to Open Source projects (in order to generate interest) or 3) impressing people with real-world solutions. I've never enjoyed a job that I had to seek out or interview for, your mileage may vary however.
Relevant: a startup called Interview Club[1] presented at the Launch Festival recently. Another startup called PreHire[2] won the hackathon (although their product is more for hiring customer service reps).
> Because here is the thing about interviews: they are incredibly hostile experiences for candidates.
Minor nitpick from a grad student here: interviews are nowhere near as hostile as academic peer reviews for papers. In an interview, the interviewers aren't actively trying to find any holes in your logic (no matter how small). To me, technical interviews are a walk in the park compared to conference submissions.
Er, I sure as hell am looking for flaws in your logic / code. I feel like I'm pretty nice about it. I don't use it to put people down. It allows me to ask follow up and leading questions to see if they too might spot it after a bit more thought.
I think of hiring someone as a process akin to testing experiments in my startup. There are so many variables when people try to work together. Hiring processes that mimic that strategy should be the best bet IMO. This article is a little low on practical advice although I realize it's focused on the issue.
There are a lot of 'hiring is broken' posts and this is a decent one, but I don't think I'm alone in feeling that none of them convincingly identify the problem, let alone solve it.[1] So I'll do that here.
Hiring isn't broken because people use the wrong interview questions. It's broken because firms are engaging in a zero-sum battle for talent. But if what you're doing is worth a damn, it should be worth a damn when ordinary people do it. If what distinguishes your company is that you've managed to hire smarter people, maybe your company isn't so great.[2][3]
The proper hiring criterion is pretty simple: do the people who will be working with the candidate like her? In Blink, Gladwell presented good evidence that formal interview process isn't any more objective than this. And when you think about it, what would "objective" even mean?
There's also an issue with the labor market -- in particular, the difficulty of firing. You're much better off hiring liberally and firing candidates that don't work out. If all firms did this, the labor market would be more efficient and firing wouldn't be so bad.
Apple is a good example. Their profit per employee is second only to Netflix among large tech companies, and this includes a significant fraction of retail personnel. Apple's profit per engineer is probably higher than Netflix's, and that's remarkable considering they're over twenty times bigger.
Does Apple achieve this with some optimized interview format? Higher salaries? Onsite massage? No, they do it with better products (and a command and control governance model...).
It means that the software industry isn't nearly as innovative as it thinks it is. This is actually so obvious it's the subject of popular jokes ('like airbnb for pet food' etc). It's also obvious when you consider the ratio of effort spent building tools to products. All the latest frameworks but... Where are the apps? Where are the apps that are even half as feature-rich as a typical Windows application from the '90s? Google Plus still can't keep which things I've +1'd straight.
[3] Or worse, that you've managed to deprive your competition of smart people. Google has done this -- hired people without having work for them to do, on the reasoning that something might come up and at least they won't be working for the competition.
I think hiring is broken for a different reason: workers, whether in software or any other industry, are viewed as less-than-equal ("second-class citizens"). Except in rare cases the entire hiring process in software especially, from start to finish, just about everywhere I've ever interviewed, seems to me designed to find reasons to reject candidates, and as a side-effect is (perhaps unintentionally) also designed to demean and diminish human beings.
I agree. But why are they so choosy, and why do they complain about a lack of qualified candidates when their rejection rates are so high? I think it's the zero-sum game I mentioned. Firms are trying to outhire each other, like talent is some magic wand that can replace compelling products. There's a lot of snake oil in the industry, like SEO (now called "data science"), meant to substitute for things like vision, leadership, and customer service.
firms are trying to outhire each other, like talent is some magic wand that can replace compelling products.
Where do you think the compelling products come from?
There are plenty of companies that take the view that you seem to be advocating - that there are employees who "create" and other employees who just turn the wheels and implement those ideas. Your average fortune 500 company is just like that. The engineers working there pretty consistently complain that the software dev team isn't respected, and isn't sees as producing things of value, or providing a competitive advantage.
Those companies believe that thevakue isn't in the oroduct/idea and the execution is easy and ought to be cheap. Which is fine, as long as you never need to do anything that requires significant technical skills. Because such environments force out anyone who thinks that they gave more to offer than simply churning out cookie-cutter, mediocre code.
Their are ways to build successful teams that don't require that everyone is top 10%, but the notion that you just need compelling product ideas and a bunch of average developers to implement them is pretty common place, and rarely produces exceptional products.
I don't know if it's exactly a zero-sum game, but the rate at which talent is created doesn't seem to be amazingly high given that there are over 7 billion people on Earth. Certainly the talent pool seems to be a small fraction of the population.
Top universities, which for the most part feed the tech industry, face a similar problem at an earlier stage in the pipeline. It's been noted that less prestigious universities tend to draw their faculty from a very narrow selection of elite universities.
The education and labor markets are linked, and inefficient in ways which perhaps reflect the same underlying cause: scarcity of people with apparent ability, and self-interest wanting to profit from oligopolistic control of this incalculably valuable limited resource.
I think part of the problem is that hiring bars are unrealistically high. Pulling numbers out of my ass for illustration, it's like companies that control 10% of the software engineering jobs all only want to hire the top 5% of software engineers. They can't all get their way, so the top 5%[1] bounces between these companies whilst the companies whine about "talent shortages".
[1] Should be noted that it isn't the entire group, since some will choose to work for organizations outside this group because the work is more interesting or fulfilling.
I don't know that there is a "right" level for the bar. There might be a game theoretic way of looking at the problem which explains how we wind up with the levels that we do.
In relative terms, the bar for, say, the NBA is much higher than for top universities and tech companies. They set the bar high enough so that, for all practical purposes, minor league teams cannot come close to matching their combined level of play. Also, if you don't compete regularly against NBA caliber players, with NBA quality coaches and trainers, it would be hard to reach the same level of skill even with comparable native ability. I don't think the league is that concerned with missing out on marginal players, as long as on balance it gets those correct. The disaster would be to miss out on the Michael Jordans or Lebrons, Bird and Magic. So they scout and pay through the nose even for the slim chance of that kind of greatness, with the tolerable outcome being a good role player.
At some level, we're all looking for greatness. Nobody wants to see a league entirely made up of role players doing somewhat above average things.
The analogy breaks down in that it's a lot easier to accurately measure (though perhaps not predict) athletic ability. I do think that investors in tech tend to heavily favor obviously stellar performers over "good but not great" ones.
That is because people do not nurture and train and help their talent enough, they want them to turn up exactly like some mental model they have for a mental model of a role they have. There are huge numbers of people once you stop trying to find exactly the thing in your head.
>just about everywhere I've ever interviewed, seems to me designed to find reasons to reject candidates
I'm sure that's intentional. Joel Spolsky wrote an influential guide to interviewing in 2000 that lays out this philosophy; the latest version (updated in 2006) is here:
The gist of a large part of the article is that interviewers should have a preference to saying "no hire", because bad hires are toxic and hard to get rid of.
But then he writes this:
----
Of course, it’s important to seek out good candidates. But once you’re actually interviewing someone, pretend that you’ve got 900 more people lined up outside the door. Don’t lower your standards no matter how hard it seems to find those great candidates.
----
Which is terrible advice if your standards are, in fact, unreasonable.
>Where are the apps that are even half as feature-rich as a typical Windows application from the '90s?
I hear you. One reason for this is scalability. You can build richer web apps if you aren't shooting for scale.
I worked on two startups with conventional LAMP stacks and heavy user sessions before joining Yahoo. At Yahoo there was really no session object[1], due to scalability. A few cookies could be set to customize behavior, but adding new cookies was seriously frowned upon. This made me appreciate sessions as a means to adding in nice little features.
In principle you can solve that by storing the session in a scalable key-value store, providing the latency is low enough.
[1]Generalization only valid for the properties I was exposed to at Yahoo.
Do you think you've learned lessons that could be applied by individual contributors in large orgs who are called on to interview? If so, it'd be great if you added that to your blog backlog. :)
Great, if you want to hire people who "can code", but have no social skills. I am personally done working in environments where people think they "can code" but have not enough of a social skill to at least empathise with your point of view on naming conventions.
Don't get me wrong, I can code, I'll write you anything you want, AND IT WILL WORK! and I'll still know there may be a better way but this is the way that works and I still think I don't have enough years to live to argue with social incompetents who think that fighting over tabs or spaces is a good idea and a pocket protector makes you sexy.
More often, when talking about naming/indentation conventions, consistency is more important than who's right. The entire code base should look exactly the same (no exceptions).
heh we just made a day labor site for devs, in part cuz we are trying to circumvent bogus contracting and hiring practices. very beta. http://daylabor.modulha.us
I think this method is good in theory but fails in practice. Recently, I applied to three different companies that asked me to do a code sample that took from about two to eight hours each. I generally don't work for free, but I had some spare time and decided why not? I did all three and turned them in. Out of three, one company liked my sample but in the interview that followed, decided that what I was asking was too much (it's what I'm currently making actually). One company never even looked at the code and after two weeks, emailed back saying they had forgotten and decided to hire someone else. The third company was pretty responsive and I'll be going in for an in-person. A seven hour in-person interview that will be more coding and such (at least on a laptop so it's better than most places). So one out of three. I think this is a huge problem with this approach and basically, most companies are not responsible enough to follow this approach. They drop the ball constantly and wonder why they can't find good engineers when they're literally everywhere. Put into the hands of a respectable company, I would definitely prefer this in the future rather than the typical white board bullshit and trivia slinging.
What you've just described sounds like the exact opposite of a rigorous, objective, work sample based hiring approach. Crappy companies have crappy hiring processes. It sucks to deal with that.
When we started doing work sample hiring, we put a process in place to handle many applicants, make sure we were responsive, and work people through the funnel as efficiently as we could. I don't think it's possible to take advantage of it any other way.
We got complimented several times on how reasonable the process was (by people we turned down!). This both made me feel good, and caused me existential "the world is broken" sadness. It's not a high bar to be decent to people.
No, it's not a high bar to be decent to people, yet many companies (and people) still aren't. The problem then is that prospective employees have no way of knowing that a company is decent versus another company that is not. Will I do another work sample for another company? I'm thinking likely not, but I haven't ruled out the possibility. Note though, that the company must offer something of great value for me to even consider it. In my case, that's remote work. I would never consider it for a non-remote company. Period.
Still, a company could have a great, responsive process in place. But my experience, and that of others, is that it's not worth my time and effort to do work for free. Let's be clear: this is what is being asked of the candidates. It therefore starts to undermine any company using such a process even though I don't think it's an inherently bad process.
Just ask "what is your process and criteria for hiring?" It's a fair question, a company should have a good answer. A thoughtful process is a good sign.
I've recently had a very similar experience with work sample/coding tests and I completely agree. It strikes me that most people setting these tasks are pretty clueless about how difficult the task is and haven't tested them out on non-candidates.
How they're implemented has varied wildly between companies, but most of them were used as an initial screener rather than a major deciding factor, despite being one of the most time consuming parts of the process for the candidate.
Like you I was able to go through with most of them, but I felt like they were a massive waste of my free time and companies were often very slow to follow up in it. I think a lot of the time it just isn't worth jumping through these hoops for a company I don't know anything about.
My best test I took was around an hour and involved building a really basic CRUD app in Django as part of a larger face to face interview, followed by a short discussion. It worked well because they set up an environment for me, the data model was fixed, and there was a list of short requirements I could complete in order, so I was able to just dive into the work, and spend time on the kind of things that were relevant.
Another one basically asked me to take a large dataset, build a web application around it, and deploy it, which would probably have taken me a week to complete but "should" have taken around 3 hours. They tried to present it a really interesting problem I would enjoy working on and then asked me to keep all my work confidential.
Most of the tests I've done are still closer to general problem solving/comp sci tests, which is still pretty far removed from the actual work, but at least they're not a huge time sink when implemented properly.
I realize that I "should" like code samples, on the going ideology and considering the real problem of non-Fizzbuzzing programmers, but I don't find them useful and only do them when I'm unemployed (which hasn't been an issue for a few years). My problem with them, based on experience, is that they only provide downside. Used a language that the team doesn't like? Too bad. (Once I was told that I shouldn't have used Scala for a code sample, because the company-- a shitty Java Shop-- had recently fired someone who was a Scala advocate.) Didn't test your code using the style that the company implicitly expects? Points off. On the other hand, I've had the experience, in the past, of submitting samples that I was told were among the top 3 the company had ever seen... and still been offered a junior or mid-level role (this was in my mid-20s). So I don't see the upside. Why spend 12 hours of my own time on a code sample, only to get the same junior-level position I could have gotten anywhere else without homework?
If a kick-ass code sample could have turned a junior-level engineer position at 0.01% into a Director-equivalent (I like being a full-time coder and don't need reports, but I generally prefer that management see me as an equal, so I can get my job done, because a 10x engineer disempowered is a -3x engineer) with above-market compensation and legitimate equity, then I'd say... yeah, these things can spot early talent that nothing else does. But, as far as my experience has led me to think, code samples just provide another reason to reject someone. ("This guy used the TestEase framework instead of EasyTest. What, does he think it's 2013??")
I'm a good technical interviewer. I'll say it: I'm very good at it. The first thing I do is that I explain my methodology, rather than hiding what I'm after, so the candidate is at ease and not surprised by difficult questions. I tend to prefer a "rapid fire" approach to collect as much data as possible, so as soon as she's demonstrated that she knows what she's talking about, I'll move on to another question or topic... and I disclose that this isn't to be rude but so I can get the best read possible. I tell the candidate that, unless she is a 0.1% outlier, I will ask questions where she doesn't know the answers, and that's OK. It's not a percentage-based test where you need to get 80% right or it's a no-go, but it's a binary search on the competence spectrum. Generally I'll take something on her CV that I'm familiar with, research it a little bit, and start with a mid-level question that I'd expect 75% of the people I'd want to hire to get. (Statistically, you get your best results by adaptively asking questions at a level of difficulty where there's a 50% chance of success. But I aim for ~75% because most candidates aren't used to difficult interviews, and I don't want anyone to freak out.) That means that there's a 25% false-negative among good candidates, so if she doesn't know the answer (and it's OK to say "I don't know"; I'm in the top couple percent of my field and still have numerous blind spots) then I'll give her another question, also of moderate-high difficulty. (I don't start giving easy questions unless I've decided on a "No", and that's usually to put the candidate at ease, because I might still be wrong, and I want the next interviewer to get a clean read.) A good interviewing process provides multiple paths to success, not multiple ways to trip up and fail. I prefer to ask hard questions rather than easy ones because, as far as I'm concerned, 5 hard questions is multiple paths to success (if you get 1 or 2, you're solid; 3-4 means you're excellent) whereas 5 easy questions is multiple paths to failure.
I know I come off as an asshole on Hacker News (because it's turning into this festering pit of pro-corporate Wrongness on the Internet, and someone has to do something about it) but I try to be as nice as possible during interviews. I want to be tough, intellectually, but I want the person at ease as much as possible (which is why I explain, ahead of time, that I've been a notoriously difficult interviewer at every company where I've worked) so I don't get a false negative. I generally give a series of hard questions (not all related to each other) that seem relevant to her experience. I might give four, and a candidate who can competently answer one of them, I would generally say is worth considering. If she gets 2-3 out of 4 hard questions right, then she's probably a good hire and I'll recommend her.
I don't think hiring engineers is as hard as people make it out to be. I do like to see code, but I can usually tell the socially adept non-coders and charlatans from people who can actually program, even without looking at a line of code. I tend to prefer to hire for general intelligence than a specialized "hole" that someone is trying to fill, and the main question in my mind isn't, "Did she get all of my questions right?" but "Would I want to work with this person?" and "Would she add something to the team that's not already there?" I won't remember, 90 minutes later, whether she answered every single question right, and if she screwed up and said "Lasso" when she meant "elastic net", that's not terribly important... probably a mistake, and knowing the ideas is more important than the vocabulary. Besides, I care more about hiring the person who has the curiosity and drive to learn new things than hiring the one who already knows what we're "looking for", because the latter changes in most companies by the quarter, while a candidate's general ability doesn't.
Bad hires tend to come from two sources. One is when non-technical people override the technical interviewers, either formally or informally. It's best to have an environment where people can speak freely and, preferably, independently (i.e. each interviewer is on the spot to give feedback before discussing the candidate with others, but also feels safe giving an honest read). The worst thing that can happen is when an executive says, "This person is amazing", before any feedback has been given. In many companies, that means that the decision is already made and the engineers are just there to ratify it. That produces a lot of underqualified hires. The second is stack-ranking, which encourages teams to keep an "insurance incompetent" on hand, so that the middling and top players can safely focus on work, knowing that they won't end up in the bottom pool. In companies with stack-ranking, however, I would generally avoid interviewing as much as possible. (In fact, this may be one reason why companies that use stack-ranking tend to decline.) If stack-ranking is in play, you need to focus solely on (1) your main project, (2) getting credit for what you do (the politics of performance, which is more important than performance itself), and (3) being perceived as a top performer without actually trying to outrun the bear (outrun the other guy, and he gets eaten; outrunning the bear is impossible). If you're in a stack-rank company, then interviewing is that class of work that's good for the company but won't improve your Perf score, so you should avoid doing it as much as possible.
This article seems to put forward a good theory on interviewing someone who you want to lock in room with a problem and have code come out the other end. Unfortunately, most of us work in the real world, where that's a very limited portion of the engineering job.
When the author says selecting for: "people who have the social skills to actively listen to someone else’s technical points, to guide a discussion with questions of their own, or to spot opportunities to redirect a tough question back to familiar territory." is a mistake, because some of them can't code. I know how to turn someone who can't code into someone who can. I don't know how to turn someone who fails at dealing with humans into someone who succeeds. And, at least in my job, engineers spend a whole lot more time dealing with people than with code.
I know how to turn someone who can't code into someone who can.
If you can do that reliably, you should be making zillions of dollars.
My experience is the opposite: if someone is very impressive in an interview but a zero on the team, they're an intractable management problem.
In any case: there is a difference between being cripplingly antisocial and being able to deftly handle an interview. The social skills required to flip the script on an interview are distinctive, and they aren't the "team player" skills. In fact, as I re-edited that paragraph and came up with "controlling the conversation", it occurred to me that they might signal a candidate who has problems on teams.
> If you can do that reliably, you should be making zillions of dollars.
I disagree to a degree; I've met many people who were simply fantastic with figuring out problems but didn't know much software development and they ended up being awesome software developers.
It's certainly doable and I don't think it's so difficult that someone who can do it should be making an obscene amount of money but they are not cheap either; that's usually a really good development lead.
I think Thomas is saying that, if the process is repeatable, then the person who possesses a black box which ingests smart person on the left end and outputs a programmer can rent that box to industry and charge, well, billions of dollars for its use. (Fermi estimate; cost of college education to credential the number of engineers AppAmaGooBookSoft will hire this year was on the order of 10^5 * 10^4 = a billion dollars.)
I think you're more wrong than right, and it's selection bias that gives you confidence here.
There are people from all sorts of backgrounds that can become very good developers, but their common feature seems to be that they were already the sort of person who could become a good developer, not any methodology used to train them.
For example, I'm confident most talented physics Ph.Ds[1] can make decent developers of numerical code, but that says more about them than it does about my skill at training.
Given a random person, hell given a random person who is already decent developer in a different domain, my confidence in their ever becoming very good at that domain is much lower.
[1] the problem then being, that pool maybe smaller than the one you are trying to populate....
> Given a random person, hell given a random person who is already decent developer in a different domain, my confidence in their ever becoming very good at that domain is much lower.
I don't think anyone was saying a random person but someone wanting to get into the field or is already in the field but perhaps not in the right spot when you pick them up.
I can't imagine anyone would mean a random person here, that wouldn't make sense as development is a skilled position. not everyone can do it and OP was talking about interviewing developers not random people.
I'm not sure I follow. But in any case, Matasano isn't a cryptography firm; it's a software consultancy. The set of things we did to boot up a specialty practice in cryptography are a different blog post.
>This article seems to put forward a good theory on interviewing someone who you want to lock in room with a problem and have code come out the other end. Unfortunately, most of us work in the real world, where that's a very limited portion of the engineering job.
It's a better representation of the job than writing code on a whiteboard.
The question shouldn't be "is it perfect?" The question should be "is it better than what we have now?"
The idea that someone has no social skills because he doesn't ace in a highly contrived situation which has no correlation to the way he will communicate in his job later is pretty weird (Do you hire engineers/developers or high-stake, high-pressure negotiators, e.g. diplomats?). As is the idea that you should optimize your hiring process for what people do "most of the time". You want to optimize for what is MOST IMPORTANT about their job. If I need someone who goes to customers I wouldn't select for people that can drive Formula 1 cars efficient, even if it happens to be the case that they are driving most of their work time.
Business depends on solving problems. Code is a medium for implementing solutions. I can count the number of times I've needed to put my CS degree to use on one hand, but knowing to whom, when, how to ask the right questions has been worth far more.
I'm not sure how you got from "less than half of engineers time spent on communication overhead" to "no communication whatsoever". Something is terribly broken if the people primarily responsible for execution take more time to communicate about a task than to actually do it.
Depends how you measure. Writing documentation is dealing with people. And even most code is meant more for people to understand than for the benefit of the computer.
Software development (please, not "coding").. It's a skill like other skills. It's not magic. Being able to practice medicine or law is a skill. Being able to architect a building is a skill. Being able to effectively develop software is a skill.
You judge someone based on their skill level. You have to know what require('fs') is in Node, but you also have to understand truths like the concept of technical debt and so forth.
Not everything is broken or "needs a new paradigm". Know what programming is. What high skill means. If you're any kind of leader - TEACH someone younger or less experienced. When you need to judge, for hiring or otherwise, measure against those truths about what good software development is.
I can talk in a pretty good amount of detail about how exactly our process worked, if anyone has any questions.
And, to head off a concern a reviewer gave me: from 1997-2005, I was a full-time software developer; I shipped shrink-wrap boxed software on Windows and Unix in the 1990s, then appliances deployed at tier 1 ISPs. I'm a "software person" more than a "security person".