Hacker News new | past | comments | ask | show | jobs | submit login

> Developer hiring is broken

This seems to be a pretty popular opinion. It totally might be right, but it’s not my own experience, so I have a serious question because maybe I don’t know what’s happening out there with most hiring today - what are the broad-stroke outcomes that demonstrate that hiring isn’t working? Are there statistics that show that hiring has problems? All of the reasons given in the article are claims without evidence, nor objective comparison to hiring for other industries. When looking for jobs, I’ve never been evaluated on IQ or code alone, it has always come with communication and personality and culture fit evaluations, among many other things. When hiring, my own team does everything this article claims isn’t being done. So I might be completely unaware of the major trends out there... how can I see those trends from where I sit? Are people not getting jobs who should, has there been high unemployment? Are companies not able to hire people? If hiring is broken, what are the problems it’s causing?

> It's a disguised IQ test

It is amusing to me that many blog posts and comments around here argue for exactly that under the same banner ‘hiring is broken’. Lots of developers are frustrated about being evaluated by how well they communicate and not by their code. Lots of people here complain about in-person interviews and tests and argue that take-home coding should be the norm, or that coding tests should not used at all.




These positions are entirely consistent, it's just a "pick your poison" situation.

Whiteboard interviews are a skill unto themselves, with only a glancing relationship to day-to-day job content, artificially high-pressure, overly performative, etc. You measure hours invested in Leetcode, not suitability for the work.

Take home projects probably collect good signals, but present a high and (crucially) asymmetric burden on the candidate. No one wants to burn a weekend or vacation days pouring effort into something that the employer can just glance at for 5 seconds and throw in the trash. Also, many candidates would prefer to reuse their time investment over arbitrarily many interviews instead of starting from scratch on each company's assessment.

Interviews focused on personality, culture fit, communication skills, and "gut feel" overemphasize the interviewer's personal beliefs. Likable candidates aren't especially likely to be good, and good candidates aren't especially likely to be the kind of people the interviewer wants to have a beer with.


>Likable candidates aren't especially likely to be good, and good candidates aren't especially likely to be the kind of people the interviewer wants to have a beer with.

See I would much rather invest time and money into somebody that I like being around and who I think is capable, rather than have some jackass who is really really good off the bat.

Of course there are biases, but that's true weather you hire for skills, personality, or both.

I think a lot of programmers think that your skill level is the only thing you should be judged on, but I think thats mostly because there are a lot of very unlikable people in this field and they'd rather hide behind their perceived skill level than attempt to be a semi likable person.

There is a sweet spot, somebody who has skill but is also not terrible to be around, and can learn quickly. The problem is that most interviews swing too far one way or the other to properly asses for both.


Yes, but likeability in an interview measures something only loosely related to likeability in the long run. Someone who is a little shy can make a great coworker/employee/friend/whatever, but you won't find that out watching them squirm in front of a whiteboard. An interview only directly measures the ability to win over strangers, and that might right for a sales position, but for most engineering positions it's a big extrapolation to guess how the candidate will perform in a team when comfortable.


Pretty good description of all the gripes people have.

Hiring software devs is a damned if you do, damned if you don’t proposition—somebody is always gonna bitch.


The only unambiguously wrong position is that the system works well and isn't worth improving.


I'm a software engineer working at a company in the hiring space. I've done over 400 technical interviews in the last year alone, and ... this is a hot take, and not the view of my employer, but I think developer hiring processes are fine (At least, at most mature companies.)

There's a problem at the moment in the market that there's a huge amount of pent up demand for senior developers. The market has responded with bootcamps and the like, and we have a ton of junior devs with very little knowledge and experience pouring into the workforce. The sad truth is that most devs fresh out of collage or a bootcamp aren't valuable enough to be worth hiring at large tech companies like Google. Most people who apply to any hiring role you advertise won't really be able to program effectively. (And if they can, they won't know the conventions of the language they use, they won't know anything about security, they won't be able to read code others write in order to debug it, etc etc.).

Programming is hard. It probably takes about a decade of practice to become good at it, and I don't think schools have figured out a replicable way to take someone off the street and teach them programming yet. (For example, most fresh grads have never had to read the code another programmer wrote. Imagine teaching people how to write without teaching them how to read!)

I think there's lots of angry junior folks out there saying "Hey, I can write a simple for() loop in python, and lots of people are hiring - but I apply to lots of jobs and keep getting knocked back! The hiring process must be broken!". And a few angry senior engineers out there saying "Why do I have to keep writing fizzbuzz? Its like I have to prove over and over again that I can program at all!".

Of course, nobody wants to admit that the reason they failed their 6th whiteboard interview in a row is because they aren't very good at system design. And the reason for that is that their collage didn't teach them any system design skills, and they have no experience, and they never learned how to use words to communicate technical ideas. And ArbitraryProductCo doesn't have the runway to train you up.

Of course there's the occasional person who is really skilled and somehow still manages to fail programming interviews all the time. But if the goal of a technical interview is "make sure we don't hire anyone who wouldn't be effective at the job", I think they're fit for purpose. I think the real sin is that we're afraid to tell people they aren't very good at programming yet, and we use technical interviews as a scape goat.


> The sad truth is that most devs fresh out of collage or a bootcamp aren't valuable enough to be worth hiring at large tech companies like Google.

.. which is really unfortunate, because those big companies are the ones that have the most resources to hire, train, and mentor juniors. At the opposite end we have small companies that can literally go bankrupt when their hire doesn't work out and is unable to deliver working software. Even if the hire works out well enough, they're not learning as much as they could in a well resourced team with enough serious talent & seniors to mentor them.

> I think the real sin is that we're afraid to tell people they aren't very good at programming yet, and we use technical interviews as a scape goat.

I don't know if it's useful to tell people that they aren't very good. It's a serious chicken & egg problem because everybody wants to hire seniors that can hit the road running and nobody wants to train the juniors. The juniors really don't need to be told that they aren't good enough, they need a place where they can get good!


> have the most resources to hire, train, and mentor juniors

In theory. I'm a pretty senior programmer now as in "been doing this professionally since the mid-90's" and as far as I can tell, modern project management principles are explicitly designed to make sure that I spend as much time as possible cranking out code and as little time as possible helping newcomers out. That may not be what the "scrum manifesto" says, but it's how the project management professionals charged with executing it interpret it.


Yeah? Being managed poorly doesn't mean they don't have the resources to do it differently. It just means they're not willing to expend those resources.

My point is, if we look at the other end of the spectrum, we have small companies that literally cannot afford to mentor juniors while paying them a salary (and tying up the seniors' productive time). Here it's not a question of how your company chooses to manage things, it's a question of whether they can afford to spend 20% of the budget on something that may turn out not to produce anything of value. Even if they're willing, it's a big gamble and can really put the company out of business in worst case.


> I think the real sin is that we're afraid to tell people they aren't very good at programming yet, and we use technical interviews as a scape goat.

So does this mean that programming education is broken? That companies should invest more in training? That bootcamps should revamp what they teach? That there should be industry standards for what programmers at different levels should be expected to know?

The most maddening thing about interviews imo is the opacity. There's always uncertainty about where exactly things went wrong, and how it should be fixed next time. It's almost a meta-engineering problem of its own. And it's a metaphor for the lack of software engineering standards.


> So does this mean that programming education is broken? That companies should invest more in training? That bootcamps should revamp what they teach? That there should be industry standards for what programmers at different levels should be expected to know?

Programming education is broken. I did one year of computer science at one of the top universities in the world (switched into mathematics after that), and I'd see that course as useless if not actively negative. The only way of learning that I've seen really work for anyone (myself included) is more like a craft apprenticeship, working closely with someone more experienced. We shouldn't be surprised that that produces widely different approaches.

Frankly the field isn't mature enough to have standards. If you tried to set a professional exam based on today's best practices, in five or ten years the answers would mostly be wrong. We still don't know the right way to write software. Million-dollar systems still come with laughably simple bugs.

What does the interview process look like for a craftsperson? That's probably the best we can expect from a field as unsystematic as ours. The one thing that strikes me is that in creative fields it's normal for people to show a portfolio of past (client) projects, whereas in software past employers usually keep all copies of your code. I have no idea how we'd go about changing that norm though.


> What does the interview process look like for a craftsperson?

Welcome to my shop. Here's some wood. Make a chair!

In most of the interviews I conduct, I get the candidate to write follow a spec we've written and some code. And I get the candidate to debug a program with some failing unit tests. About half of the candidates I interview fresh out of school have no idea how to get started debugging a program they didn't write. You need to do that just about every day at work, and its usually not even mentioned at school.

But I've worked as a teacher too. I will move heaven and earth for my students, but in my darkest moments I think taboo thoughts. Maybe IQ is a real thing, and maybe some people just don't have the neural hardware to ever be good at programming. If thats true, we do those people a huge disservice. We steal years of their lives and tens to hundreds of thousands of dollars training them in a skill they can never master. I'd love to see the stats on how many people graduate from a CS program, try but never find reliable work in our industry. I worry the numbers might be damning.

A few months ago a candidate I interviewed asked for feedback at the end of the interview. He wanted to know I recommended that he practice so he could improve. I said he should pick an opensource project on github - preferably something thats not too big and look through the issue tracker. Pick a simple looking bug and try and fix the bug and submit a PR. His whole manner changed after I said that - the idea of doing that was so incredibly daunting to him. But that right there? More or less, thats the hiring bar. As an interviewer I'm trying to answer the question "If I hire you, can you do the work?". Read, think, understand, modify, write code, communicate it to your team. Thats the work and thats the bar.


We maintain a queue of beginner-friendly tasks, and someone familiar with the codebase sits down with the new hire to orient them to what's needed, why, and the relevant files/modules to make the change. Then they're available for questions and patient with a laborious code review. You do 3 or 4 of these before you're expected to be independently productive.

Approaching an open source project cold is a bit higher than the bar.


Both of your "taboo thoughts" seem blatantly obvious to me, and not especially taboo. Intelligence obviously exists and some people have more of it than others, even if the metric "IQ" doesn't perfectly capture it. And yeah, some people are not cut out to be programmers. Why are we pretending otherwise?


I find that there's a borderline self-contradictory aspect of hacker culture. While we romanticize the concept of the 10x programmer, which is an individual of genius, our culture also stresses the concept of autodidacticism. Teach yourself to code. Just use MOOCs. Build it yourself. Hack on an open source project. So there's this tension between valorizing the inborn superhuman and the belief that it's possible to reach greatness- though perhaps not 10x greatness- by pulling yourself up by your bootstraps.

In a way, this is mirrored by the organizations themselves. Startups who make the right moves, and work hard through grit and 60-hour weeks, can become unicorns. Also some startups are led by visionary founders and cannot fail.

So all of this, buttressed by real world labor demands, create incentives for people to try to become programmers. Even those who aren't "cut out to be programmers." Our increasingly cutthroat and unequal society also incentivizes people shifting to programming as a safe career choice. "Learn to code."

I don't know how we can stop "pretending otherwise." Tech companies continue to complain about the engineering talent shortage. Bootcamps and online courses continue to promise people that they can become that talent. There aren't any agreed-upon industry standards by which to exclude people who truly aren't fit for it. FAAMG has infinite money and power in the industry to continue their entrenched practices. Most startups cargo cult the leading megacorps' processes. So instead, candidates are encouraged to continue grinding Leetcode and apply, apply again.


> I find that there's a borderline self-contradictory aspect of hacker culture. While we romanticize the concept of the 10x programmer, which is an individual of genius, our culture also stresses the concept of autodidacticism. Teach yourself to code. Just use MOOCs. Build it yourself. Hack on an open source project. So there's this tension between valorizing the inborn superhuman and the belief that it's possible to reach greatness- though perhaps not 10x greatness- by pulling yourself up by your bootstraps.

Those seem like orthogonal aspects. If we stressed the idea that you had to do (say) a 4-year degree at a great university, or something akin to the bar exam, would that be any more compatible with the idea that some people are 10x better than others? If anything I'd say the opposite: we'd expect most of the Harvard graduating class to be on roughly the same level, it seems a lot less wild that some self-taught people could be 10x better than others.

> So all of this, buttressed by real world labor demands, create incentives for people to try to become programmers. Even those who aren't "cut out to be programmers." Our increasingly cutthroat and unequal society also incentivizes people shifting to programming as a safe career choice. "Learn to code."

This happens in every field though? You get people who are desperate to become a doctor and apply to med school year after year, despite being completely unsuited to it. You get people who insist they're gonna make it as an actor/musician/comedian and spend decades working crappy day jobs so they can live where the action is, when really they'd be better advised to pick a career they're good at.

> I don't know how we can stop "pretending otherwise." Tech companies continue to complain about the engineering talent shortage. Bootcamps and online courses continue to promise people that they can become that talent. There aren't any agreed-upon industry standards by which to exclude people who truly aren't fit for it. FAAMG has infinite money and power in the industry to continue their entrenched practices. Most startups cargo cult the leading megacorps' processes. So instead, candidates are encouraged to continue grinding Leetcode and apply, apply again.

Well, if we told people outright that programming is a matter of IQ, and gave an actual IQ test rather than an IQ-like test in interviews, that might help some people realise it's not for them. You're right that what catches on in the industry is largely a function of what the most successful companies pick, but ultimately that list of top companies is not static and we'd hope that companies with better hiring practices will (eventually) rise to the top.


> If anything I'd say the opposite: we'd expect most of the Harvard graduating class to be on roughly the same level, it seems a lot less wild that some self-taught people could be 10x better than others.

That's not the point I was trying to make- I'm saying that in programming we both prize talent born of nature, and skill honed by nurture. (Though admittedly that may exist in many other disciplines.) Because of the latter emphasis on grit, hacker culture encourages self-improvement and going beyond the capacities one started with. That dogma of self-improvement goes against the notion that people are not cut out to be programmers.

Though of course, this could also be a marketing ploy for recruitment on behalf of management: "Anyone can code, you should learn to. But we only hire from the best." By encouraging an increase in talent, they have a larger labor pool to choose from (and potentially undercut wages), while plucking out the few that can pass their interviews.

> This happens in every field though?

To some degree, but the details vary. Medicine or law used to be seen as safe secure careers into the (upper) middle class, but doctors are limited through the AMA, and currently law is a notoriously difficult and costly profession with dwindling prospects. Entertainment and the arts is universally known as a risky proposition. We're talking about software, which has had the reputation of being the current surefire path to a stable, even successful, career, for at least the past two or three decades.

> Well, if we told people outright that programming is a matter of IQ, and gave an actual IQ test rather than an IQ-like test in interviews, that might help some people realise it's not for them.

Leaving aside the legality of using IQ tests to exclude candidates, that opens up the questions of if there is a direct correlation between programming good software and IQ, why programming out of all STEM fields should focus so heavily on IQ, and why all of those other technical and engineering professions don't need to resort to IQ tests for hiring.


> Read, think, understand, modify, write code, communicate it to your team. Thats the work and thats the bar.

So do your students do this? Why not?


Computer science isn't supposed to teach you how to program, and it's kind of silly to judge a whole subject by the first year. In fact I think you would be making a mistake to focus on the practical courses rather than take as many theoretical and foundational (e.g. advanced algorithms, Theory of Computation, Compilers) courses as you can. A good CS student, who understands the coursework and doesn't cheat, should easily become a good enough programmer just from completing coursework in a mostly theoretical program to get hired basically anywhere. Programming is something you learn incidentally because it's intertwined with what you're doing anyway; and to be blunt, most practical programming is rather mundane and simple compared to a rigorous CS curriculum.

I see it as like if I were hiring for something as generic as "writer". It's easy to have a generic "writer" produce a small sample for you on the spot, similar to a CS interview. Of course you can always practice writing directly itself, but I would imagine someone who had completed a lot of coursework in linguistics, classics, literature, etc. would on average be very well-prepared if they were a good student. But you could still practice and teach yourself on your own if you wanted to


> Computer science isn't supposed to teach you how to program

Perhaps, but it's still the closest thing the industry has to a "programming education"; I think it's the first thing employers look for, rightly or wrongly.

> it's kind of silly to judge a whole subject by the first year

How many of my limited days on the planet am I supposed to sink into something before I'm permitted to pass judgement? At some point Stockholm Syndrome would take over.

> A good CS student, who understands the coursework and doesn't cheat, should easily become a good enough programmer just from completing coursework in a mostly theoretical program to get hired basically anywhere. Programming is something you learn incidentally because it's intertwined with what you're doing anyway;

That's not what I saw happening (unless you count the official lectures/colloquia as "cheating"; certainly I saw cases where the meat of the answer to a supervision question was spoon-fed to us directly). The people who could program at the end of first year were the people who could program at the beginning or who "got it" immediately. I never saw people struggling with a new concept but then gradually being taught it (which is something I did see happen a lot in the mathematics course), and a frightening proportion of the students I was friendly with were coming out of that first year knowing seemingly nothing, certainly not being able to program or talk coherently about algorithms or computability. I suppose it's conceivable that those students were somehow getting something out of the system design type courses, but it seems implausible.

I understand there was a shake-up in that CS department a few years after I graduated, so maybe I went through it during a bad time. But the students who graduated there in the meantime aren't going to get a do-over.

> I see it as like if I were hiring for something as generic as "writer". It's easy to have a generic "writer" produce a small sample for you on the spot, similar to a CS interview. Of course you can always practice writing directly itself, but I would imagine someone who had completed a lot of coursework in linguistics, classics, literature, etc. would on average be very well-prepared if they were a good student. But you could still practice and teach yourself on your own if you wanted to

I'd suspect the overwhelmingly important part of writing is actually writing; I only know one person who I'd call a great writer, and hanging out with him the thing you notice is that he writes the way other people check their phone. All the things you list can enhance writing, certainly, but if you don't actually write then any amount of knowledge of linguistics or classical literature is meaningless (at least in terms of how it affects your writing ability).


>How many of my limited days on the planet am I supposed to sink into something before I'm permitted to pass judgement?

More than the first quarter of it. I took at least a year of math classes, and I don't feel qualified to pass judgement on the math department.

There's a huge difference between intro and upper level classes. Just like there's a difference between Calc I and proof heavy upper level math class.

That being said I think the overall pedagogy is better in the math department, but then again they've been doing this for a lot longer.


What's that saying, takes about 10,000 hours of study, practice and application to master anything, regardless of what it is.


> am I supposed to sink into something before I'm permitted to pass judgement

I don't have a medical degree, but I still trust medical science. Why? Because it achieves positive results, and people I trust for other reasons trust them.


Ok, but is that true of CS teaching? The best programmers I've worked with have mostly not had CS degrees (tended to have degrees in maths, physics, or that sort of area).


>The best programmers I've worked with have mostly not had CS degrees (tended to have degrees in maths, physics, or that sort of area).

There's no you were rigorous enough in tracking this for this statement to be useful to anyone. The people who don't fit the mold stand out.


> There's no you were rigorous enough in tracking this for this statement to be useful to anyone.

Sure, but has anyone claiming the opposite done rigorous analysis? Is there any evidence that having a CS degree makes for better programmers than not?

> The people who don't fit the mold stand out.

I'm not thinking about people who stood out as particularly unusual. Most of the time I didn't find out which field someone's degree was in until months into working with them.


>Sure, but has anyone claiming the opposite done rigorous analysis? Is there any evidence that having a CS degree makes for better programmers than not?

I'm not making that claim, you're the one making a claim that people with degrees other than CS are better programmers without evidence.

I'll only make the claim that a CS degree made me a better programmer. Specifically the upper level theoretical classes. I can verify that there are many problems I've solved because I realized that the problem I was working on had already been solved 50 years ago.

I also worked about a decade as a professional programmer without a CS degree, before I went back. Personally I am a better programmer.

Would I have been an even better programmer had I taken another few semesters of math classes instead of CS classes? Who knows? Absent any other evidence though, the simplest explanation is that domain specific knowledge is likely useful.

>I'm not thinking about people who stood out as particularly unusual. Most of the time I didn't find out which field someone's degree was in until months into working with them.

The point is that the more unusual someone's background is, the more likely you are to remember it. Particularly if there is some confirmation bias involved.


> Absent any other evidence though, the simplest explanation is that domain specific knowledge is likely useful.

Disagree; surely the null hypothesis for any given training programme is that it has no effect.

> The point is that the more unusual someone's background is, the more likely you are to remember it. Particularly if there is some confirmation bias involved.

There isn't anything unusual about professional programmers having a degree in maths or physics rather than CS. At least in my experience it was pretty close to an equal split.


>Disagree; surely the null hypothesis for any given training programme is that it has no effect.

That's not what's under test here though. It's training program A that includes domain specific knowledge or training program B that does not.

>There isn't anything unusual about professional programmers having a degree in maths or physics rather than CS. At least in my experience it was pretty close to an equal split.

Look at the number of graduates, the only way that is true is if almost every single physics or math graduate goes into programming. The fact that you think it's true is just further evidence of bias.

According to the Stack Overflow Developer Survey [1], about 8% of professional developers with degrees majored in math, or natural sciences vs. 63% in CS, software engineering, or computer engineering. There could be some sampling bias, but that's a huge difference.

1. https://insights.stackoverflow.com/survey/2019#education


I'm not talking about a maths or physics degree specifically. I'm talking about having a CS degree or not. The stack overflow survey seems to have shifted towards CS recently; the 2015 results (earliest I could find) imply 52% of professional developers had CS degrees at that point, which was about halfway through my career so far, so 50:50 for people I've worked with sounds about right.

I suspect SO surveys are heavily biased towards younger developers, but even taking those 2019 numbers at face value: about 20% of professional developers have no degree, and of those with degrees it's about 75% CS/information systems/sysadmin/webdev, 17% maths/physics/engineering, and 8% other. So a typical 15-developer team would be 9 with CS degrees, 3 with no degree, 1 with an engineering degree, 1 with maths/science and 1 other. The non-CS folk are not exactly rare unicorns.


>I'm not talking about a maths or physics degree specifically. I'm talking about having a CS degree or not.

Well then why did you say this:

>There isn't anything unusual about professional programmers having a degree in maths or physics rather than CS. At least in my experience it was pretty close to an equal split.

You said explicitly math and physics degrees vs CS degrees. And you previously said the best programmers tended to have math, or physics degrees or something similar.

This isn't me being pedantic, it was the entire context of the discussion.

The point is that you are prone to confirmation bias as evidenced by your belief that it's close to an equal split. Your mental model is overrepresenting people with physics and math degrees likely because it confirms your belief that they are better programmers.

>So a typical 15-developer team would be 9 with CS degrees, 3 with no degree, 1 with an engineering degree, 1 with maths/science and 1 other. The non-CS folk are not exactly rare unicorns.

That's not the point, it's that you are more likely to remember the background of the 1 person on a team who has a Math degree because she is relatively rare compared to all of the people with CS degrees. This is a well known and well documented phenomenon. And it's one of the primary reasons that anecdotal evidence, even a large amount of anecdotal evidence is so often wrong.


> You said explicitly math and physics degrees vs CS degrees.

I was giving those as examples of degrees that are normal and don't stand out. We don't think there's anything particularly odd about a programmer with a maths or physics degree. That's all I was saying.

> you previously said the best programmers tended to have math, or physics degrees or something similar.

A category which would include engineering, at which point we're at 20-25% of professional programmers with degrees by your numbers (which I still think are significantly biased).

> The point is that you are prone to confirmation bias as evidenced by your belief that it's close to an equal split. Your mental model is overrepresenting people with physics and math degrees likely because it confirms your belief that they are better programmers.

My mental model is that it's an equal split between CS degrees and not, and per your own sources that's accurate. You're fixating on a couple of specific examples of non-CS degrees that I mentioned when that's completely beside the point.


I'm not fixating on anything. This entire chain started as a reply to a post you made where you switched to math over CS, and specifically to this statement.

>The best programmers I've worked with have mostly not had CS degrees (tended to have degrees in maths, physics, or that sort of area).

That's the entire context of the discussion. The assertion that CS majors have worse outcomes (with respect to programming ability) than math, physics or similar majors.

>A category which would include engineering, at which point we're at 20-25% of professional programmers with degrees by your numbers (which I still think are significantly biased).

If you are including math, all natural sciences, and all other engineering degrees you get 17%, not 20-25%.

>My mental model is that it's an equal split between CS degrees and not, and per your own sources that's accurate. You're fixating on a couple of specific examples of non-CS degrees that I mentioned when that's completely beside the point.

There is no other logical way to parse this statement

>There isn't anything unusual about professional programmers having a degree in maths or physics rather than CS. At least in my experience it was pretty close to an equal split.

than that you were talking specifically about math and physics.

I get it, you don't like that there are numbers that contradict you, so you are grasping at straws trying to find alternate interpretations to reconcile your statement with the numbers. You obviously don't like being wrong. I don't either, that's fine, but no one other than us is reading this far down. There's not point denying you farted when there's only 2 of you in an elevator.


"The best programmers I know didn't have a degree in CS" has become a pretty tired cliche at this point...


As someone who dropped out of the process 1/4 of the way through you're not really in a good position to comment on the effectiveness of completing it.


1/3, and I dropped out of it because it wasn't teaching me (or any of the other people I met there, including those who didn't have my prior experience) anything - I got a first in the first-year exams if that's the kind of thing you care about. Absolute best case is that that degree has two years of useful content, and I rather doubt it.


When I was in university doing a CS degree, if you had dropped out 1/3rd of the way through, you'd have pretty much only gone through a handful of weeder intro CS courses that were more math than programming. So yes, if you had dropped out at that point, you would not have really gained any useful programming skills.

I don't entirely disagree with you, there were certainly some later classes that were similarly not useful in the long run, but it wasn't a total crapshoot. My senior year included a year-long group project on a team of ~10 people that basically took us all the way through the lifecycle of a project -- from inception, to design and architecture, to polish and QA, to 'releasing' it. It was a very useful course that placed you into a startup-like atmosphere.

But I think this kind of confirms that apprenticeships may be far more useful to the programming field than college degrees. If my CS program did not include that senior course, I would probably pretty vehemently agree with you. And anecdotally, at my current workplace, one of our best programmers is a kid we hired basically right out of high school who has since grown in skill considerably thanks to the attention of more senior engineers.


> So does this mean that programming education is broken?

Of course it is broken, it rarely mention naming, debugging, and never emphasize reading code. The real fundamentals are not there and you learn them on the job.


I disagree. Programming education exposes you to the stuff that you would probably never discover on your own like complexity theory and circuit design: the practical stuff like debugging and SQL are things that are best learned by practice anyway.


Isn't it the companies' role to turn those junior developers into senior developers? Where else are they going to come from?


I don't know! Is it their job?

The attention of senior engineers is the single most valuable, expensive and scarce commodity of a modern software company. The incentives are absolutely not aligned for most companies to make it worth their time to hire junior engineers.

But obviously thats a problem - because as you say, where else will senior engineers come from? And I don't think we have a good answer here. The old school answer was apprenticeship - a company trains you up, and in exchange you work for them for a time. But most companies are loathe to do that because you're likely to quit and go somewhere else for a pay bump as soon as you have enough skills and experience that you're worth hiring.

For my money this, right here, is the real problem with modern programming / career development. Whiteboard interviews are the least of our problems.


But most companies are loathe to do that because you're likely to quit and go somewhere else for a pay bump as soon as you have enough skills and experience that you're worth hiring.

This is only true in bad companies. Good companies understand that people move on, and don't assume that someone is hired in to a role forever. Once you realise that hiring is an expensive process that you will always be doing you then you can optimize it appropriately.

Some really good companies even use it as a point in job adverts. There are plenty of senior developers who actively want to teach and mentor juniors, and will be happier working in companies that encourage that process. It's a good way of retaining those staff.


> Good companies understand that people move on, and don't assume that someone is hired in to a role forever. Once you realise that hiring is an expensive process that you will always be doing you then you can optimize it appropriately.

Yes, they understand this, so they simply do not hire people who need to be trained for a year or two to be effective hires.


After the Y2K-bust, where are these good companies?


> The old school answer was apprenticeship - a company trains you up, and in exchange you work for them for a time. But most companies are loathe to do that because you're likely to quit and go somewhere else for a pay bump as soon as you have enough skills and experience that you're worth hiring.

One problem is that companies with the means to fund this- FAANG and the like- are also the ones are incentivized to gatekeep the most to maintain their elite status and engineering superiority that leads to business edge. Startups, being naturally less risk adverse, are less likely to engage in formalized apprenticeship programs for juniors.

What happens instead is that you get the half-hearted approach of hiring college students to be interns or co-ops. Not all students are able to intern. The ones who do have a leg up once they graduate. Hence the stellar reputation of U of Waterloo grads.


There is the part where you give junior pay raise after he or she learns enough to get raise elsewhere.


If you're going to end up paying what it would cost you to hire the trained person anyway to avoid having your staff leave for higher wages, that heavily reduces your incentive to put a lot of resources in to training. Training + market rate compensation is more expensive than just market rate compensation.


But is training + market rate more expensive than hiring + market rate ?

Also, it seems to ignore the fact that training people is actually an incredibly useful thing to do to hone your skills as a senior dev, and that having to teach forces you to cristallize, simplify and explain thoughts and processes that you possibly never challenged before.

Tbh, I think I'm more productive and learning more when I have a decent intern to coach in my team than otherwise. So it's really a win-win situation.


While this is true, it seems to me that most companies don't actually have both choices. Instead it's "training + market rate compensation" or "you will forever struggle to have enough senior devs to accomplish your goals". If you can't attract talent, you have to grow it and pay to retain it. Simple as that, no?


Even if your junior-turned-okish developer goes away, if you did a good job and part on good terms you now have a working advertisement in another company. And maybe a couple years down the line they'll come back.

Investing in your workforce for the long term: when you're big enough I think you should do it. Having a "reserve" is also useful. You don't have devs doing nothing: you have devs learning, teaching and ready in case a business opportunity appears.


I think you also have to account for the cost of acquiring senior staff. If you train them and pay them, your pipeline becomes more predictable than plucking them ad-hoc from other companies. That makes planning (your roadmap) more predictable.


Market rate for junior + training while he/she is junior is not more expensive.


A lot of companies struggle to match internal comp changes with what they'd actually offer on day one, for some reason.

But there are bigger problems. I've experimented with training people in various ways over the years.

Reality is programmers move around a lot. They are attracted by interesting new problems where they feel they're learning. Staying at one firm for 20 years isn't likely these days. There's nothing wrong with that, but it means if you spend a few years training someone then after that time period they may leave anyway, even if their comp is reset to be competitive, because the new place can offer them equal comp + new problems.

Another issue is that a lot of training junior-to-senior is about imparting experience, wisdom, beliefs etc. At some point they can code and it's about the decisions being made, rather than inability to do them. A characteristic of junior devs that are growing their skills is they tend to latch on to trends harder and quicker than senior people who have maybe seen it before, and can differentiate their CV without buzzwords. If a junior comes to work one day and says "We need to Kubernetize all our things" and you say "Actually, our server count is stable and low, we don't need to use Kubernetes, but we do need this new customer-facing feature implemented" then it's quite possible they'll get frustrated, want to argue with you. Of course replace Kubernetes with Haskell, Linux distro of the day, Rust, Go, whatever seems hip and new.

It can just end up being draining for everyone. Of course debating these issues can be teaching of a form, but often juniors don't see it that way. The student wants to become the master faster than sometimes makes sense.


> if you spend a few years training someone then after that time period they may leave anyway

If it takes years of training till the person is useful and worth it, then there is something wrong with the way training is organized. We give juniors easier tasks then to seniors and train them, but we also expect them to be useful from the start basically.

It really should not take years till they make enough work for for salary + training.

> If a junior comes to work one day and says "We need to Kubernetize all our things" and you say "Actually, our server count is stable and low, we don't need to use Kubernetes, but we do need this new customer-facing feature implemented" then it's quite possible they'll get frustrated, want to argue with you. Of course replace Kubernetes with Haskell, Linux distro of the day, Rust, Go, whatever seems hip and new.

There is no difference between senior wanting to change things. This sort of conflict is normal and final decision is not done by junior.

Most people actually can handle not being given their way all the time. If they have no zone for own autonomy or decisions then they will get frustrated. But that zone should be smaller then the massive architectural decision.

Moreover, portion of people pushing toward new technology and being willing to experiment is something that company should have to keep healthy. Otherwise you all will stagnate.


A lot of it is about opportunity cost. Do you take someone fresh out of school or someone who's been around the block a few times if the latter costs only 50% more? The cost of even a junior but basically able programmer can be quite high if their work has to be redone (heck the cost of a non-junior but bad programmer can be high), so you end up needing to assign tasks that aren't that important or for which large slips can be tolerated. It can be tough to find a supply of such tasks sufficient to keep people fully throttled.

There is no difference between senior wanting to change things

Seniors are more likely to have been through several jobs and environments, and learned that tooling choices aren't that big a deal at the end of the day. They've also already (hopefully) got some accomplishments under their belt and don't need to redo other people's work to have something to show - they're more likely to do higher risk strategies like trying new things as a consequence.


They can make that decision if they want, or they can wait a bit to hire a more experienced dev for money money. I wouldn't say it's their "role" - nobody's forcing them to do so, and if it doesn't help them, I don't see why they would feel obligated to do it


I feel like this is a bit of a self-reinforcing cycle:

“Why would we train them up, they’ll just leave anyway” -> “screw this place, I am going to leave when I get the chance” -> leaves -> go to step 1


Two things that have never happened in more than 200 interviews:

1) Anyone with >0 years of experience outperforms a Waterloo internship candidate on the coding or algorithms round.

2) Anyone interviewing for a senior position performs well on the other rounds and fails only system design.

I'm pretty sure it's large companies that can afford to play long-term big-picture strategies with talent, and small ones that have such low-rent concerns as which languages and frameworks you know.


> huge amount of pent up demand for senior developers

But is leetcode really the way to find them?


Isn't Google known for analyzing the hell out of their candidates for predictors of success? They dropped the GPA requirement, college degree requirement, brain teaser questions - but they kept the leetcode questions. It's probably least bad of everything else they've looked at.


Google talks a lot of shit about rather missing out on a good candidate than hiring a bad one.

What removing a metric tells you is that people that they regretted hiring people who looked fine on those metrics. They have no way to determine if people who failed on other metrics were better than the people they hired.

The problem is that if you measure someone by a metric you can’t actually disregard it in your decision process. Even a double blind affects the subject’s behavior.


> And a few angry senior engineers out there saying "Why do I have to keep writing fizzbuzz? Its like I have to prove over and over again that I can program at all!".

You didn't address this part of the problem. This friction is one of the reasons the job market is so distorted.


Oh sorry I wasn't clear. The reason is that successfully completing a simple programming exercise puts you in the top 15% or so of resumes that get sent in for most programming roles.

When you apply for a company, they have to assume you aren't very good, because most people who apply aren't very good. (Because people with strong skills get snapped up, and people with weak skills spam their resume everywhere they can.). Figuring out who's worth spending time interviewing is a hard problem in itself. And there's no silver bullet here - people lie about their work experience all the time. There are so many user accounts on github with copies or forks of random people's code, with basically no changes. I suspect they exist support lies on resumes.

I don't think it distorts the market, but it is annoying. A recruiter I talked to a few years ago said she thinks its crazy we don't use an agency model for programmers like actors do. The idea there is that good programmers pay a small percentage of their salary to a manager, who's job is to find you the best roles that suit your skills and negotiate pay on your behalf and so on. She tried to set up a business doing just that but she couldn't get enough clients to make it work. Good programmers balked at the idea of paying a few % of our salary to someone in an ongoing way, to look after our career. Having to prove your skills in each and every interview, and form those social connections on your own? Thats a choice we make.


> (Because people with strong skills get snapped up, and people with weak skills spam their resume everywhere they can.)

It's extremely saddening how prevalent this prejudice is.

But what if I am still looking for work and companies literally don't reply to my applying to them? I might be the next John Carmack but if nobody gives me a chance (for reasons outside of my control and unrelated to my proficiency) then according to you I suck. :(

It's very broken to assume that skilled people get snapped up immediately so whoever is available must be mediocre (or bad).

It is an awfully imprecise assumption!


> It's extremely saddening how prevalent this prejudice is.

Prejudice?

Look, companies wouldn't be advertising for roles if they didn't think qualified candidates (like you?) are out there. But its a numbers game; of course there are more unqualified people looking for work than qualified people at any given moment. Qualified people don't get snapped up immediately; but they aren't usually actively looking for work for long. And they often are picky about which places they apply to - for good reason. A highly qualified candidate might apply for 3 roles, get 2 offers and accept 1 of them. A weak candidate might apply for 50 roles. If those are the only two people sending out resumes - (or in general the pool has 50% strong candidates and 50% weak candidates), still only 6% of resumes will come from strong candidates.

Thats not none. And I really hear you about how frustrating it must be for companies to not even bother to reply - I mean, thats pretty rude. But ... what behaviour do you expect? What would you do with a pile of 100 resumes if you expected only 6 of them will be strong candidates? Should they bring all 100 people in for interviews, just in case there's a young John Carmack amongst them who has terrible resume writing skills? (I've interviewed 2 people who fit that description out of the 400 or so I interviewed in the last year. They definitely exist. But finding those people is prohibitively expensive for most companies.)

Flawed? Yes. Biased? A little. Prejudiced? That seems like a stretch. Can you think of a better system? That conversation seems more interesting than just complaining about it.


What I extract from your reply is what I am actively trying to teach myself: DON'T. TAKE. ANYTHING. PERSONALLY.

Yeah?

BTW I am not exactly young -- 40 y/o with 18 years of professional experience (not claiming anything about quality). I was just objecting to your general premise that if somebody isn't snatched immediately then they must be mediocre because I've witnessed programmers times better than me (whose sole efforts turned entire companies around) slag around jobless for 6 months and not being able to move beyond 2nd interview even if EVERYBODY told them they like their expertise and demeanour and that they are a good cultural fit.

But you are very likely correct that it's a numbers game and that various circumstances prevent companies to actually actively look for the gems.

So again, I do my best not to take anything personally.


> you are very likely correct that it's a numbers game

On the surface it's numbers game. But 2/3rds of getting hired is a confidence game.

https://en.wikipedia.org/wiki/Confidence_trick

See also

https://en.wikipedia.org/wiki/Affinity_fraud

Some programmers are really good at what they do but they just suck at playing the game. Or more likely they suck at it worse than the average hiring manager.


Interesting reads, thank you.

As for confidence, I should learn to fake it already I guess. I am a realistic down to earth guy who doesn't deny when he doesn't know something -- nobody can know everything. But that's likely not the point; more likely it's about projecting an image of "nothing can give me a true pause"?


It's not like fake it, think of it being more like an actor playing the roll.

It's a skill some people are naturals and some aren't everyone though can get better at it.


Imagine a scale from 1 to 10.

A 10 means "The interview process is fine. It judges people fairly and objectively, and works well for both candidates and companies".

A 1 on the scale means "The whole interview process does a disservice to almost everyone it touches, and reflects badly on our industry as a whole"

Where do you think we are?

Personally I think we're at about a 7. Which is to say, I agree with you. I've interviewed people like that and it breaks my heart every time to hear their stories. But I think there's a silent majority for whom the interview process works as intended.

- About 0.5% of people I've interviewed had amazing skills but were unable to explain those skills on a resume, and wouldn't be able to get their foot in the door at most companies they applied to

- At a guess there's probably another ~3% or so of candidates who just for one reason or another don't come across well. They don't look or sound like they know what they're talking about, or something else is going on for them. But they actually have great technical skills when you get down to it. I suspect those people struggle to find work in most normal hiring processes.

- (I have other criticisms too - like how we don't give people feedback after interviews)

But thats ... I mean, it matters, but I think the cohort of "unappreciated gems in the rough" has to be in the small single digit percentages. There's a lot of blog posts complaining about hiring in general that hit the front page of HN, but I don't think they're fair representation of the state of software engineering hiring across the board. Great people getting looked over are the exception. The awful truth is that most people, most of time are judged fairly in job interviews based on their skills. Its just that programming is really hard and almost everyone in the world is terrible at it. Its so hard to learn that you can go into massive debt and spend years studying it in school, and still be mostly unemployable out the other end. In fact I suspect most people fresh out of school struggle to find work, because they just aren't very good yet.

So no wonder these posts get upvoted on HN. People have a really good reason to feel angry and let down by the system. The story that you're a gem and you're being passed over by the soulless corporations is a much easier pill to swallow than the idea that you're being looked over because you aren't very good at programming. And your degree means nothing, and from here it'll take you years to get good at programming, if you ever manage it. And nobody wants to take the time to teach you on the job. And nobody thought to tell you any of this while you were in school.

I don't know who's fault it is - if anyone's. Companies are doing whats in their best interest. Schools are doing their best to teach CS to everyone who wants to learn it. People think going to school to learn programming means they can find useful work out the other end. I'd like to think that most do, eventually. But there's a chasm in the middle that nobody talks about. A chasm between knowledge and programming jobs. Many people never find their way out of that gap. We don't even tell people its there.


Where is the evidence for any of this? Like, truly, I hear this line about how there is approximately nobody who is competent, and I honestly think it is BS. What evidence is the claim based on? Is it that you've seen how well people do in interviews and noticed that only a small number of them perform well? How would you falsify the hypothesis that those people are fully capable of being excellent employees and are just not good at playing this particular interview game?

I think people with your mindset have just repeated this over and over again to the point that we all take it as axiomatic. It also strokes our ego ("I can do something that basically nobody can even learn how to do!") so it's easy to maintain the farce.

This is the second time in this thread that I've seen the claim that the anger is driven by people who feel let down by the system. That's not me, the system has worked really well for me. But I still think it is a hall of mirrors that could be serious improvement. Not a 7, more like a 3 or 4, and held back by both people who think it works really well and people who think it's a bad solution but that there isn't really a better one (which I'm more sympathetic to).


FWIW, I appreciate you sharing this perspective. I concur, more or less, with your position here after having interviewed many hundreds of people over my career. I've found internships were the best way to identify who was quality out of a class. It seemed of every 100 new grads I interacted with maybe 3 or 4 were skilled and engaged in learning more. But there are also some standouts and I really wish more companies did internships with larger cohorts and allowed their senior folks time to mentor. Some of my best experiences in my career were mentoring interns and I made some solid life-long relationships that way as well.

I've often posited among friends in the industry that it's partly due to apathy and mismatched expectations. College students go into CS because they expect a good paying job, not because of any specific aptitude or interest. CS isn't the study of software engineering, nor of the act of programming. They graduate, with no appreciable skill, and no real desire to learn, and can't find work. We can argue about fairness until we're blue in the face, but the reality of the situation is that motivated students who are actually interested in learning have vast free resources available to them to learn and free resources available to them to demonstrate their competence. Anyone not taking advantage of these resources is going to be subpar compared to others in the labor market.

Pretty much anyone who went to college in the last 20 years should have been able to anecdotally identify the stand-outs in their classes, and in general on the other side of the pipeline those are the only folks having an easy time getting hired. I dropped out of college, even though I was one of the stand-outs, because I wasn't learning anything I hadn't already taught myself. I've had less difficulty in my career than some of my former classmates I keep up with and my salary is multiples of theirs when they are employed, we're all the same age and they're technically more credentialed. It comes down to the fact that most people are just not very good at programming or at systems design. In fact, I'm not a good programmer, my skill-set is very much in the arena of systems design.


> I've often posited among friends in the industry that it's partly due to apathy and mismatched expectations. College students go into CS because they expect a good paying job, not because of any specific aptitude or interest.

This is not unique to CS. I feel this is the general mindset among most people who go to college: "I get a degree and get a job" and is the reason for so many young people coming out of college with a 4 year degree and no job.

Just getting a degree doesn't mean you are smart, curious, or driven. I have friends who got liberal arts degrees who are smart, curious, and driven and went on to have great careers and friends with business degrees who don't care and are unsurprisingly "underemployed" by their degree, but not by their personality and the way they live their life.


Or what if I’m being picky? I made a mistake of expediency in accepting my current job, and it’s not the first time. Several people I trust have been trying to talk me into applying to more places and turning them down if I don’t see what I’m looking for on their end. But they tend to overbook and one of the first things to go is your opportunity to ask questions.

Which is, really, as good a way to learn about someone as most of the techniques we try. You only have ten or fifteen minutes to ask the company questions. What did you hunk was worthwhile to expend that time on?


Not sure what part of my comment you replied to exactly?

I also am kind of picky although I am not sure how wise that is in the current crisis.


Yeah that's a fair question.

I think it depends on the context. To be picky, you could just send out a lot of resumes, and I as a prospective employer wouldn't have any clue how many you've sent, as long as you are still at your old company the whole time.

However if you have a good emergency fund, your old company folds, and you are being picky, people start to suspect you are unemployable after a bit.


> When you apply for a company, they have to assume you aren't very good, because most people who apply aren't very good.

Is this a problem for other professionals? Why not? When an accountant with a successful career spanning 15 years applies to a new job, does the company "have to" assume that know exactly nothing?

It does distort the market. It trades off against the desire to switch jobs, which distorts the availability of job switchers. When I get an email from a recruiter, I'm not just thinking "would that job be better for me?", I'm also thinking "am I willing to practice and perform the whiteboard code ritual right now?".


> A recruiter I talked to a few years ago said she thinks its crazy we don't use an agency model for programmers like actors do. The idea there is that good programmers pay a small percentage of their salary to a manager, who's job is to find you the best roles that suit your skills and negotiate pay on your behalf and so on.

Such things do exist (at least in some form) but I haven't had a good experience. Same old story: like everyone else, they're looking for seniors, turning down juniors, and complaining about shortage of talent.


This is the issue I have faced, and probably why I am leaving this whole mess behind. I still love engineering, but it's time to move to a mentor/consulting role. I've been in this game for over 10 years, and just can't be arsed to do any more testing. I once had someone ask me what a "cookie" was...

Storytime: I used to work at a huge Japanese multinational (you can probably guess which one), and I rose through the ranks pretty quickly. After two years (which given their turnaround, made me pretty ancient), I left the firm for greener pastures.

A few years later, they started a new sub-division, and a personal contact recommended me. It was a decent pay bump, and despite the negative press, I didn't mind working there, during my tenure. I figured I'd be a shoo-in. It was for a non-technical role (PM), but they wanted someone with a tech background, which I had. They then wanted to give me an online engineering test just as a formality. It was a quick test, and I solved it with little issue.

Fast forward 2 weeks and I get a rejection email, saying that my code wasn't up to snuff, and they'd be moving on. It took everything in me to not write back saying, "Hey, go to your main service and log-in. Ok yeah, so the interface and client backend to your SSO and the user management services that millions of your users interact with each day -- I wrote all of that. Oh yeah, your absurdly complicated coupon system -- I wrote a lot if not all of that too."


Another observation: I don't think there is a single company I've successfully working at in the past, where I could pass their interview process today (including my current company). Companies are being extraordinarily picky now, and the interview bar just keeps going up and up.

Companies keep complaining that they can't find talent, while their interviewers are asking candidates to implement red-black trees on a whiteboard in 5 minutes.


"It took everything in me to not write back saying..."

Sometimes it is worth writing back. Because people often don't realize there are flaws in the process they are using. Either they will ignore the feedback and it's business as usual or they will take it and work towards improving the system.


If the whiteboard warriors were good at system design. Google would be the top video-conferencing tool over Zoom.


> working at a company in the hiring space. I've done over 400 technical interviews

TripleByte ?


Sorry but if I wanted to be clear about where I work, I would have done so.


definitely TripleByte


> I've done over 400 technical interviews in the last year alone

Have you gotten any actual work done in the last year?

> in the hiring space.

Oh wait. This is your job.


One thing I do is:

a) Open time. I give them a couple problems in the morning and they can take as long as they want to think about them before the interviews. This takes away the 'deer in headlights' situation that so often happens to people - all of us. I'm good on my feet, but 1 time out 3 'I just don't see it' immediately. But if I have some stress-free time, I usually do.

b) Don't test their knowledge of algorithms. Who cares if they've practiced certain things a lot? Doing homework is a small measure of what you want. I give them fairly basic problems that have nice side-show questions to ask and just have a discussion.

c) I try to test for knowledge where they should have it: if they've been on tools & build or dev ops they should have a good grasp of things there. Another way to say: you're looking for strengths not weaknesses. I keep an open mind and think 'how can we leverage this person's talent'? Maybe they're not good over here, but better over there.

d) Fairly simple code, idiomatic, etc. not rocket science.

e) Mature communicator and by that I almost literally mean not too crazy. I think most of us are 'abnormal' and that being a 'decent communicator' is a little bit rare. Companies are full of weird dynamics they have to be resilient a little bit.


Unfortunately, I don’t think there’s much data around hiring outcomes because companies really don’t have a reason to share that information. For a lot of people, it comes down to personal experiences.

I’ve definitely interviewed with companies where I was put in really unnatural situations that I struggled to shine in. For example, I bombed an interview in which I had to do a live coding challenge where I had to talk through what I was thinking while coding, which the interviewer insisted on so they could understand “how I think”.

I thought it was an incredibly cumbersome and useless request because I never talk when I code, and coding in front of others in an interview is tough enough as it is. I asked if we could discuss the solution once I completed the problem, but they declined that idea.

Needless to say, I didn’t get the job and the assessment felt like a huge waste of time. I think a lot of people see companies doing stupid stuff like this and get frustrated because candidates are often capable while being placed in high-pressure situations that don’t enable abstract creative thinking, which programming requires.


> For example, I bombed an interview in which I had to do a live coding challenge where I had to talk through what I was thinking while coding, which the interviewer insisted on so they could understand “how I think”.

FWIW, there is limited time to collect data about a candidate in the context of an interview, and being able to observe a candidates’ thought process can help a lot with that. Obviously, this biases hiring towards candidates that are more observable, all else being equal.


Not sure where you interviewed maybe it's not relevant, but a lot of jobs you will not always be in a position to do the work 100% isolated and by yourself and sometimes there will be other people in the room.


And most of them will be not actively evaluating your skill constantly.


It varies among the FAANGs, but in my experience Google was the worst with the algorithms above all approach. I actually had one interviewer cut me off after about a minute when I was describing an interesting problem I worked on, saying “yeah ok that’s great let’s get on to the algorithm question.”

I’m sure it varies between interviewers but the lineup I had at google clearly didn’t want to dedicate more than a minute or two to non-algorithm questioning.


It's because background/experience questions are useless.

Whenever hiring comes on up on reddit or HN I see a lot of posts that can only come from people who haven't been an interviewer much.

First rule of hiring: most of the people you talk to want your money. The candidate is selling themselves to you. They know firing people is hard and there are almost never consequences for misleading interviewers.

What happens when people are asked to talk about their prior projects, experience or really anything that isn't a highly controlled and repeatable coding question?

1. They pass off things they saw or read as their own experience.

2. They claim a team's accomplishments as their own.

3. They massively exaggerate or try to BS you in other ways.

4. They give uselessly vague answers, not necessarily deliberately.

Maybe you don't do these things, but back when I bothered asking these sorts of questions I did encounter such answers pretty frequently.

Companies have converged on live coding because that's something concrete, real and largely un-bullshittable. Yeah, toy programs in interviews aren't "real" programming but it's a lot closer to real than listening to someone ramble in a disorganised way about "their" previous project, and at the end realise you still don't know what that person actually did and what was done by others.


Your recruiter was rating you on your performance on the algorithm question, them cutting you off was actually them trying to help you spend enough time on the question.


Good point. Maybe I'm unusually generous, but when I'm in the interviewer seat, I generally try to steer candidates into territory where they can succeed. I want to give that "hire" recommendation! But I need pull enough signal from the noise to be able to support that recommendation, and ultimately I need to measure that signal using the measuring stick defined by my employer. If I didn't need to conform to that measuring stick, I'd just do interviews that were casual conversations about projects. But I need to ask certain types of questions to measure certain dimensions.

The "tell me a little about an interesting problem" question is usually there to gauge your ability to summarize a cool problem and solution down to to an appropriately sized story. Sometimes when I ask this question, candidates just launch into stream-of-consciousness expositions, talking just-in-time as the details come to their brains. At the 2 minute mark, I start thinking come on, don't do this to yourself. At 5 minutes, I will gently try to hint to the candidate to try to summarize and wrap it up. Some people just try to fill every silence with words and I have to finally firmly yank them back and cut it off. I hate to have to do it, but if the candidate can't time-manage the answer, I have to time-manage the questions.


I get that, but it just made it obvious that nothing else mattered but the whiteboard (or in this case, luckily, a chromebook with text editor).


Is Google having trouble finding qualified engineers though? If so, what is evidence of this?


I’m sure they aren’t for the most part. They have so many people trying to get in that they can afford to have a lot of false negatives though.


it was for your own benefit. intro is just to get you relaxed. most of them don't even consider that part in the evaluation. so the less time spent there the more time you can spend on the algo q.


It's counter productive though if the interviewee is cut and gets more stressed out.


I would say the issues with interviews are

* they're not cheap (although not that expensive all things considered), assuming you spend say 10 person hours on two calls and an on site.

* they produce false negatives. on the spot exam style algorithms interviews don't reflect day-to-day work, and they filter out people who might be great at the job, but aren't used to the specific type of stress that interview creates.

* they can produce false positives. failure to check for interpersonal skills in leet koders can leave you wishing you had never hired them.

* they're painful for the candidate. ideally your role is the best fit for a candidate. people only have so much emotional and logistical bandwidth. they may take an offer after interviewing with a handful of companies before they ever find you.

now, I don't have numbers for those last three points. I don't know if anyone does. but we can look at most hiring processes and see that they are a problem, we just can't tell if it's a big enough problem to consider the process "broken".


> we can look at most hiring processes and see that they are a problem

I’m not seeing a compelling problem with the outcome. The question is: are companies not able to hire good people? And: are good people not able to find jobs?

Interviews are also expensive for a company. And I don’t really see a clear or compelling argument that interviews somehow cost the candidate money. How much money? If the candidate doesn’t have a job, then what exactly is money or time cost to the candidate? Doesn’t the eventual job with income outweigh all interviewing costs? Isn’t the cost somewhat irrelevant if it’s not egregious, and there’s really no choice if you want a job? Isn’t the alternative, choosing not to interview, potentially way more expensive?

If you’re going to frame it as expensive, I think it’s more than fair to counter with you need to average the cost of the interview over the time period that you’re employed. If you interview for 10 hours, and you work there for 5 years, the cost of the interview is 20 seconds per day.

> they’re painful for the candidate

I agree that interviews have a time cost and that candidates have a finite budget. That means someone looking for a job should invest wisely, figure out how & when to decline an interview, study the companies before interviewing.

This is subjective, and it’s not a solvable “hiring problem”. This is something the candidate has to fix for themselves. Nobody is going to give you a six figure salary and ask you to work with them for many years without vetting you. It will always take time, and it will always be somewhat uncomfortable and emotionally draining. Some people prepare for interviews, and some people like to interview. If you view it as a skill, one that you can learn and improve at, maybe it will become less painful for you. But don’t expect that companies are ever going to do that for you.


I think we all like to complain when we don't get the job we want.

It doesn't matter what the process is, some form of song and dance is needed.


I think the thing people are "on about" is that we have become so cynical about the current status quo, that there's no oxygen left to have a real conversation about what constitutes a true "false negative".

It would require some sort of longer-term data gathering around the rejection / ghosting process.

For example, Google claims to be very objective and data-driven about their hiring process, but once a candidate is rejected there is no further data.

How would one solve such a problem?

If we agree that it's just window dressing around a "song and dance" (I prefer to call it "the rituals of gate keeping"), there is no problem to be solved.

But I still feel like giving in to cynicism is the wrong answer.


It's a really interesting point you raise. Road not taken etc. I wonder if google quantifies the impact of hires over the long term. How productive are you? How much does your contribution make the company and if you weren't here would google notice?

If google could know when it was about to deny a speculative false negative it could hire them exclusively into a part of the company where everyone is a speculative false negative. Then you could just watch them compete with those who pass and try and measure something objective.


I’ve been rejected from Google, and a year later they reached back out asking if I wanted to try again. The recruiter even told me over the phone that most people they hire these days have already been rejected at least once before. So they openly acknowledge that a lot of times their best bet is to get people back in the door who they know were close. I’m curious how they determine the timing. Why wait a year vs 6 months vs 1 month?


Similar situation; I was rejected around 2007-ish. Many years later when they reached out the recruiter told me that they changed their process and they believed their newer process was bringing in better hires.

For what it's worth: Someone who works well in an established company isn't always the right person to hire in a young company.


Would you put more weight on complaints despite getting the desired. I got the job I wanted and I still think it's a terrible system.


Once you're inside the company there is data about your performance.

My point is that there's no control group and therefore no way to test the process in an empirical way and possibly improve it.

But the idea of finding a way to measure the performance of people who didn't get hired seems ludicrous, so we're back to being stuck in the status quo.

That's what I'm on about. I would love to see an article that talks about this rather than the screeds and diatribes that show up on HN so often.


The trouble with the idea of false negatives in hiring is that you can't really define what they are.

Every job is likely to legitimately reject good candidates.

Most hiring is for a specific position. If you get 10 viable, or even excellent candidates, you can often still only hire one of them.

The one that gets it will take the job in their own direction. Almost by definition, the winning candidate is therefore the best fit.

Whatever roles the other candidates end up in will go in their own directions. Even if, a year on, you can objectively compare the performance of the one you hired against the one you didn't, their performance was within the context of the role they did win. They may not have been as good a fit for the role you didn't offer them.


If the interview sucks I decline before an offer is made. Why work for someone with a bad interview?


Yeah this approach doesn't make sense in my experience. The quality of my jobs has not been at all correlated with my opinion of the interviews to get those jobs. I think one of the reasons the system still lumbers along is that it's still a good trade to go through the hazing in order to get a well compensated and challenging job. But I think it's still worth expressing dissatisfaction with the system in hopes of seeing improvements.


I think the real issue is that people assume that the success of the company depends on the quality of the employees. This isn’t true, what dominates is the match between consumer and product. And you simply can’t predict the market that well, that’s the benefit of capitalism.


In software, the product is built by the employees. It would be weird if the quality of the product (abstractly and in product-market fit) were entirely divorced from the quality of the employees.

I see it slightly differently; I think that 95% of the success of companies is driven by key moments or decisions by 5% or less of the work. (That’s usually by a similarly small slice of the workers as well.)




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: