>if I were going to work exclusively on the front-end
This is the important clause.
Imagine looking for a backend developer. A candidate with twenty years of experience shows up. Looking at his resume, you see the chronological buildup of terms: HTML, XHTML/DHTML, JS, JQuery, Prototype, mootools, YUI, Ember, Knockout.js, Angular, Om.
"Well, this is an impressive resume, but we're really looking for someone with more experience on the back end."
And then you cut the interview short, because you need to finish up that letter to Congress about the talent shortage.
I don't know if there's an everyone-wins answer here, but this sort of thing is why devs go into management.
Surely one is capable of customizing the interview based on the candidate?
I'm operating under the assumption here that the candidate has already been working professionally as a front-end developer. Do you really think it's unreasonable to expect someone who's professionally doing exclusively front-end development for a minimum of 40 hours a week to do at least somewhat well on these questions?
I didn't mean to imply that one should completely rule out people who've worked in other domains. But if a full-time exclusively front-end developer is applying for a front-end position, I do not find the idea of asking questions about front-end development to be unreasonable.
For what it's worth, in your example, if I needed someone to work through some very hairy distributed computing problems and someone with 20 years of experience applied who had never once in their entire career worked with distributed computing, yes, I'm going to take that into account. Why shouldn't I? No, that doesn't mean I'm going to immediately rule them out, and I certainly don't think there's a talent shortage, but if someone else applies with 20 years of experience in distributed computing, I'm going to lean towards them initially.
I totally understand the preference for experienced candidates. I'd make the same choice all things being equal.
But I think I'd probably have trouble hiring, and I'd start to think there's a talent shortage.
What we're doing, as an industry, is eating our own seed corn. We're eager to front-run on experience and knowledge, but unwilling to sit around and let them cook. And, frankly, this is somewhat justified, as devs aren't famous for sticking around.
What we should be doing is satisficing rather than optimizing.
If you (and now I mean you you, not the indefinite) were going to work exclusively on the front-end, I highly doubt you'd know the answer to these. Hypothetical: you're fired today, I hire you tomorrow to work on my front-end stuff. Is the first thing you do to start looking up JS trivia?
No! Hell no! Because you're a responsible person. You'd work on...what needs working on. Oh, this page is loading slowly, time to learn how to use Chrome's profiler. Hmm, this div is behaving oddly, time to refresh on CSS.
And yeah, you'd get bitten by something on this list. But...so what? There's always something like that, no matter your experience level, and if you do happen to be that mythical person who's absolutely mastered a stack, surprise! We're switching (back) to Backbone tomorrow.
Would any of this make you unsuitable for the position? I don't think so.
Back to the example, yes, if you've got a hairy distributed systems problem, and someone applies with 20 years of experience in distributed systems, yes, you hire them. But if and when they call and say, "Well, Google offered me more millions than you offered me hundreds of thousands, so..." and you're left with candidates with 20 years of experience divided between COBOL, webmastering, AJAX, Rails work, and open-source contributions to Julia and Battle for Wesnoth...then yeah, sorry you didn't get the unicorn, but that second guy will be just fine. Get the distributed-systems guy in as a consultant for a week or two if you have to. If he really is that good, you might not need to hire anyone at all!
This whole thing is a result of developers overvaluing intelligence and being scared of not being the expert on something, employers buying it that programmers are superpeople, and then acting like it's a betrayal of trust when they don't know every small detail of every technology they've ever used, and both parties' refusal to think about what they need, rather than what they want. The tell-tale sign is that "hiring is hard," but there are still technology companies around. Sure, if you're hiring better ("top 1%") people rather than good ("good") people. And developers are just as bad: not everyone can be paid above average.
This is the important clause.
Imagine looking for a backend developer. A candidate with twenty years of experience shows up. Looking at his resume, you see the chronological buildup of terms: HTML, XHTML/DHTML, JS, JQuery, Prototype, mootools, YUI, Ember, Knockout.js, Angular, Om.
"Well, this is an impressive resume, but we're really looking for someone with more experience on the back end."
And then you cut the interview short, because you need to finish up that letter to Congress about the talent shortage.
I don't know if there's an everyone-wins answer here, but this sort of thing is why devs go into management.