As someone who conducts interviews frequently on behalf of Google, I'm in a position to categorically reject your claim that someone with {PRESTIGIOUS_COMPANY_X || INDUSTRY_REPUTATION} is given an easier path. Please note that I haven't interviewed anyone with 30+ years of experience (yet) and I only interview candidates for SWE roles (not Research Scientist roles) so view my experience from that lens since because I still think it's applicable to a majority of the crowd here on HN.
Everyone (regardless of background) has to sing for their supper, so to speak. Sure, having good pedigree can make it easier to get scheduled for an interview, but that in no way means you're slated to have interviews with people who will handle you with kid-gloves. On the contrary, the whole process is designed to zap out any bias and cover a large amount of breadth. Interviewers have questions they're calibrated themselves on, and if they get picked for the loop they just usually ask those questions. This is true from the phone screen all the way through the on-site.
You may feel that established engineers cross-pollinating from other competitive companies don't have to jump through as many hoops but that's only because they've been through this rodeo enough to know how to prepare for it well. Many are hobbyist competitive programmers and people who build stuff on the side so the interview questions don't blindside them completely.
I wasn't implying that the interview was easier for candidates with recognizable track records. Perhaps two sigma was not restrictive enough. I was merely suggesting that if a candidate has domain expertise and it is well known, that they may not subjected to the same leetcode hoop jumping as the bulk.
Do you really want to remove bias for experienced candidates with a track record of substantial and nontrivial contributions?
> I was merely suggesting that if a candidate has domain expertise and it is well known, that they may not subjected to the same leetcode hoop jumping as the bulk.
To put in bluntly, if you are such a candidate you are expected to know your algorithms cold and be expected to field your domain specific questions. So if you're a well-known compiler designer you would still need to know how to wield algorithms/datastructures for parts of the interview loop and then get into detail about compilers in the domain-specific parts of the loop.
That being said, it totally depends on the position you're interviewing for. If you're interviewing for Director/VP role in an engineering ladder, then some of the interview will need to be repurposed to getting signals about leadership and impact. That doesn't mean that you don't code at all though, it just means that the focus/weightage will be moved towards other dimensions (in addition to evaluating your system design and coding skills).
> you are expected to know your algorithms cold and be expected to field your domain specific questions
...why?
You're holding candidates to high standards, fine, that's great. Compiler designers even need some deep algorithms and datastructures knowledge; you can't ensure your hashes are thread-safe unless you know exactly what they are, how they handle collisions, and so on. All that's totally fair to ask as part of their domain knowledge.
But the criticism of this sort of interviewing (and Google in particular) overwhelmingly describes interviewers judging domain experts on whatever algorithms puzzle they give every candidate from every domain. Why do we care whether a compiler design expert remembers how to implement Ford–Fulkerson on a whiteboard? Is there any reason to think that's predictive of talent? Might it even be anti-predictive because it favors generalists?
Google (and FAANG, and everyone else in that league) hires great people. I don't doubt that. But it was also hiring great people back when it posed random brainteasers and emphasized GPA, and Google has publicly said those things turned out to be totally uninformative. If you filter down to a list with more highly-qualified candidates than you can hire, I think there's a tendency to come up with random extra hurdles to avoid resorting to a coinflip that might work just as well.
Without getting too pedantic, it can be difficult to define what encompasses "expertise". You could have worked on some backend payment system for years and conclude that you're a payment "expert". But that means different things to different people.
Not all current "experts" are necessarily hired into corresponding roles needing the same expertise. To put it more concretely with an example, it's entirely possible for you to be a payment system expert but have a recruiter reach out to you for a general SWE role (you're welcome to decline). The slate of roles that do require said expertise are limited. If there's 10 qualified candidates, they could all technically pass the hiring bar. But if there's only 7 positions that require that exact expertise, then what happens to the other 3? Well, they're broadcast to "matching" teams that have available headcount. This "match" could be approximate, but it underscores why getting a signal beyond just the domain expertise is important.
We could of course debate whether something "advanced" should be asked, but that would require us to agree on what constitutes "advanced". In general, it's difficult to pin point some threshold and unanimously agree that that's the level of difficulty of generalist SWE questions that a domain expert should get asked. For what it's worth, there's additional intervention by committee(s) that look at this on a case-by-case basis to make sure that everything's on an even keel. Concretely, if you're an ML expert
interviewing for a role that specifically demands ML-expertise and you were asked some advanced question about data structures (ex: something needing the Hungarian algorithm) and you're upset that you bombed it, the don't be; the committee(s) would weigh that interview's score less if they expect something alone the lines of Maps/Queues to be asked instead. These are all made up examples, but I hope that it conveys why there's a need to extract signal beyond just core-expertise. Google doesn't just artificially create hurdles for sadistic pleasure ¯\_(ツ)_/¯
Everyone (regardless of background) has to sing for their supper, so to speak. Sure, having good pedigree can make it easier to get scheduled for an interview, but that in no way means you're slated to have interviews with people who will handle you with kid-gloves. On the contrary, the whole process is designed to zap out any bias and cover a large amount of breadth. Interviewers have questions they're calibrated themselves on, and if they get picked for the loop they just usually ask those questions. This is true from the phone screen all the way through the on-site.
You may feel that established engineers cross-pollinating from other competitive companies don't have to jump through as many hoops but that's only because they've been through this rodeo enough to know how to prepare for it well. Many are hobbyist competitive programmers and people who build stuff on the side so the interview questions don't blindside them completely.