Hacker News new | past | comments | ask | show | jobs | submit login
Asana Engineering Interview Guide (asana.com)
227 points by pspeter3 on March 28, 2016 | hide | past | favorite | 141 comments



Do these companies not recognize that moment of cognitive dissonance that surely -must- come when, as they're faced with "how do we determine if someone will be productive coming here" decide "I know; let's ask a bunch of questions we don't expect them to know! And then, let's provide a study guide so that they -can- cram before the test and come in and show they know it!"

I mean, it's bad enough to ask questions you know don't really relate to the sorts of problems you have to solve, but to implicitly admit it, -and- to also acknowledge that it isn't something you expect people to have off the top of their head by providing a study guide? What are you trying to measure?


> Do these companies not recognize that moment of cognitive dissonance that surely -must- come when, as they're faced with "how do we determine if someone will be productive coming here" decide "I know; let's ask a bunch of questions we don't expect them to know! And then, let's provide a study guide so that they -can- cram before the test and come in and show they know it!"

That's exactly how the top management consulting firms operate their recruiting process. McKinsey even has a webinar for how to prepare for their interview and exactly what category of questions will be asked. The webinar is run by current associates.

Their recruiting works pretty darn well.


Not surprising, that a bullshit friendly recruiting process works when you hire for jobs that consist to a big part of producing bullshit ;)

When the goal is to get people that actually solve problems ymmv.

Btw. that is not a dig against management consultants. I honestly think the world needs bullshit.


> let's provide a study guide so that they -can- cram before the test

Did you look at the guide? I don't see anything you would be motivated to cram after reading through that other than to brush up on the "usual suspects" of data structures, but you should be doing that before any technical interview anyways IMO.

Seems like the guide is mostly intended to give candidates an idea of what to expect during their interview.


Yes. Under "How To Prepare" -

Read through technical interview prep resources HackerRank and Interviewing.io are good places to start. We’re also fans of InterviewCake’s Coding Interview Tips.

I.e., "here, go study and cram for your quiz". If you expect people to know the data structures and algorithms off the top of their heads (because you believe that to be useful), then why tell them to prep?

That's what boggles my mind. Technical interviews relying on data structure and algorithmic questions started with a few companies (primarily Google) thinking it was a good way to gauge how strong a candidate was. For their purposes, it might have been. But as other companies started to cargo cult it (despite their businesses requiring far less algorithmic work), a cottage industry of interview prep sprung up (Cracking the Coding Interview, various sites intended to help you practice, etc). And since that then raised the bar (you were now competing against people who were -explicitly preparing- for these kinds of questions, and frequently would have seen the exact question they're being asked, or one akin to it), companies, such as Asana, decided "Okay, to be fair to everyone, we need to warn them that we're asking these kinds of questions and so they should prepare well in advance for them".

Or...you could take a good hard look to decide if that's the skillset that is really important in your hires. If it is, you can probably give them legitimate business questions (which to be fair, Asana might do), as it will cover some of the same basics, but then giving the candidate access to Google, and access to querying and digging and figuring out how best to solve it might be a better gauge of their ability to solve those problems in the real world (unless for some reason they're going to be developing without access to technical resources and the internet). Or, you might find that your business actually requires people able to solve different problems, where implementing fisher yates is not the key technical deliverable, and so you can instead ask those kinds of questions, and see who can solve them based on what they know, and what they think to ask you during the interview.


Wow, you really read way more into that than I would have. I barely even looked at the first bullet in the "How to Prepare" section. I suspect it's was just there for completeness and I think it's basically a given that people should brush up on technical interview-type questions before going into a technical interview.

For me, the real meat of that section was in the 2nd and 3rd bullets!

The 2nd bullet is telling people to reflect on their experiences and come prepared to discuss. This may be a no-brainer for some, but for those who don't have a lot of interview experience, I suspect it's easy to get caught up with the "what should I know" mentality and forget to spend a little time reconstructing the interesting projects and situations that they've navigated in your career. For most of us, only the most recent stuff tends to be fresh in our minds.

The 3rd bullet emphasizes that you should come prepared to evaluate them. Again, perhaps a no-brainer for most experienced candidates, but it's nice to see the interviewers acknowledging that this is a two-way evaluation and encouraging candidates to be sure they're making the right choice.

While it's unusual for candidates to show up unprepared for technical questions, I've seen many candidates show up completely unprepared to discuss their experiences or ask questions that will help them evaluate the position/team/company/manager.


Yeah, I have much less of a problem with those, as they're not "cram for this test" and instead "reflect upon what you already know/wish to know, so that you're prepared to voice it". That's useful advice if you're not used to interviewing, and worth pointing out.


I couldn't have said it better myself. You really summed up how the job interview got so fucked up in our field.


> other than to brush up on the "usual suspects" of data structures, but you should be doing that before any technical interview anyways IMO.

I think that most people need to do that to go through the normal interview today, but that it's bullshit and a symptom of how shitty interviews are.


I wish they (asana) listed down some of their real problems along with few possible solutions. A candidate can provide his solution or recommend one of their solutions. I think this approach can lead to a nice discussion where both parties can get to know each other.


Have you tried this? :) https://codefights.com/bots/botasana


I like how they slip in "Oh, all you have to do is parse HTML with regex" on the last problem.


God, I really hope they grasp the stupidity of using regular expressions to parse an irregular language. Implementing a very simple HTML parser takes at least a half-dozen man hours; seeing that question honestly makes Asana seem like amateur hour, and I've had to do some pretty stupid shit in my time. Snapchat, for example, had me solve Sudoku (???).


I'm assuming it's a reference to http://stackoverflow.com/a/1732454


for the HTML parsing - I was able to pass all the test cases using a stack (push the tags when they open and pop them as they close). I was doing zero validation as the challenge was to take valid html. (I didn't verify that the proper tag was being closed, I knew I'd get out of bound errors due to this, but it should work for valid html)

I was using a string builder to eat the input as I had parsed it... but I got an error on submitting my result - it gave "unexpected termination" even when I had replicated the "unexpected termination" error with a custom test case and caught the error - I still got an unexpected termination error when submitting - sigh - maybe it's related to using C# for the challenge... overall not so hard though...


> I mean, it's bad enough to ask questions you know don't really relate to the sorts of problems you have to solve

What makes you think they are asking questions they know don't really relate to the sorts of problems you have to solve?


As an interviewer at Asana, my favorite questions to ask are based on actual technical problems we have had to solve.


Yep this does not surprise me. Reading the guide it's clear to me that when you say:

> Our goal is not to simulate day-to-day software development

You don't mean at all, you mean in its entirety (so you are leaving out the aspects you are less interested in, like reading stack overflow, and emphasizing the aspects you are more interested in, like reasoning about data structures). I think a lot of people on HN are getting hung up on that sentence, honestly. Thanks for the guide, though, considering something similar for Justworks.


Absolutely and thanks for the advice. I think we should change that sentence. Looking forward to the one from Justworks!


This generally means asking an obscure edge case which was only challenging because you can't Google it and so is a wildly inappropriate question for an interview.


It is not really possible to cram for these interviews. No matter how much you cram, the process will usually tease out the real you, because a group of people will evaluate you and debate your performance at the end of it.

If you got in with cramming, you very likely would have gotten in without.

Not to mean otherwise; one should definitely prepare for the process, because interviews are competitive. There is no worthwhile competition on the planet where you would go without preparation. It is almost insulting to the ecosystem if you don't prepare.

But preparation is not cramming. Depending on your level of practice, you may end up memorizing some things, but that is different from cramming to "beat the process".

I make these observations based on my experience as someone who runs a successful bootcamp for interview prep (http://interviewkickstart.com) and being an early engineer and a hiring manager (Director of engineering) at Box prior to that.

We have seen that candidates who prepare well with the right mindset, are at a distinct advantage over those who don't. And those are also often the right candidates. If you are working on an important goal, I, as an engineering manager, want you to have prepared for it. Interviews/job are just another goal, and it better be important for you that you want a job here.

Hope this helps; not a popular perspective, but closer to reality in my experience.


Well, they did specify that they want people who are able and willing to learn, so it makes perfect sense.


I came looking for some new or interesting way of doing tech interviews, but it's the same typical SV bullshit. Lol: "Hashes, sets, heaps, binary trees, linked lists, all the usual suspects." Yeah, I spend most of my time fiddling with binary trees and hashes at the office, said no one ever. I'm a strong believer that there are two primary ways of testing someone's coding ability:

1) Open-source, or otherwise public, contributions.

2) Real-world take-home assignments.

I've opted out of interviewing people because it's such a miserable experience. I guess I'm just not a sadist that enjoys watching some poor 22-year old squirm in his chair because he can't remember what the upper bound on the insert operation of a red-black tree is.

What a joke.


I agree with this sentiment wholeheartedly but bounds on balanced binary trees are worth knowing (like lookup, it's log n), since that fact is one of just two reasons you'd ordinarily ever use a tree.

However, I will offer you some Interview Candidate Self Defense, which I'll repeat from something I yelled on Twitter a few weeks ago:

If interviewers ask you about bounds for data structures, pick a fight with them over hash table complexity. If they believe it's O(1), argue it's worst case O(n). If they want you to say worst case O(n), argue that it's in the real world always O(1). It doesn't matter which is right, because these kinds of questions aren't about technology, they're about status signaling. Be the alpha nerd!

As a bonus: interviewers are evaluating you for X-factors like "confidence". You know who never got dinged in an interview for lacking "confidence"? The person who yelled at the interviewer about hash table complexity.


Adopt an aggressive posture. Maintain eye contact and repeatedly say "amortized complexity".


The Illustrated Tech Interview Survival Guide would like a word with you.


If they say average case O(1), worst case O(n), hit them with them with O(log n / loglog n) with high probability from a simple balls-into-bins type argument.


Could you expand on the balls-into-bins argument here? It's not quite obvious to me, but I'd really like to understand.


Items are balls, slots in your table are bins. Hash functions are assumed to distribute them randomly (uniformly). See also the Wikipedia article on Balls-into-Bins.

https://en.wikipedia.org/wiki/Balls_into_bins

This reference from the wiki is good http://www14.in.tum.de/personen/raab/publ/balls.pdf


I assume you mean this tongue-in-cheek, but someone yelling at me in an interview is probably not going to score well on the "Do I want to work with them?" X-factor.

Of course, a well-informed discussion of hash table complexity would be pretty fun to hear from a candidate. It's always great when I learn something new in an interview.


I'm like 60% tongue in cheek, because the reality is, for every interviewer who will filter you for being too aggressive, two more will filter you for lack of confidence.


I have never once filtered someone for lacking confidence. I have never seen someone filtered for lack of confidence by any panel I've been on or any panel I was the hiring manager for. At this point in my career I have interviewed ~1000+ engineers.

I think you're interviewing with the wrong people, or at the wrong places. Way too many people treat interviewing as a way of demonstrating intelligence via computer science trivial pursuit. 5 minutes with Google will demolish 99% of the questions you might ask if all you care about is getting correct code on the board. The interesting part is the process and discussion the code creates, not whether or not they know the secret handshake.


That's nice, but I have literally read interview feedback from very smart interviewers at multiple companies that vetoed candidates because "they didn't appear confident". This isn't something I'm making up: it's a real problem.


I think we can both agree that interviewing is overdue for a radical reboot. Silicon Valley, disrupt interviews, please!

That said, there are places that don't interview the way you describe. I'm equally sure there are places that do (I vividly remember my Dropbox interview). My hope has always been that Darwin's Law will apply to such places over the long run, but instead I think what actually happens is that as these organizations grow the interview process just ends up being a diluted version of the same BS, which translates into "lowering the bar." Nevermind that your bar was curvier than Lombard Street.


"Silicon Valley, disrupt interviews, please!"

They did and that's the problem ;)


This is mostly good advice -- it's worth knowing both worst case and amortized average time -- but plenty of interviewers will filter you if you're being an aggressive asshole, even if you're right about facts.

Make sure you're considering what the workday environment is like (something you can usually tell from the prep materials or discussions with recruiters) and acting the way you think you'd act in that environment.


>If they believe it's O(1), argue it's worst case O(n)

In the case that all your keys have the same hash, right?


It seems to me this would entirely depend on the implementation and how collisions are handled.


In Java 8, collisions in Maps are stored in a tree structure so worst case is O(log n)


Is it a balanced tree? If not, worst case could still be O(n).


Haha, that would be an odd set of data indeed. I have no idea how they implement it but I hope I have made a point(I feel I have). I wouldn't be too impressed with an interviewer or interviewee arguing one side or the other about the complexity of "unspecified hashmap implementation".

Too clever by half.


Buckets. :)


Right!


On the flip side, Asana clearly indicates that they don't test candidates on day to day development. This is good, because now I know I would never work for Asana.

Those interested in non-applicable CS problems would probably love Asana. Although I'm not sure how any of this relates to a pretty basic project management app.


Yeah, exactly! It's like fucking insane how these companies hype themselves up. All you are is a glorified to-do list, calm down, Asana.


It is a very, very fancy to do list though.

To be honest, it was more usable when it was just released. The simplicity was one of the strong points.


> This is good, because now I know I would never work for Asana.

I've actually used their product, and I'm in the same boat as you because of it.


Thanks for a dose of sanity. After a round of SV interviews this past week, I've vowed to never put myself through something like that ever again.


I agree. I've been lucky to not have been asked any questions like these here in NYC—until now. The only reason why I am still going through it is because I really like the company and people, and they did ask me fair, on-topic questions in our first interview. But yeah, studying trees when I could be doing open source sucks.


Don't get me wrong. I don't like the way interviews are conducted much. Especially the "trying to trip up" part, and the focus on absolutely artificial problems.

But:

> Lol: "Hashes, sets, heaps, binary trees, linked lists, all the usual suspects." Yeah, I spend most of my time fiddling with binary trees and hashes at the office, said no one ever.

Actually, I spent a lot of time with stuff like that. And I'm certainly not alone. And a lot of the problems I deal with is where people haven't thought about algorithmic problems (complexity, constant factors like latency, cache efficiency, ...).

Now you can certainly argue that that's not the case for most SV jobs. And that you can re-train a lot of that on the job. And I'd agree to a good degree. But I think ridiculing the importance of such topics just mirrors the mistake the of the interviewing side.


> Yeah, I spend most of my time fiddling with binary trees and hashes at the office, said no one ever.

There are a lot of people agreeing with you so I guess I'm just the outlier here but I don't see why you would not want people to know about hashmaps/binary trees. Sure we don't normally have to implement them ourselves but I imagine most people use them all the time.

Balanced binary trees are often used to implement Maps/Sets so by knowing what the time/space complexity of operations on them you should know how they behave. Knowing them off the top of your head also makes it easy to compare them to choose the optimal data structure for whatever it is you are doing.

I agree that open source or take home assignments are great for seeing how cleanly someone can code and perhaps this is not emphasized enough with our current interview "culture" but I don't think the other extreme of completely ignoring algorithms is a good idea either.


> Balanced binary trees are often used to implement Maps/Sets so by knowing what the time/space complexity of operations on them you should know how they behave. Knowing them off the top of your head also makes it easy to compare them to choose the optimal data structure for whatever it is you are doing.

I don't see any virtue in knowing in-depth complexity analysis of any data structure off the top of your head. Sure, it's probably important to know that random access is faster in a hash map than a linked list, but generally questions are much more algorithmically in-depth. With that said, I know many people that would disagree with me and I think they are categorically wrong. Writing code is about building robust and scalable things. This involves planning and execution. Frankly, I don't think "tricky" algorithmic interview questions address either of those two.

And whatever, obviously to get a six-figure job (as I'm sure most people in this thread have), you need to drink the Kool-aid. But I'm not going to sit here and perpetuate this nonsense. Like I said, I don't even like interviewing people because it's expected of me to find annoying little logic or algorithmic puzzles that some poor guy or girl will struggle to solve.


I'm not even sure we are agreeing or disagreeing anymore. I don't like tricky algorithm questions. Well that's a lie, I like them but I don't think they are appropriate for interviews.

On the other hand, I don't feel like knowing the time/space complexity of data structures you would normally encounter in day to day coding to be "tricky". I wouldn't expect people to know about suffix trees or segment trees for example.

> Writing code is about building robust and scalable things. This involves planning and execution.

I completely agree! But I would argue that's why it's important to know your common data structures because otherwise how can you plan? I do find most interviews awful because of the time pressure which is completely unrealistic. On the other hand, if I'm looking at code my coworker wrote and ask them why they decided to use an array instead of a linked list, I would hope they have an answer.


> “We want every candidate who comes into an interview with us to feel prepared and confident.”

Was expecting to then see a list of specific resources that a candidate could use to become prepared and confident. Instead, the guide provides a high-level overview of "algorithms", "data structures", and "modeling" and refers me to "HackerRank and Interviewing.io" for tips on how to prepare.

This seems like it could be improved. You value learning -- so give candidates something to learn and see if they can do it.

> Our goal is not to simulate day-to-day software development — where we read docs and write lots of tests!

I feel very confident in doing day-to-day software development. I do not feel as confident speaking about abstract hypotheticals in front of a whiteboard.


Hey -- I work at Asana.

Great idea! Took a task to add more resources to learn from to the doc.

> I feel very confident in doing day-to-day software development. I do not feel as confident speaking about abstract hypotheticals in front of a whiteboard.

Hmm, the way we framed that is confusing. Thanks for the feedback! To be sure, our goal is to accurately assess what your day-to-day contributions would be like, and simulation is a really good way of eliminating accidental bias in that process. To that end, we do have every engineer submit actual code during the interview process, and we generally avoid white-board programming in favor of higher-level discussions (that may also use the whiteboard). But perfect simulation is possible (and time-consuming), so we've tried to focus on the higher order bits.


What's your primary tech stack?


We're in the middle of rewriting our stack to look like this:

- TypeScript web code - Using React - Connecting to a custom, to-be-opensourced reactive datastore written in Scala called LunaDb: https://blog.asana.com/2015/05/the-evolution-of-asanas-luna-... - Using to Amazon RDS (hosted MySQL) and Redis as our primary backing stores - Running on top of kubernetes, docker, AWS


I recently interviewed with Asana, and unfortunately would review their process as the opposite of what they're going for.

The first sentence is not far off from what their process is. My interviews with them were composed of one helpful database problem, after which they explained how they designed their production database, followed by a bunch of frivolous algorithms problems.

I did not receive an offer from them, so take what I say with a grain of salt. Asana seemed like a company that has some great engineers, and runs well in general. I would, however, like to provide some constructive criticism for their interview process.

1. Their engineers can use some training on how to interview. Several of them came into the room noticeably anxious. Hearing what the problems were was easy, but communication was very difficult when it came to figuring out what kind of answer they were actually looking for.

2. The questions they use are very academic, which to me means they are mental exercises lacking practical grounding. As an interviewer, it's very uninspiring to have to solve academic problems without getting a peak into the business' real problems.

3. The whole process wasted way too much time. Obviously I wouldn't have cared as much if they had given an offer, but I think it's important to be conscious of this nonetheless. My suggestion would be to pick topics for interviewers to cover, and cut out the repetition of the problems. Plan a process that will lead to a decision with one shorter visit, be respectful of peoples' time.

I hope that helps. I received more offers I'm excited about, and I hope to see the company succeed from the outside.


The funny thing is that Asana is possibly the worst performing web app out of all the ones I use on a regular basis. I have a strong suspicion there's at least one or two O(n^2) algorithms hiding there that could use the same kind of attention and reasoning abilities they expect from their candidates.


Yes! Honestly, they need to hire someone to fix the start up time of their app, because it's the slowest by far of any webapp that I use. That should be their first hiring priority.


It is one of our biggest hiring priorities and we are actively working on this.


Why not ask questions or ask for solution about that in the hiring and interviewer process? You might just find folks who love/know how to solved that kind of problems to work for you.

BTW, that is very trivial problem to solve for those who has done it.


First, we're actively working on improving page load times.

Second, having discovered issues with algorithmic complexity makes me question claims that they are not important for day to day work. What do you think?


Algorithmic complexity is certainly important. Intuition and knowledge of (or ability to research) the pros and cons of existing algorithms are essential. However, I don't think you need to see a single line of code (on a whiteboard or not) in order to test for that.


I agree


> We’ll ask you to solve some coding questions in a language and text editor of your choice. Feel free to bring your own laptop, or we’ll be happy to provide one. Notably, we ask candidates not to compile or run their code during this exercise, and not to refer to online resources.

Every time interviewing comes up, I have this funny feeling about letting people look things up vs. whiteboarding them out. Reading this finally broke something loose:

Coding in interviews should be as hard as the hardest coding you'll do on the job. Heck, anything in the interview should be about how you do things when the job is hard.

Now, this doesn't mean under stress. That may come naturally. But the difference between good and great engineers is decided when the task is hard and information is sparse. Asana is explicitly selecting for people who are going to reason about things from first principles and try to get it right the first time. I admire that they're owning up to that.

They'll pass over people who are exceptionally good at searching / synthesizing information. They'll ding people who focus on tightening the iteration time as a way of learning faster. But that's OK. They've probably built a culture where that kind of person thrives, and they're trying to find more engineers who will thrive at Asana.


No, they shouldn't be as hard as the hardest you'll do on the job, because what is hard on the job has nothing to do with the real issue with an interview: a 30-60 minute window between solution and problem statement.

I have been programming professionally for about two decades. Algorithms? Easy, there are a lot of papers, and The Art of Programming for that. Designing something completely novel? I'll be looking at it for a long time before I can trust it. Performance optimizations? I'll count cycles if I have to. A little bit of time fixes all of it.

So what is actually hard about programming? Making sure my system is observable, so that when there is a problem, I know what is going on within seconds, if not minutes. Building a system that self heals. Making API decisions that I will not rue in 6 months to a year. Helping a coworker figure out a more sensible architecture without turning it into a pissing contest, or having to pull rank.

How can I actually show ANY of the hardest things I'll do on the job in that short amount of time? I can't. In practice, I get my best salaries, where I am the most valued, either the second time around in a place, or when, either through someone on the inside or well know third party references tell the prospective employer what I can actually do.

I have worked with amazing people that I'd never refer to a company that is trying to do this google-derived interview style, because what really matters in that interview is first, the 10 second likeability, second, to read the cues on what the interviewer is thinking of what you are saying, so you can change course if he is not liking you, and third, the ability to think on your feet fast enough while you are getting the other two things done. Someone that takes 6 hours looking at a problem but comes back with an amazing, production-grade solution every time is not welcome there, because they will always get culled.

So the problem of the interviews Asana is describing is that they can cull some people, but there is no proof whatsoever that anything they are doing correlates at all with the things that really make a good developer. This is great through, as anyone using a better interviewing system will be able to get far better developers on the cheap, as all this SV companies that copy each other just chase the same style of candidate.


> That may come naturally. But the difference between good and great engineers is decided when the task is hard and information is sparse. Asana is explicitly selecting for people who are going to reason about things from first principles and try to get it right the first time. I admire that they're owning up to that.

> They'll pass over people who are exceptionally good at searching / synthesizing information.

I also admire that they are upfront about this, but the attitude in general rubs my classical engineering self the wrong way.

"[R]eason[ing] about things from first principles" is not the same thing as having those first principles memorized. That was one thing my undergrad taught me repeatedly. Engineering isn't about memorizing formulas, it is about learning what those formulas mean, how to use those formulas, and when to apply them to a task at hand. Software engineering should be no different. Expecting people to solve hard and novel problems with no resources whatsoever is ludicrous.


Much of the point of "first principles" is that there aren't very many of them. If you have any trouble at all remembering them, you can be fairly certain that you have not trimmed enough.


There is no such thing as "reasoning from first principle" for most of our engineering situations that can be described in an interview, regardless of whether it's hard or not. Because everything is based on tradeoff and very very specific scenarios.

I will reason from first principles if the interviewer provide me with a 200 pages detail on the budget, timeframe, use cases, target users and all deployment constrains. Otherwise we all worked based on assumptions and experience we got, even if the interviewer don't want to acknowledge that.


But which is more useful? On a day to day basis, programmers will usually have access to docs, Google and Stack Overflow, so surely it makes sense to select for people who are good in that situation rather than those who excel in the more unusual situation of having no information.


Ah, got to love the subscription pop-up with a close button that doesn't work. A timeless web design principle.

Aside from that, when will we get past the requirement for data structure knowledge for positions that center around web design? Data structure knowledge is available at my fingertips throughout the work day. Why must I memorize implementation boilerplate for an interview?


I tried interviewing with Asana and they rejected me on the basis that I didn't have a computer science degree. For the record, I've built apps that have generated millions in revenue and I have many years of experience. If they are trying to be meritocratic and community friendly with this, they should change that aspect of their culture to reflect that.


Can you blame them? That CS degree provides essential knowledge when it comes to solving some of the world's most complex problems: to-do lists.


I just accepted a position at Asana, and I don't have a CS degree. Are you sure you're correctly attributing your non-offer here? (Note I don't call it a failure. Hiring is a weighted RNG -- false negatives are common.)


Remember also that Steve Jobs, Larry Ellison and Bill Gates did not have computer science degrees. They need to keep people of that calibre out of their ranks - you know, the non achievers who don't understand computers.


Just the fact that you've built (worked on?) apps that generated millions of revenue is not a measure of compatibility/competence, nor is experience.

Might be different if they actually said they didn't want you because of a lack of CompSci degree though.


From the guide: "we ask candidates not to compile or run their code during this exercise, and not to refer to online resources"

Seriously? You don't allow referring to online resources? Do you also unplug the internet connection for all of your existing programmers so they don't refer to online resources?

The whole point of the interview is to see how you would perform at your job at the first place, creating an artificial job for an interview where you swap words in the array or solving some other stupid quiz + disconnecting you from online resources is just plain stupid in my opinion.


The very next sentence there is "Our goal is not to simulate day-to-day software development", so no, they probably do not unplug the internet connection for the existing programmers.


Why don't they ask them to play soccer on the interview or see how fast a person can run, if they don't care if the candidate will perform well in their day-to-day software development?


I did an extensive set of interviews with Asana and did not enjoy the experience. After an initial screen, I was required to complete a fairly extensive take home data science problem that involved querying an AWS-hosted database, fitting a predictive model on that test data, then writing a report summarizing which factors in my model were most important and what else I might do.

Considering that (a) some of the data in the database was erroneous, but when I alerted them about it they took a long time to respond and I had to just hack a work around, and (b) it was a huge time sink in an otherwise busy week for me, I was very proud of my solution.

It was a simple random forest model, with no exploration of optimizing the parameters, and some simple entropy metrics for feature importance. I wrote nicely encapsulated SQLAlchemy for the database side, then modular sklearn and pandas code for the model fits and plots, including calibration curves in addition to accuracy. I put it all together into an 8 page TeX write up with plots.

I got virtually no feedback, then after a long time I was asked to have a technical phone interview reviewing my submission that was with two data scientists simultaneously.

They barely asked me anything about my code design. They asked a few questions about my choice of accuracy measures and entropy scores, but it was extremely hard to understand what they wanted to hear (why do companies still think that a multi-party phone call is a good use of time??).

It was a major instance of "guess the teacher's password" which was a huge turnoff for me.

Then they asked very vague, high-level questions about designing a news-feed-like interface, and how might you determine which articles are newsworthy on an individual basis. Again, super vague. I threw out all kinds of ideas about seasonality, correlation to major events, properties of your network, ... But nothing seemed interesting to them at all. They clearly wanted someone to recite some well-known stuff Facebook already used, but none of that was part of the job ad or even related at all. It was bizarre.

After that, another mysterious block of time went by with no feedback, then I got rejected with no explanation.

Later I learned that Asana has pivoted to focus on providing Agile junk for enterprise project management, so I was relieved to have dodged that bullet, but it was still a vague and time-wasting interview process, with so much magic "guess the teacher's password" nonsense.


Hi there -- I'm really sorry that you had such a negative experience. It sounds like we didn't honor how much time you'd put into the challenge, and that sucks.

I'd be happy to share more feedback with you if you're still interested, though it seems like you've moved on happily. Feel free to reach out to jack@asana.com and we can talk more.

Regardless, best of luck where you are now.


> Our goal is not to simulate day-to-day software development...

I don't understand this. Why do they not want to see how well a candidate performs the tasks that they will be expected to perform on the job? Does this not result in them selecting for the wrong set of skills/knowledge?


Plus they're using this goal to justify not letting you run your code during the interview, which prohibits a "try it and see" problem-solving approach. Asana engineering team, what's wrong with "try it and see"?


You're kidding, right?


Not kidding! "Try it and see" is a legitimate problem-solving approach.


Much more interesting is the guide that this blog post mentions at https://asana.com/eng/interview-guide.


I don't really see whats interesting about this either way. Lots of places you interview at tell you what the interview is going to be like and what you are expected to be familiar with. This is just 'write code and we'll be testing you on data structures and algorithms!'

There's some flowery language there about having a genuine desire to want to hire you unlike those 'other evil companies!'... sounds like the same shit pretty much every other company does to me.


I had also the same impression. I think one of the best approaches is let the best engineers at the company go through same interview process. If they can't get pass through the hiring bar of the company, then maybe that company should re-think their hiring process.


My current company has a very similar interview style, and I later learned that I was pretty close to not being hired because I didn't do quite as well as one interviewer wanted on a coding challenge: I ran out of time. But what I also learned is that the interviewer had never solved the problem in the language I used!

I had one of our most senior engineers, who has worked in this language's compiler, to try to solve the exercise. Instead of 40 minutes, it took him three hours!

If a candidate is going to interview in a language, for the love of god, have as a prereq that the interviewer is actually capable of doing the exercise in that language.


Yes! Would be interested in seeing how much coding they ask for when interviewing for devops or ops positions.

Some orgs require only scripting, some go so far as to want application engineers who also know everything about AWS for devops/ops roles.

Disclaimer: I already asked via their google form for this information to be added in their docs.


I haven't followed the recent updates to our interview process closely but we traditionally looked for general knowledge over specific knowledge. I'm not sure how much that has changed recently though.

Edited In Response To Parent Edit: Thanks for asking! We should definitely make it clearer if there are differences per role.


"looked for general knowledge over specific knowledge" - incidentally, I was rejected because I could not answer (to the interviewer's satisfaction) how to implement an in-memory LRU cache. I made some good recommendations but the interviewers were expecting a very specific answer which exactly matched http://www.programcreek.com/2013/03/leetcode-lru-cache-java/


> looked for general knowledge over specific knowledge

This is one commonly thrown BS regarding interviews. At the end of the day it is all about one-dimensional usual algorithmic questions.

Regarding LRU cache, I think this is one of less crazy interview questions. In one phone interview, I had to solve a DP problem, code it perfectly and pass all the test-cases on HackerRank in less than twenty minutes. I am not certain how many developers out there can pass such interviews without months of preparation in such situations. Tech interviews are basically programming contests. I am not against programming contests or algorithmic questions. But it seems the interview questions are getting crazier day by day.


So very awesome of you. Thank you for the reply!


Why is it more interesting? The purpose of the blog post is to explain the rationale for the guide itself and directly links to it. I find it just as interesting to read about why they're doing it.


I had an onsite interview for a new grad engineering position at Asana recently. It's true that they test you on technical problems you don't know the answer to; sometimes it seems like they tested me on technical problems that had no answer. One of their interviewers, who was in charge of both conducting a one-on-one with me and explaining the programming questions to me, was very friendly but also had a hard time clarifying any of his questions for me. I didn't really understand him and it was obvious from his facial expressions that he believed that his explanations had sufficed from the very beginning. The questions were either too vague and could be approached from many different directions or were so specific that it was obvious the right solution would have to be found in an epiphany-like "aha!" moment of mathematical understanding. I was able to answer and program most of the questions. I have to say that I was really happy to receive feedback following my interview, which no other company did. My advice to Asana: Don't let any one interviewer dominate the experience of the interviewee (especially if you may have suspicious that the interviewer may be overly didactic or has difficulty explaining things.) Of course, that's just my experience. I ended up finding a good position somewhere else and think Asana's interviews were some of the best ones I had the honor of failing.


> Don't let any one interviewer dominate the experience of the interviewee (especially if you may have suspicious that the interviewer may be overly didactic or has difficulty explaining things.)

I think this a common problem. Many companies let a single interviewer has huge influence on hiring decisions. What if the interviewer himself/herself had some bias that s/he wasn't aware of?


Sigh, more marketing. Sorry, but at the end of the day any company that does technical interviews evaluates candidates the same ways and asks the same type of questions, having an interview guide that says things like "Instead, let’s open a dialogue. Let’s solve problems together and see what we can learn in the process. Let’s have fun!" is just trying to get more people to apply.

And I'm not slamming algorithms or technical interviews, but its time companies stop trying to dress themselves and say it how it is.


JFYI, that's not accurate. We've been sending this to candidates for months; the genuine impetus was to try to put all candidates on a more level and accurate playing field so that we can make better hires.

That's not to say it's not self-serving; of course we benefit from making more/better hires. But it's not just marketing.


I had a really stellar interview experience at Asana (and, full disclosure, I didn't get an offer).

What really stood out was the pages of notes that the recruiter sent me before inviting me to apply again in a few months. I had just been through many on-site interviews at many companies, and no one came close to providing the level of feedback that Asana did. The document that I was sent can't have been too far off from the unedited notes that the interviewers shared with each other in their debrief. It really spoke to Asana's commitment to transparency and helping develop people - even those they didn't want to hire. I'm happy where I am now, but based on this experience, I would absolutely apply again in the future.


I appreciate the sentiment behind documents like this, but, Asana, you can do better. When you do, you'll find it gets easier to hire people, and, perhaps counterintuitively, the quality of your hires will go up.

Specifically:

It's clear from the tone that you're trying to mitigate the hostility and unfriendliness of the conventional programmer interview. Great! But you need to do more than superficially adjust your tone. What this document describes is literally the circa-1999 software developer interview, in almost every detail --- the fuzzily described coding question, the "algorithm and data structures" interview, the design review.

Everyone has heard of interviews that pointlessly refuse candidates the opportunity to consult Internet references, like every working programmer does every time they code anything ever. But your process goes a step further: you won't even allow candidates to compile their code. How can this possibly be helpful to your process?

The most alienating and hostile aspect of the conventional job interview is on-the-spot whiteboard programming. You shouldn't have programmers write code in interviews at all, if you can avoid it; instead, take a week or two and design programming questions that candidates can do at home, on their own schedule, in their own surroundings.

But if you're going to make people code in an environment that is utterly unlike the one professionals actually work in, the onus is on you to do everything you possibly can to mitigate the performance penalty you're imposing. Why not, instead of coming up with crazy rules like "no running your own code allowed", point your creativity in the opposite direction and find ways to get candidates to be more at ease with writing code with someone looking over their shoulder?

Here are some things you can do right now to make your interviewing guide actually helpful to candidates:

* Provide sample questions. They do not need to be the ones you're asking in real interviews (although: if they are truly good questions, they could be!), but they should be close enough that a candidate would have no business being surprised by the real ones.

* Provide a detailed breakdown of your interviewing process. Do you phone screen? How many times? Who staffs the phone screens? What kinds of questions get asked on them? Who delivers the in-person interviews? How long do they last? These are some of the questions real candidates have about interview processes. I don't understand why more companies don't just answer them up front.

* Provide reference material for candidates. Don't point them to other people's interviewing guides! You know your jobs better than anyone else does... right? How about instead: what are the most popular books on the shelves at your team? (Bonus points: buy those books for candidates). What are some Github repositories you think represent really well-engineered software? What are some examples of really hard problems your team is still grappling with?

I hired for years and years with a battery of over-the-top technical questions and a rolodex of personal contacts. Then my team and I had a crazy idea: we opened our whole interview process up, standardized it, and took pains to make it understandable to people with no experience in our field. What we learned was that there is a huge amount of underutilized talent out there that can't (because it's locked out due to jargon and poorly described requirements) or won't (because it's denied social permission) apply for jobs unless you go out of your way to bring them in. Many of these people are better than you are. At least, they were for me. Stop bloodying your forehead against the wall competing with Facebook and Google for people who --- if they can navigate the process you describe --- can get an offer from those companies any time they want.


>> The most alienating and hostile aspect of the conventional job interview is on-the-spot whiteboard programming. You shouldn't have programmers write code in interviews at all, if you can avoid it; instead, take a week or two and design programming questions that candidates can do at home, on their own schedule, in their own surroundings.

Phenomenally good advice. I wonder, and I don't mean this to be a loaded question because I really don't know the answer: do other practitioners in other highly-skilled disciplines have to dance these sorts of dances to get hired? Does a doctor have to diagnose on the spot? Does an architect have to design a building on a whiteboard? Do they make a lawyer stand up in front of a group of partners and argue some case off the top of his head? Perhaps they do. I really don't know. But it seems to me that basically professional experience counts for little other than to keep the resume out of the trash folder.


Don't know about the architect, or a lawyer, but doctors do get real-life questions... although probably not the ones with years of experience and more letters in the title than in the name. A bigger difference though is that lawyers and doctors have to pass an exam to practice their profession, which already excludes a lot of people. Anyone can be a programmer.


>> Anyone can be a programmer.

Anyone can be a doctor or lawyer too. I know what you mean, I think, but I don't know if I buy the distinction. There is a formal pass-the-bar testing event for lawyers. For nurses and I believe doctors there are boards. Same basic purpose AFAIK. They take these tests when they are just entering the profession. How meaningful are they years later, compared with the experience gained in the interim? My wife is a registered nurse working in cardiac critical care for seven years. If she interviews for a new job with that employment record behind her they aren't going to ask her what an aorta is, which is sort of the equivalent of what Asana is talking about doing here (and what everyone does, to be fair).

So you seem to be saying that a professional nurse, as an example, can be trusted to know the basics after seven years of work, because he or she was required to take a hard test on graduation. But a programmer with seven years of experience must prove that he or she knows what a binary tree does? It's also worth pointing out that lawyers can get people jailed or cause them to lose their property. Doctors and nurses can kill them. So can we, occasionally, but it's rare.

The most effective interviewing technique, as far as I am concerned is to send the candidate out for lunch with a few of your engineers and tell them to find out whether he knows what he's doing. You spend an hour talking software with some people and it's awful hard to fake knowing your stuff.


"But a programmer with seven years of experience must prove that he or she knows what a binary tree does?"

you'd be surprised at how many programmers with seven years experience have no freakin clue what a binary tree does!

There is a reason fizzbuzz is phenomenal filter.


fizzbuzz filters for candidates who a) can do a trivial programming task in a unrealistic high-pressure, high-stakes environment, or b) signal membership in your group via memorization of hazing rituals.

fizzbuzz does not filter for programmers capable of doing original, thoughtful work in a realistic work environment.


Doctors, nurses and lawyers often have to think quickly, on the spot, in life or death (or at least life-altering) situations.

A developer (in a non-pathological work environment) has time to discuss and research a solution.

FWIW I don't remember the last time I had to implement a binary tree since my C++ course in college; I'm not a fan of pop quizzes in interviews, but if you're going to do them then at least make it relevant to the work you do on a daily basis.


> there is a huge amount of underutilized talent out there

It's like leaving money on the table. Someone needs to moneyball the crap out of this.


> I appreciate the sentiment behind documents like this, but, Asana, you can do better. When you do, you'll find it gets easier to hire people, and, perhaps counterintuitively, the quality of your hires will go up.

I agree! (I don't even think it's counter-intuitive at all.) Thanks so much for really clear feedback.


Everyone has heard of interviews that pointlessly refuse candidates the opportunity to consult Internet references, like every working programmer does every time they code anything ever. But your process goes a step further: you won't even allow candidates to compile their code. How can this possibly be helpful to your process?

Here's how.

Some problems - particularly those involving concurrency - don't lend themselves well to the standard compile/test/run/printf mode of programming. The more difficult problems you'll run into require some semiformal reasoning about the underlying problem. When I sit down to solve these problems I'm doing more or less exactly what Asana is asking for. I'm not making my code compile. I'm not running tests - with high probability they'll pass even if my code is wrong.

Eventually I'll need to actually write code which compiles and runs. But if I skip the "think hard and make sure it's right" step (what Asana is asking for) I'll wind up with wrong code in production with errors that are nearly impossible to debug.

It sounds like Asana wants to determine if people can do this kind of thing.


>It sounds like Asana wants to determine if people can do this kind of thing.

Same for Facebook/Google/Apple/Microsoft/Amazon/etc that also use whiteboard interviews. Those types of companies value seeing candidates' thought process more than the final code of work-at-home test questions. There's a signal there in whiteboard interviews that can't be replicated by at-home tests. Whether that unique signal is valuable or worth the cost is up for debate (as the ongoing disagreements about hiring practices show).


I am not disputing that it is possible to design serious, rigorous evaluations that somehow hinge on candidates not using their normal tools to answer programming questions.

What I dispute is that anyone really does that.

It is awfully easy to rationalize any hazing ritual with this sort of six- degrees- of- concurrency- and- formal- methods logic. But there's a simple observation to make about all these questions: they do not mimic the experience of doing the work the interviewer is evaluating candidates for.

The more a question diverges from the experience of doing the job, the more process and framework and structure and rigor is needed to extract value from it. But really: most of these questions are just the same dumb fucking programming questions, just with different variants of the interviewer repeatedly whacking the candidate over the head with a paper towel roll while trying to answer.


I think one thing often missed in these discussions is that the interviews that work best for AmaGooBookSoft are not necessarily the same as the interviews that work best for smaller companies.

I've interviewed at startups that have some of the best work-sample testing that I've ever seen - but I could immediately tell these systems wouldn't work well at larger companies.

There are entire companies devoted to helping people prepare for GooAmaBookSoft interviews. If any one of these companies standardized on a single interview for all candidates it would immediately turn into a situation like standardized testing in high school where students study to the test instead of learning the core material.

I guess my advice is that if you're a startup, take advantage of this! You can do things that don't scale. Interview in ways larger companies can't. Don't assume that the way these companies interview is the best for you.


That's what I'm disagreeing with - some small but important parts of my work are, in fact, "reason through this code cause tools can't help you."

As another example, I ask people to set up numerical/statistical algos without worrying about specific numpy details. I trust people to translate a log likelihood formula to python, but I want to see them actually derive that formula. This isn't some hazing ritual - it's important to actually understand the statistics to avoid getting stuff wrong.

Again, the tools won't help you - if you set things up wrong, you just get a bunch of false positives.

I understand this isn't very important for the average dev writing CRUD or ETL apps. I'm just pointing out a particular use case where their constraints are reasonable.


I don't think anyone is arguing that you shouldn't ask candidates to prove domain specific knowledge as part of the hiring process. The issue is that typically people use proxies for proving the candidates have the skills they need for a job, and you should design the proxies to be as close to the job as possible.

So if deriving log likelihood formula in high pressure, time limited situations, is a good proxy for the kind of work the candidate will be doing, by all means do it that way.

If on the other hand, they are more likely to be asked on the job, to go away for some time, think about the questions needing answers in a data set, design an algorithm to suss out those answers and then persuade their colleagues that it is correct, you should design your interview process to align with that.


It sounds like they are trying to do exactly what you suggest:

We’ll ask you to solve some coding questions in a language and text editor of your choice. Feel free to bring your own laptop, or we’ll be happy to provide one. Notably, we ask candidates not to compile or run their code during this exercise, and not to refer to online resources. Our goal is not to simulate day-to-day software development — where we read docs and write lots of tests! — but rather to see how you reason about your code and input cases. For that same reason, we won’t ding you for superficial syntax errors or misremembered function names. After leaving you to work through the questions on your own, we’ll sit down together and talk through your solutions (including any ideas you didn’t have time to commit to code).

In any case, all I'm disputing is the idea that formal reasoning about algorithms (without stressing function names and compilation) is a useless skill. We have tools that make it unnecessary for many entry level devs, but that doesn't mean everyone can ignore it.


Absolutely nobody has suggested that formal reasoning about algorithms is a useless skill.


Everyone has heard of interviews that pointlessly refuse candidates the opportunity to consult Internet references, like every working programmer does every time they code anything ever. But your process goes a step further: you won't even allow candidates to compile their code. How can this possibly be helpful to your process?

To me it sounds exactly like they want you to demonstrate some semi-formal reasoning about algorithms, and you think this is not a realistic representation of work. What do you think they mean, and what do you oppose?


Why does everyone love algorithms and data structures of all the CS topics so much? I would like to be asked about covariance and contravariance, LALR parsing or some basic statistics for once.

But more importantly, I would love to be asked about CS topics that arise every day in development: code organization. How many times do you get a new requirement that makes you throw all previous assumptions (that an architecture was naively built upon) out of the window? When it turns out that module A and B, which were completely different and didn't even know about each other, now need each other's data? Yes, this situation arises when original architecture wasn't thought through enough — and we all were guilty of this at one point or another.

Now, what a developer does in that situation is much more important than whether he remembers what complexity what kind of hash table (with all the chaining and hashing types) is. Does he just hack it so it works, creating a lot of fun for future maintainer? Does he decide to spend 2 days to refactor the whole system into a new architecture? Or does he find a balance somewhere in between?


i dont understand why you would even listen to them, their website is such a piece of shit- terrible performance, terrible ui. maybe their infrastructure is good, but i can tell you they absolutely suck at frontend js.


> We design our interview questions to see how engineers work through technical problems they don’t know the answer to yet.

That's the way to do it, I think. In my own technical interviews, I pose (relatively) simple questions that require some problem analysis, a strategy discussion, and then some coding and testing, with a goal of testing aptitude for basic programming abilities like modularization, indirection, recursion, defensive programming, unit testing, etc. I keep things interactive and collaborative and see how much the candidate can do for themselves.


The problem is that Asana seems to have preconceived ideas of what answers are supposed to look like, but the questions are vague and open-ended enough that even very good candidates can give good answers that simply diverge from the manner of thinking that the Asana question designer had in mind.

If the question has a well-defined answer, then this is good. If you are asking questions that the candidate "doesn't know the answer to" merely as a parochial side effect of the question's vagueness and the vast space of possible good approaches, then you're really only screening for people who happen to think about things the same way you already do, and thus are good at guessing what you want to hear.

I'd argue that this is very bad in terms of diversity of thought.


I use problems with multiple acceptable solutions and easy tests of correctness -- i.e., it's obvious whether or not the approach is sound and the code works. For example, write a function that determines whether two rectangles on a display overlap, given the coordinates of their corners. Solving it requires the ability to analyze the problem and describe the conditions, the ability to code a simple Boolean predicate, and some aptitude for checking boundary cases and writing unit tests. I have others; what they have in common is that they're really easy problems for great programmers, and well within the limits of a 45-minute technical interview for competent coders, and good at exposing candidates who lack some basic skills.


Hi there! That sounds bad. Could you let me know what's giving you the impression that that's Asana's goal?

We really try to explicitly avoid that kind of interview evaluation, and put strong emphasis on Communication, User empathy, and Learning, per the guide. But definitely "did they say the right answer?" is an easy trap to fall into so its entirely possible that some interviewers are doing it -- would love any concrete feedback you could give to help us avoid it in the future.


The more concrete feedback is in the previous post (you commented on that one too). But I will elaborate here:

On 8/4/15, I received a message from an Asana representative via Stack Overflow Careers. After a few back-and-forths discussing specifics about whether I was a good fit, I then agree to do the take-home test.

I received the take home test on 8/7/15. I emailed back on 8/8/15 with a verifiable error in a timestamp column of an AWS-hosted Postgres database that was configured with some test data (data resembling some certain kinds of user data for modeling what an "engaged" user is, and what factors predict when a user will be "engaged"). I included code which could be run to demonstrate the erroneous data.

On 8/9/15 I wrote in again stating that I had created a workaround for the erroneous timestamp data, since the only way I could complete the exam was to do it on those particular days (I couldn't wait around for the exam data to be fixed).

I submitted my solution on 8/10/15. I got an email on 8/11/15 with a note from someone in engineering apologizing for the timestamp data error, but also saying it shouldn't affect my solution. The recruiter who sent that email also said that my submission was currently being evaluated.

Then I waited a week before hearing anything. On 8/18/15 I got a response apologizing for the delay and saying that I was invited for the next round interview.

The email said I would be speaking with a person named Jack (maybe you?) a few days later (8/21/15 at 4:00 pm EST). But that call actually included two data scientists over one conference call line.

Both interviewers effectively made no comment on my software choices. In fact, when I spoke about how I cared about software design and best practices, even when working on a rapid prototype, it actually sounded as if the interviewers thought this was a bad thing and there was some awkward silence. I brought up examples of how I had worked in a very fast paced environment before (quant finance) and my team had learned some hard lessons that the naive approach of letting data scientists just write scripts for ad hoc modeling, assuming that engineering will "productionize" it later, tends to lead to some bad failure cases, and that it's far more efficient over time to simply require data scientists to construct rapid prototypes, even from the first line of exploratory, ad-hoc code, that adhere to many best practices and are only a short distance from "production" even in their first version.

Then they asked me some vague and hard-to-decipher-over-a-conference-call questions about how to measure the accuracy for the random forest model I had fitted, and about my conclusions regarding which predictors were most efficacious. Despite having used random forests for published research results and even having taught part of a course on random forests when I was a grad student, I could not make heads nor tails of what exact accuracy metric the interviewer was fishing for. I am sure that if such a question was asked in-person, with some pen and paper, or something where it's not just the vague descriptions of some stranger I've never met before, then we could have sorted it out. But the communication was just too poor over the conference call for this kind of question.

Given that I had to solve the problem in my spare time while conducting all the other stuff in my life, I felt it was quite reasonable for me to take a very conservative approach: fit a very simple model, don't waste time chasing down parameter optimizations (like depth of the trees or some huge set of features). Just walk through some very straightforward analysis, use some straightforward features, produce some common diagnostic curves (which showed a huge bias in the data labels -- almost all example users were not "engaged" ), and draw only conservative conclusions (e.g. with such severe bias in the toy data set, it's pretty hard to conclusively say anything, so don't pretend you can say more than you really can, don't recommend strong conclusions, and leave it at that). The point is to show basically a mock-up of the end-to-end workflow, from querying the data, to constructing the features, to fitting the model, to assessing the fit. The point is clearly not to do that in some kind of insanely accurate way when you're talking about a busy adult doing it in their spare time over a few days.

The two interviewers had some gripes with the specifics of my accuracy measures (which seemed really misplaced, again for an analysis done in a busy adult's spare time using erroneous toy data).


After that, the interviewers asked the questions about how to develop a news feed ranking algorithm at a super vague, high level. I spoke about how you would need to account for seasonality or cyclic effects -- e.g. election news would clearly be ranked higher in newsworthiness when approaching important elections, or the same idea for sports-related articles, etc. I talked about trying to determine which news articles are the most clicked-on by peers in your network, possibly using some centrality measures to rate certain users more as "trend setters" and use their news viewing habits as guides for what to show to others.

I was actually fairly detailed in my description of some math ideas for these things, but the whole conversation sounded like someone sucked the air out of it. The interviewers kept trying to get me to talk more specifically about aspects of the newsfeed that actually occur in Facebook's newsfeed. When I told them I don't have a Facebook account and don't know what the modern version of the newsfeed does (which is true, I don't), they sounded like they just lost all interest in talking to me. My hunch was that they wanted me to possibly talk about A/B testing, but this is kind of silly. They definitely didn't ask me about A/B testing. It was like they wanted me to guess that phrase. Plus, at least for frequentist A/B testing, there are good reasons to believe a lot of it is junk science and a lot of applied mathematicians would not consider this a reasonable idea at all. I guess I could have talked a little about Bayesian A/B testing, but my mind associated their problem with network modeling and seasonality effect modeling, not A/B testing. (I can't be sure they were focused on A/B testing -- it's just what I independently thought after the fact when struggling to understand the outcome since no one gave any feedback).

Overall, I could tell it went badly, but I could not understand what the two interviewers were trying to get me to say. As someone who has worked on a lot of math modeling problems, from undergrad and on through grad school, I thought my way of breaking down and describing ideas for the newsfeed example was perfectly on-point and technically useful. To boot, the job I was interviewing for did not (at least as it was explained to me) have anything to do with ranking problems, A/B testing, or newsfeed-like work at all.

That was on 8/21/15. I then heard no response of any kind, until 8/28/15 when I got a form letter rejection email that simply said at that point, Asana did not want to continue the process, and gave no further details.

From start to finish it went from 8/4 to 8/28, so almost a month. I scrambled to put in a lot of hours to solve the take-home test in my spare time over a short window, and then went through two different full week long periods of silence (one after my code test was being reviewed, and one after the vague/weird two-on-one phone interview), only to receive no feed back at all on my software solution, limited and tangential feedback on my actual data science approach (feedback which didn't seem at all appropriate for something a busy adult would put together over a couple of days), and no feedback at all about the decision to reject me based on the interviews.

The overall experience left me feeling like Asana asks very vague questions and reserves the right to be dissatisfied if you don't identify the magic answer to the vague question. It also made me feel that Asana does not act in a timely manner to review code submissions or to provide interview feedback, even though Asana did require me (despite being busy outside of interviews) to complete a code test in a timely manner. It also sounded like Asana might have some data science employees who act a bit like "gatekeepers" in the interview process -- they felt a need to assert that they are better than the candidate, even if it means bringing some discussion up about specific accuracy criteria or something when it may not really be very appropriate for a quick solution to a poorly specified problem.

It's a bit like someone talking about Haskell programming for an interview, successfully solving some Monad problems in a quick way given the time constraints, and then getting grilled about unification algorithms in the Hindley-Milner type system. It's just to let the existing employee who is conducting the interview feel superior and act like it is some great filter on the incoming talent pool, when really it's mostly unrelated minutiae.

As I mentioned in the other comment, I later learned the extent to which Asana supports Agile-focused systems for enterprise project management, and this actually saddened me a lot. I always liked Asana because it embodied a type of minimalist efficiency that is fundamentally incompatible with the bureaucratic wastefulness of Agile/Scrum/Kanban/"lean" buzzword systems. I had high hopes that Asana would not be just another one of those cargo cult management junk systems, and maybe it still will avoid that fate, but agreeing to support Agile doesn't bode well.

So ultimately it was a good thing I was rejected, though I do not feel it was a fair reflection of my engineering effort on the exam, and I certainly don't feel that the evaluation was done in a way that showed respect for the amount of time and effort that I took to complete the task.


Yep, I'm that Jack. I remember you now. Thanks for the clear picture of what it's like to be on the candidate side. Efficiency is clearly a goal, and we clearly failed here given that it took us a week to get back to you after the phone interivew. I do think your interview was an outlier in that regard, but that doesn't excuse it.

Re: the questions themselves, although I don't agree with your characterization of what's going on or what we're looking for, I definitely think there's a lot of room to improve on our data-science screening. I'll send you an email with more re: what specifically came up in your interview that made us reject.

Again, thanks for taking the time to share this really explicit feedback.

(For the record, I don't really know what you're referring to re: Asana and Agile.)


Hopefully this thread's not too stale yet to escape notice.

I just wanted to add that the user above, Jack, absolutely made good on what he said.

He sent me some very well-written, extremely professional, and constructive feedback about my interview process. I am able to see a lot of things I can improve and it will help me for future interviews. I think it also highlights some things about the process that Asana can improve, and by all evidence of Jack's sincerity in the feedback, I believe Asana is very interested in ways of improving.

Don't let anyone say that all I do on Hacker News is grumble about open-plan offices... in this case I am expressing my sincere gratitude and appreciation for Jack's feedback.

If I had received this kind of response in a timely manner when Asana decided not to continue the interview process with me, I would have walked away with a significant positive feeling about interviewing with them.


I have interviewed (phone screened) at Asana. It was delightful. I really liked the people I met there and enjoyed the problem I worked through over the phone. It really did feel collaborative and like we were solving a problem together just as software engineers do when they're not in this artificially combative environment we often create when interviewing.

Regarding the problem, I didn't have the best approach out of the gate. In fact, I had a vague memory of the right approach ("I seem to recall it uses two data structures, and if I really stretch I think it was X and Y, but I don't remember the details.") That made it harder to get to the final solution. The interviewer was very good at at talking things through with me in a way that didn't seem like they were frustrated or otherwise grumpy that I didn't know this particular problem cold. They apparently thought the results were encouraging enough to move to an onsite, but I got an offer from my current employer (where I am working with close personal friends) and had to pass on the chance to find out what that would be like.

All in all I think Asana is probably a great place to work. I also like the product, although it seems like you need a whole team to adopt it to make it sticky.


I find it funny that startups only implement these ridiculous hiring practices after they start to grow. Imagine if the founders had to quiz each other on algorithms before they agreed to do a startup in the first place. The first few hires at a startup are arguably the most important, yet they don't go through the bullshit that later stage startups put potential employees do.


> I find it funny that startups only implement these ridiculous hiring practices after they start to grow

Hmm, I'm curious what you're assuming! At Asana, at least, our first hires went through many of the same questions that interviewers today go through, but we were just a lot less systematic about our evaluation and less scrupulous about accidentally testing for knowledge/context rather than ability. The intention of this guide was to make the interview /less/ bullshit-y, by giving people enough grounding to not be thrown off by unexpressed expectations.


Their subscription pop up at the bottom of the page with a close button that does nothing only makes me sure I don't want to work for them.


I like the idea behind this but the actual guide is pretty thin:

https://asana.com/eng/interview-guide

would be great to see a more detailed guide covering the concepts they like to see master of and problem solving skills they are looking for.


Disappointing. The links to Triplebyte and Stripe are far more informative.


Thanks for the feedback! What did you learn from those that yo u didn't see in the Asana one?

- Jack from asana


How do these question align with the work Asana is doing? Side note: The blog site performance degrades as you keep scrolling down.


Maybe you could hire someone that will design you a mobile website?

I'm not installing your app on my phone...


There are many different types of engineers This guide is for software engineers


this place has so much cash, they may as well hire trial by fire, create an onboarding experience for serious candidates, morph them to your liking or let them go, stop guessing with goofball interview theater.


- smart

- gets things done

- able to actively engage in discussion about software development

- works hard

- wants to work here doing this job

- loves to learn

- not overly dogmatic

The rest, meh.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: