Hacker News new | past | comments | ask | show | jobs | submit login

> Then I discovered quicksort, and graph search, and Bayesian inference, and Dijkstra, and Karatsuba.

I would guess most "discovered" those because they had to interview. The % of programmers who need to Karatsuba or Bayesian inference to finish their project is very much smaller than the % of programmers who thought "oh shit, I better learn this because Google and Facebook keep asking about graph theory".

It it is perhaps an interesting secondary effect. CS concepts become useful not just because they are useful (and they are) but also because recruiters at many companies think they are. If tomorrow juggling skills somehow become popular during Google interviews. You'd see a lot of juggling programmers out there.




I distinguish between two types of puzzles: human-made (which I call puzzles) and everything else (which I call problems.)

In those terms, I hate puzzles and love problems. Puzzles are contrived by humans and are generally as much psychology problems as anything else. They basically require you to think like the human who created them, and they have bizarre and arbitrary constraints that are totally unlike the real world, where, as Feyrabend told us, "Anything goes."

Puzzle-style interviews are 99.9% human made, and one of the features of human-made puzzles is: the answer is known and very nearly unique. This makes them almost completely unlike the real problems programmers encounter, where there is no known answer (if there was, the programmer would find it on Stackoverflow or similar) and amongst the potential answers there is no obvious winner.

So it's really easy to be good at puzzles and terrible at problems, which require taste and good judgment to navigate through the combinatoric explosion of approaches and parameters (and no, design-of-experiments won't do more than give you a little assist on genuinely hard problems.) Problems also require one to know when to stop, when a solution is "good enough" for the given domain. Even problems that are nominally non-statistical in nature require decent statistical knowledge to evaluate potential solutions.

So as someone who designed algorithms for a living for quite a long time, and still does for fun and profit now and then, I'm as strongly against puzzles as the OP. I'm actually slightly biased against people who love puzzles because it looks to me like they are spending way too much time playing with nice, neat pebbles on the beach, when the ocean of ugly, intractable problems lays unexamined before them.


An interesting heuristic difference between the two types of, lets call it, challenges, is how you define success.

In a puzzle, you achieve success as a well defined solution, or (in a lot of the theoretical cases) by showing no such solution exists.

In a problem, there is many goals to balance (simplicity, efficiency, cost, implementation time, etc). Often, the best solution is not needed, or the "optimal" one is too complex or takes too long to implement. Often, the perfect solution is not possible, and you have to think about how to change the definition of the problem to get something acceptable.


Totally agree! Puzzles are just one way of studying human (or animal) behaviour in a controlled environment. It's absurd to be used as the main tool to evaluate a person's competence.


I agree so much and I'm going to adopt your terminology :)


I think this explains why I hated school.


So true; the Google, Facebook, Microsoft, Fill in Here puzzle interviews are pretty silly. Rather than splitting people into groups of "good developers" and "bad developers" they can really be split up into these two categories.

1) People who read Cracking the Coding Interview 2) People who haven't heard of this book.

http://www.amazon.com/Cracking-Coding-Interview-Programming-...

Seriously; any competent programmer can crank through the stuff in that book in a week or two and be solid for the types of silly questions they will be asked in those interviews.

Honestly the best interview I ever had was one where I interviewed with a highly technical IT director and it was just very conversational. We spent about two hours of just chatting about the different types of technologies we had been working with and kept diving deeper and deeper into knowledge areas. At the end of it he just says, "Ok, you are obviously a smart guy; I'm going to send you over to see the VP now so you two can discuss salary requirements."


As the author of said book who has coached MANY people, I can assure you that many - heck, most developers - could not pass a Google interview regardless of how much they prepared. They'll get better with preparation, but it doesn't just fix all issues.


Gayle,

Don't get me wrong. I made it through the Amazon interviews thanks to your book. I spent a week reading through and cranking out examples before the interview. Then I spewed out all of the desirable answers at each step of the interview process.

Now I'm a highly experienced developer with 15+ years experience across a variety of technologies. I contribute to open source, answer questions on Stack Overflow, present at technical meet ups. I solve problems that need solving even when no one else will step up.

Despite all of my experience, implementing hashes and sorting linked lists in place, as well as many of the other questions asked by these companies just don't come up on a daily basis. A good developer isn't someone who can explain those things in an interview, it is someone that can examine the problem and design a solution. They should have an idea of the tools necessary and then reach for the right book or use a healthy dose of google searching some api docs to build the solution.

The current interviewing techniques are broken. I would like to offer the solution, but quite frankly I've spent plenty of time interviewing people and we haven't gotten it down yet either.

-------

The ideas Malcolm Gladwell discusses in, "Blink: The Power of Thinking Without Thinking", applies here. Some of the most experienced developers will have an intuition for how best to implement a solution, this does not necessarily translate to them being able to well articulate the why and how.


So, you're a good developer who is smart and passed the interviews with some studying. That's exactly how it's supposed to work :).

You're right that the problems that some up in interviews typically aren't real life problems. But that's not necessarily an issue. Are the skills they test (developing good algorithms) important for real life?

An interviewer doesn't ask you to develop an algorithm to return a random node from a binary tree because they particularly care if you know THAT algorithm. They're asking as a way of evaluating your skills developing solutions to problems you haven't seen before.


I believe that is the most common argument made, but also therein lies the fallacy.

I'm not developing anything new when I implement a quicksort algorithm; Tony Hoare certainly was back in 1960 when he created it. The same goes for the hash; someone at IBM made a big leap when they first used hashes back in the early 1950s.

Since that time numerous people have made some improvements to these (and other algorithms of course), but by and large companies are just testing is the developer interviewing able to code existing algorithms and explain their Big-O notation.

I think you end up with 4 subsets that the interviewees fall into.

I = {interviewees}

A = {I | bad programmer and can't solve algorithms}

B = {I | good programmer and knows the algorithms}

C = {I | bad programmer but knows the algorithms}

D = {I | good programmer, doesn't know the algorithms}

It is subsets C and D that concern me.

In subset C we have someone that studied the algorithms really hard, but just can't write good code and solve real original problems. One approach to this is the "hire fast/fire fast" mentality, but in some organizations "fire fast" is not an option; so you would rather not hire them in the first place.

In subset D we end up missing out on good developers. In a bad job market there isn't much need to worry about this; if we miss a good developer easily enough we'll find another. Right now though developers are in such high demand I see this loss as a bigger deal.

I also think that this type of interviewing technique works better for the tech giants (MS, GOOG, FB, etc...) than it would for a lot of smaller organizations. A good set of programmers will self select out of interviewing at these companies because they don't believe they are "good enough" for MS(or alt). Additionally a good set of developers that do believe they are high caliber enough for these organizations will do some research about the interview process before hand and end up studying up on the right things before going in.

With smaller companies or businesses with non centralized hiring processes for developers interview candidates may just not know what to prepare for in the interview and while an algorithm may seem obvious in hindsight; there is a reason nobody was using it until it was "discovered". I can't reasonably expect someone to devise the answer to that type of question when I interview.

Honestly I would love to hear some ideas on this; as it has been a pretty major concern for me for the last couple of years.


I suspected that this is true, thanks for confirming.


Actually I think graph theory is pretty important. For example many programming contexts have to deal with dependencies, and often dependencies form a directed acyclic graph. Understanding that will aid in writing better, more reliable code. Path finding is also pretty important, especially if you want to make a game.


That's great if you're a game AI programmer (an incredibly small group) or you're creating Yet Another Dependency Loader (hey, I guess <language that becomes super popular next year> will only have 100 frameworks to manage modules, you'd better create your own as well).

You know what I really wish? Programmers would spend 1/100th of that algorithm studying on learning to write readable, maintainable code.


> You know what I really wish? Programmers would spend 1/100th of that algorithm studying on learning to write readable, maintainable code.

IME, programmers that write readable, maintanable code and those that have a solid grasp of algorithms and theory as it applicable to the problem domain tend to overlap considerably. Having a clear analytical mental model of the approach to the solution leads to clearer and cleaner code -- and, where necessary, clearer and more useful comments -- than hacking your way to something that works. (And I'd say thats as true of my own code -- where I've done work on both extremes of the clear model to hacking-my-way-through axis -- as of others' code I've seen.)


>>>learning to write readable, maintainable code.

Which is why I love python. Have you seen some poorly written JavaScript that's poorly indented and has no comments? Until there is a universal guideline for JavaScript and that developers follow, I won't have a go at it again.


Indentation is pretty low on the list of things that make code maintainable or not. (If it bothers you, just pass the code through a pretty-printer.)

Much more important is to pay attention to system-level measures such as modularity, abstraction, (de)coupling, and documentation. 99% of maintenance nightmares are due to code that is not modular (and thus not easily replaced), is abstracted too much or too little (leading respectively to obtuse or repetitive code), highly coupled (causing brittleness when assumptions are broken), or poorly documented.

(There are probably others, but these are the first to come to my head. These measures are also not necessarily orthogonal.)


Your comment comes across as bitter and sarcastic. It doesn't sound like you like being a programmer.


Agreed. Some people say you'll only use it in games or AI or something fancier, but that's not entirely true. I work with what would be considered pretty boring enterprise software, and yet I once solved a pretty big problem using graph algorithms. That actually earned me a good reputation among my coworkers.


I don't think the primary complaint about these types of skills is that they are never useful. It's just that they are treated as disproportionately important, i.e. interviews consist of 80% CS knowledge but the real job only requires it 1% of the time on average.


I used some super simple graph theory recently to write a Java tool that spit out data from multiple tables on a database into SQL files, with statements in the right order so that foreign key relationships were preserved. And this was for a CRUD-ish web app.


This is very similar to the behaviour that Keynes described in the equity markets: http://en.wikipedia.org/wiki/Keynesian_beauty_contest


I don't get why people tend to think that computer science is something different from "actual programming". In the end everything you can program can be expressed formally and vice-versa, the only thing that changes is how much you abstract from the actual implementation, which is handy in order to catch defects in the approach you are taking to certain problem.


It's not the subject matter per se, it's the difference in habits and typical incentives for people who identify more with one description or the other.

For many products and businesses, it's not really necessary to avoid long-term dead-ends (say 3-5 years) if you can be much more effective in the short-term (say 3-12 months). So a high up-front cost (required knowledge and planning / research experience) approach often loses out to simply banging out more code, which requires little CS knowledge.


Because everybody can do "actual progrmaming." Especially in grad school.

Computer Science is broad (even AI is different), but I'd say it's more about Turing's academic papers, and computational mathematics. Programming, and computers, aren't even needed, they're just models which either can or cannot derive solutions, or in a reasonable time. Such as the halting problem, P, NP, NP-complete.

In a way it's mathematical logic, but with more rigorous proofs.

Everybody else can learn a generic language with some external libraries and make almost anything they want. But that's not more academic than it is to do woodworking, or sewing. Comp Sci is academic, and one of the more important ones because it determines a lot of research other disciplines do. I.e physicists and mathematicians do comp sci, while the softer sciences run models without understanding why they work.


As Knuth pointed out, "computer science is no more about computers than astronomy is about telescopes"


Dijkstra


Thanks I knew that didn't sound righht


The difference between computer science and programming is analogous to the difference between physics and architecture.


I'd say fluid-dynamics and plumbing. Most businesses really just want us to connect this to that and not have the shit back up. If you can deliver that quickly, without much analysis, and without fuss than you're very effective for many business and product types.


I would say engineering




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: