Hacker News new | past | comments | ask | show | jobs | submit login

My theory is that the time taken for an engineer to solve a problem is a hockey-stick/exponential curve based on the complexity, something like __/

The key thing is the knee of the curve is in a different position for different engineers. Basically, up to a certain point, you can crank through a problem pretty well. But once problems reach a certain level of complexity you'll slow down. And at a higher level of complexity, you're just going to go in circles and not going to get anywhere.

The point of a "rockstar" is not that they are 10x faster on the flat part, but their curve is shifted so they are on the flat part while another engineer would be on the exponential part. In other words "rockstar" depends on the problem. Knuth is unlikely to build a basic CRUD application 10x faster than an average HN programmer, but if the problem is building a typography system, he's probably 100x as productive. Likewise, if I had to build a Javascript compiler engine, a self-driving car, Minecraft, or a translation system, I would be stuck around 0% productivity while a "rockstar" would deliver. But "rockstar" depends on the project: on a simpler project, many more people would be able to deliver well.

TL;DR: for a particular project, people who can do that project well are 100x more productive than people who are overwhelmed by the project. But this line moves depending on the complexity of the project.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: