Hacker News new | past | comments | ask | show | jobs | submit login

That’s very much an exaggeration. Pixar/Google etc can’t run on a single desktop CPU and spends a lot of money on hardware. The best estimate I have seen is it’s scale dependent. At small budgets your generally spending most of that on people, but as the budget increase the ratio tends to shift to ever more hardware.



It is absolutely about scale.... an employee costs $x of dollars regardless of how many servers they are managing, and might improve performance y%..... that only becomes worth it if y% of your hardware costs is greater than the $x for the employee.


The issue is extra employees run out of low hanging fruit to optimize, so that y% isn’t a constant. Extra hardware benefits from all the existing optimized code written by your team, where extra manpower needs to improve the already optimized code already written by your team.


Eventually you have to buy more hardware. The Pixars and Google have the most to gain from added expertise.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: