Hacker News new | past | comments | ask | show | jobs | submit login

Imagine the world had a single supercomputer, and somebody wanted 100% utilization for those hundred hours to, say, train a neural net to play donkey kong. It's not a huge cost, given the expected lifetime of the resource, but to everybody else waiting in the queue, it would be quite the opportunity cost.



But you don’t sum up all the other things for opportunity cost, you take the single max value. In other words, the opportunity cost doesn’t care about how many other people were waiting, it cares only about how much useful results the best alternative proposal would have gotten with the same amount of hours.


That's an inaccurate picture of resource allocation. 100 hours of a globally unique resource could satisfy 100 1-hour demands. You sum those.


That would be one proposal - give 1 hour to 100 different groups.

What you don’t do is sum the 10000 different requests, nor the 100 requests that also want to use 100 hours each.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: