Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It doesn't. Neither is the average.

To say such a thing, we need:

1. Some assumption about the distribution of the true value. 2. And a metric measuring the "cost" of being wrong.

Assuming a uniform distribution, and measuring cost as the expected absolute value of the error, we find that the average of the interval is the best guess.

Using the same assumptions, any number in the interval is a better estimate than the endpoints.

From that it (obviously) follows that the mediant is a better approximation than both endpoints.



The average of the interval is the best guess if you are only trying to minimize the error. But if you want to also keep the denominator small that's when the problem gets more interesting.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: