Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

No because latitude and longitude are infinitely precise only limited by the accuracy of the measuring instrument. With the binary search idea you proposed the dividing lines are more tightly packed longitudinally near the poles.


But near the poles, 1 degree longitude specifies a much smaller span than at the equator. So with say, 4 digits per latitude and longitude, you can specify a much smaller region at the poles than the equator.


And that's the problem.

To get 10m accuracy at the equator, you'd need to get much greater than 10m accuracy at the poles.

You're effectively wasting information.

The surface area of the Earth is 5.10x10^8 km^2. In an ideal world, you could uniquely represent every 10m*10m area of the Earth - there are 5.1x10^12 such areas. log2(5.1x10^12) is about 42.2.

So you're wasting about 1.8 bits of information (~4%). Not too bad, now that I calculate it out.


Yes, I get that. I was responding to user rtkwe who said that latitude/longitude didn't waste information.


Lat. long. doesn't have a length limitation of the binary search system, 10:10 so the information isn't wasted because where the grid is sparser near the equator we just extend the decimal places.


That "extension" is itself the waste being discussed. Some places require more digits than others.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: