Hacker News new | past | comments | ask | show | jobs | submit login

Isn't it impossible? Integers go arbitrarily large but computers don't.



It is possible within the limits of available memory on the computer. Rather than the usual limit of a fixed number of bits.

I've ironically found that big integer libraries sometimes optimize math routines more than string conversion. This was quite annoying for me when I optimized the factorial function in Ruby, and found that generating the string was my bottleneck. I then optimized that as well. :-)


They can go large enough for anything that matters.

A quick Google says there's an estimate of 10^78 to 10^82 atoms in the universe. That number would be able to be stored in well under 300 bits.


Universe is tiny compared to mathematics and software, even simply RSA keys are already 2048bit.

Lots of problems suffer from 'combinatorial explosion' [1].

I recently learned about the Archimedes's cattle problem, the solution is of order 10^206544 [2]

[1] https://en.wikipedia.org/wiki/Combinatorial_explosion

[2] https://en.wikipedia.org/wiki/Archimedes%27s_cattle_problem


10^206544 would still be less than 8 kB. You could store it on an Atari cartridge!


Computers also go arbitrarily large. Not infinite, but arbitrarily large.

A real number type could be bounded by the amount of RAM you have.


You can use disk storage too. And network storage.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: