Hacker News new | past | comments | ask | show | jobs | submit login

I dunno... the only alternative seems to be storing as a float everywhere, since most coding languages don't support a decimal type... and using floats for currency honestly just feels scary to me, since they're not fixed-precision.

I mean I know there are min/max ranges in the float specification that can guarantee no loss of accuracy to two decimal places... but then I need to manually make sure there's never anywhere in the code that might ever sum up transactions to beyond that, or whatever. Or people thinking it's OK to divide and then multiply, which you're hopefully less likely to do if it's an integer.

I feel much more peace of mind storing currency data as integers and doing the conversion. It feels quite natural really, since I'm already used to doing that with datetimes for different timezones.




Python has Decimal, JavaScript has BigNum and so on. I disagree with you - most languages support arbitrary precision numbers. Floats are not needed or wanted.

You so not want to invent your own arbitrary precision numbers

https://floating-point-gui.de/formats/integer/




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: