Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

No, I think JavaScript's "floats everywhere" is worse than most languages. I prefer languages like Python/Ruby that have one arbitrary-precision integer type and one floating point type, or languages like Haskell or Lisp that have a rich selection of floating/fixed/abitrary/rational types with well-defined performance. But some (not all) of the WTFs here exist in all of those languages too. Eventually, programmers working with numbers really do need to learn how to use different numerical representations.


Could you please grace us with a code sample for all of these? I'm not intimately familiar with Haskell, Lisp or Ruby.

  >>> from decimal import *
  >>> Decimal('0.3') == Decimal('0.1') + Decimal('0.2')
  True
  >>> Decimal('NaN') == Decimal('NaN')
  False
  >>>


You only get that result in python by importing the decimal library. The standard behavior in python is (.1 + .2 == .3) == False

However, your code example makes the excellent point that python has such a library available, whereas Javascript does not. At least, not as easily as an import.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: