No, I think JavaScript's "floats everywhere" is worse than most languages. I prefer languages like Python/Ruby that have one arbitrary-precision integer type and one floating point type, or languages like Haskell or Lisp that have a rich selection of floating/fixed/abitrary/rational types with well-defined performance. But some (not all) of the WTFs here exist in all of those languages too. Eventually, programmers working with numbers really do need to learn how to use different numerical representations.
You only get that result in python by importing the decimal library. The standard behavior in python is (.1 + .2 == .3) == False
However, your code example makes the excellent point that python has such a library available, whereas Javascript does not. At least, not as easily as an import.