If you're very careful, a double can be an integer type. (A 53-bit one, I think?) (I don't love this line of thinking. It has a lot of sharp edges. But JS programmers effectively do this all the time, often without thinking too hard about it.)
(And even before BigInt, there's an odd u32-esque "type" in JS; it's not a real type — it doesn't appear in the JS type system, but rather an internal one that certain operations will be converted to internally. That's why (0x100000000 | 0) == 0 ; even though 0x100000000 (and every other number in that expression, and the right answer) is precisely representable as a f64. This doesn't matter for JSON decoding, though, … and most other things.)
If you're very careful, a double can be an integer type. (A 53-bit one, I think?) (I don't love this line of thinking. It has a lot of sharp edges. But JS programmers effectively do this all the time, often without thinking too hard about it.)
(And even before BigInt, there's an odd u32-esque "type" in JS; it's not a real type — it doesn't appear in the JS type system, but rather an internal one that certain operations will be converted to internally. That's why (0x100000000 | 0) == 0 ; even though 0x100000000 (and every other number in that expression, and the right answer) is precisely representable as a f64. This doesn't matter for JSON decoding, though, … and most other things.)