Timestamp ints get constantly casted back and forth between seconds, microseconds, and nano seconds. its an absolute mess everywhere. This are fixed point floats.
Everything sensor related, and countless logging, file or anywhere wait_for or timeouts are used are typical usecases, and indirectly whenever people need to do simple things like basic arithmetics on timestamps.
The code will either contain some precision change or a unfixed bug related to overflow of one or more kinds.
So int64 is half, and I was off by a factor of three doing it in my head... I was wrong no doubt about that. But the order of magnitude is the problem, it should be millenia, or millions of years, not a few centuries.
Uh... what? I have in fact never seen a timestamp being used that way. What kind of use cases are we talking about?
> Int64 is also too small, as the common nanosecond clocks would only give a century of range
An unsigned Int64 worth of nanoseconds has a range of over 584 years.