Hacker News new | past | comments | ask | show | jobs | submit login

I feel like I read something once that said when writing numerical/scientific code, traditionally there were so many weird high performance computers that were used, e.g. Crays or whatever, you'd have to be robust to all sorts of different types of FP anyway.

Nowadays maybe that sort of diversity is less of an issue? Expecting determinism in the sense you mean it just seems weird to me.




Having absolute determinism is probably still difficult but using SSE on x64 on Windows, where all users have compatible compilers (I.e determinism without diversity) is at least “good enough” nowadays. I haven’t seen any issues with that scenario so far, even though it’s certainly possible problems can arise.


I think it’s in Goldberg91.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: