Stanislaw Lem, The Star Diaries, where Tichy at the same time apologizes and brags about how he retroactively tried to create/fix the world and how he failed.
You need to study Stanislaw Lem if you want a glimpse into the future... in Cyberiad he invented the Electronic Bard whose description is uncannily close to chatgpt. Including the poets losing jobs and protesting.
And more... Adams is kind of more approachable version of Lem.
They both had the amazing ability to talk about human nature, and project it into the future through the lens of technology. Lem was such a prolific writer though, I don't know if there are any books of his I would not recommend (except maybe his last works, incredibly dense compedia of abstraction)
• People imagine enabling fast float by "scope", but there's no coherent way to specify that when it can mix with closures (even across crates) and math operators expand to std functions defined in a different scope.
• Type-based float config could work, but any proposal of just "fastfloat32" grows into a HomerMobile of "Float<NaN=false, Inf=Maybe, NegZeroEqualsZero=OnFullMoonOnly, etc.>"
• Rust doesn't want to allow UB in safe code, and LLVM will cry UB if it sees Inf or Nan, but nobody wants compiler inserting div != 0 checks.
• You could wrap existing fast intrinsics in a newtype, except newtypes don't support literals, and <insert user-defined literals bikeshed here>
It must be clearly understood which of the flags are entirely safe and which need to be under 'unsafe', as a prerequisite.
Blanket flags for the whole program do not fit very well with Rust, while point use of these flags is inconvenient or needs new syntax.. but there are discussions about these topics.
Also maybe you woild like to have more granular control over which parts of your program has the priority on speed and which part favours accuracy. Maybe this could be done with a seperate type (e.g. ff64) or a decorator (which would be useful if you want to enable this for someone elses library).
Reasoning about the consequences along a couple functions in your hot path is one thing. Reasoning about the consequences in your entire codebase and all libraries is quite another.
The whole point of ALGOL derived languages for systems programming, is not being fast & loose with anything, unless the seatbelt and helmet are on as well.
I don't know if most Rust programmers would be happy with any fast and loose features making it into the official Rust compiler.
Besides, algorithms that benefit from -ffast-math can be implemented in C and with Rust bindings automatically generated.
This solution isn't exactly "simple", but it could help projects keep track of the expectations of correctness between different algorithm implementations.
Rust should have native ways to express this stuff. You don't want the answer to be "write this part of your problem domain in C because we can't do it in Rust".
My first computer was Spectrum. With broken cassette recorder. There was no other way to play games but writing them down from a recipe book. And picking up programming along the way. Discovering the new world, the world of the electron and the switch and then the beauty of the baud.
Our team is developing machine learning algorithmic solutions that improve outcomes for our advertisers. It is part of Outbrain’s Recommendations Group - about 40 machine learners, data scientists and machine-learning engineers who are responsible for everything that Outbrain recommends in its feeds and widgets. The team uses an interplay of Python, Java and Rust, in addition to Spark, Bigquery, and Tensorflow to form our ML and AutoML pipelines.
Team’s responsibilities are:
- Leverage Outbrain's rich data sources, large-scale computing resources, and proprietary algorithms to build a state of the art models to improve outcomes for our publishers and users
- Implementation and integration of algorithmic feature to the high scale production system
- A/B testing and monitoring of new features
- Data analysis over huge datasets to validate the various hypothesis
- Collaborate with the operations team to further improve our tooling and results