> The State-of-the-Art in Practical Software Assurance
Unfortunately I cannot take this very seriously, if it is completely ignorant of the Spark/Ada ecosystem and presents Rust as the "state of the art".
Rust has some nice properties, but there is an entire field of programming languages and compilers designed for building safety-critical software for aerospace and defense. Unfortunately, it looks like that is going to be forgotten, or rather, intentionally overwritten by ignorance.
So they are going to teach a process to consistently make and deliver high assurance, robust systems written in Rust when they knowingly claim that they have never even heard of, let alone made, even a single high assurance, robust system in Rust.
This is absolutely ridiculous. Having a process for making high assurance, robust systems is a extraordinary claim which requires the extraordinary evidence of repeated success at delivering such systems in real-life adversarial environments and succeeding at audits and verification efforts that are demonstrably passed only by robust systems. Anybody who has not reached that standard should not use the words “high assurance”, “robust”, or “secure” as they have no evidence of their claims and anything they say should be completely, 100% ignored.
It is frankly outrageous that people so far from that standard that they can not even point to a single system that anybody has done have the gall to use those words.
"Secure," "robust," "high-assurance," etc. are not standard terms with a single agreed upon definition. Like just about anything that concerns computer security, their definitions depend on threat models and maybe a domain-specific specification.
I don't particularly care whether a particular flavor of Rust (or C, for that matter) brands itself as "high-assurance," as long as it is not dishonestly claiming compliance with a particular domain's assurance or robustness requirements.
> when they knowingly claim that they have never even heard of, let alone made, even a single high assurance, robust system in Rust.
Huh?!
They said you can't build safety-critical software in Rust because it's not certified.
Not all high assurance software is safety-critical, not all correct software is certifiable, and FOR SURE not all certifiable software is correct.
> evidence of repeated success at delivering such systems in real-life adversarial environments and succeeding at audits and verification efforts that are demonstrably passed only by robust systems.
Can you name one such process? The old auto and aero software standards from the 90s/early aughts are crufty, have demonstrably failed on any number of occasions, and leave much to be desired.
(BTW, "verification" in those standards means something totally different from what "verification" means to software experts today.)
The Rust compiler and ecosystem would need a thorough audit before its use in safety-critical settings, but I have no doubt that this audit will happen and that when it does the evidence generated y that audit will blow way past the standards applied to MISRA C/Ada/etc.
It's not a very good impression: MISRA C is basically a list of guidelines, half of which say "don't invoke undefined behaviour" (duh) and the other half are questionable coding-style rules. None of it really corresponds to any Ada features (not to mention you basically ignore any MISRA rule you like if you just write down that you've done it and attempt some justification).
The goal of MISRA C mostly isn't to prevent problems in your code. It's to make the code more amenable to static analysis, which _can_ detect problems.
Which, notably, is what Rust does by construction: you don't have to do (much) alias or escape analysis against Rust, because it's given to you by the compiler for free.
There's also been a decent amount of empirical research on MISRA C's effectiveness, most of which has shown that attempting compliance actually makes programmer-introduced faults more likely[1].
It's extremely common to use MISRA without static analysis (I would personally advocate for the opposite way around). Also, if the goal is to make static analysis easier, why not just have the rule be 'the static analyser must be able to say this is OK'? (which will depend greatly on what the capabilities of your static analyser are).
MISRA: Of course the C language provides loads of standard library functions which look ordinary but are actually defined in a way that's hideously dangerous. You can use most of those but please take care to avoid mistakes (MISRA D4.11).
Ada: It would be insane to provide a standard library full of dangerous sharp edges so we didn't do that.
They seem rather ignorant of the safety aspect of high assurance software and appear more focused on security. Neither of which are ever fully solved at the software level anyway.
What languages and compiles are certified for satefy-critical software? And how is it decided? That is super interesting, and I didn't know it was a thing.
Author of the book here, thank you for your feedback!
Though not an Ada expert, I have used the SPARK subset. And I’ve worked on security assessment of military systems with components written in Ada. Although we looked at binaries and not source code, because our assurance model did not consider the Ada compiler to be part of the chain of trust (so we test the artifact the CPU will execute). I really like Ada and hope to use more of it in the future!
The HN community may be interested in this 1 hour presentation [1] on Nvidia's adoption of Ada, including why they chose it over Rust in 2019 (regulatory certification was one factor). There’s also a great academic paper [2] on the recent advances SPARK has made in heap memory verification, inspired in part by Rust’s type system.
There are currently efforts underway to certify Rust for usage in safety critical systems via an alternate, certified toolchain (among other pieces of the puzzle). Two related initiatives are linked through in FAQ question #7 (which someone already shared). I’ll be sure to add the Ferrous Systems and AdaCore collaboration when I have a chance to batch together some updates.
I, like many others, believe Rust is a technology that can deliver value in safety and mission critical verticals due to reduction in undefined behavior via principled static analysis (a type system for which there has been early success in formal verification of safety claims[3]). Looking forward to digging into Rust research tools for deductive verification in a later chapter(similar to SPARK - hand-written Hoare logic specifications, proven via SMT solving at compile-time).
We make comparisons to C and C++ in the book because they are the closest relatable analog for many developers. But I’m happy to add mention of Ada to not alienate potential readers like yourself, though my Ada knowledge is not currently deep enough to provide in-depth comparison.
This book is a best effort and an early-stage passion project. It strives to be technically accurate and data driven while remaining approachable to a range of readers. Like others have mentioned on this thread, this book does not claim to provide guidance for adherence with any particular standard or certification process. Although Chapter 3 does map concepts from MISRA C 2012 to Rust, in order to remain grounded in realistic, industry-adopted best practices.
The goal is to help readers build more secure and reliable software in general, to the furthest extent possible using entirely open-source tools. I look forward to iterating the content to meet high standards of quality, while keeping the entirety of the book freely available online.
Always open to critique and suggestions, thanks again!
This appears to be an online book (with a physical version forthcoming) that will teach you rust with a focus on "building performant software we can justifiably trust".
I think this is a learning resource (and not, for example, a more formal set of tools to write "high assurance rust").
I was really interested in this sort of stuff for a year or two, but ended up coming to the conclusion the best thing is to use tried-and-tested tools with rigorous standards/practices such as C/C++ than theoretically interesting tools like formally verified Haskell/Rust. It's just far more practical and there's something to be said about those boring tools used in the wild > academic creations.
I think my problem with writeups like these are that they attack the problem from the wrong (or uninteresting) angle. Like, say I'm writing software for an airplane then I'm gonna be chained to DO-178C. How will using Rust help me achieve all the objectives it requires for e.g. Level A? Is Rust gonna get me there cheaper (I think it might one day)? Is it gonna enable me to build systems that use constructs that I wouldn't have dared (due to complexity, safety or whatever)? Of course it's kind of hard to present this without a particular system design in mind (maybe it's PSAC for an abstract system I'm really after...). I suspect the situation is similar when getting some kind of security certification/accreditation (haven't seen much of that side of the coin). I'm not saying this project isn't valuable (it is), I just miss the top-down perspective of the language in this context. :)
Thank you for this perspective, it's one I'll try to incorporate more of in a future revision.
While the content aims to be generally applicable to a broad range of software, further contextualization against a specific standard like DO-178C might make for a valuable appendix section.
There's a bit of a balancing act, however, since Rust is, at present, not a certified choice for such use cases.
Well, in my experience it is possible to claim Level A despite not using a certified compiler. Perhaps not if you're building a civilian airliner and you're probably still gonna have to do some kind of activities to show what goes into the compiler comes out as expected. So although certification may be an ultimate goal, I don't think I'd consider it to be a complete showstopper when it comes to getting Rust airborne and keeping whatever it executes on so. :)
But you can avoid these by following best practices and using tools to find them. These are known knowns and there are solutions. The issue with Rust and others is that there are so many unknown unknowns, because it hasn't had the same number of hours of development time and usage.
Have you ever used a language like Haskell in a formal verification environment? You still get memory issues but there are far less tools to tackle them.
What is the value of Rust's idiosyncrasies, if it takes a book to explain how to actually benefit from them? You need to write a book to teach one how to do the same in C... is it just that that book will be longer?
Ada/Spark is the standard for high integrity systems. There are many things about Rust that make it unsuitable for safety critical software at the moment, e.g. lack of formal verification, lack of a standard, the fast moving of the compiler (it is under constant change), widespread use of dynamic memory allocation, etc.
Unfortunately I cannot take this very seriously, if it is completely ignorant of the Spark/Ada ecosystem and presents Rust as the "state of the art".
Rust has some nice properties, but there is an entire field of programming languages and compilers designed for building safety-critical software for aerospace and defense. Unfortunately, it looks like that is going to be forgotten, or rather, intentionally overwritten by ignorance.