Postel's law isn't about accepting arbitrary invalid inputs. It's about inputs that are technically invalid but the intent is obvious from looking at it, and handling those according to intent.
In a distributed non-adversarial setting, this is exactly what you want for robustness.
The problem, as we've come to realise in the time since Postel's law was formulated, is that there is no such thing as a distributed non-adversarial setting. So I get what you're saying.
But your definition of robustness is too narrow as well. There's more to robustness than security. When Outlook strips out a certificate from an email for alleged security reasons, then that's not robustness, that's the opposite, brokenness: You had one job, to deliver an attachment from A to B, and you failed.
Robustness and security can be at odds. It's quite OK to say, "on so and so occasion I choose to make the system not robust, because the robust solution would not be sufficiently secure".
The only area in which is it acceptable to reason this way is graphical user interfaces. (And only if you've provided an API already for reliable automation, so that nobody has to automate the application through its GUI.). Is say graphical, because, no, not in command interfaces.
Even in the area of GUIs, new heuristics about intent cause annoyances to the users. But only annoyances and nothing more.
Like for example when you update your operating system, and now the window manager thinks that whenever you move a window so that its title bar happens to touch the top of the screen, you must be indicating the intent to maximize it.
I suppose the ship has sailed now that people are deploying LLMs in this way and that and those things intuit intent. They are like Postel's Law on amphetamines. There is a big cost to it, like warming the planet, and the systems become fragile for their lack of specification.
> When Outlook strips out a certificate from an email for alleged security reasons
I would say it's being liberal in what it accepts, if it's an alternative to rejecting the e-mail for security reasons.
It has taken a datum with a security problem and "fixed" it, so that it now looks like a datum without that security problem.
(I can't find references online to this exact issue that you're referring to, so I don't have the facts. Are you talking about incoming or outgoing? Is it a situation like an expired or otherwise invalid certificate not being used when sending an outgoing mail? That would be "conservative in what you send/do".)
You are spot on in regards to consideration of adversarial context; however, it is instructive to review the nuanced difference between the RFC761 (1980) statement, viz. "be conservative in what you do, be liberal in what you accept from others" and the substitution of "send" for "do" in RFC1122 (1989). The latter is, with hindsight, an error, since it refocused the attention of some rigid thinkers entirely onto protocol mechanics and away from implementation behaviour, despite the commentary beneath that admonishes such a mindset and concurs wholly with your point.
Or to put it otherwise, Postel was right to begin with, albeit perhaps just a little too cryptic, and has been frequently misquoted and misinterpreted ever since.
It’s not wrong, because it doesn’t stand alone, and yes, if you’re cherry picking half a statement, then you’re like the Australian politicians calling it a “lucky country”, perpetuating a gross misrepresentation of both the letter and the sentiment of the original statement.
This is especially ironic given that the constructive argument against Postel’s law is generally based on the value of a strict interpretation of specification. If you’re intentionally omitting half of the law, then you have an implementation problem.
Furthermore, none of this has much to do with YAML being a shitty design.
I think the other half of the law is more or less good, so I don't have any reason to bring it up. I don't exactly know what it means to be conservative, but it sounds safe, like staying away from generating inputs for other programs that exercise dark corner cases. I don't really have anything to discuss about that half of the law.
The world doesn't need principles that are half good.
> I don't exactly know what it means to be conservative
That could undoubtedly lead to misapprehension. As the GP indicated, and as the word itself means, it references systemic stability and self-preservation behaviour. Reciprocally, however, the obligation to be liberal absolutely does not mean absolving faulty inputs of their flaws. For example, it would not excuse a dud response to an SSH handshake like trying to negotiate RC4. Both Steve Crocker and Eric Allman have been at pains to unpack the understanding of robustness, forgiveness, and format canonicalisation in security context, and they're hardly wrong. It's also why I'm particularly an advocate of the "do", not the "send", formulation. This is a much more systemic and contextual verb in its consequences for implementatation.
> staying away from generating inputs for other programs that exercise dark corner cases
This is exactly the kind of focus-solely-on-the-wire misdirection that I identified above as a common misinterpretation. Conforming to the most precise and unambiguous interpretation of a protocol, if there is one, in regards to what an implementation puts on the wire, can most certainly be a part of that, but that isn't always what being conservative looks like, and processing is equally if not more important.
The introduction of Explicit Congestion Notification (ECN) aka RFC 3168 (2001) springs to mind. RFC 791 (1981) defined bits 14 & 15 of the IPv4 header as "reserved for future use" and diagrammatically gave them as zero. RFC 1349 (Type of Service, 1992, now obsoleted) named them "MBZ" (Must Be Zero) bits but gave them to be otherwise ignored. RFC 2474 (DSCP, 1998) did much the same with what it termed the "Currently Unused field". When ECN was introduced, making use of those bits as a supposedly backwards-compatible congestion signalling mechanism, we discovered a significant proportion of IP implementations aboard endpoints, routers, and middleboxes were rejecting (by discard or reset) datagrams with nonzero values in those bits. Consequently, ECN has taken two decades to fully enable, and this is where both sides of the principle prove their joint and inseparable necessity; to this day many ECN-aware TCP/IP stacks are passive, stochastic, or incremental with their advertisement of ECN, and equally forgiving if the bits coming back don't conform, because an implementation that resets a connection under the circumstances where the developer comprehends the impedance mismatch would be absurd. Thus fulfilling both sides of the maxim in order to promote systemic stability and practical availability and giving ECN a path to the widespread interoperability it has today.
The exposition on page 13 of RFC 1122 (Requirements for Internet Hosts, 1989) broadly anticipated this entire scenario, even though the same section misquotes Postel (or, rather, uses the "send" restatement that I find too reductive).
The statement of the robustness principle is an integrated whole. A partial reading is, perhaps ironically, nonconformant; as with Popper's paradox of tolerance, one thing it cannot be liberal about is itself.
Yes, rejecting must-be-zero-reserved-bits due to being nonzero is obviously incorrect.
"Must be zero" means that when the datum is being constructed, they must be initialized to zero, not that when the datum is being consumed, they must also be validated to be zero.
Violating either rule will cause that implementation not to interoperate properly when it is still found deployed in a future in which the bits have now been put to use.
Rejecting must-be-initialized-to-zero fields for not being zero is not an example of a flaw caused by neglecting to be "liberal in what you accept". It's an example of failing to accept what is required: the requirement is to accept the datum regardless of what is in those reserved bits. It is arguably an instance of failing to "be conservative in what you do". If you are conservative, you stay away from reserved bits. You don't look at them, until such a time as when they have a meaning, accompanied by required (or perhaps optional) processing rules.
Now, I see the point that reading Postel's law might steer some designer away from putting in that harmful check for zero, specifically due to the "be liberal" part of it. But that's just a case of two wrongs accidentally making right. That same designer might refer to "be liberal" again in some other work, and do something stupid in that context.
The only thing that will really help is clear specs which spells out requirements like "the implementation shall not examine these bits for any purpose, including validating their value".
In a distributed non-adversarial setting, this is exactly what you want for robustness.
The problem, as we've come to realise in the time since Postel's law was formulated, is that there is no such thing as a distributed non-adversarial setting. So I get what you're saying.
But your definition of robustness is too narrow as well. There's more to robustness than security. When Outlook strips out a certificate from an email for alleged security reasons, then that's not robustness, that's the opposite, brokenness: You had one job, to deliver an attachment from A to B, and you failed.
Robustness and security can be at odds. It's quite OK to say, "on so and so occasion I choose to make the system not robust, because the robust solution would not be sufficiently secure".