Enforcement is the entire point. A failed return from a recv() may be an application problem. It doesn't compress.
> ..software is a purely logical artifact
No. No, sir , it is not. There is no magical unicorn version of communications in which you can simply assume it all always gets there instantly and in order. We can get close - severely underutilized Ethernet & 802.11 spoil us - but nuh uh.
> Make no mistake, it is expensive.
And you wonder why they are like they are :) "you can't afford it, honey." :)
> No. No, sir , it is not. There is no magical unicorn version of communications in which you can simply assume it all always gets there instantly and in order. We can get close - severely underutilized Ethernet & 802.11 spoil us - but nuh uh.
That simply means you want an unimplementable abstraction. (Perfectly reliable sequential communication over a computer network.) Of course it doesn't make sense to want impossible things.
> And you wonder why they are like they are :) "you can't afford it, honey." :)
This brokenness can't be fixed at the level of business applications. Languages and standard libraries need to be fixed first.
I forget what the thing you just did is called, but you've managed to switch sides. :) I'm the one who said there is no unicorn version etc. ....
You can't fix that in a library. There is a sequence of escalation. Failures are formally checked-for and counters are incremented, alarms are sent, actions are taken...
You may not be interested in the Second Law, but the Second Law is interested in you.
> I forget what the thing you just did is called, but you've managed to switch sides. :)
I didn't switch sides. I stand by my assertion that software is a purely logical artifact. The laws of thermodynamics have no bearing on whether redirecting the control flow to a far-away exception handler (or, even worse, undefined behavior) is a reasonable way to deal with unforeseen circumstances.
> I'm the one who said there is no unicorn version etc. ....
I'm not talking about unicorns, only about abstractions that don't leak. That being said, I'll admit that sometimes there are good reasons for using leaky abstractions. My favorite examples of this is garbage collection. The abstraction is “you can always allocate memory and you don't need to bother deallocating it”. The second part is tight, because precise collectors guarantee objects will be reclaimed a bounded number of cycles after they become unused. But the first part is leaky, because the case “you've exhausted all memory” is uncovered. The reason why this isn't a problem in practice is that most programs don't come anywhere near exhausting all available memory, and, if it ever happens, ultimately the only possible fix is to add more RAM to the computer.
FWIW, I don't consider TCP a leaky abstraction, because it doesn't promise that actual communication will take place. It only promises that, if messages are received, they will be received in order by the user of the abstraction. That being said, most TCP implementations are leaky, as is pretty much anything written in C.
Most importantly, you need to enforce the assumptions. The lack of enforcement is where abstraction leaks come from.
> the Second Law of Thermodynamics ( which is the lynchpin of the Two Generals Problem ) is unlikely to change to accommodate our foolishness :)
The second law of thermodynamics is fundamental to understanding how the physical world works, but software is a purely logical artifact.
> As I understand you, "polishing abstractions until they don't leak" is equivalent to "doing the whole job, not just part of it."
It means redesigning the abstraction until there are no cases uncovered by the abstraction's enforcement mechanisms.
> Economically, this is a pain point for the people we work for. It sounds expensive.
Make no mistake, it is expensive. But dealing with abstraction leaks is even more expensive.
> The accounting for it is very difficult. "Can't you just make it work" is not unreasonable.
It doesn't work if it breaks.