I can't help but think they're trying to fix something that isn't broken at all.
Adding new abstraction layers rarely helps when doing systems programming. You (as in "the developer") want to be as near to the machine as possible. C does this pretty well.
In this case it seems like a very thin wrapper that leverages the type system to allow catching a whole class of errors at compile time, like using exhaustiveness checks to make sure a function call handles all possible return values. I think the small overhead is well worth it.
The original API is not "broken" per se, it's just limited by the language features ("magical" return values vs. tagged unions or whatever they're called in Rust, I don't remember.)
It's not even clear to me that there's any overhead to the Rust version. Checking error codes that should be checked isn't overhead. Checking them inefficiently would be overhead, but the Rust version looks like it should compile down to something pretty similar to what the equivalent C switch blocks would produce.
> It's not even clear to me that there's any overhead to the Rust version.
There is a slight bit of stack overhead: Option<ForkResult> is at least {tag:u8, {tag:u8, pid:i32}}, and due to alignment constraints it's actually {tag: u32, {tag: u32, pid: i32 }}). A nonzero wrapper[0] would allow folding either ForkResult or Option into a 0-valued pid_t and remove one level of tagging: http://is.gd/yxStW1
Beyond that you'd need generalised enum folding in order to fold two tags into the underlying value (you'd denote that pid_t is nonzero and nonnegative for instance)
We do have a planned optimization that would fold the tags for cases like `Option<ForkResult>` to give a word pair, which should be returned in %eax:%edx (or %rax:%rdx).
But if the wrappers get inlined (which they should be) then SROA kicks in and promotes the tags to SSA values, where other optimizations such as SCCP can eliminate them. Optimizing compilers are awesome :)
I'm not talking about the efficency of the resulting binary, but the "distance" from what the programmer is thinking, to what the machine will really do.
Compiler optimizations aside, C does a pretty good job at this. It's way more efficient than writing assembly, but your still basically just moving memory around, while doing some arithmethic. Easy to understand in "machine" terms.
Of course, this is only relevant when you're doing low-level stuff, like kernel or drivers programming. For the userland, Rust really looks like a nice language (I've played with it just a bit), and I'd be really happy if it pushes C++ away ;-)
Compiler optimizations included, C does a terrible job at this. It puts forward a seductive but terrible mirage of simple mappings and understandings which are just plain broken. And then you add multithreading into the mix and it gets even worse, even without optimizations.
We live in a world of many cores, and multiple CPUs all over the place - in your GPUs, your hard drives, motherboard controllers - and the intrinsic language support for multithreading literally does not exist as part of the C99 standard? One has to reach out to a mixture of POSIX, and the compiler extensions the POSIX implementation uses to annotate memory barriers so the optimizer won't break things, and intrinsics that introduce atomic operations, and... gah!
C and C++ do such a terrible job of this I have to resort to disassembly to debug program behavior far too frequently. These are the only languages I'm forced to do this with. If C or C++ were really "close to what the machine will really do", I'd expect the opposite result.
Even simple things like class and structure layouts and type sizes are controlled by a mess of compiler and architecture specific rules and extensions to control the application of those rules with regards to padding, alignment, etc. which I get to debug. Ever had to debug differences in class layout between MSVC and Clang due to differently handling EBCO in a multiple inheritance environment? What about handling alignment of 8-byte types on 32-bit architectures differently? At least you've replaced all uses of "long" because of the mixture of LP64 and LLP64 compilers out there...? And what about when two incompatible versions of the standard library with different type layouts get linked in by a coworker? These are the symptoms of a language that doesn't control what the machine is really doing very well at all.
When I really need tight control over what the machine will do at a low level, my tools are actual (dis)assembly, intrinsics, an understanding of the underlying hardware itself, and simple code that eschews features requiring significant runtime support or underpinnings. None of those are C or C++ specific. The last one requires some knowledge of how a language's features are implemented - C and C++ might be broken enough that you're forced to wrestle with that topic, when it's more optional in other languages, but... that still doesn't make it C or C++ specific.
Then you are missing the whole point of Rust. The point of Rust IS to allow you to be close to the machine but while you maintain much higher level of safety. Rust is designed to do this with little overhead.
In this day and age with big software packages, security being an increased concern it really is high time programming languages do more to help us avoid bugs which expose us to hackers and crackers.
I do have an affinity for C, but as a Objective-C programmer currently coding in Swift, I am really seeing how many more bugs the compiler helps me uncover.
I think Rust is on the right track. It is a long overdue change to systems programming.
The proposition that Rust is offering is not new. In the 90s Modula-2 was touted as "a better, safer way" of doing system programming than C. It failed to get traction outside of education because it failed to offer a compelling reason for people to migrate. Those that do not study history are doomed to repeat its mistakes.
In the example given it's possible to write a similar library in C to protect against unwanted side effects or bad API design. I'm sure several have been written over the years.
Rust is a great language with lots of improvements over other system programming languages, but that is not going to be enough to get people to switch. You have to show that it's good enough to be worth throwing away 40 odd years of experience and well understood best practice. Something that is going to take a long time and big public projects to do. If just being better was good enough Plan 9 would have been a roaring success and Linux (if it happened) would probably be a footnote in history.
C and UNIX have survived as long as they have not because better alternatives haven't come along, but because the alternatives haven't offered a compelling reason to switch. Unfortunately at least now Rust is falling into the same category.
"NEWP is a block-structured language very similar to Extended ALGOL. It includes several features borrowed from other programming languages which help in proper software engineering. These include modules (and later, super-modules) which group together functions and their data, with defined import and export interfaces. This allows for data encapsulation and module integrity. Since NEWP is designed for use as an operating system language, it permits the use of several unsafe constructs. Each block of code can have specific unsafe elements permitted. Unsafe elements are those only permitted within the operating system. These include access to the tag of each word, access to arbitrary memory elements, low-level machine interfaces, etc. If a program does not make use of any unsafe elements, it can be compiled and executed by anyone. If any unsafe elements are used, the compiler marks the code as non-executable. It can still be executed if blessed by a security administrator."
Sounds similar to modern practices? Done before C and UNIX were a thing.
C and UNIX have survived this long, because they go together as one, just like JavaScript is the king of the browser, C was the only way to go when coding on UNIX systems.
> Unfortunately at least now Rust is falling into the same category.
Rust offers one major thing that Modula-2 never did: eliminating memory management problems (also concurrency problems) with zero overhead. In the '80s and '90s it was not known just how dangerous memory management problems could be (use-after-free was thought to be a harmless annoyance). Not now in 2016, with every single browser engine falling to remote code execution via UAF in Pwn2Own.
Modula2 was from an era before the internet was ubiquitous and everyone had computers in the pocket. To compare lack of uptake of a "safer language" from a time when the internet and attack surface was so much smaller to now seems disingenuous. C and UNIX go hand in hand, nobody is disputing their worth or tenacity. I fail to see how a proposition that is not new detracts from Rust.
Modula-3 descends from Modula-2, although not directly.
Some of the Xerox PARC Mesa/Cedar researchers went to work for DEC (later Compaq) and created Modula-2+ with feedback from Niklaus Wirth. Which had actually used Mesa as inspiration for Modula and Modula-2.
Eventually Modula-2+ evolved into Modula-3.
Nowadays I would say part of its ideas live on C#.
I don't think you want to be as near as the machine as possible, otherwise all system programmers would write machine code. You want to have powerful abstractions that the compiler can see through to produce optimal code.
I tend to agree with you with one caveat. We did a C like scripting language and added one thing that C was missing, a variable value that is "undefined" which is not the same as zero. Really simple but now you can do stuff like
pid_t p;
if (defined(p = fork()) {
// parent / child stuff here
} else {
// fork error here
}
It's pretty much the same as try / catch, we just implemented it as part of the variable. And any scalar or complex type can be undefined.
I suspect if C had this a lot of these code samples would be a little more clear. Maybe? Dunno, it's worked well for us. And we like C a lot.
Adding new abstraction layers rarely helps when doing systems programming. You (as in "the developer") want to be as near to the machine as possible. C does this pretty well.
Perhaps I'm just getting old :-(