Having C or C++ somewhere in the lower levels of your stack doesn't make them the primary language for your domain. Other wise CPU microcode would be the primary language of every domain.
Different metrics tell a different story. For example GitHub pull requests [1] are C++: 2.60%, Rust: 2.09%, C: 1.43%, with a clear trend showing Rust ahead of C++ next year. Or you could look at the Stackoverflow survey of languages used among professionals [2], which gives C++: 20.17%, C: 16.7%, Rust: 8.8%, with rust gaining 1-2% each year.
There's no best metric, they're all biased, you need to consider a few different ones. Otherwise you won't notice when you've stumbled upon one with with an extreme view.
Combining C and C++ in language stats is debatable, they should IMHO be measured separately. When grouped as a language category, "C/C++/Rust" is slowly becoming more common.
I assume 1 is pulling only from github.com public projects. Many companies A: don't make their code public, B: probably use github enterprise/other hosted solutions (esp including not git based)
Yes, public github repos, stackoverflow survey respondents, devjobscanner offers, google searches, etc are all skewed in some way.
It's very hard to qualify the effect of those biases though: for example how does the public/private repo ratio differ between languages ? Good luck giving a trustworthy answer to that. Apart from looking at lots of different source kinds, one thing that's fairly trustworthy is the trend of a specific language in a specific source.
On that topic, looking at the "SO questions" metric of the first link, C and C++ both have a strange regular spike in the last quarter of each year. I attribute that to new CS students flocking to SO at the beginning of their term. Another fun trend to look at is the hourly google searches over a week: the weekdays / workhours spike is much more pronounced for some languages than others.
SWC (by the same author as this tentative tsc port) is a part of more and more people's typescript toolchain, and is written in Rust. Having all your ecosystem in a single language feels nice, but pragmatically different tools might be better in different languages.
The point of having it all embeddable in Go is that we could write a single executable that handles the entire process, with the toolchain embedded and the configuration written as code.
> Afaict, none of them requires multiple mutable references to the same objects
They ask to store objects ("turtles") into a single Vec. Two turtles from that Vec can breed to create a child turtle stored in that same vec. Parents must retain a reference (a real ref, not an index or other workaround) to their children, meaning that children have multiple refs. Children can become parents themselves, so all the turtles are mutable.
There you have it: multiple mutable references to the same object. With proper Rust you'll need some kind of RefCell to implement this (convoluted) design. The runtime check will ensure a runtime panic if you try to make the same object mutable via different RefCells (trying to breed a turle with itself). With BronzeGc the compiler will believe that they are different objects, and UB-optimize accordingly.
There are a lot of Rust GC approaches here, makes me wonder why the study author didn't use one (or more) of the existing sound GCs. Remove a glaring flaw of the study, and spend less time writing throw-away code.
He (and other prominent kernel developers) has manifested his interest many times, but is taking a strict "wait and see" approach (as you would expect from any big project manager). Whether it happens or not, Rust has gotten closer to inclusion in Linux than any other language.
The coreutils rewrite is essentially a hobby project, which nobody is going to switch to until it can boast 100% compatibility. So of course it's slow coming, it's not a fair comparison. There's also the issue that writing a drop-in replacement is harder and less attractive than making something better (see ripgrep and other coreutils alternatives).
But as bawolff said, the main issue is moving towards safer tech. If Ada had good enough momentum, we'd be discussing Ada in the kernel instead. C++ has been rejected from Linux, leaving Rust as the only kernel-capable language with enough momentum, so Rust it is.
That has nothing to do with memory safety though let alone Rust specifically. What we need is a smarter way to manage authentication (non-persistent session cookies and usage a password manager to handle the log in is a step in the right direction).
A lot of those guarantees are only available in the alloc-free subset of Ada, which make them much less attractive. The devil is in the details, making Ada guarantees not always better than Rust ones.
D and Ada both use a GC in many contexts, and are not half as interesting in their GC-less contexts. They both initially only had a proprietary compiler, which durably harmed adoption. They both seem to cater to fewer domains than C/C++/Rust. Today Ada is only used in high-stakes industry, and D doesn't seem to have any claim to fame. It's a pity that neither succeeded in their time, but there's no good reason to use either of them today.
Zig is very promissing, but is just too new.
You failed to mention C++. It's well loved (and hated), has many pros and cons compared to C, and is used for kernel development (just not Linux).
It's silly to think that the only explanation for Rust success is community lobbying. Rust has many concrete advantages, like being safer than C/C++/D/Zig, being fully GC-free and suitable for kernel and embedded development, having actually gained significant traction, having great tooling/docs/ecosystem... And generally being a language that people enjoy. Rust isn't the end-game of kernel programming, but it isn't just "yet another better C".
The only reason it isn't highly adopted was the high cost of its compilers, only SGI and Sun cared to have compilers on UNIX back in the day, and Microsoft, IBM and Apple rather doubled down on C++.
Sorry, bad mischaracterisation on my part, annoyingly it's too late to edit my post.
Looking at the Ada docs again, it considers manual deallocation an unsafe operation and suggests avoiding it altogether (IIUC, by restricting yourself to the stack or by leaking to the heap). That seems like a huge restriction, making the "Ada is safer than Rust" claims rather academic.