Hacker News new | past | comments | ask | show | jobs | submit login
Nvidia Security Team: “What if we just stopped using C?” (adacore.com)
257 points by q-big on Nov 7, 2022 | hide | past | favorite | 362 comments



C is like asbestos. It was fine at what it did, good performance, but the safety problems outweigh them. The difference is that we _stopped using asbestos_ because it was unsafe. It’s still around but being replaced during renovations, and no new installations use it.

For whatever reason with C there’s this huge emotional component to it. Safer alternatives exist. You’d rightfully laugh at a contractor who suggested asbestos is fine, if you make sure to use only highly-skilled installers who patch up the drywall so that no fibers can escape. But with C we say that all the time, and the CVEs keep piling up.


> For whatever reason with C there’s this huge emotional component to it.

Frankly getting your knickers in a knot is emotional - the people using C are doing so for pragmatic reasons, not emotion.

I do see comments like yours a lot - there's an emotional attachment and/or ego attached to the comment, so the argument against is very vociferous and, honestly, ugly[1].

[1] C and asbestos have almost nothing in common in terms of negative impact on the world. Maybe people are still using C because hyperbole is simply not a persuasive argument.


> I do see comments like yours a lot - there's an emotional attachment and/or ego attached to the comment

Could you explain how it's emotional? It comes across as a pretty logical analogy to me - the author isn't suggesting that they don't like C (which would be categorically emotional), they are comparing it to phasing out something unsafe in another industry.


I hate all analogies with physical world, because the person tends to pick up the most emotional one. It's emotional because they picked the thing that makes my lungs bleed. C, with all its flaws, doesn't damage my physical health.

You call it a logical analogy, how an analogy can be logical?

EDIT: and I hate analogies because, since they suck, someone will try to come up with a better analogy. And now we have a thread where people compare C to asbestos, then they compare C to a fire in kitchen. Not a kind of thread I would like to be in.


How can you hate all analogies with the physical would? How do you propose somebody learns something completely abstract without any physical counterpart whatsoever? How do intend to teach people something as simple as numbers without using your fingers or some other quantity of items in the world?

Vast quantities of language is built on analogy. Streaming, the internet, the World Wide Web, Starship. The word “cycle” comes from the Latin for “circle”. The list goes on.

I struggle to even comprehend how someone can hold this opinion. Have you ever tried instructing anyone in anything?


> Vast quantities of language is built on analogy. Streaming, the internet, the World Wide Web, Starship.

These are metaphors, not analogies. [1]

The GP is cautioning against argument by analogy, not the use of metaphorical language. Conceptual traps engendered by argument from analogy have been raised for thousands of years, indeed probably ever since formal discussion of language and reasoning have been recorded.

[1] https://prowritingaid.com/analogy-vs-metaphor


Yes but the metaphorical name has obviously come about through a simile or analogy. Like a ship but in space going to stars. We can understand a cycle through looking at how we draw a circle etc. The words themselves are metaphors but when you explain them to a child for the first time you do it through similies and analogies. They’re intrinsically linked.

And I’m firmly in the camp that it is fine to use analogy in debate. Rarely do people claim when they make a comparison that it is a one to one mapping. Just that there are possible insights to be gleaned from the comparison which could potentially mean that we may be able to solve problems in one domain using similar solutions to the existing domain.

Take this quote for example:

> I hate all analogies with physical world, because the person tends to pick up the most emotional one. It's emotional because they picked the thing that makes my lungs bleed. C, with all its flaws, doesn't damage my physical health.

Obviously interpretation is subjective but this seems a huge misinterpretation of what OP was saying. To me, OP is clearly arguing that C code is damaging to the long term health of computer programs the same way asbestos exposure is damaging to long term human health and that we should consider replacing or retiring C code with modern alternatives like Rust the same way asbestos has been replaced with modern building materials. Saying that you don’t like the argument because bad C programs are nowhere near as damaging to physical health as asbestos seems like a bad faith interpretation of what OP was trying to say. I also think the criticiser may feel differently if he was hooked up to a life support machine that was known to have buggy c code but that’s by the by.

Hacker News seems to have gone really hellbent on this pedantic ‘this is fallacy x’, this is fallacy y’ stuff recently as if every comment is some kind of formal scientific paper. It’s really weird and I’m sure it never used to be like this. It’s sucks the joy and fun out of everything, I don’t know why people can’t just express interesting viewpoints here anymore like a normal conversation without the fallacy police showing up,


I'd rather prefer people saying "C has a long history of damaging security" and that's it, no asbestos here. Is a personal preference of mine. Less chances to misinterpret something.

At least it's not a "comparing Linux kernel to a car" situation I've also seen.


I’ve never compared the Linux kernel to a car but I wouldn’t necessarily throw it out as a useful analogy for two reasons:

- 1: sometimes we have to learn simplified but faulty models in order to understand the more complex and more accurate models

- 2: Everyone sees the world in different ways. It is often through combining these different viewpoints (or analogies) that we arrive at something closer to the objective truth. This is eloquently illustrated in the [Blind men and an elephant parable](https://en.m.wikipedia.org/wiki/Blind_men_and_an_elephant). The moral of the parable isn’t that we should throw away all analogy because it can be misinterpreted. The moral is, as Wikipedia puts it:

> The parable has been used to illustrate a range of truths and fallacies; broadly, the parable implies that one's subjective experience can be true, but that such experience is inherently limited by its failure to account for other truths or a totality of truth. At various times the parable has provided insight into the relativism, opaqueness or inexpressible nature of truth, the behaviour of experts in fields of contradicting theories, the need for deeper understanding, and respect for different perspectives on the same object of observation.


Analogies might not help you understand a perspective on a topic but they do help many, many people.


[flagged]


Sure, maybe the analogy isn't logical.

> It's also logical to kill young children with autism so they don't become a financial or emotional burden on their family or society as a whole. It turns out claiming something is logical isn't, itself, enough of a reason to do something and those doing so should endeavor to give more convincing arguments.

That's an analogy, and we are talking about a programming language - not humans.

Again, where is the passion in the original comment? I have asked for nothing more than that.


> That's an analogy, and we are talking about a programming language - not humans.

Is it your supposition that asbestos was banned for non-human reasons?

Or does this "logic" only work when convenient for your own defenses?


A truth stated passionately doesn’t become false. A falsehood stated calmly doesn’t become true. This is at the heart of why emotion is always a logical fallacy.

Everyone would be better served by using and hearing it less. Focus on the actual argument.


I am neurodivergent, so I may be missing something here. The argument seems pretty passionless. Excluding the assumption that C is unsafe, it contains purely facts.

Your reply doesn't point out the passionate component of it either, and doesn't answer my question.


> I am neurodivergent, so I may be missing something here.

Yes, you're missing the logic.

> Excluding the assumption that C is unsafe, it contains purely facts.

You think C is carcinogenic and mere exposure to it kills?

From a "purely facts" PoV, leaving all emotion aside, C runs, and has done so for a number of decades, life-critical systems, including munitions, weapons, life-support, industrial machinery and aviation.

It's only because you are married to your argument that you think comparing C to a fatal poison is "pure facts".


I interpreted the original comment as saying that C is detrimental to the long term health of a piece of software in a similar way to how asbestos is detrimental to the long term health of humans.

I don’t think there was any suggestion made that C is detrimental to human health.


Analogy is not an argument.


Sure it is, depending on the analogy and the purpose of the analogy. An analogy also isn't always fallacious. It needs to have a faulty extrapolated assumption from the comparison.

In this particular case, the argument is that C should be replaced with something safer for similar reasons that asbestos is replaced with something safer. However, one can certainly challenge the validity of this comparison and whether there's something faulty in it.

I don't feel one way or the other because I don't work with C or other low-level languages, but it's a bit absurd to say that an analogy isn't an argument. It absolutely can be if the argument within is clear, even if fallacious.


Thanks for clarifying. I meant valid, non-fallacious argument.

More often than not I see analogies (ab)used as a tool to construct a straw man. This particular case demonstrates how analogy takes discussion away from merits and downsides of C to an emotionally charged subject of cancerogenic substances, as other commenters highlighted in this thread.


Personally, I don't think it's the commenter's fault that some others have such strong emotions to the analogy that they're distracted from the point. There's no indication that the commenter was appealing to emotion and, as others have also pointed out, making a comparison to an "emotionally charged" subject does not nullify the validity of the comparison. In fact, it's fallacious to falsely invalidate the argument based on some cherry-picked, subjective, peripheral perspective like that. In my opinion, the appeal to emotion and the derailment of the discussion is from those doing so, not the original commenter.

It seemed like just a simple, general, easily-mappable analogy that most people would get, even non-programmer laymen, that emphasizes that unsafe should be replaced with safe. Any analogy of something unsafe can be seen as "emotionally charged" if someone was personally impacted by it.


> Analogy is not an argument.

Just for sake of clarity, formal analogy is a valid, non-fallacious argument because it's based on parity of reasoning.

The original comment made a pretty good, arguably formal analogy between asbestos and C. The only way to dismiss the conclusion is if there were no alternative to asbestos, then you're kinda forced to use asbestos. This is the case for C in some contexts, like some microcontrollers that only come with C compilers. It's not generally true that we need C for much else though.


There is seldom a microcontroller that also doesn't have a Basic or Pascal compiler from https://www.mikroe.com, but naturally that is an additional cost that many would rather not care about.


> Analogy is not an argument.

"An analogical argument is an explicit representation of a form of analogical reasoning that cites accepted similarities between two systems to support the conclusion that some further similarity exists." [1]

[1] https://plato.stanford.edu/entries/reasoning-analogy/


That still doesn't point out the emotion in the comment. Your reply here would work as a reply to the root comment, but doesn't address what I am asking at all. Where is the passion?


Asbestos is an emotionally charged topic for many people, especially the many who have (primarily) older relatives who have died or are dying from cancers caused by it. The analogy is resting primarily on the appeal to that kind of emotion without actually establishing a strong logical similarity between C and asbestos.


> Asbestos is an emotionally charged topic for many people, especially the many who have (primarily) older relatives who have died or are dying from cancers caused by it.

I’ve not personally been impacted by this, but this seems reasonable.

> The analogy is resting primarily on the appeal to that kind of emotion

This seems like projection. It struck me as a perfectly legitimate analogy without any emotional correlation.


I don't know, C isn't out there causing cancer and killing people left and right. It's hyperbolic to draw that comparison. Critical systems (especially post Therac-25) that are using C are already using a subset of C, static analysis tools, and comprehensive testing that raise the quality level of actual life-and-health-impacting C programs.

EDIT: Because someone will probably make a note of it: Therac-25 wasn't written in C, but the flaws in that system and others helped lead to a massive change in the way critical systems are developed and treated. I referenced it because it's one of the most notable failures in safety-critical software systems.


It’s fine to argue that the analogy is flawed (all of them are) or that it’s irrelevant, or simply wrong.

What this entire thread is about, however, is this highly charged response to what seemed like a straightforward analogy.

> Frankly getting your knickers in a knot is emotional - the people using C are doing so for pragmatic reasons, not emotion. > I do see comments like yours a lot - there's an emotional attachment and/or ego attached to the comment, so the argument against is very vociferous and, honestly, ugly[1].


At the end of the day it seems as though an emotional argument occurred in the eye of the beholder. This only adds even more credibility to the root comment.

A failed experiment with the Socratic method on my part.


Where does this argument end? Are we banned from using analogies containing animals because someone out there has a relative who died from a bear/shark/bee sting? Are we banned from talking about cows in a debate with vegans because someone lost a loved one when they were out hiking and got caught up in a cattle stampede? Are we banned from mentioning cars because everyone knows someone who had a traumatic car injury? By this logic you either end up molly coddling the entire world into silence because everything is a possible traumatic trigger to somebody or you declare yourself the king of acceptable debate and only the topics you approve of are acceptable because you are the chosen one anointed by the heavens.


What a bizarre comment.


It’s a classic reductio ad absurdum which is bizarre by nature. It shows that silencing people for discussing things you personally find upsetting is a slippery slope that leads to absurd or undesirable endpoints.

https://en.m.wikipedia.org/wiki/Reductio_ad_absurdum

https://en.m.wikipedia.org/wiki/Slippery_slope


Also point out that there is nothing that can be done to fix asbestos. But C can be made safer. The three things that block that are

1 A culture that thinks API's need to always have a gotcha. See the inability to to field a safe string copy.

2 ISO commitee that thinks it's in charge of maintaining a historical artifact. Suggestions to change the language like Brights suggestions about passing arrays fail because the ISO committee thinks that's Heresy.

3 People that think C is hopeless. Usually these are people that had a bad time with it in school.

4 Idiots that think the only reason to use C is speed. Therefor any improved way of doing things that might be 10-20% slower than raw unsafe code is pointless.


That makes sense, thank you.


Fine, but that leaves us with passionless non-argument. How is it helpful?


Unsafe C is a fact, there’s no need for an argument in their comment.


You don't seem to address the "pragmatic reasons" expressly, so I'll go ahead and do it:

1. "Don't rewrite code" a fundamental precept of software that has been "engineered"

    - This is more about development lifecycles than something that contributes usefully to software quality in the long term. Corporations, and people haven't been treating these things like bridges or overpasses
2. Outsourcing is believed cheaper than the solution.

    - Again, paying low-wage junior SEs to patch uncovered bugs is obviously cheaper than proactively finding them all.
3. You can sell the New Coke too!

    - If you have a good rewrite, sell it at a fat markup, and keep selling the old crap too, it's not your risk.

    - Obviously don't rewrite unless your horrid monolithic source leaks, and you're certain that it's a problem, those disgruntled workers you wrongfully laid off know too much.
4. If your tech references are a security risk, sell those too!

    - Don't include pesky manuals where people can uncover issues with your development processes, outsource a minimum blackbox of guess work to Singapore and let that double as QA.

    - It's not a manual, it's 'training' -- supporting education of hopeful workers! Wow!
I won't name any names.


> Frankly getting your knickers in a knot is emotional - the people using C are doing so for pragmatic reasons, not emotion.

I think a lot of us using C would like to move away, but for whatever reason can’t. One example is that we hotpatch our binaries, but that sort of infrastructure doesn’t exist for Rust.

Beyond tooling issues, that certainly can be fixed, it’s also non trivial to move a large codebase to a safer language. Maybe incremental use of languages like Rust is still better?


Believing that all software security problems would be solved if we got rid of C is a bigger disease than C itself. I work in a fairly large (for my country) financial institution and the problems we deal with on a day to day basis are so far removed from buffer overflows that it's not even funny.

We still have software running in excel macros that control web browsers via COM. C is nothing compared to the attack surface of that.


OP is not claiming that it would reduce _all_ software security problems though, so this is a straw man (intentional or not).

Even the analogy holds up in this regard: using other materials instead of asbestos does not remove all the construction problems.

I think C has other things going for it (simplicity of the compiler), just wanted to point that out.


> Believing that all software security problems would be solved if we got rid of C is a bigger disease than C itself.

According to studies done at Google and Microsoft, about 70% of their serious security issues are a result of memory safety errors.

>Around 70 percent of all the vulnerabilities in Microsoft products addressed through a security update each year are memory safety issues; a Microsoft engineer revealed last week at a security conference.

https://www.zdnet.com/article/microsoft-70-percent-of-all-se...

and

>Nearly 70% of the high severity security bugs in Chrome's code are memory unsafety problems, Google's engineers have revealed.

https://tech.hindustantimes.com/tech/news/70-of-security-bug...


As has been pointed out repeatedly whenever this statistic is brought up in discussion, that figure is not indicative of much.

And in discussions like this, it's verging on the dishonest - leading with a large number makes readers think that a large difference to bug count will result.

> Around 70 percent of all the vulnerabilities in Microsoft products addressed through a security update each year are memory safety issues; a Microsoft engineer revealed last week at a security conference.

What's the total bug count?

A codebase with 10000 bugs logged against it might have maybe 10[1] that are CVEs, of which, according to that statistic, only 7 would be fixed by moving to a safer language (either GC or Rust or similar).

That's just not enough motivation for switching when the payoff for preventing "70% of CVEs" is so small it may never even happen.

[1] I've worked on C codebases that were developed over a decade, and 1000:1 of all-bugs:memory-safety-bugs is a very conservative ratio. IME, it's been closer to 10000:1.


> that figure is not indicative of much

Microsoft does not appear to agree.

>Mark Russinovich, the chief technology officer of Microsoft Azure, says developers should avoid using C or C++ programming languages in new projects and instead use Rust because of security and reliability concerns.

https://www.zdnet.com/article/programming-languages-its-time...


> Believing that all software security problems would be solved if we got rid of C is a bigger disease than C itself.

Getting rid of a class of errors (correctness) allows you to focus more on other errors (logical). It could be argued that it could slightly reduce the likelihood of the latter.


Our industry went through this with the move to Java (and later .Net).

"It will solve memory leaks!"

...

"whoops, you can still leak memory in garbage collected languages, my bad everyone!".


ditching C doesn't solve memory leaks but it does solve use after free. A memory leak is a performance problem that isn't generally exploitable. A use after free is almost always a security vulnerability.


You can use resources after releasing them in pretty much any language. That's maybe not as dangerous as reinterpreting arbitrary memory, but then you can also use object pools for everything in C, which is what some very high performance code I used to work on did.


quick, someone tell everyone locking up resources without releasing them isn't exploitable for bad actors.


Memory leak is better than privesc in most cases of active exploitation.


Both a paper cut and a gunshot wound bleed. It doesn't make them equal.


I had to move part of the Code to C for different reason (imaging device) and used that code in a .net application. But allocating unmanaged memory on .nets heap led to memory fragmentation and while overall there was enough memory available, you ran into allocation errors since unmanaged memory needed to be allocated in one successive block. Usually doesn't happen, but if you have large objects like high resolution images in memory, you can quickly run into problems. And to my knowledge there was no way to check if there was enough successive memory available.

Had to wait on a patch for .net... not a problem anymore today, although I don't know the details how they solved the problem.


I dont remember single mem leak in my half decade .net career

which speaks how rare they are


.net 1.0 was 20 years ago, you've been working in it for 5 years, roughly 25% of its lifetime.


We're evaluating techs by their state decade(s) ago here or what?


judgment is not the only reason for conversation, I strongly suggest you read back over the conversation from the beginning and endeavor to understand the context with which responses are made.

I also suggest you say 5 years instead of "half-a-decade", most of us see through that and it's off-putting.


I have worked with .Net, in a hobby capacity, since beta. I have worked with it professionally since 2007.

I have seen two instances of rooted memory:

- WPF. The implicit GC root path introduced by events. We were easily able to track this down due to the information that the GC holds in memory[1] (fixing it was not so easy).

- A leak in managed C++ (non-pure, i.e. native code). We had to use the normal tools to track this down, not a 5 minute diagnosis.

At the end of the day GC'd languages can have (and usually do) richer tooling surrounding memory usage. They are better in this regard even during a failure state.

[1]: https://learn.microsoft.com/en-us/dotnet/core/diagnostics/so...


Lets loop back to the original point.

Getting rid of C won't get rid of security problems anymore than moving to java and .net got rid of needing to think about memory leaks.

Those that think otherwise are ignorant of history, or choosing to ignore it.

And it gets worse when you consider resource exhaustion, not just memory. There are entire mechanisms in .net and java whose sole purpose in life is to try and deal with resource exhaustion because the GC only deals with 1 aspect of that resource exhaustion (memory leaks).

---

For those of us who watched that transition, it's obvious it won't work out the way people are claiming.


In C you leak memory because of all the land mines in the language. In Java/.Net you leak memory because you have flaw in your logic.


Circular references and other things that tend to leak memory on GC languages aren't a logic flaw in the sense of your business logic being wrong or your algorithms being bad, though. They're things the language allows you to do, that can be completely correct at a business level and visually correct at a code level, but result in fundamental breakage of the VM's operation.

I say this as someone who does high-level languages as a day job, and doesn't find them to be a problem personally, but, yeah, Java as a concept still allows some unintuitive footguns like that.

Difference in magnitude of impact, difference in the response of the standards organizations in handling it, but yeah it's certainly same kind of footgun as C lets you do. The difference being that Java has devoted an enormous amount of resources to squashing these errors and they largely aren't a problem anymore, where C is afraid to touch the "undefined behavior" and "implementation dependent" sacred cows.

Is there an unspoken social contract between language committee and language users that aliasing a variable or causing a circular reference should not shoot your dog and burn down the house? That's the fundamental disagreement between Java and C's standards bodies.


Circular reference does not leak memory if your using a proper GC


Memory errors are always logic errors in C. Freeing something that was already freed or accessing a buffer out of bounds and so on are pure logic errors.


Plenty of logic errors in the last 50 years, CVE appeciates it.


But that person was trying to say memory errors are some kind of special "magical" error, which is simply wrong. Memory errors are merely logical errors. There is nothing "special" or "mysterious" about them.


Given the common meme as nasal dragons for UB, there is certainly some magic creatures associated with them.


From my comment, emphasis added:

> it could slightly reduce


> Getting rid of a class of errors (correctness) allows you to focus more on other errors (logical).

I think you mean "memory safety", not "correctness".

Correctness is a much harder target to hit and requires a good deal of time.


Did you see the claim "70% of all security bugs are related to memory safety"? Which isn't to say 70% of successful exploitation of said bugs resulted in genuine harm, but low-level vulnerabilities in widely used system-level software (network stacks etc.) can potentially have devastating effects in the way an excel macro is unlikely to.


What I'm trying to get at here is that security is not just "bugs". Nowhere in the excel macro hell that is my job are there any bugs. Using a macro to transfer money in and out of peoples accounts in bulk is bad design, It's insecure, and i have at times argued that it's unethical, but it is also not a bug, It's exactly how the system was designed to work. Edge was designed to embed an Internet Explorer OLE frame, and that frame was designed to have an IE7 compatibility mode. No part of that would count in the "70% of bugs" list.

Moreover, almost no bugs are discovered in the actual reasonable software we have. That's not because we are the best engineers, or because we write it with the utmost care and attention to detail. It's because we're a total of 5 people (mostly a couple of years out of uni) looking at it. C, and other systems programming languages are greatly overrepresented in the corpus of software that is actually being combed through for bugs. Java and Cobol are heavily underrepresented.

I want to make this clear again. I work in banking. If 2008 showed us anything it's that banking underpins most of our modern society so these quality issues are not unimportant, they can be devastating.


Nobody believes that all software security problems will be solved if we get rid of C. Nobody. Absolutely nobody. Really.

What people do believe is it would remove whole classes of security problems that affect everybody. Even if you have software running in excel macros. Maybe in your environment that is a minority of your problems. That is probably not a good thing, and we feel for you, and respect preferring to tackle that problem first. But it would be a good idea to not entrench C further while you are doing that. Once you shrink your attack surface, it would be great if the code being invoked doesn't have exploitable memory overflows to make all your hard work pointless.


Ok, but why are the Excel macros so risky? It’s often because the underlying software or OS was written in C/C++.


What? No it isn't. It's because the users think they're opening a document to look at some data and suddenly it's an executable transferring money via their bank account.

And that's all working as designed. Someone built it like that. There's no bug anywhere in that chain, just as extremely rickety system with no affordances for human intervention. No separation of what is trusted and what is not.


Excel macros are risky because they offer a kind of privilege escalation. Most Excel built-in functions (that is, what you write into the cells) have limited reach. They impact that cell and (by reference) other cells, but not the underlying filesystem and other things. Even looking at the database ones, I'm not seeing a single function that writes to the DB but plenty that read from it, that's "safe" in that it would be very hard to abuse these kinds of things to screw with other systems or the host system. But macros can run arbitrary code and access whatever the user can access in a way most Excel functions cannot and can trigger arbitrary effects. The equivalent of `rm -r ~/` is not possible in straight Excel, but is feasible with a macro. C and C++ have nothing to do with that, it's the underlying privilege model that permits it.

There's a good case to be made for using better languages than C and C++ for large classes of software. Don't weaken it by making absurd claims.


> There's a good case to be made for using better languages than C and C++ for large classes of software. Don't weaken it by making absurd claims.

And this idea can be applied to so many things.

Seriously, very often things have enough downsides that you don't need to exaggerate things that are not downsides. You're actually weakening your side by doing so because if people cannot trust even that bit of honesty, why would they trust anything you say?

be trustworthy is the first rule in convincing others.


Broken security model/sandboxing aside (which is inexcusable), the use of C/C++ expands the attack surface - there are some excel exploits though that take advantage of C vulns like buffer overflows: https://www.exploit-db.com/exploits/18087


You can be socially engineered to open a mallacious excel.


This is not a very useful analogy. It suggests that we should regulate C out of existence. But regulation was appropriate and effective for a particular reason with asbestos.

We stopped using asbestos because of the link between its manufacture/installation and lung cancer. It’s a clear case where government regulation helps deal with a market failure: asbestos is cheap and effective, so consumers like it, but nobody involved in asbestos transactions bears the cost of the lung cancers the industry causes.

Writing C doesn’t make underpaid programmer die early deaths. The issues with C are borne directly by the customers who are using C software. It’s not really a market failure of any kind.


> The issues with C are borne directly by the customers who are using C software. It’s not really a market failure of any kind.

While I agree that equating a programming language to a carcinogenic substance is inherently wrong, your argument is partially false too.

The cost is born by the costumers of the software, not the developers. In that sense, it's actually directly opposed to your point.


Isn’t that what I said? I don’t see any difference.


If you did then I dont understand the point you're making with the given example.

with the language you've got

  * the creators of C
  * the developers writing software (tool/library/whatever) with C
  * the users which use this software 
If the software has a security issue, the cost will have to be born by the users, not the developers.

That makes it (imo) directly opposed to your argument, as its technically the same as with the carcinogenic substance

  * the creators of the substance dont get cancer
  * the contractors using it to build the house dont get cancer
  * the people that live in the house get cancer
(but the people that use software written in C dont ... die, which is why I agree that its wrong to equate them)


I see where your confusion stems from. In the case of asbestos, it is the contractors using it to build the house who tend to experience adverse health effects, and not always the people who live there. Asbestos is harmless when it's inert in the wall, paint, etc. The danger is when you start doing construction / renovations and the dust is free floating.

So OP's point is correct. The two situations are not exactly analogous.


Well the creators of the substance (manufacturers who used asbestos in their products) and the contractors that installed the products did in fact also get cancer.

You're both a little bit wrong, and both want to be right, so seem to be ignoring the negative aspects of both your arguments.

C does not have a direct proven link to causing death or catastrophic disease like asbestos does, but any bugs due to language "allowing" the dev to shoot themselves in the foot is largely born by the users - at least initially.


As others said - asbestos is really bad for asbestos miners, and pretty bad for installers. But it’s pretty much totally inert when installed - it doesn’t affect homeowners.


Asbestos is killing mostly the contractors instead of the users, which is breaks the analogy here. (That being said, any argument using the “market failure” concept is dubious from the beginning)


> equating a programming language to a carcinogenic substance is inherently wrong,

I work with C and it kills me a little more each day, just like cigarettes. Its a pretty fair comparison.


I don’t think the federal government should step in and regulate it, but why not some kind of professional or trade organization, like UL?

And really, I think your argument bolsters my case. Federal and state governments are freaking out about cybersecurity and protecting infrastructure and voting systems, so why shouldn’t they ban the use of software written in C in their acquisition process?


Such regulations already exists, see for instance https://en.wikipedia.org/wiki/MISRA_C. Similar rules could be created for Rust (I would assume that this would have to be some sort of 'sane subset' of Rust, which for instance would forbid dynamic memory allocation alltogether).


Or you could use https://en.m.wikipedia.org/wiki/SPARK_(programming_language) which is already a formal defined language and literally the thing discussed in the link right at the top.


C in software stacks is more like flame in kitchens. Fire is dangerous and bad for your health. Kitchen fires kill people, gas leaks kill people. It's possible to do without. But billions of people around the world cook with flame and do OK. All our ancestors did, and here we are. There will always be professional kitchens that use gas/flame (almost all of them). And it'll always be possible for your food to catch on fire anyway, if you cook it wrong.

Unlike asbestos insulation in houses, or flame in your personal kitchen, you actually can't avoid C in your software stack. Do you use Linux, FreeBSD, or OpenBSD, or macOS, or Windows? Do you use Postgres or MySql or sqlite, or maybe redis or memcached? How about a language like Python or JS or Java or Ruby or Lua? Do you use rust? guess what the business-end of that compiler is written in! so you dabble at the tippy-top of the stack in some language like rust or go, so what?


> Unlike asbestos insulation in houses, you actually can't avoid C in your software stack

You know what, construction industry could have made the same argument - oh look, all insulation suppliers use asbestos, the factory is built out of it too, its everywhere in the supply chain, you can't actually avoid asbestos if you ate trying to build a house, so we should learn to live with it.

You can't avoid fossil fuels, you gave to use fossil fuels to produce a solar panel, so dreaming about clean energy is pointless.

The difference is that some i industries pull the finger out of their bum and clean up their act, and some i dustries, like our's, does not. Even thoigh perfectly serviceable alternatives exist.


That's a great analogy. I like C, and when people resist it I worry about their programming skills. That may not be fair, but sometimes the complexity of C / C++ is a version of complexity you will see in a big application.Sometimes the complexity is needless, confusing, dangerous-- insert what you hate here.

But really Rust seems to have the qualities I like about C and less problems. Also, all the reasons I used to have for wanting to be "close to the metal" are getting abstracted away in hardware. I think the only remaining reasons to use C is the huge existing code base.


A couple of points from somebody who enthusiastically writes new C code every day:

> ...the complexity of C / C++

In this case, please don't throw C and C++ into the same bucket, C++ is vastly more complex than C.

> Rust seems to have the qualities I like about C

...a subset of Rust has those qualities, but the whole of Rust has the same problems as C++: overboarding complexity both in the language and stdlib.

To lure C users over to a memory-safe language, we need a programming language that just adds the comptime memory-safety features to a 'small language' like C, but none of the higher level features (which only serve to split the language community into different tribes - which is also the biggest problem of C++).

Finally: yes, C is not memory safe, but when 'language-level memory safety' is essential for security, then there's something fundamentally wrong with the sandbox the compiled C code runs in.


I actually find that Go has the qualities of C that I like. Namely, simpler language constructs and no classes.


“...when 'language-level memory safety' is essential for security, then there's something fundamentally wrong with the sandbox the compiled C code runs in.”

Doesn’t the problem then just become what do you write the sandbox in? If you have security-critical code that can’t be sandboxed, because it is the sandbox, then it seems that language-level safety is your only remaining defense?


Yes, it definitely makes a lot of sense to write the sandbox in a memory safe language like Rust. Not even the most die hard C fan would argue against that ;)


Even if a C program is sandboxed, lots of programs handle sensitive data that memory errors can leak. For a simple example, even in a sandbox, a memory error could pretty easily lead to being able to grab another user's password out of memory.


Although the article is about Ada, which is technically more memory safe than rust.


I'd like to see the Ada guys, and especially the SPARK guys, talk about Rust the way Rust people talk about C. That said, I'm just gonna add ACL2 to my Common Lisp toolbox and then I too will have the power of theorem proving.


I am not yet convinced that Rust is a sufficient answer to C. Its constraints do solve some issues, but I am not really convinced. Also personally dislike the self-identification of Rust users. Rustafarian? Seriously?

You probably get accustomed to it, but I am not sure if I want to yet. Not deep enough into compilers to know why variables cannot be implicitly mutable or not. Probably a compiler necessity to enact its borrowing rules. But ultimately the compiler should make my life easier. Only ever using one thread or process does solve several common locking issues.

In that regard C string handling will probably let more people move away from it than its security flaws.

It will probably end up like always. There are some applications where some languages shine. And the simplicity of C has merits, otherwise another language would have taken its place. Of course C still has a lot of momentum and perhaps another language would do better in the specific use case.


Rust users are called Rustaceans.


Heh, right. I believe I like to stay with that point anyway.


> For whatever reason with C there’s this huge emotional component to it. Safer alternatives exist.

For a lot of things, I just don't think this is true.

For example, here is sqlite's reasoning for choosing C: https://www.sqlite.org/whyc.html

I can't say I can really disagree with it.


That article wins the argument for me, and I'm not a fan of C particularly. Admittedly it doesn't address security concerns, but the fact they're able to achieve 100% machine code level branch coverage (I wonder if that includes the possibility of executing injected code...) should go a long way towards generating confidence that there are no readily exploitable vulnerabilities.


There are many C developers out there who will insist they have gained enough experience to never write code vulnerable to various memory safety exploits. Real "I actually drive safer slightly buzzed" vibes.


> Real "I actually drive safer slightly buzzed" vibes.

“I don’t need seatbelts, I actually know how to drive unlike others”


phase out C on the schedule the internal combustion engine is going out of circulation as is being required


It's not an emotional component, it's just plain old network effects.

Unfortunately, a lot more switching cost is involved in getting embedded and systems programmers and platforms over to a new programming language, than is involved in getting construction contractors and industrial engineers to choose a different material.


Lots of them just have their own toolchain and language subset. Make porting literally impossible. Because you don't know how the original one works.

And c like language on these embedded devices are probably due to the fact that it's the easiest way to implement a programming language(abstraction) that work on bare metal.


I despise this security-first attitude. It just leads to so much mental fatigue everywhere and any step is a potential minefield. It's ruining programming.


> any step is a potential minefield

That's the exact problem with C. It's the tool, not the attitude, which is a problem.


Securely containing untrusted user code so that it can't do any harm is ultimately the responsibility of the OS, or whatever sandbox the code is running in.

Language-level memory safety is useful for eliminating an important class of bugs, but if those bugs can be exploited to escape the sandbox, then that's the problem of the sandbox, not of the buggy program running in that sandbox. So please, by all means, write the sandbox itself in a memory safe language like Rust, but requiring the same for all code running inside the sandbox is nothing more than an admission of failure.


That's pretty limited way to look at it. You don't need to escape the sandbox to do harm.

You can have your app in perfect sandbox where nothing can escape then have a memory safety bug inside allowing user to get to the stuff of another user (because your sandboxed app talking with sandboxed database still have one set of credentials to do everything)

There is of course https://xkcd.com/1200/ for that.

And no "move that separation to DB" also isn't a solution.


I agree, same with popular multiplayer games for instance which could be exploited by memory safety bugs in the network protocol implementation. But these are cases where just implementing a couple of critical modules either with a memory-safe language, or using code generation (or both) to reduce the amount of 'manual bugs' go a long way. AFAIK exploits almost always happen in places where data comes into the system, so those are the places to focus on.


What if the sandbox, kernel, device drivers are written in C? Sounds like a good case against C for these uses.


Yes, I fully agree! But such subtleties are usually overlooked by the 'C users are criminals' crowd ;)


Imagine any other engineering field complaining about safety being a mental drain.


I think a better analogy may be nuclear power, than asbestos, because of the labor side of the tech.

Using nuclear power correctly takes a ton of engineering prowess, lots of additional construction. If done well, it works great. When it goes bad, it goes really bad.

So if it's 1982, maybe nuclear power or C is a good option. If it's 2022, new technology may provide a cheaper way of getting to the same end point, with less engineering and construction, and probably even less cost.


This is a worse analogy because good nuclear plants exist. With C, I've yet to see a widely used library written by multiple humans in C that doesn't have critical memory bugs.


Linux comes pretty close for practical usage. Or sqlite.


https://www.cvedetails.com/product/47/Linux-Linux-Kernel.htm... and https://www.cvedetails.com/vulnerability-list.php?vendor_id=.... Linux has new security vulnerabilities found every year due to memory issues caused by C.


Those C code bases all get used and many have lives depending on them. Generally the really bad disasters are almost always narrowly avoided before they come to public attention.

Why would your experience there make you think the good nuclear plants are any different?


Nuclear power yields amazing results compared to alternatives, this isn’t the case with C.


Forgive my ignorance, I'm a hobby embedded programmer at best, but aren't many programming languages built right on top of C? Is the whole lineage unsafe, or is it really just C?


When people say "C is unsafe", what they mean is "it is too easy to create unintended vulnerabilities when writing C". The language won't catch your mistakes, and some of those mistakes are dangerous. "Safety" in this context means "language, compiler, or runtime features to prevent, catch, or mitigate mistakes made by human programmers".

So this doesn't actually affect things built on top of C. E.g. transpiling Rust to C won't make it any less safe because C is "unsafe" - Rust has its own safety features, that work regardless of compile target.

Or think of it this way - assembly language is even more unsafe than C, yet everything (including C) is built on top of that.


Transpiling rust to C is still dangerous if you don't control the compiler on the other side - one of the many footguns with C is thinking that there is any "inherently safe" C code. There isn't, because tomorrow the optimizing compiler may optimize away your null checks or start optimizing away your variables due to aliasing, and this danger is ever-present because of the sweeping nature of "undefined behavior" in C.

C is safe if you are using a 1970s-style compilers that don't perform optimizations, or perform a limited and predictable subset. C that is passed through an optimizing compiler (which in the context of this comment is all modern compilers) can never be inherently safe unless you conform strictly to the "undefined behavior" spec which is impossible in practice.

In contrast transpiling (really this is just compiling) Rust to assembly should never be unsafe, because you know there is no additional transformation step. Or at least, unless the processor has an implementation flaw... which is, of course, the whole thing about Spectre/Meltdown.

C has far far too many things in the "unsafe behavior" categories that really should just be compiler errors. But like "implementation-dependent" behavior in general, this was seen as an advantage at the time, it makes it possible for low-power or esoteric hardware to be driven with C code. But in modern high-level software development contexts, this makes it a foot-howitzer to try and use.


Why don't you think it's possible to avoid undefined behavior in machine-emitted C?

Machine-emitted C can avoid most of the language, guarantee it won't incorrectly alias anything, and guarantee bounds are always checked.

Bugs are possible but you can make Rust->C bugs no more likely than Rust->assembly bugs.


Thank you for the correction. I agree with everything you wrote, and should have known better and added a caveat regarding the difficulty of correctly divining the behavior of a C program, instead of assuming compiler writers are flawless omniscient beings.


Anything where a human is writing C is probably unsafe. Machine generated C can also be unsafe, but I'm unaware of an epidemic of systematic issues resulting from it the way we get with human C.


Not really. Many programming languages compile themselves nowadays (it's called bootstrapping) so there's no direct C lineage in them.


Slight correction: Programming languages compiling themselves is calls selfhosted. The process getting to that point (ie first making something that compile the language so you can compile the language with itself) is called bootstrapping.

(at least that is how I know it as)


yes. if the compiler was written in c and there is a bug in the compiler that language could be tainted.


We use all sorts of hazardous materials, which are safe if properly used, transported and stored.

C is not "unsafe" in the sense that little bits of it break off in installations and make people sick, but in the sense that the assurance of safety almost all rests on the programmer.

If code is safe, and not changed, it stays that way.


> Safer alternatives exist.

Sure, but do they have certified compilers for Functional Safety (ISO26262) development for Automotive or Medical applications? Rust does not, although there's a company working on it.


Yes, the compiler used for the work described in the post: https://www.adacore.com/industries/automotive/iso26262


Depression can be bad, too. Maybe it can lead to cancer, and it undisputedly causes deaths.

Other languages make me depressive and steal my focus. They make me go to great lengths trying to get the simplest things done. In C, it's like memset(p, 0, sizeof *p), done. Other languages make it such that I always code on 3 layers of abstraction to not have to do the "hard" task, but now I don't know the implementation of those layers. They make it such that I'm constantly torn to come up with an even more clever thing to say the same things, with even more abstractions and moving parts. They pull so much attention to themselves, distracting from the task that I want get done. With C, this is not the case.

Maybe with a different personality, I could be similarly or even more productive in Rust, ignoring all the kitsch in its ecosystem. But I've started reading Rust projects dozens of times and have always turned away. They always use one more abstraction than I can be arsed to look up, always hide the most interesting and motivating parts behind a crate dependency.

This is not the case with C, where it is easy to find good (enough) code that is honest and direct, and rewarding. I know that C is not perfect. We're having an open relationship, and I don't have any attachments (besides knowing it well) that prevent me from jumping ship when I find a better one.


To adjust your analogy a little, we actually still use asbestos, but it just isn't used for the general public. It still has unique properties that are industrially or commercially useful, such as certain asphaltic roofing compounds and friction products like brake pads.

But, it isn't used in products designed for the general public where strict oversight can't generally be provided.


This is the key point that's often missed: asbestos isn't used anymoee in applications where the risks outweigh the benefits.

We use incredibly dangerous materials in photolithography, nuclear weapons production, armor piercing projectiles, and aerospace... because there are no alternatives and/or the benefits outweigh the risks.

What changed is that we learned more about asbestos (it causes cancer) and phased it out of applications where alternatives existed, even more expensive ones.

In the C analogy, there may be places C continues to make sense... but it shouldn't be used anymore out of default, because it's the way things have always been done, or any other reason that doesn't end with "... and that's more important than an increased risk of security vulnerabilities."


> Safer alternatives exist.

Do they? The only “safe” comparable alternative is unsafe Rust which as the name suggests isn’t actually safe.


You're right. Alternatives that tick all the boxes that C does, and are also safer, do not exist.

This guy is just doing the typical religious yelling about people doing their jobs (writing kernels, drivers, browsers, games, etc) with the most flexible tool for programming we've developed to date.

Maybe one day Rust will become dominant. Maybe someone will overhaul the aliasing rules of C, write some similar lifetime tracking tooling, and it will live on for another 50 years. Who knows. The reason people still use C and C++ isn't because they're good languages (or even that people like them, as the comment states), it's that until recently there have been very few viable alternatives, and for some domains literally none at all. They've largely been the only option for a long time.


The article is about Ada SPARK, and nVidia using it to develop their firmware today.


Yes, I wasn't referring to the article, I was referring to the person comparing C to asbestos.

It makes sense for nVidia to trade development time and prefer a language with formal verification for their firmware.


You typically don't have to write any unsafe Rust as an end user. The unsafe stuff is contained in very small and very rigorously analyzed standard library types. Which then provides you a safe interface to do all the stuff you normally do.


In practice unsafe Rust is necessary to write many classes of nontrivial code, and is harder to get right in many ways than C++ (possibly C) and I think Zig.


Examples? I have been writing production rust for about 6 months now and haven’t had to use unsafe yet. Looking at the standard library internals I see it a lot but everything I have had to do works in safe.


> The only “safe” comparable alternative is unsafe Rust which as the name suggests isn’t actually safe.

And for some applications, Rust isn't even an alternative. Not every chip is supported by LLVM.


There are still industrial uses of asbestos today, because for some applications, there isn't a ready drop-in replacement. Uses where safer drop-in replacements were possible (shingles, automotive friction materials, flooring, insulation) have gone away. That might be a good comparison against "portable assembly", also known as C.


I think the problem NVidia has is that it has a massive low-level high-performance computing infrastructure written in C. Things like WebGL are a thin veneer over it, which is terrifying.

Their use combines the proper use of C (low-level driver code) with the unsafe use (large-scale systems). They should absolutely stop using C. On the other hand, I think it's fine for my network driver.


I was talking with someone in charge of coordinating WebGL rollout with various IHV driver teams and they said while AMD and NVIDIA were definitely annoyed at the work it was going to take, Intel was the one where the team reacted with sheer terror.

Maybe that speaks to some of the driver problems they're having now, if their house just wasn't in order beforehand.


The cynic in me says that Intel is the only vendor of the three who has thought about security before.

I am willing to bet that NVidia and AMD's GPU unit have floodgates of vulnerabilities along the lines of Spectre, which they simply never had to deal with before (not to mentions likely hundreds of more basic ones).


> I am willing to bet that NVidia and AMD's GPU unit have floodgates of vulnerabilities along the lines of Spectre,

100%, I'm not anybody important but I've been saying this for years, as soon as I saw Spectre/Meltdown announced it's like "lol yeah I bet GPUs are even worse, everyone is just racing for performance and not timing-correctness so they probably have zero hardening against that". I hadn't thought of the connection between Meltdown and that person's comment but yeah it's entirely possible they saw the potential for that or other security shenanigans.

Multi-tenant or multi-privilege-level GPU is likely a shitshow, so in a way it's actually a blessing that heavy GPGPU compute never really took off. Because I bet you can totally do things like leak the desktop or your browser windows to malicious webGL or another tenant on a shared vGPU. We live in a world where clients are mostly running one GPGPU application at a time, plus the desktop, and that means there's nothing there to leak.

(Although of course, it's always a little useless to speculate about massively counterfactual scenarios and assume everything would have gone the same... if multi-tenant/multi-app GPGPU compute had taken off, more attention probably would have been paid to multi-user security/hardening.)

Of course the real game-over is if you can get the GPU to leak CPU memory (or get the GPU to get the driver stack to leak kernel memory/etc via the CPU). That's bad even without multi-tenant.

Intel in particular may also be more vulnerable to that sort of thing since they have uniquely tight ties between the iGPU and the CPU. They come from a world where dGPUs didn't exist and the iGPU was only ever a ringbus away from memory or the CPU... it's apparently been a huge problem with the Xe/Arc drivers (there have been a couple patches where they produced 100x speedups by fixing operations that allocate memory in the wrong place, etc) since the GPU is suddenly no longer super close. It would not surprise me at all if AMD would be more secure because they're not working with an iGPU that's super tightly tied to the CPU like that.

https://www.pcworld.com/article/819397/intels-graphics-drive...

That's super funny you bring that up, thanks for tickling that particular neuron. Great comment and again, just a rando who tech-watches for fun, but, I agree 100%.


Out of curiosity what are the modern uses for asbestos?



So it's not everything but as long as operating systems are written in C, it will be the language of choice for many projects.


I have no emotion towards C. It is an old language and certainly an imperfect one. The thing is you just need to come up with something better. Maybe someday that can be Rust or another language. But you need something to replace it.

This will not be a quick change. C is interoperable, and the industry is moving slowly.


Ouch, that's a hurtful analogy. I was about to start playing with Turbo C on my 486, ordered K&R book...


> Ouch, that's a hurtful analogy.

It's also wrong. A lot of the time the people making these types of comments about C are just plain bitter that their favourite $NEW_LANGUAGE is mostly a rounding error in terms of usage.


> A lot of the time the people making these types of comments about McDonald’s are just plain bitter that their favourite $3_STAR_MICHELIN_RESTAURANT is mostly a rounding error in terms of visitors.


> A lot of the time the people making these types of comments about McDonald’s are just plain bitter that their favourite $3_STAR_MICHELIN_RESTAURANT is mostly a rounding error in terms of visitors.

That's true too; what's your point?


The supposed bitterness doesn't actually make their criticism of McDonald's wrong.


I think it is a little bit melodramatic as an analogy. It's perfectly fine to tinker with C, to use it in personal projects, to use it where it's not possible to use something safer (some embedded system that Rust doesn't have a backend for) or if you want to contribute to an existing project like Linux, OpenBSD or CPython. Whereas it's really not a good idea to work with Asbestos under any circumstances.

But the message overall - that maybe we don’t need to instinctively reach for C for newbuild low-level programming anymore - is probably about right


I guess you need to know it enough to know why people don't want them now.

Like, everyone knows typescript catches more human mistake that JavaScript. But you won't know why if you don't know JavaScript at first place.


Were you planning to write production software on your 486? I don't see why it should hurt you at all.

If there's a joke here I missed it, sorry.


As long as it doesn’t have a modem you’re probably safe. Probably.


There is a lot of software out there written in C. All of which needs maintenance. At work new C code gets checked in every day for software that runs major companies around the world.

Having said that, I would never recommend C as the language of choice for new projects.


Are there really good alternatives to C? You can use Rust where you can use C, but the memory safety doesn’t come for free. A lot of people find Rust extremely annoying to write. Maybe something like Zig? But the ecosystem isn’t that big…


In my opinion, the problem is that we failed to create some additional layers on top of C. an extended std lib with data structs and algorithms as well as a style/practices for everyday use (ie non compiler writers).

i adore C, it was my first language and i still like its simplicity very much. Still, I have come to believe that only strictly vetted licensed people should be allowed to use it :)


There are an incredibly large number of analysis tools for C to check things like memory safety. It doesn't look like they are always used in important code, though.

A lot of memory safety CVEs come down to "we wrote a static analyzer and found a latent memory bug here."


Maybe a language with the checking by default in the compiler would be a good idea, like Ada or SPARK.


That doesn't stop bugs, that just makes them less likely. VS making them impossible on accident


People also had a large and unexplainable emotional attachment to asbestos.


MISRA C avoids those security flaws in a big way.


I wish that were true. It imposes a lot of restrictions but doesn't solve the fundamental issues of C.


> It was fine at what it did

No need to talk in past tense. C is still the best language at what it is meant for: writing fast software. Not safe software, _fast and portable_ software.

> but the safety problems outweigh them

Safety is to speed what security is to convenience. Frankly, most of the arguments I see today are just fear mongering. In my experience, there is no protecting users from developer mistakes that lead to software exploitation and 'unsafety', independent of the language being used. Memory issues are a common avenue, but if you take it away, the next issue will become the main avenue for exploitation, and you'll be back in the same spot, with the next generation of developers fearing that the new software safety level is insufficient to reach some ideal safety level, all while having traded off real world speed across the board to reach that point.

> no new installations use it

I'm writing a tool in C. It doesn't need to be perfectly secure. It does need to be fast.

By using C we acknowledge that users want speed, more than they want safety, and that's our developer reality. That doesn't mean users don't want safety, it just means that outside of specialized domains, on average speed takes priority - people have been voting for decades that they want speed more than safety, and the process of deciding the priority is as you'd expect: users use money to buy a faster product and companies selling software invest in making their software fast first.

The security argument is also weak. The whole point of cracking software is to find flaws in existing systems, regardless of the language they were coded in. One thing will always be true: anything that sends data can be cracked. You can only make the entry bar harder, but you will not create a language that is significantly safer and as fast as C because the optimizations that make software fast (like not checking array lengths, not checking for overflow, etc) are also the ones making it less secure.

One can create a language that's safer than C and fast (in specific situations), but there will always be people cracking the core of whatever the 'safe' language du jour is, so you will have decreased security incidents on average, but have not eliminated them, while leaving room for a competitor to swoop in with a faster product that's safe enough, and it's your users who will decide who wins, not you.

Just to emphasize this, so far, I have not seen any indication that the general population will shift to picking safety over speed, so you'd be betting against your users.


At this point there isn't strong evidence that C is going to deliver better performance than C++ or Rust, especially for a given amount of work. One could argue that better code reuse (templates, generics, crates, etc.) allow programmers to write faster code in C++ or rust. For example you need barely any work to use a really good hash table in rust, one that's really tricky to rewrite by hand because it involves simd and whatnot.


The language that for the past 50 years has:

1. consistently had much faster compilation times than C++ or Rust

2. consistently generated some the smallest binaries

3. consistently ended up in #1 on programming language benchmarks

4. consistently been used to write the worlds OS kernels, where speed is critical

does not have _strong evidence_ that is going to continue to deliver better performance than C++ or Rust?

On the contrary, there is no evidence that Rust will ever be as fast as C. The burden on proof is on you and Rust to support your bold claims. C++ is in a different boat - it already tried to displace C in the 90s and it has been successful only in the environments where its other properties are a much more needed advantage, not for being faster than C. That's where Rust will end up - in between Java and C++. It will never displace C, because you pay for runtime safety with CPU cycles. Static analysis can only get you so far.

Even Java's JIT, which got everyone excited about beating C in the early 2000s, failed miraculously at being faster than C, even if it could generate better assembly code at runtime, precisely because extra features cost cycles which necessarily slow down execution speed, so you're not the first group of people with starry eyes and bold claims about performance and C.

As far as the generic simd hashmap goes, it will always be slower than a specialized custom built hashmap in C, even if it takes longer to code it and has more bugs.


C has existed for a longer time, and for historical reasons it's used for some of the biggest and most popular kernels. That makes sense. However, newer kernels don't necessarily stick to C (e.g. Fuschia's kernel is in C++); there is also no evidence that kernels in C++ would be slower than the C equivalent. Regarding your other points:

1. compilation time is not related to runtime performance.

2. smaller binaries can mean better perf sometimes, but it's not always a win. It's a deep tradeoff between inlining/specialization/code size.

3. benchmarks are fun, but I think it's misleading to claim they're absolute truth. In particular, a popular benchmark will be tuned endlessly till it yields what you want. For real code you get a tradeoff between the effort put in optimization, and the total time to deliver features. I think this is where C++ (and rust) shine over C.

4. see above

For the rest: "continue to deliver better performance than C++ or Rust" that makes no sense since C++ and rust are more recent. And C++ can be just as performant as C already, or better (for a given amount of programmer effort). Code reuse is a big deal.

The runtime impact of safety features is real, but (for non-elided bound checking) it's pretty low. Rust is also known for compile-time safety features which can help writing better code because you need to spend less time on debugging; the classic examples are string handling (keeping slices of the input is easier to do safely in Rust) and threading. Some features of C++ and Rust are also better optimizable than the equivalent C idiom (e.g. to represent objects/dynamic dispatch).

Comparing C++ and Rust with java is a red herring. Java is JIT compiled and garbage collected, with little control over memory layout. People might have overhyped it in the 1990s and 2000s, sure. Rust and C++ give you as much control as C if you need it, and they go throught the same static code generator (LLVM) than one of the leading C compilers. Rust also will optimize the memory layout of structs for you by rearranging fields unless you explicitly use `#[repr(C)]`, which means that it'll be smaller than C's equivalent on average.

> As far as the generic simd hashmap goes, it will always be slower than a specialized custom built hashmap in C, even if it takes longer to code it and has more bugs.

Maybe if you spend as much time to write your C hashmap as was spent on the generic Rust hashmap. That's really a long time. If you just need a hashmap somewhere, pulling the standard Rust hashmap (or, in C++, abseil, from which the Rust version took inspiration) will deliver great performance for little effort, so you can move on with your life and spend more time on the profiling phase. I think we talk too often about "absolute" performance without accounting for the total time it takes to achieve it, including debugging, profiling, etc.


More asbestos! More asbestos!



[flagged]


I've never given Rust a chance because I'm utterly repulsed by the community.


I actually am quite please with it, or at least the programs I've used written in Rust... but the community has been pretty toxic and it hasn't been without its share of security vulnerabilities.


Ada is the better Rust :)

I know the approach is not the same, but the goals are similar enough. And I think that Ada core principles are more useful and less "magical" for hardening code.


I would actually really love Ada if it weren't for a couple extremely frustrating syntax decisions: using parentheses for both call arguments and array subscripts, and making parentheses optional for calls with no arguments. Those two decisions make it incredibly difficult to distinguish between function calls, variables, and array subscripts. I absolutely have to use an IDE for Ada as a result. Also, Title_Case_With_Underscores is just silly.

Otherwise, I agree. Ada is pretty great. It deserves more love.


Is there potential for a Erlang -> Elixir play here? Ada -> Grace :). I find it a shame languages w/ a Pascal lineage are unpopular simply for syntactic reasons.


Both decisions to use parentheses for both arrays and argument lists and omitting parentheses for call with arguments come from Ada's design goals of supporting programming at large. While this is contrary to some other languages that focus on specifics of implementation, the decision isolated design intent from implementation details.

The original "Rationale for the Ada programming language" book provides a lengthier discussion on these points.

TLDR; The syntax is an intentional language design choice which puts capture of design intent in the forefront over implementation details.


That's all well and good but whatever the rationale it still makes code unnecessarily hard to read.

  A := B;
  X := Y(Z);
Is B a variable name or a function call? Is Y a function or an array? Is Z itself a function call? There is no way to know the answers without looking up the definitions for B, Y, and Z. And the answers are important because function calls can alter the program state and affect performance in ways that a variable deference or array subscript cannot.

So when reading Ada code, a developer has to constantly jump around the code base to understand which fundamental language mechanics are being used. It's a frustrating problem for someone who has to review real-time, safety-critical Ada code. And I've never run into that problem with any other language.


There's the same issue with C++ vs C: C++ is better for 'programming in large codebase ' because it can hide/encapsulate more things.

But that makes C++ worse for hard real time programming where you want to know what's happening.


With more descriptive variable names it’s not really a problem in practice. If B and Y contain verbs, they’re probably procedure/function calls.


When I used Ada the IDE was vim. LoL.

Actually, I wrote an optimizing compiler for Ada in C++. Also in vim.


Have you done all those things in vim simply because you couldn’t find how to exit from vim?


Perhaps more to the point, it sounds like SPARK is better suited to this particular use case. Some of the problems they're using SPARK to solve aren't even on Rust's radar.


What if all problems were solvable by adding or removing a layer of abstraction?

Perhaps, like any other tool-set... C has an optimal problem domain not everyone can understand, and it is often unrelated to llvm compilers.

Those who failed to learn why C++ evolved into its current state, are simply doomed to naively make another iteration of the same tantalizing bad design choices.

https://en.wikipedia.org/wiki/Second-system_effect


    The fundamental theorem of software engineering (FTSE) is a term originated by Andrew Koenig to describe a remark by Butler Lampson attributed to David J. Wheeler:

    "We can solve any problem by introducing an extra level of indirection."

    The theorem is often expanded by the humorous clause "…except for the problem of too many levels of indirection," referring to the fact that too many abstractions may create intrinsic complexity issues of their own.
https://en.wikipedia.org/wiki/Fundamental_theorem_of_softwar...


It's a funny aphorism at best but I don't think software would work nearly as well if there were more than a grain of truth to it.

Think of all the layers of abstraction that have led to this moment, you reading this comment, me writing it: the hardware, the operating systems, libraries, run-times, the network... it's immense. And it's a wonder that it works so well let alone at all.

I like this quote but I think it applies to weak abstractions; ie: not abstractions at all but, as they say, indirection and obfuscation. Using the word "abstraction" has done a lot of harm to efforts to convey the importance and utility of various formal methods from property testing to proof automation. A solid abstraction has fundamental laws and properties. It aids in mastering complexity rather than hinders.


> "We can solve any problem by introducing an extra level of indirection."

It's worth keeping in mind that this isn't a fundamental law of the Universe. Not all abstraction layers are equally good. Just because it's possible to do a bad job at something doesn't mean we can't try and do a good job.


Yes, written by someone who has never inherited a code-base hundreds of juniors patched over 12 years. At that point, you launch a separate division, slowly transfer functional support, and jettison the old structure after a year.

;)


> "…except for the problem of too many levels of indirection,"

Isn't this what neural nets are solving, right now? ;)


With things like WASM and GraalVM this is becoming more and more realistic. We don’t really write ASM anymore because it’s tedious and error prone, same can be said for C. I wouldn’t choose it today unless I absolutely had to. Abstraction is the way.


Already in 1961, one of the achievements of Burroughs was being an OS fully written in a safe systems languages (one of the first uses of UNSAFE code blocks), and a bytecode based format for executables.

This system is still sold nowadays as Unisys ClearPath MCP.


And the language you use to target WASM is... C?

I'm not necessarily saying it's a good language, but it's an excellent abstract machine. It won't die.


Just about every language can compile or transpile to WASM:

https://github.com/appcypher/awesome-wasm-langs


There are plenty of WASM runtimes without any C on them.


I'm not talking about the language you use to write the runtime. I'm talking about the language you use to write the code that becomes the WASM that the runtime runs. They not need be the same.

As others have pointed out you can compile pretty much everything to WASM, but it's most often done with the same languages you'd use to compile to native code; C, C++, Rust, the usual suspects.


D, Go and C# use their own bootstrapped compilers, just to give three examples from many others.


The annoying part is having to endure the Astroturfing campaigns.

I'd prefer a clear 1 liner in Julia or Prolog over 80 pages of unknown correctness. ;)


Abstraction is probably the most expensive (and bug-ridden) thing we do in computing. For OS writers and users, it is not a good idea to add any more abstraction than you absolutely need.


There is benefits to using better and better tools as they are being invented and evolved. In many aspects C is a primitive tool.


NodeJS project: 441MB, 32872 dependencies, and uptime <5 days due to patching cycles

C project Version: 2.5MB, 6 dependencies, and uptime 3+ years (OpenSSL patch ended the streak.)

Being busy is not necessarily productive, and "Better" is often use-case dependent. =)


Who's proposing replacing C with NodeJS? In the context of a discussion of how using verifiable/safe by design languages is an alternative to C?


It was in jest of a "Novelty bias" assertion. ;)


So how you wrote same app that have half a gig of dependencies in NodeJS ?

I have 10+ year Perl app that never needed fixing but I don't present it as an example of Perl being better than everything.

Also JS is a particularly bad language in near-every respect and not even in same category as C so I dunno why you compare them in the first place.


Perl was awesome when no other options were available. And one will find many languages inherited its features, as is was great for parsing text.

"So how you wrote same app that have half a gig of dependencies in NodeJS ?"

Nope, some have greatness cast upon them... great heaping piles of steaming greatness... One likely already knows the greatness of which I speak. ;)


How long did it take to build both?

How fast can you add features to both?

How easy it is to find developers for both?


I will let you know if the NodeJS underwear stain on my budget ever finishes.

;)


How will undefined operations change with your new layer of abstraction?

Anyway, you are really claiming that a sandbox capable of running C programs will solve more problems than writing your code in Rust?


I am inferring languages like Rust are not a replacement, but rather focus on a different problem domain already addressed in higher level languages.

Its not a complete waste of time, but I have yet to see a use-case only Rust features could better achieve. I am not smart, so may have missed something important. =)


> languages like Rust

Hum... If you know other languages similar to Rust, I'm curious to know them. Maybe there is something better out there, but AFAIK, the closest thing in existence is C.

> a different problem domain already addressed in higher level languages

Things like device drivers and embedded software. It's also what people push into submission when they want high performance software, but this is always a hard thing to do.

I do really not recommend you to use Rust if a higher level language solves your problem.

> I have yet to see a use-case only Rust features could better achieve

It helps achieving things like making sure your GPU driver fucking works, instead of doing what the nvidia driver does.

Of course, this article is about a different way of making sure it works. One that requires a lot more of work, but can provide some assurances that Rust can't. It is also easier to apply over Rust than C, but neither are the best target for it.


"making sure your GPU driver fucking works"

Well I did notice how nVidia left out the part that the CUDA C API had a known bug that would crash on exit due to a de-allocation bug for years. Thus, C++ became the de facto interface for their GPU library.

All code is terrible, but some of it is useful (except rust and VB). ;)


This is a wrong interpretation of the obvious solution, enforcing sane standards for the typical imperative patterns used and assuming feasibility of static analysis.

Programmer as user just makes sense once you've moved past punch cards to interactive IDEs. Everyone should have stopped using punch cards by now, but lo-and-behold this is still basically the standard in some outfits.


In some remote places... weaving machines still use punch cards, as the factory has run for over a century without electricity.

One can never assume our use-case assumptions will generalize. =)


Right, but they almost certainly have on-hand a skilled reviewer that verifies that good-faith programmers haven't done anything incorrectly that would bork a machine.

... that step wouldn't constitute introducing an abstraction either.


C on a modern compiler is very abstract, and the abstractions are an insecure pain in the ass.


Ah yes, the infamous gcc -O3 gamble ;)


>Perhaps, like any other tool-set... C has an optimal problem domain not everyone can understand

Maybe this is true specifically for C. However you imply this is true for "any other tool-set".

This is also a form of bias. It is illogical and irrational to think that every tool set is good for something. Things that exist can be horribly bad for everything and things that exist can in theory be good for almost anything.

There is not magical rule that says all tool-sets are great because they're always optimal for some niche problem domain.

The universe is not made up of apples and oranges. There are rotten oranges and rotten apples as well.


Languages like Python/C# have fundamentally broken threading models. And yet people like to use these for cluster work. Kind of like eating steak with a spoon, as you never knew about forks.

There are often language specific features that are not isomorphic, and are the primary reason a language was developed in the first place. =)


Can you explain what you mean by C# having a "fundamentally broken threading model"?


Sure there are forks and there are spoons. But there are also lumps of coal.

You have a problem: You need to eat soup and steak. So you use a spoon or a fork. The lump of coal is useless. You can probably smash the steak with it then eat the remains or lick the soup off the wet coal you dipped into it. Possible but a horrible tool overall.

There is an argument to be made whether certain languages are lumps of coals rather then a spoon or a fork =).



I actually enjoy programming in C. I don't get to use it as much as I'd like to though. Almost all the work I do these days has some other language that makes more practical sense to use: JavaScript, Golang, Rust, etc. But coming from an ECE background I enjoy the low level nature of C.


The problem is not the people who are proficient and enjoy the language. The problem is the people of the other kind who get told to do it for a living. I know the top level asbestos analogy (not mine) got shot down but let me try another. The problem is that we send both a trucker and cyclist to the freeway without any kind of protections for or against cyclist.



If we risk societal collapse because of buffer overflows, I think our society needs fixing, not our fun and sharp creative tools.

Personally I think “what if we stopped using npm?” and “what if we stopped using cargo?” are much better security questions to be asking at this point.


There may be better questions, but we aren't only allowed to ask the best possible question.


Security teams 10 years ago: Hey, why are you using that library?

Security teams today: Hey, why are you using that language?


C.A.R Hoare in 1980,

"Many years later we asked our customers whether they wished us to provide an option to switch off these checks in the interests of efficiency on production runs. Unanimously, they urged us not to--they already knew how frequently subscript errors occur on production runs where failure to detect them could be disastrous. I note with fear and horror that even in 1980, language designers and users have not learned this lesson. In any respectable branch of engineering, failure to observe such elementary precautions would have long been against the law."


Once the same kind of programming mistakes reappear in every other library, all over again, you start to question the language paradigms.

Humans are not good at programming. I've never seen a "done" C++ codebase in my life, maintainers are always too busy reinventing parts that had originally nothing to do with the feature scope of their library.


> Humans are not good at programming. I've never seen a "done" C++ codebase in my life, maintainers are always too busy reinventing parts that had originally nothing to do with the feature scope of their library.

No reason to limit to "C++" there. It's an industry wide problem with every language. Even in open source, if a repo hasn't had commits in the last week, most people treat it as a dead project.


> if a repo hasn't had commits in the last week, most people treat it as a dead project

By "most people", you mean GitHub-era folks who are secretly more interested in the things about the site that fulfill their social needs than they are interested in solving the problem that the software is supposed to be used for.


I think that's the major problem with software engineering in general: when is software "done" ?


When you can no longer remove anything else.


I can’t even imagine getting back into C and knowing where to begin to do it “right” and get to a useful level.


Security teams in 10 years: Hey, why are you programming?


I absolutely love AdaCore, last employer (~6 years ago) worked heavily with Ada and AdaCore, the support from them was just mind-blowing, sounds like a advertisement i know, but not often have i seen something like it..well maybe from SGI and DEC..longlong time ago.


I like my Rust and furries as the saying goes, but Ada is the underrated language out there. It really has a lot of things doing well for it, is really nice to write shit in it and probably the AdaCore being a single entity behind modern chain is what's both good and bad behind it. Yes, GNAT.. but, same thing.


> last employer (~6 years ago

It does not sounds like ad, but like Fortran. Last Ada job I remember was mars rover in 90ties. Great but nothing for new hires!


I was expecting this to be a rust article lol.

Ada is more memory safe than rust technically, although with less devs using it.


Except dynamic memory allocation, which in Ada either needs a GC or is unsafe.


Stuck in Ada 83?

Ada compilers never used GC, and it was eventually dropped from the standard.

Since Ada 95, that RAII is supported via controlled types.

Additionally in many cases like strings and vectors, the compiler does the memory managment for us anyway.


And rust is unsafe if you use unsafe rust (or even if you use Arc in safe rust and reference count incorrectly).

Ada doesn’t use allocation, remaining 100% memory safe.


Erm it is perfectly possible to perform dynamic allocation in Ada. You have the 'new' keyword to allocate a new object of a type. You have the 'unchecked deallocation' mechanism. You have controlled types that deallocate when an object is out of scope. You have all sorts of weak references schemes in some libraries. You have storage pools to handle allocation specifics for a type. You have the secondary stack that handles returning objects of unknown-size-at-call-site.

Most of those can be disabled though through the 'restriction' mechanism (look up Pragma Restriction which is very interesting in itself).

SPARK itself can handle and prove some ownership properties but to the best of my knowledge isn't at the level of rust in memory safety on dynamically allocated memory.


> SPARK itself can handle and prove some ownership properties but to the best of my knowledge isn't at the level of rust in memory safety on dynamically allocated memory.

It actually is: https://www.adacore.com/uploads/techPapers/Safe-Dynamic-Memo...

And using https://www.adacore.com/sparkpro as a reference (ignore the 'Pro' bit as it's also available in the GPL edition) - anything certified to SPARK Silver level is far safer than any Rust code out there.


Seems I missed some of the progress... Things are moving fast these days.


> And rust is unsafe if you use unsafe rust (or even if you use Arc in safe rust and reference count incorrectly).

I don't believe it's possible to cause unsafety using Arc in safe Rust. I don't know what "reference count incorrectly" means here. Could you explain?


I believe they are referring to memory leaks. Not really a "safety concern" as it will simply cause the application to crash (eventually).


Arc in safe Rust won't let you cause use-after-free no matter how much you screw up.


Ada is one of those languages that I really like in theory, but don’t really get to use it because I’m usually better off picking something else (for any number of reasons, but right tool for the job, y’know)


In the end, how many people "speak" a language matters a lot more than its properties, both in the real world (compare Esperanto and other designed languages vs. English) and with programming languages.

If language A is "better" but language B has all the libraries and developers, the right choice is usually still B.


Not if language A pays twice language B.


Ada had many good ideas, but I can’t imagine using it for anything but existing large legacy code bases.

I am impressed by the Swift language: nice REPL based development if you want that, typed, great support for Apple devices and some support for Linux.


Isn’t reference counting a type of garbage collection?

I feel like you can’t put Swift in the world of Rust/Ada/C/C++


It is, and for Apple, it is the successor of their system languages, explicitly stated on Swift documentation.

Here enjoy a full graphical workstation OS developed at Xerox PARC, Mesa/Cedar uses reference counting with a cycle collector.

https://www.youtube.com/watch?v=z_dt7NG38V4


> Isn’t reference counting a type of garbage collection?

Reference counting leads to memory leaks if cycles occur. Garbage collection algorithms don't have this problem.


I checked and you are not correct.

https://en.m.wikipedia.org/wiki/Garbage_collection_(computer...

Reference counting is a type of garbage collection.


On the other hand:

"Garbage collection vs. ARC"

https://atp.fm/205-chris-lattner-interview-transcript#gc (Hacker News: https://news.ycombinator.com/item?id=31139610)

This (sub)headline only makes sense if garbage collection and ARC are different concepts.


Hmm I think they are referring to GC as “traditional” GC, but I will concede that they don’t seem to say that.


True enough. I probably should not have inserted Swift into an Ada conversation. BTW, I am a Lisp developer, but some reason I find Swift palatable.


Er, does Swift have any of Ada's safety properties or proof tooling?


> Swift

> REPL based

What?


I don’t use Swift very often, but when I do, I use the REPL to experiment with new code snippets, just as I would for Lisp languages, Julia, or Python.


Consider the source. Its Adacore, the support open source Ada. I know them from the Ada on x86 seminar they held at my employer (we were using Ada on PA-RISC). Nice people but pro Ada (or its "Spark" language which I guess is like Ada).

But another huge issue is that C is the language used by the OS. If you want to use that OS functionality to allocate memory, do networking, IPC etc, you are using a C header file and calling into it. We had a binding library to making calling C from Ada easier, but its still an extra step. The good news is that other languages are starting to have some great libraries.

Ada and Rust and a bunch of other languages are safer than C, and likely a better choice. Its just there is a lot of existing code to port over.


I’ll never forget this neckbeard older dude in an algorithms course with me in College

“C is the only good language. And you should use it for everything.”

“Even web servers that power apps?”

“How could you consider anything else?”

During the same program (over ten years ago), someone came from Adobe and basically said “C/C++ are the source of the majority of our security bugs and would just go away with another language.”

Change is hard folks


I wouldn't believe Adobe, company known for shit security practices for decades now, to be any authority on the subject. They'd cut themselves with safety scissors


A company isn't an authority on any subject. Employees though... I'd bet the survivors have stories the rest of us just wouldn't believe, straight from the trenches. World class experts on how to not do things.


They wouldn’t, but this was a pretty bright staff scientist iirc.


This is literally a PR article from a company who sells SPARK products, talking about a company buying into SPARK. Please provide your own salt lick.


I'm dreaming of a C with only pointers that has an optional VM (with and without GC) and namespaces, string and stream included.

You can cook your own (C++, WASM and Java), but if it was standardized and cross compatible.

Also curious why so many dislike streams?


Aren't you describing golang?

Golang is basically C with pointers, and still very very static in its programming paradigms.


Golang only has a GC, it doesn't have an option to manage memory in other ways.

Were you referring to unsafe pointers and calls to Cgo?


> Golang only has a GC, it doesn't have an option to manage memory in other ways.

But as far as I understand golang's memory internals, they still offer you to use the copy-based stack directly ("var some SomeStruct;") or to allocate things directly on the heap (via "new(SomeStruct) / make(SomeStruct)".

I might be wrong about this, but this is what I understood from casually reading the spec [1]; while they never mention stack or heap specifically and describe it more as memory being allocated at run time, which kind of hints to a copying garbage collector underneath. But they also seem to implement a mark and sweep mechanism [2] so I'd say it's a hybrid GC, similar to how ECMAScript VMs work these days.

Nevertheless you're right with the argument that it doesn't offer a way to manage memory yourself, which I think is a good thing. Technically you could use "C.malloc()" and "C.free()" though.

> Were you referring to unsafe pointers and calls to Cgo?

Yeah, I was kind of referring to the possibility to implement C-interface adapters using CGO (the internal "C" and "unsafe" packages). Personally I would only use C APIs if there's no way around them, though, and keep as much code in golang as possible.

[1] https://go.dev/ref/spec#Allocation

[2] https://github.com/golang/go/blob/master/src/runtime/mgc.go#...


> But as far as I understand golang's memory internals, they still offer you to use the copy-based stack directly ("var some SomeStruct;") or to allocate things directly on the heap (via "new(SomeStruct) / make(SomeStruct)".

Those are equivalent calls, you can't explicitly choose to allocate on the stack/heap.

Conceptually there's no distinction between the stack and heap in Go, you just allocate whatever memory you want and the runtime handles cleanup for you.

In practice the compiler will perform escape analysis to place everything it can on the stack, but that's an implementation detail, you don't get to explicitly choose when allocating.

At best you can get the escape analyzer to show what it thinks escapes to the heap and try to coax it to allocate on the stack instead.


> I'm dreaming of a C with only pointers that has an optional VM (with and without GC) and namespaces, string and stream included.

It'd be really nice to be able to trigger some scope-exit behavior, too, IMO.


Do you mean for Exception handling?


Maybe parent hinted at what GCC extensions like __cleanup__ variable attribute do.


Indeed: maybe not a full "RAII" kind of feature but at least something simple that allows the author to specify (around the same place where a resource allocation happened, e.g.) that some other symmetric behavior should take place on any scope exit point.


Hm, ok my take on this is unexperienced; I come from Java and only learned C++ recently: I would like to have heap inspection tools which a VM would give you. Then you can remove the VM if/when you need performance and have profiled memory enough.


At the moment, Zig and Rust are the most exciting places to look, IMO.


Have a look at D.


https://en.wikipedia.org/wiki/List_of_programming_languages

I think I'm not going to like it because I want something that is foundational = works everywhere, is stable everywhere and has all features everywhere (f.ex linux 32-bit on ARM)

But I will give it a fair shoot.


LDC is a frontend to LLVM. https://github.com/ldc-developers/ldc


As addendum, GDC is part of GCC on the other side.

So already plenty of platforms are already covered.


> C with only pointers

Interested in what you meant by that.


He probably doesn’t want pointer arithmetic.


I don't want to add/remove &, * and sometimes *& everywhere until it compiles... I don't care about memory in that way.


Five years later...

"Welp, back to C because AMD's performance is overtaking ours."


Ada/SPARK is a strong typed C with visibility done correctly. (you know the "private" notion in C++ ... and sane scoping using semantic grade packages) All checks in Ada/SPARK can be removed so essentially you end up, in production, with code as performant as C.


I'm following the work of the rust compiler team these days, they're trying to leverage all rust frontend guarantees to perform better optimisations.


This is only true if your C code is as correct as your Ada/SPARK code. Incorrect software can run faster than correct software...


So true! A loop that crash at the first iteration of a zillion is quick as electricity in a NOR gate.


AMD drivers are crap though, it's not an old joke it's current, for the last 6months AMD drivers are riddle with driver timeout ( Timeout Detection and Recovery ), chrome hw acceleration not working, bsod, black screen etc ... really that bad.

https://www.reddit.com/r/Amd/comments/xvtn2u/amd_your_driver...

Truth is to have good drivers you need a lot of people and $$$, Nvidia has the upper hand on that.


YMMV! As an non-power, graphics-wise, Linux user (that is, no 3d), my experience has been better with AMD than Nvidia (owned multiple Nvidia cards and a modern AMD one).

In daily usage, I think I've found one issue with the AMD card, and a couple with the Nvidia card. What's worrying is that when filing bugs while using an Nvidia card, devs both times gave the pseudo-automated answer "we can't solve that, Nvidia drivers are closed source", which was wrong in one case.

In Windows, I've used all the cards for gaming only, and never had any issues.


Agreed that YMMV, I have heard about these issues in that reddit thread about NVIDIA cards as well and have experienced the hardware acceleration bug for example.


AMD drivers on Linux have been getting better. Though don’t I run a bleeding edge card …


This is about firmware, nothing to do with the performance of GPUs...


Firmware and drivers have a massive impact on the performance of GPUs. It's not just hardware.


The article states they had no performance hit from switching to SPARK.


It’s rare to see a thread where everyone is simultaneously correct but talking past each other.

None of you are mistaken.


I noticed this happening and just stopped replying :p



Yes, and security has a large performance impact.

Just look at the performance costs of bounds-checking array access in C++ code.

Or more macro, the performance impacs of AV tools or Windows Defender on your system


> Yes, and security has a large performance impact.

Not necessarily. The linked blog talks about SPARK which is about running your code through theorem provers to mathematically formally verify that your code does the correct thing _in all instances_.

Once you have passed this level of verification - you can disable assertions and checks in the release version of the application (whilst of course - having the option of keeping them enabled in development releases).


>Just look at the performance costs of bounds-checking array access in C++ code.

If your compiler can prove you dont need bounds-checking it will remove the check and the performance would be the same. Hence, if your program has been proven to have no runtime errors you dont need them.


> If your compiler can prove you dont need bounds-checking it will remove the check and the performance would be the same

and in practice that is a very big "if"


Wouldn’t the performance costs of bounds checking on arrays be the same if the computer was doing it or if your code was doing it?

By that logic C/C++ doing no bounds checking speeds your code up?


> Wouldn’t the performance costs of bounds checking on arrays be the same if the computer was doing it or if your code was doing it?

It depends. The C programmer can choose to do the bounds checking in a for loop by just checking once before the loop begins, or once per iteration even if an array is accessed multiple times in the loop, or the safe language might have more overhead than a simple if statement in the C code. This can, of course, go the opposite direction (the safe language has verified the loop bounds, but the C programmer is checking before every array access). It's a battle between the C programmer and the designer and/or implementer of the safe language.

One of the reasons I like C is it gives you more control. This can be a good or a bad thing. This can lead to some really performant code you couldn't do in most languages or it can lead to some gnarly security problems. Maybe both in the same spot of code.

I use C to write mostly pet projects at home. I use it at work without having a choice in the matter.


Yes, which is why compiling on different optimization settings will have bounds checking on or off in C++


Well, yes, it does. Whether or not that’s a good tradeoff is a different question.


Did you read the article? They said there was no performance hit.


"Did you read the article?" is a particularly unhelpful comment when the linked article has been hugged to death, and the real article is behind a sign in wall.


> "Did you read the article?" is a particularly unhelpful comment when the linked article has been hugged to death, and the real article is behind a sign in wall.

As is making assumptious comment without reading the article, especially when the comment may be wrong. I think it's not only unhelpful, but actually harmful, because other people, who also jump straight to comment section, may form an opinion based on misinformation. Also, timbit42 did provide us with what they read in the linked content; unless they added it with an edit.


I read it using the archive.org link. Should people comment without having read the article and knowing what it claims?


Are you kidding? Asking if people read the article is more helpful when there's an unusually large number of people that haven't.

And if they need help accessing it, help can be arranged.


Well sometimes people don't. I know I've done this on occasion. Just read the title and dive in with my own opinion.


Yup. I don't think HN can avoid this at scale, though. It's been a problem on sites like this since early in the days of Slashdot.

The fundamental problem is that the voting and karma system actively incentivizes this kind of behavior. No amount of "did you read the article?" comments can counteract that force. All they do is increase the noise level even further.


"Be the change change you wish to see in the world." — Arleen Lorrance†

† No Gandhi: https://quoteinvestigator.com/2017/10/23/be-change/


I enjoyed reading this quoteinvestigator article, but the conclusion it reaches credits Arleen Lorrance:

> In conclusion, Mohandas Gandhi did write a pertinent passage in 1913 that expressed a similar idea, but the popular modern saying is considerably more concise and forceful. QI believes Arleen Lorrance should receive credit for the expression she wrote in 1974.


They talk about SPARK, but I never heard about it until now. Reading up on Wikipedia it seems an ADA derivative.

I assumed they were switching to Rust, but that doesn't seem to be the case.


SPARK allows you to formally prove that your code is correct according to a given specification. It can thus provides much stronger guarantees than what Rust would be able to provide.

Similar technology exists for Rust, but it is much less advanced than SPARK is (https://github.com/xldenis/creusot)


Possibly naive question from someone who's never worked with this kind of language, but something I've been wondering about:

> it is possible to prove mathematically that your code behaves in precise accordance with its specification

So that bridges from one thing to another; but what prevents flaws in the specification itself? Is it just simpler/easier to read than the code is, so they're easier to spot?


> Is it just simpler/easier to read than the code is, so they're easier to spot?

Kind of. You basically annotate your program with pre/post-conditions, then you run the program through a theorem prover, which attempts to prove all of your conditions are true in all cases.

The basic guarantee is that if you pass the theorem prover, your code will be free of array out-of-bounds accesses, null pointer dereferences, integer overflow, divide by zero, and other runtime errors. These are useful security properties to prove always hold, and you don’t have to explicitly write specifications for them (just conditions to help the prover along).

If you want to go further (prove functional correctness of your program), you can add more specifications. A lot of times you just want to prove some basic sanity checks hold 100% of the time (e.g. always return a sorted array). These can be pretty easy to specify correctly.


"What if you just stopped writing bad code?"

"What if you used static analyzers to detect these issues before shipping?"

"What if you put bounds checking into your functions to prevent this?"

"What if you tested your software more effectively before shipping it?"


> What if you just stopped writing bad code?

This is a popular but useless question. The reality is that to err is human. At scale it's even worse because now you're not just accepting that you personally may be capable of mistakes when writing software, you might have made a mistake when hiring the people who in turn hire the programmers. Ouch.

> What if you used static analyzers to detect these issues before shipping?

In principle all the modern safer languages are in some sense is more static analysis. But the analysis is made possible in considerable measure through language design. If you insist on an unsafe language (like C or C++) then the analyser can't help much beyond "Don't use this language". Good advice.

> What if you put bounds checking into your functions to prevent this?

Bounds checks are a very small part of the problem. They're symptomatic (a language designed to do well at this will enforce bounds by one means or another) but not sufficient.

> What if you tested your software more effectively before shipping it?

You can't afford to actually test software thoroughly by its nature. You will only be able to test a microscopic fraction of possible system states, whether the softwawre works as intended under all the many other states remains unknown.


They don't actually mention what SPARK is nor link to any product page. Not very useful self-promotion.


> Performance compared to C: “I did not see any performance difference at all.”

See? but have you measured it?


Getting a 504 when I click the link. Is anyone seeing this issue?


Yes. OpenResty needs some rest. Give it time. 2 minutes, or so, after a click on reload.


> 1999

> CPUs are now 1GHz

> C is obsoleted outside small sections of code such as image processing

> 23 years later

> "What if we just stopped using C?"


This is about image processing.


in a JPEG implementation, there is 10 lines of code that need to be in C, assembly, etc, the rest can be any language.


Not this echo chamber again.


Since when is using SPARK an echo chamber?


Someone commented on that page: "Just use Rust, you stupid corporate normies." which seems a bit odd considering the different goals. The 'big deal' here seems to be formal verification, not just "something that is not C" which is what people appear to assume here.

Something I suspect they focus on here is the boot ROM or the SEP RtOS, which is small enough to warrant this kind of scrutiny.


The timing of their POC being in 2018 makes me wonder if it was possibly a direct response to the boot ROM exploit disclosed by Kate Temkin in 2018 (https://www.ktemkin.com/faq-fusee-gelee/).


commenters like that have turned me off of rust entirely and completely.


So you're avoiding a technology because of unrelated teenagers' internet comments. Is that any more useful than adopting a technology because of unrelated teenagers' internet comments?


Both are entirely reasonable, especially if it's something you're doing in your free time.

Internet toxicity can entirely suck the fun out of participating in a community, thereby defeating the purpose of participating in it.

But, similarly, a really fun community can do the opposite. I used to participate in the Ruby community more for the people than anything else. I never actually loved the language all that much.


> So you're avoiding a technology because of unrelated teenagers' internet comments.

A) What makes you think it's a teenager? The internet makes even ordinary people act in ways they would not act IRL.

B) It's completely reasonable to avoid something because you don't like the community. Comments like the quoted one in GP post are definitely not rare from the Rust community.


the behavior of advocates matters.

the rust community can deal with caustic advocacy or they can do nothing. they've chosen to do nothing, from what I can tell.

accordingly, I've chosen to stay away from that product and its community.

this is not a crackpot point of view, this is the point of view of a sane person who realizes that his time on earth is finite and who has zero appetite for games.


This is completely wrong. You should never hold people responsible for the behavior of others whom they have no control over. What do you even want the Rust maintainers to do about it?


Rust really is a great replacement for C++, despite how oversold it is by the more enthusiastic elements of the community. I'd recommend trying it out with an open mind.


You shouldn't use Linux then. Linux evangelists were far more obnoxious when Linux was first making the rounds.

This is just how tech works and how tech people tend to behave.


> Linux evangelists were far more obnoxious when Linux was first making the rounds.

Yes, absolutely.

> This is just how tech works and how tech people tend to behave.

I really think that that this old guard mentality needs to die out. Rudeness is protected in the industry because we normalize that it's... what, intrinsic to programming? But that just isn't true, and it's harmful to everyone in the field.


I think you're attempting a reductio ad absurdum here, but, since your 2nd sentence isn't really true, you haven't achieved the "ad absurdum" bit.

Back in the day, yes, the Linux community was indeed quite bothersome. That's why I fairly quickly switched to FreeBSD as my primary OS. I found I had a lot fewer toxic interactions in the FreeBSD community than I did the Linux one. In time, FreeBSD didn't really cut it for me any more, so I switched to OS X. The Mac community does have its incorrigible elements, too, but they are much easier to avoid than they are with Linux, where interacting with the community is a must in order to get help.

Yes, these things do have implications for open source communities. Personally, I suspect that, all along, the biggest headwind for Linux on the desktop has actually been that the social environment surrounding Linux tends to alienate people who might otherwise have stuck around to help make it more successful.


Don't make me remember the Python evangelists back in the day when they were targeting Perl.


I remember those fools. those people are why I won't use python even today, 20 yrs or so later.

if python was 10% as good as they claimed, it would be the most wonderful creation of mankind, past and future, by a factor of three.


LOL, I can still remember the python "enthusiast" who lived across the hall from me in the freshman dorm in 1996/97. He is forever etched in my brain.

I ran into him at a bar in Indianapolis 10 years or so ago when our alma mater was in the Sweet Sixteen. Must have drunkenly passed along my email address.

Because about a year ago, he sent me a link to an interview he had given about the new thing he was passionate about. I congratulated him and asked a few questions out of kindness. Yeah, the response was exactly what you could imagine. All future emails from him go directly to the trash.

Some people


Just because people tend to behave in an obnoxious way doesn't make it even an ounce more acceptable.


> when Linux was first making the rounds

Rust is already 12 years old.


Read Aurynn Shaw's essays on "Contempt Culture". Tech people can, and need to, change.


that is why I won't use python and why just the mention of "Linux desktop" puts a sour taste in my mouth.

feverish advocates detract from the thing they aim to augment, and yeah, the worst thing about anything on the internet is its community, by a wide margin.


:/


The security demands of a system depend on the use case.

I’m not going to be importing all of Rust to tinker on an RPi.

What if we accepted that not everyone is working on big tech corp problems?


[flagged]


If you bothered to read the link you would have found out that it is about firmware written in Ada/SPARK, that is even lower level than raw drivers.


I didn't even read the link, but it points to Adacore, which is a not-very-subtle hint what language it's going to be about.


OT: Please post an archived link in comments, especially for small blogs

Edit: https://web.archive.org/web/20221107114522/https://blog.adac...




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: