Hacker News new | past | comments | ask | show | jobs | submit login

> The personal opinions of future compiler authors shouldn't affect whether your program builds successfully.

So, let's say that you find some 2003 software on some deep corner of SourceForge that you want to use.

Would you rather :

* Have the build fail because new warnings were introduced ? * Have the build succeed and then get crashes at runtime because the behaviour of the compiler changed anyways and the program was rendered invalid ?

I'll personally always want the first one.




There's a name for a kind of change that makes a previously correct program incorrect: "compiler bug".


> There's a name for a kind of change that makes a previously correct program incorrect: "compiler bug".

No. The only thing that matters is that the language rules are respected. Or maybe you'd want programs that were correct with the first C compilers to still work today ? Because for instance they didn't do type checks at all. `void f(float, float); int x = f;` you say ? no problem! `char c; c[-15];` ? no problem! Everything did compile, sure. Is it the world we want ? Most certainly not.


Trigraphs used to be part of the standard and are now removed. So really old programs that used them now would fail to compile. Granted, that is a bit of a low hanging fruit.


Also "horribly written code."

I've been playing around with some code from 1993---the Viola web browser (one of the first graphical web browsers). Ooh boy was it fun when I ran it with "-pedantic -Wall -Wextra". Hundreds, if not thousands of warnings. A ton of unused argumetns, implicit declarations (no prototypes), unused variables, return type not specified, a few "control reaches end of non-void function," some format string mismatches. Pretty much what you would expect from K&R C (the code paid some lip service to ANSI C, but not a whole lot).

Also, the code runs (not well, but it does) on 32-bit systems, but immediately crashes on a 64-bit system. That's because of the whole "int==long==ptr==32 bits" that permeates the code.


Who says the program was previously correct? Just because it compiled and didn't blow up so far, doesn't mean it's correct. It might or might not have been correct..


the program wasn't "previously correct", else a new, but still language compliant, compiler would build it just fine


Worth noting that some warning flags (at least when combined with -Werror* ) make the compiler not standard compliant.

* It's been some time since I last partook in language lawyering, and can't remember if standard compliant C compilers are allowed to produce diagnostics, the term used, for well-defined code as long as they also succeed.


Perhaps what we really want is a flag to cause the compiler to error for code that would produce undefined behavior and give warnings for the rest? Turning all warnings into errors is a great "worse is better" technique but it seems a bit too coarse-grained to me.


Alas, undefined behaviour in C++ and C is in general not that easily detected..


Oh, to be sure! However, my claim is that if compiler authors are able to add more no-false-positives undefined behavior warnings over time, I kinda want those to "break" the build for my existing software... But if something is just a style check or creates false results, I'd rather the build be allowed to happen.


It's true, but there's a few tools out there to help now, such as tis-interpreter (https://github.com/TrustInSoft/tis-interpreter)


Yes, an interpreter has a much better shot at detecting runtime undefined behaviour than a compiler.

The whole point of undefined behaviour in C and C++ is to let the compiler cheat: ie a Java or Haskell compiler would have to take into account that (i < i + 1) can sometimes be wrong for native ints, and would have to prove that overflow can't happen in order to optimize this comparison away to True. Undefined behaviour in the standard frees C and C++ compilers from these obligations, and they can just assume overflow for signed ints won't happen.

These shortcuts (plus a lot of smarts) make it feasible to write a fast optimizing compiler with the 1970s state of the art in static analysis.


Agreed. If the compiler wants to optimize something because of Undefined Behavior, that's something that -Wall should break on.

'Warning, you have undefined behavior', let the programmer decide what the intent of the section is and fix it.


Just FYI, not all undefined behaviour can be detected at compile time using current C semantics. I'm very very in favour of cracking down on undefined behaviour and changing things from undefined to implementation-defined and stuff like that, but it's not as easy as just flicking a switch and making the compiler warn whenever it assumes undefined behaviour won't happen.


Do you have an example I could take a look at? I'm actually not well versed in the C world


A simple case is the warning clang will generate for:

  int err = 0;
  if (err = f()) {
  }
This is indeed a useful warning, as it's common to mistype == as = in this situation, but it's also absolutely correct code, and the idiom that silences the warning (surrounding with extra parentheses) does not change anything about the correctness of the code.


A hint could be that you're initializing to zero a local just to overwrite it. Not an error or a warning but it could hint that you did not mean to, isn't it ?


The initialization is just an example to show that err is a previously declared variable. The actual code could easily be different, but the warning is emitted on seeing the assignment in the conditional.


Unintended code (like having both arms of an if-else-statement execute the same action, or a duplicate condition in a logical expression) is sometimes perfectly valid as far as the standard goes. A double condition for example may simply be a result of poor refactoring, and thus the program even works as intended – however, the compiler cannot know that. That's why the helpful warnings flag those statements – -Werror then makes the compiler abort the compilation altogether on any warnings which is clearly not standard-compliant in cases like the one I outlined.

The majority of the warning options showed in the article are intended to find unintended code (which can however perfectly defined if not intended behaviour as opposed to invalid, undefined behaviour). The very basic examples showed for those options also themselves do not contain anything that by itself would make the code not standard-compliant.


Well, the standard does not say that if you ask the compiler to flag potentially suspect code that it should compile successfully. That doesn’t make it non standard if you decide you do want your compiler to stop compiling when it encounters potentially suspect code and enable that mode wth a conpiler flag.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: