The C language (and the C++ language, for that matter) basically _never_ make backwards incompatible changes. This is a first class priority for the standards committees.
Adding simple abstractions which don't hide complexity in the underlying assembly is a huge productivity and correctness boost. Not that it matters anyways; modern optimizers are always going to do things that absolutely bend your brain.
Wouldn't that be nice if you could use a `scope(exit) {}` instead of accidentally missing two or three `goto cleanup_7;` which result in an RCE?
I would absolutely love a standards-compliant way to scope-guard sections of code. It'd make huge swaths of C code simpler; it's even in enough demand that Clang and GCC implement as their own extensions.
Please do provide examples.
Revisions to the C and C++ have almost never broken meaningful backwards compatibility, at least at the language semantics level.
Besides what nrclark has written on his/her comment, C11 dropped gets() and the introduction of the new memory model in C11 might break code that was relying on CPU specific semantics not in synch with it.
Regarding C++:
- auto changed its meaning on C++11
- export templates were removed in C++11
- exception specifications were deprecated in C++11, removed in C++17 and might do a come back in C++23 with value based exceptions
- gets() got removed in C++11
- declspec and auto had a small semantic difference, settled in C++17
- initializer lists introduced in C++11 changed their auto deduction type in C++17
- the required implementation semantics for std::string in C++11 broke COW in compilers like GCC.
C's variable-length arrays are one example. They were introduced in C99, and downgraded to 'optional' in C11. So that's one place where a newer C compiler could refuse old C code and still be language compliant.
Also the draft C2x standard does away with K&R declarations, which will be another compatibility break.
Here is a counterpoint: The C language is not finished, and many programmers like myself would like to see the language continue to evolve and grow in reasonable ways, especially given the fact we'll keep on writing it for a while to come. The ISO C people should be commended for doing their hard work.
Nope; I will be writing in "it". What you obviously want is to be writing in something other than "it". You basically want to be able to write Rust, Go or D into a file that has a .c suffix and is passed through a preprocessor.
There's no evidence of this. There's just existing code. Ok. There's existing code in lots of other languages with different syntax (large and small differences). So what?
The latest video about Oden was a fantastic primer on QoL changes that should be standard, but there are people who always think what they learned is optimal. These are the same people who trivialize evidence to the contrary, in defense of their particular viewpoint.
If you think if standard-driven changes to the compiler will never break your C89 code, because you're using the -std=c89 switch or whatever, you are pretty naive.
The only guaranteed way your compiler won't break your code is if it's left alone.
That can happen due to changes unrelated to new standards too. So if you are worried about compiler changes, then definitely freeze the compiler version you use.
Alternatively invest in in quality assurance to verify that your code still works with a new compiler version.