which is no worse than the provided code which results in
error: negative width in bit-field '<anonymous>'
It's worth noting that declaring an array with 0 elements is not allowed in C99. However, using a struct with no named members has undefined behavior in C99 [1].
And this is why I hated learning C. You went through a 500 pages C programming book, did all the exercises, had a good grasp of the language but you were still useless because you didn't know anything about those weird macros, Makefiles, automake, gcc flags, etc.
Bad C code is bad C code. There's equally bad JavaScript, Ruby, and Python code too; none of those languages have particularly simple semantics.
Moreover, this particular example is due to compatibility issues across the multitude of platforms that Linux supports, necessitating support for versions of C older than any language you might like. Were Linux only ever to be compiled with a C11 compiler, this could be replaced with simply `static_assert(x)`.
And speaking as someone who's coded C for 16 years and makes a living doing so: it is rare that I write a macro; my Makefiles are about 5 lines long; I've never touched automake; and the only GCC flags I ever use (and use consistently) are `-std=gnu11 -Wall -Werror -O3`. Heck I wrote a cross-platform NES emulator last month following these rules.
There's a lot of bad C out there to find. Don't let that turn you off from writing good C.
People often regard C code as inherently ugly because of nasty coding styles that were prevalent in the 80s and 90s – all local variables declared at the top of functions, lots of global variables, "space saving" indentation and often no whitespace between operators, bad separation of concerns. C can actually be a decent "high-level" language if you conform to a more modern style of programming. Codebases like Git and Nginx demonstrate that this is possible.
I generally agree with you, but C even at it's nicest is not "high level" except in the trivial sense of being having a recursive grammar (which assembly doesn't have).
The whole advantage of C is that it is a powerful, nice, and portable way to express roughly the same things you would otherwise express in assembly. Most of the really nasty undefined behaviour of modern compilers comes from forgetting this.
C is actually "high-level" in most aspects and we tend to forget that because of newer languages with lots of bells and whistles. C gives you a lot over assembly: functions with (mostly) no need to worry about calling conventions, automatic variables instead of dealing with registers and stack frame offsets, expression-oriented syntax that facilitates nesting of many operations within a single statement, structured programming support instead of scattered jumps and a rudimentary type system around values, pointers, arrays and structs.
Nothing fancy, but it means that the language can be compiled in a straightforward manner without any runtime support and a programmer can easily have a complete mental model of it.
Indeed. I don't want to give the impression that C is just assembly. It gives all the advantages you name, and they are very important.
But I still say that C is a way of talking about same things you want to talk about in assembly, albeit while automating some tedious but important things like register allocation. You are still commanding the computer at a low level: still telling it which byte to put on which IO port or memory location.
Of course you can build higher level abstractions on this -- but only to a point. C compilers go wrong when they imagine a C program lives in such a higher level of abstraction -- whereas the other languages thrive on defining such abstract machines.
But you can't actually directly address I/O ports in C, and directly addressing memory (via pointer arithmetic which oversteps the bounds of an object, or which puns the type of an object) invokes undefined behavior. Both those things depend on details of the underlying architecture, ABI, and operating system, and are abstracted away in C.
But in general functions shouldn't really be all that long anyway (if they are, break 'em up!). I kinda like declaring my variables all at the top — I think it looks nice & clean.
Even in short functions, "declare it when it's needed" makes your program follow a data-flow style and enables you to use const more, since in many cases the initialization is the only assignment that you do to that variable.
While the parent technically said "at the start of functions", at least C89 (I don't remember about K&R) allowed declarations at the start of any block. This is not incompatible with minimizing scope - just introduce more blocks. And in fact, introducing a block to bound the scope of a variable allows a smaller scope than simply declaring it halfway through a function, as you can also end it early.
If you declare a variable as `const`, you have to initialize it during declaration. So you have to place the declaration exactly where the value expression becomes available.
Also, it acts as a declaration of intent and makes the data dependencies in a function more explicit.
If I declare a variable in the middle of a function, I am declaring intent that this variable shouldn't be used in the first half of the function – maybe the data it is supposed to hold cannot be available at that point.
> People often regard C code as inherently ugly because of nasty coding styles that were prevalent in the 80s and 90s
Those code styles as still prevalent in 2016, speaking from the experience of occasionally having to look into how people write C code to integrate into Java, .NET, Python, Ruby at the enterprise level, which just stresses my point of view about the language.
Specially since outside HN bubble many don't even know what a static analyser or code review are all about.
Saw Hungarian notation used by two distinct developers in 2016. One of them freshly graduated and otherwise a halfway decent programmer. As far as I could tell both used it just as wrong as most developers did almost two decades ago. Bad styles find a way to live on.
>>Bad C code is bad C code. There's equally bad JavaScript, Ruby, and Python code too; none of those languages have particularly simple semantics.
The problem is that C is in everything. OK, not literally, but you know what I mean.
Bad JavaScript is easy to avoid. You just navigate to another website. Bad C though, not so much. Often times it's in the kernel or somewhere deep like that.
I agree that C is complicated, but this one has nothing to do with macros, flags or anything external. All you need to know to understand it is in the language itself.
C is easy, computers are complex. Sometimes these are confused. Like in this case, the syntax is easy to human-parse, but it's harder to reason about the result of the expression on different architectures and with different compilers, yet the complexity is caused by the computer, not the language.
Quite. I started out in a world of C99, endianness and differing bit packing. My early day to day involved many more unions and a lot more time wondering if SunOS cc was going to behave the same as Dynix or MSC on DOS.
It's easy to forget how much of that mindless preprocessor and make conditionality we've left behind. Sadly in exchange for hardware that's much blander now.
The rules are simple for simple projects. Then you get into things like undefined behaviours, implementation-specific behaviours, etc. Which compilers will abuse heavily for optimisations without telling you about it - for example like the case of silently removing NULL checks. Also any undefined behaviour at all in your source code allows the compiler to throw away all code after it. Without telling you about it.
Maybe the book I read was not that good (Deitel&Deitel) but the section on macros was pretty tiny and I don't think I'd have understood that macro on my own. Anyways, I think C is great, just that most books don't seem to cover the important stuff to work on serious C projects.
The macro isn't the complicated part in the OP (`e` just gets substituted with whatever the argument is). The definition the macro generates is (for most cases, needlessly) complicated, but that has nothing to do with macros or the preprocessor.
The general rules of macros are:
1. Don't use them; prefer static functions.
2. Use them as symbolic constants only.
3. Use parameterized macros only when you must use # or ## (i.e., for code abstraction), and then use them sparingly.
4. If you really insist, wrap every use of the arguments in parentheses, and be careful not to write the name of an argument somewhere where you don't mean it.
You'll get pretty far knowing next to nothing about macros (e.g. expansion phases and tokenization rules) by following the above rules.
Deitel&Deitel make some amazing books as far as I'm concerned. They were my source material through college and a very enjoyable introduction to C, C++, and Java.
It was, five years ago. (C11 provides static_assert.) C was probably one of the first widely-used languages to support such a thing, beside Ada and C++.
While this construct looks pretty weird if you decompose it element by element using the rules of the language it becomes rather clear IMO. The most arcane feature here is probably the bitfield syntax that you might not encounter very often in the wild.
The problem with C IMO is that the compilers are very permissive by default and it's easy to trigger an undefined behaviour with seemingly harmless code if you're not careful. Things like promotion rules make it difficult to guess at a glance how the code is going to behave if you don't have a very good understanding of these (rather quircky) rules. There's also the whole mess of the pointer vs. array distinction which sometimes matter and sometimes doesn't etc...
By comparison these BUILD_BUG_ON macros are relatively straightforward IMO. The naming is a bit misleading unfortunately but at least it's in full CAPS so you know it's a macro...
Funnily enough, I learned JavaScript about at the same time I learned C (~13 years ago) and at that time, JavaScript was dead simple to learn in comparison. I couldn't imagine myself trying to learn JavaScript today though, I'd probably be banging my head on walls.
I'm not sure that macro in this submission shows that Go is better.
Neither Go nor C has static assertions, the macro in questions uses clever techniques to implement one in C, but you probably can't do that in Go. So C is more powerful here.
Another example of clever C macro use is implementation of foreach loop (C doesn't have foreach) in linux kernel, here's question about it:
I don't agree that Go's tooling is what makes it popular. Mainly because it's tooling is not very good. All of the vendoring tooling is broken in one way or another, the linters and vet-checkers have questionable advice in some cases, and not to mention that the standard library has so many quirks that come from the fact that Go is trying to be system programming language that hides details about the system from you.
[ Disclaimer: I'm a maintainer of runC and have been programming in Go for many years. But that doesn't mean I have to like the language. Give me C any day. ]
Comparing vendoring tooling for Go and C in C's favor seems odd.
Personally I think Go actually makes a pretty bold statement about dependency management: people don't know what they want. I also have a sneaking suspicion Git and Github's interface in no small part to blame.
Golang for me made me come to one important realization: if I can't build a project by just cloning the git repository, then why the hell not?
Or rather, it's why almost every popular tool is popular. Take Rails for instance. It's just that Go competes with C and boy howdy does the comparison look good to anyone who can make the switch.
I know Fortran 90 and use it for some simulations -- since it has a matrix syntax it's not that hard to write changing matlab code. But I know Fortran is going where the wild roses grow, and wonder if I should spend the time to learn C for high-dimensional numerics.
[1] http://stackoverflow.com/a/12918937/959866
Edit: You can get around having 0 elements by using
but you're starting to lose clarity again.