"21st Century C" is for people who have learned C in the past but need to brush up their knowledge with modern practices.
As someone who learned C at the time of the first edition of the K&R, "21st Century C" would be more useful if I ever had to code in C again (which hopefully I won't). A refresh, 5 years later, would be useful, though.
Is there a place to report errors? I've just skimmed through and the pdf has some possible formatting bugs: tables inside code blocks, page breaks splitting the tops of code blocks, small things like that.
I'd love a language that feels like a more convenient C. C as it is, with some niceties like built-in containers with some nice syntax sugar. Maybe someday Zig will get there.
D is pretty darn nice and you can write code as if it were a C project, you don't have to use classes if you don't need them. I also view Go as very C-like honestly. People complain about things missing in Go that are technically not in C either, except it has a lot more batteries included stuff out of the box.
Why would conservative use of C++ not meet this mark? It's highly compatible with C, has container types, ranged for-loops, operator overloading which makes use of container types more sugary.
You can just use features of C++ which improve the ergonomics of C; it's not necessary to buy into modern C++ wholesale. If you want to write a C program but with generic container types, just use the features of C++ which facilitate that.
> And the ergonomics of modern C++ look absolutely horrid compared to actual modern languages.
True, but if you want to write "C as it is, with some niceties", then you're not talking about modern languages.
It's simpler than the tutorial example for python-clang. It's strictly less work than a responsible unit test suite, and there's not much downside to getting some details wrong on this particular application.
I disagree heavily with that. Templates, classes, move semantics and destructors make for very tight and elegant programs I have found. No inheritance.
A language called clay was basically this. With better marketing and follow through it could have taken off. It had templates and move semantics and was interchangeable with C. The author was using it to write parts of a program that was already written in C.
Since you mention typesetting: I am working on a volunteer project that is transcribing old, out of print books from the 1700s through early 1900s (all public domain). For now the plan is simply to make them available online formatted for web, e-readers, and PDF, but thought is being given to formatting for print-on-demand. We've never done the latter so we have no idea what tools would be best for that: is there a good reference for free tools/software for doing page layout for physical books?
Any connection to https://standardebooks.org/contribute/ or if not, are you online yet? Wouldn't mind having a look. (And also interested in the print typesetting question.)
LaTeX or TeX would be excellent choices if you want good typographic control, though there's a significant learning curve if you're going to produce beautiful books. Check out https://www.ctan.org to start learning more.
Context would be a convenient tool for the job. You can create pdf, xml, and epub from the same source. You get the power of Tex but producing modern looking stuff is IMO much easier than with Latex. I personally usually use Latex for short articles and Context for books.
I hate LaTeX typesetting, it looks very archaic and reeks of academism (is that a word?). Unless document require lot of formulas, LaTeX is not a right tool in 2019.
Of course it can, but by default it does not look good, and it is not trivial to change things, hence you see documents which look straight out of 90s.
You probably don't hate Latex typesetting. It's just that the kind of default styles that people, especially beginners, tend to use that often look archaic. Latex doesn't make creating your own unique styles too easy though, which is why I personally prefer Context.
The Linux Programming Interface: A Linux and UNIX System Programming Handbook - is arguably the best C programming book nowadays if you code in Linux/Unix, the rest C books are most for ANSI C, that might be more useful for bare metal embedded system that does not run a full multi-task POSIX OS?
On a related note, do we have any fully complaint C11 compiler for MS Windows or is using gcc under WSL (or a VM) the best option? I take it there is no C17 compliant compiler yet?
mingw: For native build and execution. A lot of Windows IDE's will just work with mingw.
msys2: For posix compatible build but native win32 targets. Useful when your project uses autotools. Like when trying to build someones library. msys2 just tends to work. Trying to get autotools to work under windows tends to exceed my patience, time and brain cells.
cigwin: for a posix compatible build and runtime environment.
Also gcc can be built as a cross compiler. The two times I tried it is was more straight forward that I would have thought.
Note that msys2 isn't just for building programs, it provides a full bash environment with many unix tools and a package manager (pacman, same as in Arch Linux) to install more. The build environment is unixy enough to be able to build more tools from source using the classic ./configure; make; make install route (more or less, may need some modifications in the code).
For me my main use of msys2 is as a shell where i run scripts and command-line utilities with the build stuff being a distant secondary use.
I did. It’s short, clear, well written and I found it an excellent introduction to C. I’d recommend it if you read it alongside some more modern supplements.
Used it in college about 8 years ago. Can't say I'd do the same today, but back at that time it made some sense as college programming classes were teaching more than just practical programming knowledge.
That's what I used first. I did a lot more study, but I think that's actually the only book I used. What did it for me was trial-and-error and practice, but the most important part was getting the basics down so I could start reading through other people's good code.
Despite being amazingly well written, the most recent edition of K&R was published in 1988. The current version of C is C18, released last year, so K&R is missing about 30 years of evolution. It's not a bad book, by any means, but it doesn't include important aspects about how C is used today.
K&R is a well written book, and is very pleasant to read. IMO, it is one of those books that everyone should read one day or the other.
The main thing to watch out about it these days is that K&R is a bit carefree about the numerous ways where you can shoot yourself in the foot using C. For example, K&R famously implements strcpy as `while(dst++ = dst++);`. These days, most people would say that code is excessively terse, and that you shouldn't be using strcpy in the first place (because it can potentially overflow the dst buffer).
Both of these replies are missing the asterisks to indicate pointer indirection. Probably HN is eating each asterisk, treating it as emphasis or something.
Also terse it may be, but this (assuming the missing asterisks are restored) is a good, idiomatic implementation of strcpy. Whether strcpy is good or evil is a separate discussion.
Because it's 30 years out of date. It's missing three revisions of the language. Teaches a lot of practices that lead to obtuse brittle code riddled with security holes.
I just tried (with gcc 9.2.1), and got no warning. As far as I know, these _s versions aren't that good, and often aren't available at all; see http://www.open-std.org/jtc1/sc22/wg14/www/docs/n1967.htm which says among other things
"[...] none of the popular Open Source distribution such as BSD or Linux has chosen to make either available to their users. At least one (GNU C Library) has repeatedly rejected proposals for inclusion [...]"
"[...] As a result of the numerous deviations from the specification the Microsoft implementation cannot be considered conforming or portable."
As long as he's publishing under a CC license, it might be nice to release the LaTeX source as well. PDF is usually good enough, but sometimes needs to be converted to other formats; you usually get best results going latex->epub than pdf->epub. Some devices it's a lot easier to be able to change attributes to better fit the form factor.
Looks reasonable to me. I've been programming C daily for 25 years. Cosmetic but it is nice to see "int const foo;" being preferred to "const int foo". And "We define variables as close to their first use as possible". Less cosmetically, section 5.6 on named constants is great. That's a fiddly area of C from which to pull out reliable advice.
There is no technical reason to prefer one or the other; they are both declaration specifiers which can occur in any order. long const int is valid, even.
The const qualification is in the type; it is not a direct attribute of foo, so putting the const closer to foo means nothing.
The parent says it is "cosmetic", not technical. The parent, being someone who wrote C daily for 25 years, knows what the syntax allows and that it isn't necessary. A constant pointer to a constant pointer to a constant integer should be written "int const * const * const x" because there's only one other way, "const int * const * const x," which is inconsistent, ugly and potentially confusing.
My perspective is that "const type" order consistently appears in all versions of the ISO C standard and POSIX.
Every major system API I can remember working with that didn't hide const behind a typedef (like Win32's LPCSTR and whatnot) has it const first.
Speaking of Microsoft, though the API's hide const with a typedef, the MSDN samples are predominantly "const char" order.
I've never seen a man page with "int const".
Searching all of the man pages I have installed on an Ubuntu 18 system here (literally grepping through all of /usr/share/man), I found just one with "char const" or "int const". Namely this one:
You just worked on some unusual code bases, and are glossing over the language and system API's those code bases depended on, and their respective docs.
const int isn't a declarator; it's a list of declaration specifiers.
Though these can be reordered, it's clear they were intended to follow English.
In a compiler implementation, it's troublesome to enforce the order compared to processing a simple list and setting/checking flags in some data structure, diagnosing invalid duplicates or mutually exclusive situations.
I think you're inadvertently just proving the point.
I was speaking of declarators, not declaration specifiers. In C, a pointer type is part of the declarator and not the specifiers. Therefore, the CV qualifier is also part of the declarator except for the one exception of the specifier. Regardless of whether specifiers were intended to follow English (and this claim is dubious), declarators certainly were not, since it's often more natural to read them right to left, and it's generally most natural to read the left most type-specifier last.
example:
char *x[10]; Translated to english this is naturally stated as an array of 10 pointers to char. Not char something something array.
Edit: later on I see you've moved the goal posts about the "uselessness" of cv qualifiers on pointers. I don't know what to tell you, I think that's bullshit, and there are a number of viable use-cases for them regardless of whether you have encountered them. I'm just addressing the consistency part here.
Indeed, declarators must have their postfix chain read first, left to right, then the prefix portions right to left. (Separately at each level of parenthesization, if any, working inside out.)
There is no choice about where to put const inside a declarator; the position affects the nesting level which alters the semantics. So there is no debate to be had there.
Note that in declarators, the const is necessarily to the left of the thing it qualifies, so why would we put it to the right of int:
int ** const ptr;
^^^ this is what is const
int * const *ptr;
^^^^ this is what is const
const int **ptr;
^^^ ^^^^^ these two are const
In the canonical specifier order, nothing to the left of const is tainted by const:
extern const int **ptr;
^^^ ^^^^^^ const and const
^^^^^^ unrelated to const
For the best possible consistency, imagine const to be pissing, while the wind is blowing left to right.
Your third example could just as easily and correctly be:
int const **ptr;
^^^^^ this is what is const
True, it is a special case when there is more than one item in the declarator list as to how const applied, but that by itself is secondary to where const goes.
Const qualifies the object. Whether the const is applying to the "int" in some way or is an independent type specifier that applies to each declarator is pretty philosophical. I say it's the latter. The fact is, the very allowance of a cv qualifier in the specifier list introduces an inconsistency or imbalance.
Ultimately this is one of the ultimate bikeshedding argument. There's no right answer, and appeals to spoken language aren't particularly compelling when bikeshedding C of all languages. This isn't AppleScript.
I'm not even terribly interested in this particular point. I'm more interested in quashing the notions that bikeshedded opinions of language syntax have some divine providence. It's just bullshit.
We're then missing that the const qualification is applying to the int type.
Also, declarations can have multiple declarators.
int const *p, volatile **q[3]; // syntax error! not how it works
This isn't bike-shedding. The shed has already been painted. There is a right answer which is not to make your code look weird just because the standard allows it.
No "int const", no i[array] instead of array[i], no deceptive trompe d'oeil nonsense like:
int* p, q;
You will not solve any issue in C programming by swapping around declarators.
When we write code, stuff that looks weird should signal a bug. When bug-free code looks weird, that is a distracting false positive that grabs my attention for no reason. Don't add gratuitous weird.
There's no "except" because one cannot conclude how people should write programs from how they do write programs, regardless of circumstances. In any case, languages and their use change. C is 47 years old. Consistent expression structures is more important than analogies to English.
Further, C is not like English and even if it were, one could not conclude that it should be or should continue to be. Moreover, that it was designed by English speakers is not evidence that it is like English or should be like English: most assembly languages were also desiged by English speakers.
It's isomorphic, but you're right, I used an overly specific term. In any case, one can't conclude that something should be the way it is from the fact that it is the way it is without fruther premises, regardless of the kind of value judgements involved.
Implicit key premise seems to be that the more akin a formal (programming) language to it's users' natural one, the more ergonomic it is. I would not necessarily agree, but not read it as ought-from-is.
Yeah, this only works if you initialize at the same time as declaration. And as you say, without const, the same code is generated. All it prevents is accidental reuse of current_foo, which may have some value.
If you declare a variable in this way in C, it can never be assigned. Not once — zero times.
What is or is not good practice in C has little bearing on or relationship to functional languages, so I'm not sure why you bring them up in this context.
"const int * foo"
means that
foo is a pointer to an int that is constant.
Which could also be
"int const * foo"
foo is a pointer to a constant int.
But since the const qualifier for pointers can't be reordered like this, I think the point is that it's a better practice to have the const come after, so that it ALWAYS come after in your codebase regardless of the context?
Thanks for the explanation. I meant it mostly facetiously, but my more serious point is that there are some parts of some languages (and APIs and tools and architectures) where the answer you get from experienced practitioners is "think harder". It's not that I can't grok it, it's that it takes more effort than it should given how much the concept comes up, and the more I wrestle with my tools, the less I potentially get done.
C++ rvalue references are a great example. Whenever I see them I have to go back and relearn the concept because accounting for the language feature sometimes feels more complicated to get right than the unsafe pointer chucking it replaced.
const-qualified pointers are largely useless. If a struct contains such a thing, it can't be dynamically allocated. As a function parameter, it doesn't protect the caller from anything because passing is by value.
Let's see how many times this occurs in my TXR project (~75K lines):
One place: a comparison function for a qsort call, where a vector of const wchar_t * strings is being sorted. I didn't write this (though I did convert it to wchar). qsort passes const void (star) pointers to the elements, which are const wchar_t (star), thus things end up like this.
How about Linux kernel 4.9? There are numerous occurrences of this involving arrays declared at file scope, like this:
# lines that start with at least one tab and contain *const,
# possibly with optional spaces between * and const:
kernel-4.9$ git grep '^[\t].*\*[ ]*const\>'
kernel-4.9$
Wow, not a single result!
Let's relax that and allow a space or tab [\t ]. Then there are lots of false positives due to items in block commented lines starting with a ' asterisk' sequence. If we require at least one lower-case alpha character before star-const, we filter most of these out:
drivers/staging/vc04_services/interface/vchi/vchi.h: const char * const vll_filename; /* VLL to load to start this service. This is an empty string if VLL is "static" */
Your grep result of Linux is because the regex is bad. `\t` does not seem to be available in the grep basic syntax. It is available with Perl Compatible Regular Expressions, but then you can't use `\>`.
Many of the matches are the same as your "array of string constants" example, but just defined at local scope. And many are just constant local variables which some may not consider worth making const. The ones from tracepoint.c above are probably walking an array of const pointers by pointer instead of by index for example though, which is a common case where pointers to const pointers are used, which are legitimately useful.
I'm only an infrequent C programmer, but that goal interests me simply because it seems unattainable given other aspects of C's syntax. If qualifiers went to the right of things they qualify, I'd expect variable declarations to look more like:
foo: int const;
And similarly, function declarations might also take a more ML-like syntax. C seems more like it wants to be an adjectives-before-nouns type of language.
So, IOW, it's not purely a point of style, it's more about what's a sane way to write things given the requirements of the language's syntax when you're dealing with pointer declarations?
Which would explain why it seems so odd to me; I spent a fair amount of time in C-style languages, but typically only ones that lack pointers.
Though, if the goal is communication, wouldn't it be clearer still to use typedefs to clean up more complex declarations a bit? Or does that end up making things worse?
People do that if such a type is used a lot but probably not if it's just a one-off, just like when they write a function instead of using the same snippet a lot but don't if it isn't.
I really like the first edition. Old C diehards hate it because it breaks a lot of tradition. Personally I think it's really refreshing and made me appreciate C and lot more.
The first edition has been my go-to for introducing some of the newer (1999 and later) features of C (and some of the subtle footguns) to people. I'd definitely recommend it as a first introduction to C.
I'm not a fan so far. Lots of deliberately bad C code. Using uninitialized variables, etc.
Explanation of jargon seems to take priority over explanation of the language - the term "string literal" is explained before the concept of a function.
Yes. As I said - it was deliberate. That does not excuse it.
By the very fact that this is a book, it presumes that the intended audience consists of visual learners. When working with visual learners, if something is erroneous, it is imperative that it be explicitly and visually called out as such. In linguistics pedagogy, for instance, this is done by marking any ungrammatical construct or example not found in natural language with an asterisk every time.
This book does not do this. It freely uses deliberately bad code in example programs. This can only result in confusion on the part of learners. It is therefore inconsistent with pedagogical best practices. I would not recommend it.
I would second 21st Century C, it's an excellent book. It has a very pragmatic approach, skipping features like unions that are not used in modern C, but going into great detail on the important things like memory management.
Huh? Since when are unions not used in modern C? Tagged unions are a really common pattern in language interpreters. What are they supposed to be replaced with?
Thirded, especially on the pragmatism, this by far the most pragmatic programming book I've ever read. I fell in love with the C as a scripting language parts (https://github.com/RhysU/c99sh), now I don't have much of a use for a scripting language to sit between bash and "proper language" (apart from some occasional awk).
Read the reviews on Amazon. It's not a good book, the author makes embarrassing mistakes to the point where you start questioning his programming knowledge.
An Introduction to C & GUI Programming - Simon Lon is also good for beginners, especially it's easy understand content and beautiful typographic style.
I've only skimmed the book, but I wouldn't recommend it. The author has a tendency to be snarky and come up with "fixes" for "mistakes" that "old" C programmers make, except he does so in a way that is at odds with the fundamental design of C constructs and does not really show a correct understanding of undefined behavior.
I vaguely recall the whole fallout where it was revealed he clearly did not understand undefined behavior, and I believe he did walk back and accept his misunderstanding.
That said, yes it is damned annoying to have someone that doesn't understand the intricate details of the language poop all over that community and act as if "well i've contributed lots of code in C, why listen to a bunch of language lawyers, that isn't important anyway" as if the two are mutually exclusive. There are some of us that have been using C for years for actual work and are also intricately familiar with the prickly details of the standard. Those are the people that you should be looking for C books. Zed Shaw consistently shows he would rather hear the sound of his own voice and diminish things that are uninteresting to him rather than put in that work to become an expert on all facets of the topic he is claiming expertise.
I just finished Learn C the Hard Way and would be interested in a comparison between it and those other books mentioned.
Just from a quick glance, Modern C appears to be a much more in depth book. It appears to cover things like memory alignment, advanced types like unions, malloc, threads, etc.
Learn C the Hard Way is more of a quick introduction. You learn the basics, like pointers, basic data structures, basic types and then learn to do something interesting with those basics (build a web server). It also teaches some valuable real life practices that Modern C doesn't touch on, like how to structure your project, makefiles, how to use Valgrind and gdb.
Wait what, Learn C the Hard Way doesn't teach malloc? Programming without dynamic memory allocation is pretty restrictive. What "data structures" does it even teach if it doesn't cover malloc?
Like I said, it teaches the basics. There's a single page on heap vs stack allocation and it says malloc allocates memory to the heap. That's it. That's all you really need to know to do stuff and it doesn't go into what malloc actually does under the hood.
Modern C has an entire chapter called "malloc and friends" where it goes into much more depth about malloc, calloc, realloc, etc.
The book is a good introduction to programming in C and I'm glad I read it. You obviously have a history with the author and I'll thank you to keep your past impressions out of this particular discussion.
> There's a single page on heap vs stack allocation and it says malloc allocates memory to the heap. That's it. That's all you really need to know to do stuff
Every other C book seems to disagree, but OK.
Does the book cover the -> operator? Does it mention free and realloc? You need those to do stuff, at least.
The C language can be best learned from "The C Programming Language, 2nd Ed" [1], also known as "K&R book". Pair it with excellent notes by Steve Summit [2], and you don't need any other book to master C.
Love Deep C Secrets. I have it and flick through it whenever I want to remind myself about the dark corners I never mastered during the last 20 years of programming in C.
Where is the Holy C edition :)
(PS. I know about his rants on ycomb)
If you can get past his rants,
some of his concepts end up pretty neat.
To experience it, just download the ~12mb
OS and run it in VM Virtual Box pretty quickly.
Choose 64bit alternative OS.
https://www.vice.com/en_us/article/wnj43x/gods-lonely-progra...
Whether it's helpful or not, I hope it's self aware. Jumping from denigrating men to celebrating a culture without denigration in a single sentence is impressive.
> Is this casual sexism helpful to anyone, men or women?
It communicates the author's perspective to the reader, and that may be useful to them if they believe (correctly?) that such a statement will bring them favour.
Personally, I dislike it and hold a dim view of it; but I can't speak for others.
Weird. My first thought reading that even men! parenthetical was not that the author actually had a dim view of men.
If the parenthetical read "even women!" that would not be perceived as a minute poke at the female ego. Commentary would be at least an order of magnitude louder/more inflamed. You know damn well there would never be an "even women!" parenthetical by any author who cared about their career.
So it's weird for you to take what little commentary there is and shoehorn it into 'minute poke at the male ego'. Honestly I paid way more attention to it due to your comment, otherwise it would have just slid out of my awareness.
>If the parenthetical read "even women!" that would not be perceived as a minute poke at the female ego. Commentary would be at least an order of magnitude louder/more inflamed.
Come now. Far worse things are said about women all over the web all the time.
And the point is the double standard. When women are outraged about toxic male behavior that leads to sexual harassment, violence, rape culture, keeps them out of employment in certain fields, etc, they're derided and dismissed as radical feminist man-hating social justice warriors engaged in mob outrage and misogynistic witch-hunts.
When men are outraged about literally anything, no matter how trivial (and this is very trivial), it's always perfectly justified.
> Come now. Far worse things are said about women all over the web all the time.
You can't possibly be ignorant enough to think that the existence of blatant (albeit throwaway) misogyny like the example here would be tolerated on HN? There'd be rightful complaints right at the top of the thread and dissenters would be downvoted to hell. To be clear, I'd very much be joining in on the downvoting: a throwaway joke about women being bad at something isn't within the bounds of what I consider acceptable in eg learning materials.
Believe it or not, there are people out there who were raised with a shred of moral consistency, and who have actual moral beliefs (like being against sexism) instead of whatever grotesque facsimile of a conscience you've convinced yourself is your moral center. "But there are people out there who do worse!" isn't an ethical argument, it's a toddler's excuse for acting out.
I looked at that book, but I was worried about the Gtk part, the book is now a few years old, is the Gtk stuff still relevant? My worry is that Gtk seems to always be coming out with new versions and its hard to know whats still relevant and good.
That sounds like irony. Given the author is German and lives in France for years, it could totally be a sarcasm to denounce the current situation. It's hard to tell just by text though.
> Why is it always that SJWs like you need to find that one joke, one excerpt and turn it into some kind of personal war/problem/argument/discussion/hate?
Namecalling ("SJWs like you") is against the hacker news guidelines. I would suggest you read them.
Also, it's worth noting that actual, self-proclaimed SJWs (I am one, I know many, and am in their communities) wouldn't have any problem with the joke in question, because it's "punching up", i.e. it is making a joke about social problems in a group that has the systemic advantage. So your name calling isn't even accurate in this case.
> Instead of focusing on the VALUE brought by the author, you managed to dig up one benign joke and went on to discredit the entire thing.
Please point out where the poster "discredit[ed] the entire thing". I don't see them discrediting the entire book, I see them questioning a bit of context around the joke, which is a perfectly valid thing to do.
> Why aren't you ashamed of yourself?
Why aren't you? The poster is asking a legitimate question around the context of a line of the book, and you're insulting them and denigrating them for this.
Maybe this is more about outrage culture (being upset over relatively small things like microagressions rather than "real problems", often argued to be a fallacy of relative privation, unless the small things are not a problem at all or the argument is that the response is disproportional to the magnitude of the problem) than about whether an SJW is of a feminist or some other persuasion. Some disagree that SJWs are necessarily feminist or vice versa. A lot of these terms aren't exactly clearly defined.
The argument could have been a lot clearer and less confrontational. At least it seems more consistent than anti-feminists being "reactionary" for the old times where we could make jokes about women but being offended by jokes about men.
To clarify, by parent I was referring to the original commenter who made the parent comment to yours. My comment had nothing to do with biological parenthood. I'm glad you had a good laugh :)
egghead.io has some awesome illustration for the courses:
https://egghead.io/browse/languages/javascript
Technical book publishers should just stop hiring ink illustrator that like medieval drawings and get graphic illustrators that like Adobe Illustrator and SVGs
My comment also isn't really a comment about rust. It's about a common misconception that the unix/linux kernel somehow talks in "C".
Libc talks in C, most of the other libraries talking for interfacing with the kernel talk in C. The kernel speaks it's own language that has no special relation to C at all.
Except that you'll be hard pressed to find a syscall without an accompanying C library. It's common for other languages to build upon that rather than call the syscalls directly. Specially during the early days of a new kernel feature.
This article has nothing to do with Rust. Bringing Rust up like this is inappropriate, and looks bad on the Rust community. It fuels an "us vs them" mentality, and doesn't win over any hearts nor minds. It often hardens them, instead.
I'd argue that Rust is much closer to C++ than to C or ML.
Rust was designed by C++ programmers, to replace code that was previously written in C++. The concept that most guides its design is the idea of "zero cost abstractions", which a very C++ thing as well. ML doesn't mind if abstractions have a cost, while C is a smaller and more spartan language.
Damn it, two giant gross "toenail fungus" ads taking up most of the page - shocked me a bit and I closed the page really quickly (and probably now associate this book with fungal diseases - choose your ad providers carefully!)... Is there a non-fungus-y link to this book somewhere?
That's an ethical question that I've weighed a decent bit. I don't use ad blockers personally, as ads are truly the reason most of the internet is free. If ads are too intrusive, I will 'vote with my clicks' and avoid that particular website in the future.
There are privacy concerns with this, of course, but DuckDuckGo and modern Safari (with cross-site tracking prevention) helps mitigate a bit of it.
I felt the same way once, especially when Google first came onto the scene with their plain text unintrusive ads. After the 90's, they truly were a breath of fresh air. They weren't intrusive at all and for the first time were often even relevant. If memory serves I even bought at least one item from a text ad on Google sometime around 2005 or 6.
What's happening today couldn't be more dramatically different. Tracking your activities around the entire Internet, then using your own computing resources to auction them off to the highest bidder for every new page you view.
I don't have a solution to the ethical conundrum. But accepting this kind of thing cannot be it.
"If ads are too intrusive, I will 'vote with my clicks' and avoid that particular website in the future."
They'll never even notice, and in the meantime you're at risk of all sorts of trackers, malware etc. Plus...you're seeing adverts all the time. I used chrome on android the other day and I couldn't believe how many there were, or how annoying. If you're not clicking on them it means nothing anyway, as far as I can tell. I remember back in the day sites saying "click on the ads - it helps us" but I don't think that's a thing any more. Perhaps I should knock up a script to visit random sites with ads and maybe click on a few, in a container/vm. Would that help?
Once I started using "request policy", then eventually uMatrix, I found I no longer needed to use an ad blocker.
Especially if one browses with javascript disabled.
I occasionally do occasionally see ads, but now they tend to be non intrusive, and for sites I frequently read, can be avoided w/o the use of a blocker (and the regex engines they tend to depend upon).
C itself is an old language that lacks a lot of features that are near-universal in newer languages, but the language is still evolving, and there is still a valuable distinction to be made between how people preferred to write C decades ago and what's considered good style today.
Not really; C has a number of things that C++ never adopted such as restrict and partial structure initialization (although this might be coming to C++).
Runtime variable local arrays, designated initializers, restrict not. C++ cribbed designated initializers from C, coming in '20. Restrict was cribbed, late, from Fortran.
AFAIU C++ will require that you initialize the members inthesameorder as defined. It also won't permit you to exclude any definitions. Why even bother?
Somewhat controversially, C permits you to define members multiple times, with the last definition taking precedence. Compilers sometimes warn about this, but I've personally found the behavior useful--I'll write an API that provides a macro with default values yet which allows the user to override any particular definition. I don't think I've ever had a bug in an initialization, at least not involving multiple definitions. That seems like a very easy and superficially useful diagnostic to write, but which prevents useful behaviors; behaviors that named initializer's were deliberately designed to provide.
In a C++ designated initialization, members may be omitted. (You were told wrong.)
There is probably no reason to enforce order unless unmentioned elements have a non-trivial destructor, so that could be relaxed in a future Standard. Members that need destruction would need to be destroyed in the opposite order; enforcing order allows reusing code already generated for the containing-object destructor, but in many interesting cases (e.g. C structs) there are no destructors to run anyway. In the others, there is no reason why it would need to re-use the class destructor.
IMHO, the phrase “C is a compiled programming language” is super confusing.
Edit: there are many languages that have both compilers and interpreters. There are several C interpreters as well. The classification of languages as “interpreted” or “compiled” does not appear to be a sound concept, IMHO.
What about the phrase do you find to be super confusing? It is a short summary (even labeled as takeaway 0.1.2.1 in the text) of the first paragraph of section 1.2. That paragraph explains how C source code is just text and that it is turned into an executable program with a compiler. In context the phrase seems more than clear.
The explanation of what a compiler is and what it does is perfectly fine. A C compiler is how most of us do C. But going to “C is a compiled programming language” is quite the wrong generalization IMHO.
because languages are specification, "books".
they are not interpreted nor compiled.
The most common implementations of the C language are compilers, yes, but "compiled" it's not a language property
As another denizen of this chain, I still don't see the confusion at all.
> because languages are specification, "books". they are not interpreted nor compiled
Languages are a specification, and an implementation working together in perfect harmony, with absolutely no undefined behaviour at all, yes, keep walking now.
At least... the good ones try to be, cough ignoring small half baked interpreters and compilers I've had the pleasure of working with cough and never touching again.
> The most common implementations of the C language are compilers, yes, but "compiled" it's not a language property
It absolutely is a large part of the C language and worth teaching, and I'd only split hairs in a programming language theory class, this is a book that presumably leaves the reader with a better grasp of C than before. And arguing against it is an exercise in personal experience I assume (you have C interpreter experience I wager?).
Most people are still taught C in terms of a compiler like gcc, or clang. Source code in, object code (for a specific language specification, target architecture, etc) out. Think operating systems, kernel modules, executables, dll's, and, etc.
I never touched a C interpreter (but I am curious at such a beast), but I know that C is fine to be referenced as "compiled".
so, is javascript an interpreted language?
what about jits?
...or babel, for what is worth.
if languages are "compiled" why does both C and C++ (and also java etc...) need a "memory model"? bare physical address and real threads should be enough, no?
> Languages are a specification, and an implementation working together in perfect harmony, with absolutely no undefined behaviour at all, yes, keep walking now.
never talked about "quality" or "comprehensiveness" of the specs.
Just that languages are specs, not implementations.
Yes, in the real world you will use gcc, and you will learn that you write some text stuff and it will transform to an executable for you.
Still, why don't you use the right terminology?
The problem is that, if a book slip on this -- so basic -- definition, how good can it be on the complex parts?
The root thing here is the author said "C is a compiled programming language". According to the objection, C is also, an, interpreted programming language so its not the correct classification. That's the complaint I read...
I just don't see this complaint as valid. From the discussion garnered here, C be both a compiled and interpreted language. This is a book about C in the context of the compiled variant, and has references to gcc and clang. The level of detail and nuance when using terminology here is used to match the context the terminology appears in. Excess verbiage for terminology isn't a panacea to confusion, in fact it can increase it. Can't "C is a compiled language" and "C is a interpreted language" both be true independently?
Don't get me wrong, I like reading perspectives like yours, because I just discovered a bunch about C interpreters.
But I feel like you are asking for a "spec" level document on modern C, when you are reading a more practical work on it.
It just so happens that the specification in this case makes very specific demands on the translation environment, closely describing compilation step by step from source files into a program image.
Your "interpreted C" is only C in the loose sense that one may describe other non-compliant but roughly similar implementations.
I know that, unlike "Modern C", "21st Century C" can't possibly cover C17 because that book was released in 2014.
But otherwise, what would be notable differences? In terms of style, correctness, idiom, depth, and breadth?