If you're interested only in learning about lower levels of the computer, and don't need to actually write at lower levels, it may be better to learn, say, MMIX.
But lots of people do need to learn C. They might not be working on hip new social web services, but they're out there.
In my work, I have observed recent computer science graduates filter in and get assigned to program embedded systems using C. Their knowledge of C was minimal. Their code tended to be suboptimal, or even bizarre to the point of, I wasn't sure why it compiled. I recommended they get a copy of K&R. They did. I don't think they got much out of it.
I don't blame them personally. They simply had not been taught C in school, and had never had a reason to learn it before. Trying to learn it on the job with antiquated books didn't seem to work very well. (Though some did eventually come to learn C well. Others migrated to different assignments not involving C.)
Whatever your reason for wanting to learn C, be it purely academic, or if you actually need to program with it, fresh, modern educational resources are a good thing.
Because if you don't know C, you don't know what's actually happening when you write software.
Because if you don't know C, you've only seen the map, not the territory.
Because if you don't know C, you're looking at the finger, not the moon.
Because if you don't know C, everything you think about computers is a leaky abstraction.
The counterpoint is that flawed models are often good enough. Newton's theory of gravitation comes to mind.
But if you like knowing what's going on, either for personal satisfaction, or because what you're working on is not fully encapsulated by a different language (inevitably, originally, written in C), then C is both interesting and useful.
Comments like these seem to have the underlying idea that C somehow explains how things work. It doesn't. It doesn't explain at all. It's only an abstraction among the others - granted, it's a lower level abstraction.
To understand how a computer works when it executes a program, accesses some data, writes to a disk, spawns a process and forks it, then kills the child after a certain interrupt can pretty much only be learned by learning assembly and some hardware and OS-related literature. Of course, for most programmers(let alone people in general) this is nearly useless information today. It's obsolete. And that's good.
But what C teaches you about how "things are under the hood", is just a leaky abstraction. C doesn't care about such low-level things because they are implementation details. Standard C could be ran by pen and paper just as well as on a computer. It's a "black box". Takes input and produces output according to the standard.
Frankly, the only "low-level" thing C has over e.g. Java is that most domains in which C is used are working close to the hardware and it runs on "the real hard machine". With C, in it's modern domains, you are exposed to these "low-level" details, not because of C, but because of the domain. With many other languages the user doesn't have to deal with such problems because of the problem domain is different.
"Of course, for most programmers(let alone people in general) this is nearly useless information today. It's obsolete. And that's good."
I'm not sure how you can say that knowledge of assembly/hardware is obsolete. Perhaps unneeded for tasks that most programmers have in our hip new ad-driven world, but someone still has to understand and write the code that runs the code that eventually puts the cat-based meme on your screen.
Engineers seldom use calculus, but they used derived functions every day. The same applies to programming, everyone who programs should be exposed to the low level innards enough to at least take the magic away and have a basic concept of what a computer actually does. You'll be better for it.
> To understand how a computer works when it executes a program, accesses some data, writes to a disk, spawns a process and forks it, then kills the child after a certain interrupt can pretty much only be learned by learning assembly and some hardware and OS-related literature.
That's funny, because 97% of the Linux kernel (the code that implements the things you mentioned) is written in C.
I think you're getting confused by the fact that C has a standard library, so it's true that you don't have to write process spawning yourself. But the standard library is written mostly in C and calls into an OS that is written mostly in C. So C's computational model can indeed explain how low-level things work.
The C standard even defines a "freestanding implementation," which is a C platform that has no access to an OS. So C-the-language can work without any OS at all, and without any runtime/interpreter that is filling the role of a traditional OS. Very few languages can say that, and indeed it's the reason why it makes sense to develop an OS in the language. To write an OS in a language that requires an OS begs the question.
I personally learned a lot more about how computers work from assembly than from C. One example: C programmers talk about "the stack", but it's an implementation detail, you can learn C without knowing what the stack really is. Assembly will teach you.
As for process spawning, I/O, etc.: it might be that C is just as good as assembly for those, but I would be surprised if either was sufficient. Presumably that's where "hardware and OS-related literature" comes in.
Of course, for most programmers(let alone people in general) this is nearly useless information today. It's obsolete. And that's good.
Damn right it's good--for me--that most programmers think low-level programming/machine knowledge is obsolete. I'm pretty certain it means I'll never want for a (good) job as long as I can still think and move enough to edit code.
Of course C doesn't "explain" how things work. C is a language, it doesn't explain anything. However, when you write and read real-life C code, you have a chance to learn how software works under the hood much better than with higher-level languages. The low-level rigid type system makes you see how data structures are really laid out. Pointers and direct memory addressing makes you realize how memory is really managed, the real meaning of passing things by value and by reference, and many more things.
However, when you write and read real-life C code, you have a chance to learn how software works under the hood much better than with higher-level languages.
getchar(); //C
STDIN.getc //Ruby
getChar //Haskell
System.in.read(); //java
How do any of those show me how software works under the hood? In none of them do I have any clue how a character makes it from my terminal to my program.
int* a;
Teaches me nothing about caches, memory latency, NUMA etc. Hell, dereferences are even guarantied to read the same physical location in memory(just the same logical location.)
struct stuff{
int a;
int b;
};
Doesn't teach me anything about memory layout. C assumes you are running on some hardware from the 70s, it doesn't know about virtual memory, address spaces, memory pages, NUMA, ram with multiple channels, the no execute bit, GPGPU programming. The only thing C has going for it is simplicity.
> struct stuff{ int a; int b; }; Doesn't teach me anything about memory layout.
Maybe not, but this does:
offsetof(struct stuff, a);
offsetof(struct stuff, b);
sizeof(struct stuff);
> C assumes you are running on some hardware from the 70s, it doesn't know about virtual memory, address spaces, memory pages, NUMA, ram with multiple channels, the no execute bit, GPGPU programming.
I'm not sure what you're complaining about; virtual memory is explicitly designed so that no code (even assembly language) "knows" about it except for the very small amount of code that sets it up. Likewise with most of the features you are mentioning. A memory reference in C operates at about the same level of abstraction in C as it does in assembly language, which is the lowest-level software interface available.
> The only thing C has going for it is simplicity.
But it's an interesting /kind/ of simplicity.
FORTRAN is simple, for instance. Yet we don't use it much any more.
C is simple enough to not make you go bananas trying to learn the language [C++], but rich enough that you don't go bananas solving large, interesting problems [assembly]. It pretty much nailed the uncanny valley of just complex enough.
It's a bit creaky. It desperately needs namespaces. I'm on the fence about memory models (this is a platform thing, in my mind), and definitely Do Not Want threads jammed into the language. I would love a decent macro language, but that's probably a decade-long debate (if what happened in the Scheme community is any indication). I would love a compilation system that didn't suck [yes, macros and a Go-like build system probably don't mix well].
I've been writing C for over 30 years. I plan to keep writing it for another 20. The unscientific, neat thing about C is that it's /fun/. I know this doesn't go over well with standards types and cow-orkers who feel the urge to override operator = and take dependencies on Koenig lookup all the time, but C has a charm that other, similar languages have been unable to capture.
FORTRAN is simple, for instance. Yet we don't use it much any more.
Computing is like an iceberg, with the web being the bit above the surface.
rich enough that you don't go bananas solving large, interesting problems [assembly]
Two words: macro assembler. It may surprise you to know that not that long ago, sophisticated GUI apps were written in assembly language (in fact the IDE I use on my ST, Devpac, was written in assembly, and it's everything you would expect of a modern IDE - editor, compiler, debugger, etc - running under GEM). Many games were written in pure ASM.
I've done macro assembler. (Hell, I've written a couple). I know all about macros. C is not just a fancy macro assembler.
This doesn't get you away from the ooky stuff that C does for you, like register allocation and code optimization (link time code generation is a wonderful thing, even for C). Very few assmeblers are bright enough -- or should be trusted enough -- to do code motion, strength reduction, common subexpression analysis. And, oh bog, just writing a plain expression? Not even possible in a macro assembler.
[I rewrote the ST's file system in assembly, btw. It started out as an honest effort, then got bogged down in stuff that would have been a no-brainer in C.]
Indeed and even more... The entire GEOS operating system, gui/windowing system, application suite, and the vast majority of the rest of the apps were written in 8086 assembly language and ran on an original IBM PC.
There are a few niches where new code is written in assembler (numerical code, parts of standard libraries), but in general, I think 'maintained' is a better word to use.
I don't think you have to go all the way back to the ST,Amiga era, I recall there was a surge of Win32 assembly after that with some nice assembly applications as the result. However with the level of optimization offered by compilers today, these days assembly seems mainly relegated to where it's fine-grained control allows for better performance in extremely performance oriented parts of code. An example of such would be SIMD code where highly fine-tuned assembly code often runs circles around the compiler generated equivalent, as proven by simply comparing x264 performance with or without assembly optimizations.
Personally I haven't done any assembly programming in atleast the past 6-7 years but I still get the urge now and then to program in it again. However, even though it's unlikely that happens, my assembly experience has given me a thorough understanding of how the computer works at a basic instruction/memory level which has been extremely valuable when I want to create optimized code in higher level languages (like C). So yes, while learning C is certainly worthwhile even if you are going to write in even higher level languages, learning or atleast graping the fundamentals of assembly is in my opinion even better.
What if I want to know how registers are used and allocated in a program? Most C compilers ignore "register" annotations, and with good reason- they hamstring the compiler's ability to optimize, which is usually a completely opaque process from the programmer's point of view in the first place.
Why have a type system that includes implicit coercions between types, sometimes with different internal representations? (Just imagine an int promoting to a float.) Doesn't that obscure the "real meaning" of the program, or is it merely a detail you find uninteresting? C is not the only language that makes these low-level concepts accessible, and C does not represent the "floor" with respect to making the behavior of hardware explicit.
If what you're actually getting at is that C is the most popular language today that exposes all of those things, it sounds less convincing- Systems languages cannot advance if we take C's position as given.
You're right, C is the most popular language with the least amount of magic between you and assembly. It's a useful balance of abstractions on top of assembly without a total loss of the processor model.
This isn't less convincing to me, I would never argue that someone learn C instead of all other languages. But I do believe that knowing C makes you a better programmer in all (current popular production) languages.
Re: your point about systems programming not advancing...I didn't mean to imply that C was "perfect" (in the Latin sense, meaning "done, finished"), just that it's the best we have for many things.
I do really like Go, and wish I could use it more, for non-personal projects.
Having really bizarre type conversions can allow for brilliant code in some cases - the Quake fast inverse sqrt function contains a float->int coercion, iirc.
> To understand how a computer works when it executes a program, accesses some data, writes to a disk, spawns a process and forks it, then kills the child after a certain interrupt can pretty much only be learned by learning assembly and some hardware and OS-related literature.
Well yes, but you're talking about library code there. Your code, which links in libraries, will never fully describe those libraries.
But that library code was very likely written in C...so...
I've seen a fair share of lispers and clojurians blogging about compiled assembly, cpu cycles and cache usage of their ludicrously high-level macro-expanded DSLs. As people said already, C semantics is closer to the metal but that's about it, it's just one small language (not insulting) It's not reality, even assembly isn't. In any language, doing full stack design will get you close to reality.
C is a leaky abstraction based on a simplified model of how computers and their memory worked in the 1970s. C code is transformed in complex and heavily context-dependent ways by all but the most primitive compilers before it hits metal, despite the popular opinion that "the language is close to the machine".
If you like C, use it. If you want to know what's going on, learn assembly.
You're conflating the language with its modern compilers on modern processor architectures, but I agree. My statements above were the iconoclastic and anachronistic version of reality, purposely so.
If you know C, it is possible to comprehend a direct mapping of your "high level" code to a set of processor instructions. This is complicated by modern huge glib-style libraries and modern compiler optimizations, but it remains possible.
That, combined with the fact that C is foundational in almost all current production languages, and remains highly useful among them, are why I think anyone who takes programming seriously should know C.
Someone else compared C to Latin. Obviously C is more useful on its own than Latin, but the comparison is valid for the foundational aspect. Understanding threads and locking in C gets you a long way toward understanding those concepts in most popular languages.
I'd make the same arguments for understanding Unix, by the way...and also that you can't really understand Unix without understanding C. And the converse.
> This is complicated by modern huge glib-style libraries and modern compiler optimizations, but it remains possible.
I disagree. It's complicated by those things, but more importantly it's complicated by modern processor architectures, which do not execute one instruction after another anymore like C would have you believe - they just mostly pretend to.
Which isn't to say this is any less true of the other high-level languages we have at hand, and I still think knowing C is valuable (although I'm wavering more on that then I was 5 years ago), I just think your characterization is incorrect.
OK, but you're diving deep into arcane abstraction there. All the tricks that modern processor architectures do for speed are only relevant to your understanding if you are designing microprocessors (which everyone should also try, we did it in college, and it was a blast).
At that level, even the compiled opcodes aren't strictly demonstrative of what's happening inside the processor, but you'd only know that if you were running a simulator and watching traces light up.
Well, that's a part of it but it can be relevant to performance. Cache behavior and branch prediction aren't very visible in C. There's also a lot of goings on with memory where C just says, "Uh, unspecified!", particularly with multiple cores and/or cpus.
Yes and no, there are no way in plain C to excert direct control of branch prediction or cache usage, however that's where GCC extensions come in.
Extensions such as __builtin_prefetch, __builtin_expect are heavily used in the Linux kernel to allow a higher level of optimization by directly instructing the compiler how to handle branch prediction and caching (based upon careful benchmarking) in performance critical areas rather than leaving it up to the compiler's 'compile-time' heuristics.
I could make exactly the same assertions about assembly language (I learnt 6502 when I was a kid, and dabble in 6502 and 68k these days for fun, keep meaning to learn ARM). Being able to reason at that level, and read generated ASM (x86 ugh) is an invaluable skill which I use daily, and it's a level below C.
But, lines of assembly language professionally written: 0.
Why can't this characterisation of "close-to-the-metal enlightenment" come from any language that surfaces your OS's SysCalls, along with an understanding of what compilers and assembly are?
It's possible to understand the entirety of practical computation from top to bottom without C needing to be in your toolbox. Correct me if I'm wrong.
Nope, you're right. But your "any language" almost certainly passes those syscall bindings through C to get there. Some languages are higher fidelity than others.
However, C requires you to know more about how the compiler is going to turn your character string, say, or your looping construct, into assembly code.
Some languages don't even have structs or pointers, or explicit memory allocation! You're laughing (I hope), but these are real things in memory, with only a thin layer of magic (layout, pointer math) in C...whereas they do exist in other languages, but are hidden -- usually pretty well, but rarely completely to the point where you are better off not knowing about them.
Not really. Using a for loop in C requires no more understanding than using a for loop in Python. As for character strings, what you need to know in C is the "gotchas" related to running off the end, in most cases the machine would be just as happy with a Pascal-style string[1]
I think Go, for example, performs syscalls directly.
> However, C requires you to know more about how the compiler is going to turn your ... into assembly code.
But does it really require (with emphasis)? The nature of some debugging and optimisation might push you towards that, but it's still a choice.
I feel your point comes down to the use of C as a "vehicle" to learn low-level concepts. Which is fine, but someone can also just learn them directly (upfront or retroactively) and use whatever progression of languages they want.
C++ is C with libraries to hide some stuff and add others.
Removing C++ OSes from your list, I'd say that you're down to a vanishingly small percent of deployed systems. So yes, knowing C is important.
And I have never met a single person who knows any of your remaining languages who isn't at last passably familiar with C. I've never asked, but I don't think any of them would call their C knowledge unimportant or not useful.
In Windows all the new APIs since Windows Vista are mostly COM (C++) based, including the new Userspace Driver Framework.
With Windows 8 this will increase, as Microsoft is moving to have C++ as their official systems language, by introducing WinRT as the new subsystem, and dropping support for anything besides C89.
Just because C is used a lot, you cannot say that all (even 99,9%) operating systems use it.
Actually I failed to mention the amount of operating systems that you may have running on your watch, vcr, radio, microwave, etc, which summed together are much more that the OS X, Windows, Linux base.
"..possible to understand the entirety of practical computation from top to bottom without C..." - what level of understanding are you talking about? You don't need to implement data structures, for example, in order to know how to use them, but implementing them gives you deep understanding of the trade-offs you need to make, and implementing them in C specifically helps even more. You get to dig into the memory with malloc for example, you "see" and can even calculate all the memory that each struct will use up. I mean, this level of enlightenment does so much good, even if you don't use C to write production code on a daily basis.
I'm just saying that it's technically possible to implement anything (including what you describe) without being forced to use C (though perhaps awkward), so it's not safe to assume that someone who doesn't know C is ignorant of the low-level. At the same time it's obviously not rational to ignore C just for the sake of it.
I do like C myself and plan to continue using it. I've been looking in to OCaml recently and as someone else in the comments indicated, they seem like they'd make a good team.
Hum. Unless what you're working with is translated into C, which is pretty rare these days, learning C won't tell you much that's of any use. The relationship between knowing C and writing good Python, say, is approximately 0. (The value may even be negative.)
Same even goes for something like C# and Java. And in any event, the major influence on how efficiently code runs is the underlying CPU, not C. C is the same, no matter which CPU it's going to be compiled for; the resulting code, which is what actually gets run, will vary. `p[i]', where `int p[100]' is a global, is going to turn into something quite different when built for MIPS from what it would for x86.
So what good is learning C? Well, if somebody will give you a job with it, and that job is well paid, then maybe it would be worth learning seriously... not sure how common this is these days, though.
> So what good is learning C? Well, if somebody will give you a job with it, and that job is well paid, then maybe it would be worth learning seriously... not sure how common this is these days, though.
Or if you want to hack at and/or understand parts of Linux or any of countless utilities.
I think learning many different kinds of languages is important. The breadth of experience makes you a better programmer as a whole. During the trial and sampling of many languages you will fall in love with one. Stick with that one! Your job should be fun. Each person has different learning modalities and concepts they just "get" that are congruent with how they think so no one language is perfect for everyone.
But, why learn C? Learn C to broaden your experience and gain a marketable skill in the most popular programming language[1]. You don't have to become an expert in it but I think everyone should dabble in it a little bit!
>I think learning many different kinds of languages is important.
I agree with you, 100%. There are no ultimate languages. You have a broad field, as a developer, of words to use - choose the one you a) get along with, b) can comfortably use, c) and are interested in actually using for something.
What I have observed is that good developers learn other languages faster, the more they do it, i.e. if you set your goal higher, you get better at it each time you iterate. I tend to think, as a generality, that eventually there is a point for each individual developer where the effort to learn some new language/codebase/taxonomy gets flatter and flatter, to the point that there are no 'easier nor harder' levels of it any more.
C is still very, very useful. $35 worth of pocketable computing power and a built-in C compiler can still deliver kick-ass results.
I've recently started relearning my C skill, since I've had not touch it in almost a decade. I'm combining LCTHW, K&R and Sedgewick's Algorithms in C series in the process.
I have not had so much fun programming since high school. The general feeling of increased brainpower lurks with me these days mostly because of it.
Well. Think about it. Probably 95% of the instructions executed on your computer were generated by a C compiler or inlined withing a C program. Check out how many of your programs are linked to a standard or special C library. Perl, bash, Python etc. are written in C. Most libraries are written in C.Throw in what's basically C programming in Objective C and C++ and we're closer to 99%
No.
C++ is a superset of C.
You can write C and jam it into a C++ program.
This happens often enough in C++ programs.
Java absolutely does count. The Sun JVM was written in C !!! So, yes, every java program is executing instructions generated by a C program.
A distinguishing characteristic of Jikes RVM is that it is
implemented in the Java™ programming language and is
self-hosted i.e., its Java code runs on itself without
requiring a second virtual machine. Most other virtual
machines for the Java platform are written in native code
(typically, C or C++). A Java implementation provides ease
of portability, and a seamless integration of virtual
machine and application resources such as objects, threads,
and operating-system interfaces.
I program on a mid range system. If I want to write in C for it I could but I have no reason too. I use two languages that appeared well before C and produce very effective and near bullet proof code because of it.
There are some big names in the language world that were here before C and some probably had influence on that language.
While I would not mind refreshing my C, having not had any use for it since 85, I am not even sure where to start.
Out of curiosity, what do you use? My history is not too great, but I can't think of many languages that came before C that are still in use today. I actually use fortran at work (along with C), and I'll give a pass to Lisp (it seems like c predates the dialects that are still in common use, ie common lisp and scheme), but it seems like most others (Pascal, COBOL, Basic) are more curiosities in the modern world. Please correct me if I'm wrong.
Actually Cobol is pretty much everywhere in the substrate, but not in visible places... banks, insurance companies, and the like have tons of cobol code in use.
And if we count VBA, VB6 and VB.net as spiritual successors to BASIC there's an absolutely ungodly amount of new BASIC code in use and being written every day anywhere people use Excel. It's ugly, but it's a large part of how business is done.
Because when programming in any language, you can't rise above your own proficiency in C.
That is my observation anyway. I think it has something to do with the deeper understanding of references and dereferencing, and the inner workings of languages in general.
I've been programming Perl for going on ten years, I know a bit of Java as well, and have spent the last 4 or 5 years learning a lot about web development. I even recently learned nodejs and coffeescript to a pretty decent level and released https://emailprivacytester.com/ using it.
I just started to learn C and C++ about a week ago. I did a module of it at University many years ago, but I never kept it up afterwards and forgot nearly all of it. Anyway, I am completely loving learning it at the moment. It just feels like I have so much more control of what I'm doing, and I can't wait to build and release my first non-trivial application using it.
I want to know C well. Mostly, I just want to be able to confidently hack on the massive amount of software that has been written in C over the years.
[edit] I've always had this feeling in the back of my mind, that no matter how good I am at Perl or JavaScript or HTML or Java, I will never feel like an "expert" until I am good at C.
Coming from a web development and Python background, and starting to really learn C about a year ago, I know the exact feeling you're talking about. As many cool Python programs I had written, I just didn't feel like it was a "real" program until I programmed and compiled an application in C. And I love how C has made my Python so much more powerful, both through ctypes and Cython. Finally, C was the gateway to Lua, which I'm now enthralled with: a tiny, extraordinarily fast and flexible scripting language that is easily compiled into a C program, thus giving me that single-file binary executable I cherish so much. I know I could do the same with Python, but LuaJIT is just so easy to embed, and yet so incredibly fast. Learning C has opened a lot of doors.
I first dipped my toe into C waters at 12 (years old) and very soon felt out of my depth enough to leave it alone. Cut to much later and I think I'm ready to try again.
I've no project in mind yet, but, Eric, your libtm plan caught my eye a few months back and if you've anything on github I'd be interested to follow along. I'm a translator myself and often find myself optimising how MemoQ could be matching segments.
Hey, I just e-mailed you. I've ended up starting to prototype libtm in Python -- getting started in C just wasn't working. I still "think" in Python. Python works out well because all the libraries I'm using (leveldb and ICU, mainly) also have very nice Python bindings. Once I've got the prototype in Python, I'll port it to C.
I'll see if I can put what I've got in Python up on github after I finish this big project at work this week and next. I'd be really interested in your input. Did you read the Tim Baldwin presentation on translation retrieval I referenced in my post on libtm? He's done some really cool research that has had a real impact on the way I keep my TMs (e.g., trying not to go over 20,000 segments).
I just want to check in with you: I too am in the midst of a total love affair with Lua/LuaJIT. That one little VM has really turned into something amazing. And now, the ways it is used!!! Phenomenal.
I'm astounded with LuaJIT. Have you read some on Mike Pall's posts about the inner workings of LuaJIT on the Lua mailing list? I can't follow everything, but what I do follow just blows me away.
This quote is the most important out of the whole bunch: "It's an important, foundational language that requires you to understand the full stack of the technology. If you learn C, you'll understand computers at a much more profound level than if you don't."
We should be careful about the word "learn." I read it as "understand C and master C," not simply being able to write C code.
Why read Shakespeare when you're learning to write? Why study Bach when you're playing jazz? You can definitely skip Shakespeare and Bach, but I think it's universally acknowledged that studying their work gives you important foundational knowledge even if they don't apply directly to your work at hand. Same with C.
> There will be some poor soul studying and using C in a thousand years.
I would hope that by that time there will be powerful algorithms you can use to transcompile old C code into $LANGUAGE code.
EDIT: Well actually I would hope by that time strong A.I has been invented and asking flesh and blood humans to write code will elicit some really strange looks.
I think C++ has greatly made C obsolete in many fields, I'd rather see people embracing C++ than C when it comes to learning and using either. For example almost everything C teaches about "low-level" and "how things work" is learned by C++ too, while giving possibility to remain at higher level of abstraction most of the time - if so desired.
The majority of good, and reusable code is in libraries - as it should be. If you write a library in C++, you're limiting potential users of said library to only using C++, because there's no ABI compatibility with other languages.
C on the other hand, has almost global compatibility with other general purpose languages. It should therefore be everyone's effort to write good C libraries and promote code reuse.
Of course, there's no harm in writing your library in C++ and exposing a C interface to it - but that's not necessarily as simple as it might sound. The better approach is to design your API in C and then figure out how to implement it in C++.
As for writing the actual applications (non-reusable part), there are obvious advantages to using C++ over C.
C++ is awesome, no doubt about it. However it's a superset of C and the entry cost is much, much higher. Learning C is good for your culture, learning C++ is useful only if you plan on using it.
The post is about building GCC itself with a C++ compiler, which is already possible on most(?) platforms: Basically, the default of the --enable-build-with-cxx configuration option will change, which does not affect the default input language of the gcc executable (which is chosen by file extension anyway).
Learning C before C++ is a very bad idea, as it will make you try to use language patterns that are considered bad ideas in C++, as there are safer alternatives.
I think that in order to be proficient in C++, you also need to understand what separates C++ from C and how C++'s features (classes, overloaded functions, templates) map to C.
> Kernighan and Richie's The C Programming Language is one
> of most popular, if not the most popular, programming books,
> and it defined the ANSI standard.
Wait, what? ANSI defined the ANSI standard, which had quite a few additions from K&R C. It'd be more proper to say that the book defined C itself before the publication of the standard.
I've been a professional C developer for 30 years. I've written C in practically all spheres of application - in embedded firmware, drivers/OS stack, userspace, web, &etc.
These days, I'm not writing so much C, but more heavily depending on the deep understanding I have of how various layers of the whole stack of a computer, running an application, work. How C works, what a compiler is doing with C code, how to write good (and bad) C code .. all of this is a deep skill, still applicable to understanding and debugging at the OS and Application level.
The reason to learn C is this: there are a lot of components of the modern stack that are still composed of it. If you learn C, and know C, and can comfortably produce reliable, rock-solid, working systems around a C framework, then you will have exercised an ability that is, naturally, broadly applicable throughout the computer world.
That is not to say that you should not make C your main language; moreso, use other more advanced, more tailored languages as you see fit. But if you're going to exercise the ability to freely shift over the whole stack, C is going to be a mightier tool than most.
Re: giving K&R to newbies: it should be, give K&R plus "Expert C Programming - Deep C Secrets" to newbies, and professionals alike. Together, those two easy to read books will help you gain a fast grasp of how to write C programs, why to write them in certain ways, and so on. I find K&R a great reference, but Deep C Secrets a rather fun read; only have the latter on the crapper shelf in the bathroom, for example. Plus, I never get DeepCSecrets back when I loan it to fellow coders (ever), so I have a few extra copies, too, to give to newbies I work with.
See: (http://books.google.at/books/about/Expert_C_Programming.html...)
If you are learning C, and haven't heard of cscope, you can do no better than get it set up, do the tutorial, learn how to use it as a tool. cscope is a very capable text-based navigation/browsing tool, for large C code bases. Unpack your favourite F/OSS application, fire up cscope on the root dir, search for main, build a key word list, navigate freely. Bonus points: it has superlative vim integration.
C will not fail you when most other languages might. There are times you may realize, in fact, you are doing things that would be easier in C, off in Java and Haskell and Ruby land, ad&infinitum..
One last really, really good reason to learn C: Lua.
Lua kicks ass. Why? Because you can glom Lua into any C code base, and give yourself a much comfier language to do business/game/architecture code in, while still having a fairly large degree of raw control over the C runtime power, to boot. Master putting Lua into a small C lib collection, and you'll see what I mean.
Learning to add Lua to a big code-base is like .. somehow .. a final 'delivery' of the whole 'write code once, run it everywhere' promise, albeit its a developer mantra, not some CorporateOS-decides-to-bundle-your-interpreter/runtime issue.
Anyway, just my two cents worth. Hope I still see new C coders being made in a few more decades, eh ..
But lots of people do need to learn C. They might not be working on hip new social web services, but they're out there.
In my work, I have observed recent computer science graduates filter in and get assigned to program embedded systems using C. Their knowledge of C was minimal. Their code tended to be suboptimal, or even bizarre to the point of, I wasn't sure why it compiled. I recommended they get a copy of K&R. They did. I don't think they got much out of it.
I don't blame them personally. They simply had not been taught C in school, and had never had a reason to learn it before. Trying to learn it on the job with antiquated books didn't seem to work very well. (Though some did eventually come to learn C well. Others migrated to different assignments not involving C.)
Whatever your reason for wanting to learn C, be it purely academic, or if you actually need to program with it, fresh, modern educational resources are a good thing.