Honestly, that conventional wisdom is a little hackneyed.
I'm not sure most folks would call C "wonderful" for low level code. It's warts are well known, and really don't need to be there: the structure syntax is needlessly complicated (. and -> are separate operators for no good reason); the calling conventions are backwards to support a varargs facility that almost no one understands; they underspecified the type conventions, leading to perennial confusion about the sizes of ints and longs or the signedness of char; etc... If you were starting from scratch to write a language to live in the same space, you probably would fix most of those.
But conversely, it's not "atrocious" for "application" code either. Plenty of good software has been and will continue to be written in C. It's probably a poor choice for code that handles mostly string data. It's doesn't have a whole lot of advantages for an architecture with lots of parallelism (i.e. one which doesn't need to worry about single-CPU performance issues). Since those two criteria generally define "web development", you don't see a whole lot of C used for the kind of problems YC startups work on. Just recognize that those defined a very limited notion of "application", and that lots of people in the real world are working on "applications" for which C et. al. are very useful, eminently practical languages choices.
Really, C is there to abstract the CPU. If the CPU has an instruction to do something, you'll probably find it as an operator in C. If it doesn't, you won't. If your problem is appropriately targetted at that level of abstraction, you'll think C is great. If it's not, you won't. I guess that's hardly a surprise.
I've worked on a large client/server property/casualty insurance application written in C. I've also written programs and middleware in C++, Java, Perl, and Smalltalk.
C is great for getting the CPU to do something at a somewhat low level, but still having some ability to abstract. If you want lots of abstraction my experience is that C++, Java, and Smalltalk are much better choices. Of those, C++ gives you the most leeway for getting yourself into deep trouble that's hard to debug. Smalltalk allows you to blow up or lock up the world with one errant statement, but messing with those parts of the library is rare, and you can get everything back anyhow because your source code and every change to the image is kept in something like a transaction log. Java gives you a lot of the benefits of Smalltalk, but saddles you with a syntax that was designed to allow low level programming with somewhat high level abstraction, even though you aren't doing the former and want the latter in spades.
I'm not sure most folks would call C "wonderful" for low level code. It's warts are well known, and really don't need to be there: the structure syntax is needlessly complicated (. and -> are separate operators for no good reason); the calling conventions are backwards to support a varargs facility that almost no one understands; they underspecified the type conventions, leading to perennial confusion about the sizes of ints and longs or the signedness of char; etc... If you were starting from scratch to write a language to live in the same space, you probably would fix most of those.
But conversely, it's not "atrocious" for "application" code either. Plenty of good software has been and will continue to be written in C. It's probably a poor choice for code that handles mostly string data. It's doesn't have a whole lot of advantages for an architecture with lots of parallelism (i.e. one which doesn't need to worry about single-CPU performance issues). Since those two criteria generally define "web development", you don't see a whole lot of C used for the kind of problems YC startups work on. Just recognize that those defined a very limited notion of "application", and that lots of people in the real world are working on "applications" for which C et. al. are very useful, eminently practical languages choices.
Really, C is there to abstract the CPU. If the CPU has an instruction to do something, you'll probably find it as an operator in C. If it doesn't, you won't. If your problem is appropriately targetted at that level of abstraction, you'll think C is great. If it's not, you won't. I guess that's hardly a surprise.