Hacker News new | past | comments | ask | show | jobs | submit login

Not exactly, read the exact paragraph in The Development of the C Language:

> [...] In all these cases the declaration of a variable resembles its usage in an expression whose type is the one named at the head of the declaration.

> The scheme of type composition adopted by C owes considerable debt to Algol 68, although it did not, perhaps, emerge in a form that Algol's adherents would approve of. The central notion I captured from Algol was a type structure based on atomic types (including structures), composed into arrays, pointers (references), and functions (procedures). Algol 68's concept of unions and casts also had an influence that appeared later.

Algol 68 by the way had a sensible type syntax that is akin to Go's. Ritchie "explains" the syntax difference by having the same syntax for the type and the expression, but that's not exactly why. For example there are actually two expressions relevant here, `*` dereference and `&` reference. Why was the former used? Why couldn't `*` be made a postfix at this point if the same syntax was a big concern? Ritchie himself never fully elaborated on those points, and I think that's because it was never his primary concern at all.

As you have noted, it is very important to realize that C did have their design goals (for which C did a very excellent job). On the other hand it would be very misleading to claim that C had some bigger visions even at that time and they validate its design! Ritchie and others were clearly smart but they didn't design C to be something magnificant, it was a rough product from various trade-offs they had to endure. So why this particular syntax? Well, because B already picked a prefix `*` which Ritchie didn't want to change at all, and he allowed it to infect all other aspects of the type syntax without much consideration. (Or more neutrally, Ritchie couldn't figure out any other option when he had to keep the B compatibility. But keep in mind that B have already changed that operator from BCPL.)

Technically speaking it was somehow designed, but only by following the path of the least resistance, not a solid reasoning, hence my use of the word "accidental". There are many other examples, such as the logical `&` (etc.) being renamed to `&&` but the bitwise `&` keeping the original precedence order because nothing was done. To be fair to Ritchie though, it is not his fault, but rather a fault of the whole software community that was fixated on this ancient language designed for specific purposes way too long.




The relevant paragraphs from Ritchie's paper are not just what you quoted (much of your comment is not relevant) but much more;

For each object of such a composed type, there was already a way to mention the underlying object: index the array, call the function, use the indirection operator on the pointer. Analogical reasoning led to a declaration syntax for names mirroring that of the expression syntax in which the names typically appear. Thus,

     int i, *pi, **ppi;
declare an integer, a pointer to an integer, a pointer to a pointer to an integer. The syntax of these declarations reflects the observation that i, pi, and ppi all yield an int type when used in an expression. Similarly,

     int f(), *f(), (*f)();
declare a function returning an integer, a function returning a pointer to an integer, a pointer to a function returning an integer;

     int *api[10], (*pai)[10];
declare an array of pointers to integers, and a pointer to an array of integers. In all these cases the declaration of a variable resembles its usage in an expression whose type is the one named at the head of the declaration.

The above can be summarized as the following two points;

1) "Declaration reflects Use" (from K&R C book)

which leads us to the corollary,

2) Syntax is variable-centric rather than type-centric i.e. You look at how the variable is supposed to be used and then work out its type.

> To be fair to Ritchie though, it is not his fault, but rather a fault of the whole software community that was fixated on this ancient language designed for specific purposes way too long.

Again, this is completely baseless and merely your opinion. I have already pointed out the main design goals which drove its design and the fact that it is still relevant today is proof of that. The "simplicity" of its very design is its greatest strength. The fact that modern needs (eg. Strong Type Safety, Multi-paradigm, Security, Concurrency etc.) require us to design/seek out more language features is not a reflection on the language itself since it was designed well before these "wants" became "needs". On the other hand the various extensions of C (eg. Objective-C, Concurrent-C, Handel-C etc.) are proof of its versatility and extensibility and hence its enduring relevance.


Not only you swiftly dismissed most of my comment without even trying, but you have no backing evidence for why declaration has to reflect use (K&R has no rationale for that AFAIK). And even that principle doesn't imply your claim; it only means that syntax changes to the declaration and uses have to be synchronized.

> On the other hand the various extensions of C (eg. Objective-C, Concurrent-C, Handel-C etc.) are proof of its versatility and extensibility and hence its enduring relevance.

Most popular enough languages will eventually have tons of extensions, most of which would be obscure and not known to the general public though. I'm not even sure which "concurrent" C you are talking about (there had been multiple extensions of the same name in the literature, none of them ever gained traction in the industry). Have you ever actually seen or used suggested extensions firsthand?

Also extending a language is actually kinda easy and doesn't imply some kind of quality. You seem to value your own decade-long experience, but I had been also a programming language researcher decades ago and I know that because any PL researcher will make new languages all the time. Or let's put in this way: Brainfuck has been extended and adapted so many times [1]. Does that make Brainfuck "versatile and extensible" in your definition? Is the enduring relevance even a necessary consequence of such qualities? Think about that.

[1] https://esolangs.org/wiki/Brainfuck#Related_languages


You seem to have lost the plot in this and your previous comments. Hence my dismissal and bringing the discussion back to pointer types and summarizing it.

In addition to The Development of the C Language - https://www.bell-labs.com/usr/dmr/www/chist.html, read also this presentation by Ritchie titled Five Little Languages and How They Grew: Talk at HOPL - https://www.bell-labs.com/usr/dmr/www/hopl.html

Thus Asking questions like "why was * used for pointer syntax" is meaningless since Ritchie himself says he took it from B; that's all there is to it. Also when i said C's design was validated it did not mean that the designers knew everything (they are on record saying that they themselves were surprised at its success) but that later usage in the industry validated it. The parsimonious design itself was a huge reason i.e. it was not "accidental" as you claim.

Thinking about the evolutionary path BCPL->B->C answers your other questions.

> why declaration has to reflect use

While Ritchie took types from Algol he didn't go the whole way (thus weakly typed). Since both BCPL/B had no type system to speak of, they treated memory as a linear array of cells and a pointer was just an index into this array. When you combine these two you have pointer operations defined in terms of the type of the object pointed to. Now you can see how "Declaration reflects Use" and variable-centric syntax makes sense (all the quoted examples in my previous comment). There is nothing called "a pointer" but only "a pointer to an object of a type" with the object in a sense propagating its type details to the pointer.

> Most popular enough languages will eventually have tons of extensions,

Nothing comes close to the "C family of languages" by which i mean not just extensions but anybody who took inspiration from it.

> I'm not even sure which "concurrent" C you are talking about

There is only one "official" one, that was designed by Narain Gehani and explained in the book "The Concurrent C Programming Language".

> Also extending a language is actually kinda easy and doesn't imply some kind of quality.

I am not talking about trivial additions but the idea of a carefully designed "Core/Kernel language" which is then extended into various multi-paradigm domains. The best example here is "The Oz Programming Language" (https://en.wikipedia.org/wiki/Oz_(programming_language)) By design and accident C turned out to be a pretty good Core/Kernel language for others. The three examples i listed above show how C was extended to Smalltalk-style OO domain, parallel programming domain and hardware description domain thus demonstrating its versatility and extensibility. I can say even more if using C++ as an example (viz. Cfront, C-with-classes, C++-as-a-better-C, Templates-implemented-using-preprocessor etc.) but my point should be clear now.


I already had spent too much time on this discussion so this comment will be the last one for anyone still following this.

> Thus Asking questions like "why was * used for pointer syntax" is meaningless since Ritchie himself says he took it from B; [...] Thinking about the evolutionary path BCPL->B->C answers your other questions.

It rather means that Ritchie took it from B and didn't feel like it should change. It should be noted that B did change its dereference syntax from BCPL, which eventually settled on `!e` and `e1!e2` for `*e` and `e1[e2]` in C (with everything implicitly being a byte pointer). As far as I'm aware there is no documented reason why Thompson turned `!` into `*`, which by the way would be used in too many places at this point until C moved away from BCPL-style escape sequences in string literals. Maybe Thompson's other language, Bon, has some clue but I have no information about that. In any case Ritchie clearly didn't think far into this particular change, because C was originally conceived as a largely compatible extension to B (then NB) and then stuck. Isn't that accidental enough?

> Now you can see how "Declaration reflects Use" and variable-centric syntax makes sense.

There are multiple concrete implementations for that driving principle. `int a;` and `a int;` would have been equally okay under this principle, so the only real reason to pick the former is the influence of B (`auto a;`). I wondered that you are using that term only to mean C's strictest implementation, and that you actually don't like ANSI/ISO C's newer function declaration syntax, but then your arguments for the general principle would not back the eventual syntax used by C.

> Nothing comes close to the "C family of languages" by which i mean not just extensions but anybody who took inspiration from it.

There are many different classes to consider. In the broadest sense even Python is said to be inspired by C, even though it would be absurd to consider Python to be a proof of C's inherent quality in addition to its popularity. Some languages are also only syntactically similar to C because its block syntax did have some nice syntactic property (known as "curly-brace languages" nowadays). Those superficial similarities can't imply your claim.

> There is only one "official" one, that was designed by Narain Gehani and explained in the book "The Concurrent C Programming Language".

It was never standardized, and apparently it wasn't available much outside of AT&T Labs. If the book would make it somehow official, I can write and publish my own book with the same title today. In any measure, that Concurrent C language is not as notable as other concurrent languages based on or influenced by C.

> By design and accident C turned out to be a pretty good Core/Kernel language for others.

There are a lot of extension languages that are NOT based on C or C++. In fact, I believe every single language in the TIOBE Top 20 ranking has at least 10 notable ones in average. C was used for extension only because it was popular for many years, and many such extensions had to work around or bend its limitations for their purposes. (For example, C itself doesn't have types for distinct address spaces and many extensions add reserved keywords, which you can add to any existing languages as I noted earlier.)


If you want a discussion on the overall design of C and your opinion of it create a separate thread and i (and others) will be more than happy to engage you there.

> It rather means that Ritchie took it from B and didn't feel like it should change ... Isn't that accidental enough?

That is not called "accidental" but a "design decision".

> In any case Ritchie clearly didn't think far into this particular change,

That is just your opinion and not borne out from Ritchie's own writings and what we can infer from it.

> `int a;` and `a int;` would have been equally okay under this principle,

No, the decision to have "int a" was already done and only a "pointer to int a" was being thought of leading to a variable-centric view.

> Some languages are also only syntactically similar to C because its block syntax did have some nice syntactic property (known as "curly-brace languages" nowadays). Those superficial similarities can't imply your claim

https://en.wikipedia.org/wiki/List_of_C-family_programming_l...

> In any measure, that Concurrent C language is not as notable as other concurrent languages based on or influenced by C.

The point was not popularity nor standardization but the domain into which C was extended. That is why i gave the three specific examples one for each important domain.

> There are a lot of extension languages that are NOT based on C or C++.

That is not the point. I am talking about using C as a "Core/Kernel language" in the design of other languages. No other language (other than the Lisp family) comes close to C in versatility and eventual success here.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: