Hacker News new | past | comments | ask | show | jobs | submit login

These are all simple (not necessarily intuitive) if you know how operator binding works in C, (using braces to highlight);

  int *x[10]; ---> { int* } x[10];

  int (*f (void))[10] ---> int { { (* {f (void) } ) } [10]; }
The point is that once you have had some practice you can work it out and Go's syntax is not necessarily much better.



Your original comment said it "is easier to read and understand", not "can be worked out after some practice". Of course it is not like you should inline every `typedef`s into a single mess of complicated types, but you never said that you believe only simplest types should be used and C syntax is easier for them.

In any case, I think Go is a clear winner here because all logical types are consecutive tokens. For example `f` in my example is (as you correctly parsed) a pointer to a function that returns an array of 10 integers, but that return type is normally written `int [10]` or `int NAME[10]`, while here is written in two chunks `int` and `[10]` with a big parenthesis inside.


Again, my using "is easier to read and understand" was w.r.t. the parent's claim w.r.t. Go's syntax. You understood it wrong to mean an absolute general case.

You need practice for complicated things and that is what i was pointing out with "can be worked out after some practice" and not that everything trivial needed practice.

"Go is a clear winner here" is your claim and not necessarily one that i agree with since as mentioned, knowing the binding rules and a little practice complicated declarations are not that big of a deal.


Agreed that it's not actually a big deal (hence "here"), but it does strengthen a point that the C syntax wasn't designed carefully after all. The current C type syntax was completely accidental and any reasonable design could have avoided that. If that was too late for some reason, one could have defined a new parallel syntax that solves this problem. In fact C++ did so via its new function declaration syntax `auto f(...) -> ...`. Guess why...


> The current C type syntax was completely accidental and any reasonable design could have avoided that.

Absolutely baseless claim.

The C syntax and language is the product of a small group (not a committee) of smart people with the goals of syntactical brevity, close to machine architecture (PDP-7/11) and the design goal of building along the path of BCPL->B->C. Dennis Ritchie himself explains the rationale in his paper The Development of the C Language and so one does not need to make untenable assumptions. The enduring success of the language (even in the face of all the developments since then in Computer HW and PLT) is proof of the validity of its design goals. Its "Abstract Machine" is simple and there is no complicated Object Model with the syntax merely being a thin veneer over a sequence of bytes. Contrast it with most modern languages (which seem to be designed to solve world peace/hunger and everything in between) and C appears more and more relevant these days. C++ used judiciously without a lot of "new features" introduced by the standards committee (the bane of the language) takes it to the next level sweet spot.


Not exactly, read the exact paragraph in The Development of the C Language:

> [...] In all these cases the declaration of a variable resembles its usage in an expression whose type is the one named at the head of the declaration.

> The scheme of type composition adopted by C owes considerable debt to Algol 68, although it did not, perhaps, emerge in a form that Algol's adherents would approve of. The central notion I captured from Algol was a type structure based on atomic types (including structures), composed into arrays, pointers (references), and functions (procedures). Algol 68's concept of unions and casts also had an influence that appeared later.

Algol 68 by the way had a sensible type syntax that is akin to Go's. Ritchie "explains" the syntax difference by having the same syntax for the type and the expression, but that's not exactly why. For example there are actually two expressions relevant here, `*` dereference and `&` reference. Why was the former used? Why couldn't `*` be made a postfix at this point if the same syntax was a big concern? Ritchie himself never fully elaborated on those points, and I think that's because it was never his primary concern at all.

As you have noted, it is very important to realize that C did have their design goals (for which C did a very excellent job). On the other hand it would be very misleading to claim that C had some bigger visions even at that time and they validate its design! Ritchie and others were clearly smart but they didn't design C to be something magnificant, it was a rough product from various trade-offs they had to endure. So why this particular syntax? Well, because B already picked a prefix `*` which Ritchie didn't want to change at all, and he allowed it to infect all other aspects of the type syntax without much consideration. (Or more neutrally, Ritchie couldn't figure out any other option when he had to keep the B compatibility. But keep in mind that B have already changed that operator from BCPL.)

Technically speaking it was somehow designed, but only by following the path of the least resistance, not a solid reasoning, hence my use of the word "accidental". There are many other examples, such as the logical `&` (etc.) being renamed to `&&` but the bitwise `&` keeping the original precedence order because nothing was done. To be fair to Ritchie though, it is not his fault, but rather a fault of the whole software community that was fixated on this ancient language designed for specific purposes way too long.


The relevant paragraphs from Ritchie's paper are not just what you quoted (much of your comment is not relevant) but much more;

For each object of such a composed type, there was already a way to mention the underlying object: index the array, call the function, use the indirection operator on the pointer. Analogical reasoning led to a declaration syntax for names mirroring that of the expression syntax in which the names typically appear. Thus,

     int i, *pi, **ppi;
declare an integer, a pointer to an integer, a pointer to a pointer to an integer. The syntax of these declarations reflects the observation that i, pi, and ppi all yield an int type when used in an expression. Similarly,

     int f(), *f(), (*f)();
declare a function returning an integer, a function returning a pointer to an integer, a pointer to a function returning an integer;

     int *api[10], (*pai)[10];
declare an array of pointers to integers, and a pointer to an array of integers. In all these cases the declaration of a variable resembles its usage in an expression whose type is the one named at the head of the declaration.

The above can be summarized as the following two points;

1) "Declaration reflects Use" (from K&R C book)

which leads us to the corollary,

2) Syntax is variable-centric rather than type-centric i.e. You look at how the variable is supposed to be used and then work out its type.

> To be fair to Ritchie though, it is not his fault, but rather a fault of the whole software community that was fixated on this ancient language designed for specific purposes way too long.

Again, this is completely baseless and merely your opinion. I have already pointed out the main design goals which drove its design and the fact that it is still relevant today is proof of that. The "simplicity" of its very design is its greatest strength. The fact that modern needs (eg. Strong Type Safety, Multi-paradigm, Security, Concurrency etc.) require us to design/seek out more language features is not a reflection on the language itself since it was designed well before these "wants" became "needs". On the other hand the various extensions of C (eg. Objective-C, Concurrent-C, Handel-C etc.) are proof of its versatility and extensibility and hence its enduring relevance.


Not only you swiftly dismissed most of my comment without even trying, but you have no backing evidence for why declaration has to reflect use (K&R has no rationale for that AFAIK). And even that principle doesn't imply your claim; it only means that syntax changes to the declaration and uses have to be synchronized.

> On the other hand the various extensions of C (eg. Objective-C, Concurrent-C, Handel-C etc.) are proof of its versatility and extensibility and hence its enduring relevance.

Most popular enough languages will eventually have tons of extensions, most of which would be obscure and not known to the general public though. I'm not even sure which "concurrent" C you are talking about (there had been multiple extensions of the same name in the literature, none of them ever gained traction in the industry). Have you ever actually seen or used suggested extensions firsthand?

Also extending a language is actually kinda easy and doesn't imply some kind of quality. You seem to value your own decade-long experience, but I had been also a programming language researcher decades ago and I know that because any PL researcher will make new languages all the time. Or let's put in this way: Brainfuck has been extended and adapted so many times [1]. Does that make Brainfuck "versatile and extensible" in your definition? Is the enduring relevance even a necessary consequence of such qualities? Think about that.

[1] https://esolangs.org/wiki/Brainfuck#Related_languages


You seem to have lost the plot in this and your previous comments. Hence my dismissal and bringing the discussion back to pointer types and summarizing it.

In addition to The Development of the C Language - https://www.bell-labs.com/usr/dmr/www/chist.html, read also this presentation by Ritchie titled Five Little Languages and How They Grew: Talk at HOPL - https://www.bell-labs.com/usr/dmr/www/hopl.html

Thus Asking questions like "why was * used for pointer syntax" is meaningless since Ritchie himself says he took it from B; that's all there is to it. Also when i said C's design was validated it did not mean that the designers knew everything (they are on record saying that they themselves were surprised at its success) but that later usage in the industry validated it. The parsimonious design itself was a huge reason i.e. it was not "accidental" as you claim.

Thinking about the evolutionary path BCPL->B->C answers your other questions.

> why declaration has to reflect use

While Ritchie took types from Algol he didn't go the whole way (thus weakly typed). Since both BCPL/B had no type system to speak of, they treated memory as a linear array of cells and a pointer was just an index into this array. When you combine these two you have pointer operations defined in terms of the type of the object pointed to. Now you can see how "Declaration reflects Use" and variable-centric syntax makes sense (all the quoted examples in my previous comment). There is nothing called "a pointer" but only "a pointer to an object of a type" with the object in a sense propagating its type details to the pointer.

> Most popular enough languages will eventually have tons of extensions,

Nothing comes close to the "C family of languages" by which i mean not just extensions but anybody who took inspiration from it.

> I'm not even sure which "concurrent" C you are talking about

There is only one "official" one, that was designed by Narain Gehani and explained in the book "The Concurrent C Programming Language".

> Also extending a language is actually kinda easy and doesn't imply some kind of quality.

I am not talking about trivial additions but the idea of a carefully designed "Core/Kernel language" which is then extended into various multi-paradigm domains. The best example here is "The Oz Programming Language" (https://en.wikipedia.org/wiki/Oz_(programming_language)) By design and accident C turned out to be a pretty good Core/Kernel language for others. The three examples i listed above show how C was extended to Smalltalk-style OO domain, parallel programming domain and hardware description domain thus demonstrating its versatility and extensibility. I can say even more if using C++ as an example (viz. Cfront, C-with-classes, C++-as-a-better-C, Templates-implemented-using-preprocessor etc.) but my point should be clear now.


I already had spent too much time on this discussion so this comment will be the last one for anyone still following this.

> Thus Asking questions like "why was * used for pointer syntax" is meaningless since Ritchie himself says he took it from B; [...] Thinking about the evolutionary path BCPL->B->C answers your other questions.

It rather means that Ritchie took it from B and didn't feel like it should change. It should be noted that B did change its dereference syntax from BCPL, which eventually settled on `!e` and `e1!e2` for `*e` and `e1[e2]` in C (with everything implicitly being a byte pointer). As far as I'm aware there is no documented reason why Thompson turned `!` into `*`, which by the way would be used in too many places at this point until C moved away from BCPL-style escape sequences in string literals. Maybe Thompson's other language, Bon, has some clue but I have no information about that. In any case Ritchie clearly didn't think far into this particular change, because C was originally conceived as a largely compatible extension to B (then NB) and then stuck. Isn't that accidental enough?

> Now you can see how "Declaration reflects Use" and variable-centric syntax makes sense.

There are multiple concrete implementations for that driving principle. `int a;` and `a int;` would have been equally okay under this principle, so the only real reason to pick the former is the influence of B (`auto a;`). I wondered that you are using that term only to mean C's strictest implementation, and that you actually don't like ANSI/ISO C's newer function declaration syntax, but then your arguments for the general principle would not back the eventual syntax used by C.

> Nothing comes close to the "C family of languages" by which i mean not just extensions but anybody who took inspiration from it.

There are many different classes to consider. In the broadest sense even Python is said to be inspired by C, even though it would be absurd to consider Python to be a proof of C's inherent quality in addition to its popularity. Some languages are also only syntactically similar to C because its block syntax did have some nice syntactic property (known as "curly-brace languages" nowadays). Those superficial similarities can't imply your claim.

> There is only one "official" one, that was designed by Narain Gehani and explained in the book "The Concurrent C Programming Language".

It was never standardized, and apparently it wasn't available much outside of AT&T Labs. If the book would make it somehow official, I can write and publish my own book with the same title today. In any measure, that Concurrent C language is not as notable as other concurrent languages based on or influenced by C.

> By design and accident C turned out to be a pretty good Core/Kernel language for others.

There are a lot of extension languages that are NOT based on C or C++. In fact, I believe every single language in the TIOBE Top 20 ranking has at least 10 notable ones in average. C was used for extension only because it was popular for many years, and many such extensions had to work around or bend its limitations for their purposes. (For example, C itself doesn't have types for distinct address spaces and many extensions add reserved keywords, which you can add to any existing languages as I noted earlier.)


If you want a discussion on the overall design of C and your opinion of it create a separate thread and i (and others) will be more than happy to engage you there.

> It rather means that Ritchie took it from B and didn't feel like it should change ... Isn't that accidental enough?

That is not called "accidental" but a "design decision".

> In any case Ritchie clearly didn't think far into this particular change,

That is just your opinion and not borne out from Ritchie's own writings and what we can infer from it.

> `int a;` and `a int;` would have been equally okay under this principle,

No, the decision to have "int a" was already done and only a "pointer to int a" was being thought of leading to a variable-centric view.

> Some languages are also only syntactically similar to C because its block syntax did have some nice syntactic property (known as "curly-brace languages" nowadays). Those superficial similarities can't imply your claim

https://en.wikipedia.org/wiki/List_of_C-family_programming_l...

> In any measure, that Concurrent C language is not as notable as other concurrent languages based on or influenced by C.

The point was not popularity nor standardization but the domain into which C was extended. That is why i gave the three specific examples one for each important domain.

> There are a lot of extension languages that are NOT based on C or C++.

That is not the point. I am talking about using C as a "Core/Kernel language" in the design of other languages. No other language (other than the Lisp family) comes close to C in versatility and eventual success here.


> C appears more and more relevant these days

C is not relevant any more, not sure what world you are living in. it only has any relevance because it was the best option at the time decades ago, and so people are forced to use it when making syscalls. thats it.


What? There are more embedded devices than ever running C/C++ code today. All OSes, System utils etc. are still done in C/C++. All higher level performance oriented frameworks/libraries in any domain (eg. AI/ML) are implemented in C/C++ and then a interface to them are given through wrappers in other languages. Also C is the common "lingua-franca" across languages.

C is still in the top five in the TIOBE index today.


> All OSes, System utils etc. are still done in C/C++.

first of all, no. plenty of OS are made in other languages. also, the big OS WERE written in C, and only remain so in order to avoid redoing millions of lines of code.

> and then a interface to them are given through wrappers in other languages

again this is only done because the OS is using an outdated language, so people are forced to work with it.

> C is still in the top five in the TIOBE index today

that doesnt matter, this does:

https://madnight.github.io/githut/#/pull_requests/2024/1


You seem to be living in your own world and not willing to face Reality.

First of all, TIOBE considers data from multiple sources to come up with its ranking. Github by itself is not enough; there are orders of magnitude more code outside of it and hence your assumption is wrong. Also most C/C++ folks prefer to keep code local (proprietary and personal reasons) and hence are not sampled. You can only get an idea indirectly by volume of objects/software shipped/used containing their C/C++ code.

> plenty of OS are made in other languages.

No. All of them are "experimental" and i don't know of any that are production worthy. Also writing an OS as part of some study project in $FAV_LANG is not tenable here.

> again this is only done because the OS is using an outdated language, so people are forced to work with it.

You have not understood what i wrote at all. I mentioned "performance oriented frameworks/libraries" which have nothing to do with OS but just domain specific code eg. AI/ML, Gaming etc. The OS interface itself is just a very small part of it. But their domain logic are all implemented in C/C++ with thin wrappers for other languages.

Summarizing; C/C++ are relevant as ever today for all systems. To drive the point even more strongly C and C++ occupy two spots in the top five TIOBE index.


> Github by itself is not enough; there are orders of magnitude more code outside of it and hence your assumption is wrong. Also most C/C++ folks prefer to keep code local (proprietary and personal reasons) and hence are not sampled.

"you're wrong man, C totally has a bunch of code being used thats private, I swear". you could say that about every single other language. only thing that matter is what can be measured. C is dead man, you are just in denial. its an old crap language that hasn't been relevant in at least a decade. if you need some evidence, just look to the fact that after decades it still doesn't have a package manager, so many people laughably just vendor code when working with C projects.


[flagged]


You broke the site guidelines egregiously in more than one place in this thread. That's not allowed here and we ban accounts that do it. Moreover, you've done it repeatedly in other places also, e.g.:

https://news.ycombinator.com/item?id=41601160

https://news.ycombinator.com/item?id=41590528

https://news.ycombinator.com/item?id=41563488

If you keep doing that, we're going to have to ban you. I don't want to ban you! Therefore if you'd review https://news.ycombinator.com/newsguidelines.html and stick to the rules from now on, we'd appreciate it.

Among other things, that means not posting any more personal attacks.


TIOBE can remain transparent and also be seriously flawed in its methodology, which has been already questioned for many years. No popularity indicators would be entirely free from flaws, but anyone who are aware of TIOBE's possible flaws always quote multiple indicators including PYPL [1] and RedMonk Top 20 [2]. That's how you can remain convincing even with only possibly flawed data sources.

While C/C++ will remain an important language for many years, its continuing decline is also clear from those indicators. In fact even the most recent TIOBE has reported the lowest ever position (4th) for C, and that fact was already well known for other indicators: PYPL indicator for C was roughly in decline for a decade, so did its RedMonk ranking as well. They both estimate the current use of given language by looking at popularity of tutorials or questions, while TIOBE estimates the cumulative use of given language. All things equal, TIOBE is going to be systematically delayed compared to others, yet still TIOBE ranking for C is now falling and there is no reason to believe otherwise.

[1] https://pypl.github.io/PYPL.html

[2] https://redmonk.com/rstephens/2024/09/12/top20-jun2024/


What you are pointing out is nothing revelatory. That a statistical index is dependent on data and methodology is a vacuously true statement and not an argument. That there are multiple indexes trying to measure the same thing is also true and not an argument. The point was to show an index which is well respected (criticisms not withstanding) as a counter to silly claims.

I personally do not place much stock in these rankings since all of them are flawed in their sampling methodology due to using only publicly accessible indicators like google searches, stack overflow questions, job postings, Github and similar public repositories etc. C programmers on average are more experienced and hence have no need for these. They are already aware of most of the Good/Bad/Ugly about the language and are used to working out problems for themselves and hence don't show up in these metrics. Thus C/C++ rankings might appear to be waning when they are holding steady or rising much more slowly w.r.t. others.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: