I wonder if programming languages having precedence rules at all is an atavism from the 70s when it was an interesting problem for parsing. Adding parentheses makes things clearer in 99% of the cases.
Programing languages require precedence rules because mathematicians decided that having a screwball precedence would make their notation more concise, over the hundreds of years of being used and taught these notations have become the expected form of math, and programing languages(all of which tend to be math heavy) usually will try to follow the expected rules.
Honestly as an interested amateur I think that much of the notation used in math is one of the weakest parts of the discipline. there are two things that I think would aid the notation greatly.
1. Most complex equations have a lot of moving parts within them, however the notation has no way of indicating what these parts are for and why they are there, you better hope that the author has taken to time to document their equation properly.
2. the terrible symbology. you have heard the joke about the two hard things in computer science. it turns out that programmers are usually fairly good at naming things, the mathematicians are the true dark masters at names. if you have an item in your equation that represents the confidence of a rating they will not name it confidence_rating, no, they will name it σ(a small sigma), good luck searching(or even typing) that.
Now I get why it was done, they tended to a notation that was as concise as possible, this makes it much faster to manipulate parts when you are working on something. however I feel this has the opposite effect when trying to teach it to others.
As its designers explain: A bare a * b + c is an invalid expression. You must explicitly write either (a * b) + c or a * (b + c).
Why doesn't WUFFS have precedence? Because it's focused intensely on solving its specific problem, Wrangling Untrusted File Formats Safely. Having solved the buffer problem (you can't write buffer[oops_too_big] type mistakes in WUFFS, they will not compile) and the integer overflow problem (likewise, it doesn't compile) they wanted to also solve that problem where the programmer thought there were making one calculation, but due to precedence rules they were actually making a different calculation.
Well yeah, having precedence rules makes the notation more concise and can save you from typing lots and lots of brackets. And I don't think multiplication/division having higher precedence than addition/subtraction (and, by extension, AND having higher precedence than OR) is all that "screwball" - and even if it were, if you paid attention in elementary school, you already know it, so programming languages can rely on it. Of course, the moment they start getting "creative" with precedence, those exceptions turn into massive footguns...
> I wonder if programming languages having precedence rules at all is an atavism from the 70s
It's not an "atavism from the 70s", it's a mirror of mathematical conventions, as well as a syntactic convenience.
Language designers have tried to do away with precedence all along by having uniform evaluation (e.g. smalltalk, as well as APL and all its descendents hence Ivy, probably), removing infixes (lisp, forth, assemblies), or requiring explicit prioritisation (I think I saw that a while back though I don't remember the language).
> Adding parentheses makes things clearer in 99% of the cases.
It also adds a significant amount of noise for 99% of the cases, for no value since any schoolchild past 12 or so has integrated the precedence rules.
> It also adds a significant amount of noise for 99% of the cases, for no value since any schoolchild past 12 or so has integrated the precedence rules.
As all the viral "Only 25% of people got this right" stuff that pops up from time to time proves: No they haven't.
Also since we're rarely writing equations composed entirely of single digit numeric literals and are using reasonable variable names, the percentage of characters overhead and therefore noisiness of using brackets is much less than in the short examples being thrown around.
If memory serves well, normal operator precedence was already present in the first FORTRAN, which was proposed in 1953.
And APL's priority system is: there is no priority. Everything is evaluated right to left. Pike's Ivy is no exception. 3/4 differing from 3 / 4 is a lexical convention: 3/4 is a rational number.
The latter example is more noisy, you do learn precedence in high school yet the latter example is also more clear because it is the most accurate way of conveying intent.