I think our positions are pretty close, I'm just coming around to the idea that any operator precedence at all is of dubious value.
It's nice to write `a + b * c` and have it parsed the "right" way, but I've become fairly comfortable with the idea of getting rid of that. To always require parentheses. I don't think writing `a + (b * c)` is that terrible. `a + b == c` is a little more annoying as `(a + b) == c`.
Where I think the worst cases are is addition/subtraction, since those are so primitive -- subtraction can be thought of as addition of a negation, and in that form it is directly associative with addition. Indexing into arrays with constructs like `x[a + b - 1]` is incredibly common, and here almost any parenthesization makes it look like it has meaning -- if you see code that says `x[(a + b) - 1]`, is that a different intent than `x[a + (b - 1)]`? It feels different to me; `x[(start + offset) - 1]` is an adjustment, but `x[start + (first_comma - 1)]` is finding the location before an offset.
On associativity I think this gets even worse; if you have to parenthesize `a + b + c` then it feels like the language is getting in your way, just by force of habit. Nobody blinks at the idea of deciding between `a b + c +` vs. `a b c + +` in a stack language, so maybe people would just get used to typing `a + (b + c)` and it would just fade into the background.
This is all kind of rambly; my general feeling about computer languages is what I said above -- the harder it is for a machine to parse, the harder it is for a person to read; but some idioms are just so ingrained that it's hard to even recognize that you're making assumptions about associativity; and I venture there's a significant class of programmers who don't know what associativity is in any formal sense but routinely use the property because it's so ingrained in how we are taught to calculate.
It's nice to write `a + b * c` and have it parsed the "right" way, but I've become fairly comfortable with the idea of getting rid of that. To always require parentheses. I don't think writing `a + (b * c)` is that terrible. `a + b == c` is a little more annoying as `(a + b) == c`.
Where I think the worst cases are is addition/subtraction, since those are so primitive -- subtraction can be thought of as addition of a negation, and in that form it is directly associative with addition. Indexing into arrays with constructs like `x[a + b - 1]` is incredibly common, and here almost any parenthesization makes it look like it has meaning -- if you see code that says `x[(a + b) - 1]`, is that a different intent than `x[a + (b - 1)]`? It feels different to me; `x[(start + offset) - 1]` is an adjustment, but `x[start + (first_comma - 1)]` is finding the location before an offset.
On associativity I think this gets even worse; if you have to parenthesize `a + b + c` then it feels like the language is getting in your way, just by force of habit. Nobody blinks at the idea of deciding between `a b + c +` vs. `a b c + +` in a stack language, so maybe people would just get used to typing `a + (b + c)` and it would just fade into the background.
This is all kind of rambly; my general feeling about computer languages is what I said above -- the harder it is for a machine to parse, the harder it is for a person to read; but some idioms are just so ingrained that it's hard to even recognize that you're making assumptions about associativity; and I venture there's a significant class of programmers who don't know what associativity is in any formal sense but routinely use the property because it's so ingrained in how we are taught to calculate.