Your examples would have looked much better and without suggestions of wrong precedence if C would not have also replaced the Algol operators ":=" and "=" with "=" and "==".
In my opinion this is one of the greatest mistakes of C, and which unfortunately has been inherited by too many other languages. Even Dennis Ritchie has admitted that this might have not been a good idea.
That is due to the limitation of the source code to ASCII, which I consider obsolete.
"!=", "<=" and the like were single-character operators in Algol and in many other early programming languages. They have been replaced with 2-character operators only because IBM and most other important US computer manufacturers were not willing to support character sets with enough characters for covering the needs of mathematics and of other languages than English.
The target for the US character sets was only the text that may appear in commercial letters written in English. For any other uses, like programming languages, characters have been accepted only when they could occupy one of the few vacant places in the character map.
The dominance of the US-made computer hardware forced this ugly limitation upon the programming languages. So the programming languages had to use various commercial characters, e.g. !, @, #, $, %, instead of more appropriate traditional mathematical symbols.
Now, with Unicode and UTF-8, I consider the use of ASCII for source code as stupid.
In Unicode, also the ":=" of Algol exists as a single character.
Regardless of how operators are encoded, as single- or multiple-characters, a good text editor for source code allows one to change the correspondence between pressed keys and the inserted text.
Because assignment is used more frequently, even when using ASCII a text editor can be configured to insert ":=" when you press "=", and, for example, to insert "=" for ctrl-= and "!=" for alt-=, in order to minimize key presses.
Many early programming languages were even more limited than ASCII itself, for the sake of portability. In Algol itself the operators and keywords were defined as abstract representations, with multiple possible ways of serializing those to computer-encoded text.
In my opinion this is one of the greatest mistakes of C, and which unfortunately has been inherited by too many other languages. Even Dennis Ritchie has admitted that this might have not been a good idea.