The benefit would be it being optional. You only need it while learning the language, whereas once you're familiar with the notation the terseness becomes a feature.
Of course once you've come up with clear names for each symbol you could do the opposite, let the IDE turn `plus reduce range 100` into `+/!100`. But as long as IDEs are still glorified text editors and devs care about the representation that gets stored on disk I would argue making the terse notation the default is the right choice.
It's a very weak benefit. I don't know about you but most of my time programming is spent thinking about the program, not writing it. In fact, this would only increase the time I need to think about how to write things (what was the symbol for reduce again?)
And easier to spot patterns in the source files. Whether someone finds that beneficial or not is subjective I guess, but I think it's useful to be able to immediately recognise what something like +/! is doing whenever you see it in source after using it once or twice.
So what would be the benefit compared to just writing `plus reduce range 100`?