Hacker News new | past | comments | ask | show | jobs | submit login

Notation is extremely important. It's basically a way to organize how you abstract a problem. If your abstraction is "bad" it will be harder to solve certain problems.

It's really not much different everywhere else in society. Different programming languages/frameworks/etc. are doing essentially the same thing (if you ignore the speed of execution). All the languages are Turing complete and can do more or less the same IO. But it's still much easier for people to use certain tools for certain problems than others.

The right notation allows you to focus on what's important and forget what is not.




This couldn't really be stated any more clearly[1]; well put. I'll only add that this is true for any variety of abstraction, natural language included. Abstractions encode the biases of their creators[2]. The 'power' of an abstraction comes from the set of things that are easily and concisely expressible; its primitives. However, this is balanced by the truths that are no longer easily expressible, because the encoding doesn't allow for it. There's a certain intuition that semantics and abstraction are tied tightly in this sense; you don't can't really convey what something means unless it's concisely expressible in the abstraction you're using. Slang, idioms, calculus, etc.

---

[1] and yet, I guess I'll just babble on adding more words anyway...

[2] be they mathematical concepts, programming paradigms, or cultural norms and quirks.


I learned this in designing the D programming language. The original syntax for lambdas seemed natural from a compiler point of view, but from a user point of view it was awkward and pretty much unusable. Changing the syntax was a complete transformation in its usage.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: