Hacker News new | past | comments | ask | show | jobs | submit login

Imo the problem is that 'high' is a relative term. It's calibrated according to whatever the commonly used abstraction level is for the term-speaker.

The other problem is that level doesn't really mean anything, or it means something different in each language. Is it feature count? Abstraction potential? Memory-addressed vs objects? Statically or dynamically typed? Closeness of fit to the machine it's running on (imagine a lisp on a lisp machine, or x86 on an emulator - is that high or low level?)? Etc.




If "level" is ill-defined then low and high aren't relative terms, they're meaningless. But there is a lowest either way, the machine code. Then high level programming languages have the programmer writing code in a model independent of machine architecture. Those models are't one dimensional, so let's not bother trying to put them in order from low to high, let's talk about their features.

That's how i'd have it if it were for me to decide, but this isn't how language evolves.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: