Hacker News new | past | comments | ask | show | jobs | submit login

One of the sections in the article is headed "study existing languages", but why not also study languages that shaped programming language history? No mention of the Pascal-family of languages (Pascal, Modula-2, Oberon), no mention of Ada. Even Algol 68 is full of ideas that would reward study.

Also, the language descriptions are too brief to be helpful (Java and C# for being enterprisey, C++ and Rust for pointers and other system constructs)




I think it's even more important to study history of PLs if you want to make a new one. The history of programming languages was shaped heavily by Moore's law, and the fact that if you waited 18 months, your previous slow program would likely be twice as fast on newer chips. The result was that programming languages rose to the top by riding this exponential wave through mechanical sympathy.

Today, however, Moore's law has stalled, and a different kind of power law is on the rise: core counts. Languages which previously saw success by being imperative and optimized for single core execution are falling over trying to keep up. We see increased efforts to harness the power of multiprocessing through asynchrony and parallelism being first-class citizens in newer languages, while others struggle to stay relevant.

Languages of the future will be built from the ground up to harness massive CPU core counts that will be available on consumer desktop machines in the near future. My advice to any budding language designer is stop looking to reinvent C and C++. There's currently a wave of programming languages that came of age circa 2010 - 2020 which are focused on just that (Go/Rust/Zig etc.), but if you're just looking to get into that space now you're late to the party. Instead look back to Erlang and the original promise of Object Oriented programming as the basis for a new language in 2022.


I expect an extreme version of this to be the future. Not just many cores, but many machines. My ETLs at work run on a spark cluster and are specified across programs organized in a separate DAG. That’s the kind of “program” with lots of head room to improve.

I’d bet future many-machine heterogeneous-resource languages will make that a lot easier.


Erlang and Elixir are already pretty strong in this regard.


Any ideas why the strengths of erlang and elixir haven’t made their way into the mainstream?


I would say that the strengths of Erlang have made their way into the mainstream, but into infrastructure and tooling rather than languages. I believe the reason for that is because the current crop of popular mainstream languages will never forsake their imperative roots. Erlang's true strength is that it approaches the distributed computing problem from first principles, which yields a language that feels right to work with in that domain; whereas the mainstream attempts to shoehorn distributed computing onto an imperative foundation, which has always been very awkward since it's really just trying to square a circle.


The wiki page is rather informative in the sense of choosing which historical languages may be worth a closer look:

https://en.m.wikipedia.org/wiki/History_of_programming_langu...


Yeah that list is glib. To include Brainfuck but not Pascal is telling.


The context of this page is that it's part of a university course, a per-requisite for which would have been already studying a lot of older languages. The examples listed there are a survey of modern languages, trying to hit as many paradigms/language design choices as possible.


...and no mention of Nim, which shares a lot with Pascal, Python, C and Ada and is more innovative.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: