> but if you want to cause a so called "paradigm shift" then you'll need to understand where the progress is happening and what kinds of opportunities it creates.
This is the most Hacker News thing I read all year. What are you talking about? Where did he say anything about wanting to cause a paradigm shift?
And I remember when Python, React, Node were all fringe. Until they disrupted the industry. But the important take away is to choose for a specific reason, and expecting a traction in the future. While OCaml might be good for a research project, it is unrealistic it gives you some future advantage. On the other hand, you can see how Rust can give you an advantage right now for some cases. Or even arguable rescript.
Typically, you should not attempt multiple paradigm shifts simultaneously. In fact, I would argue, the more innovative your end user product is, the more boring your tech stack should be.
Facebook was PHP.
Google was C++.
Bitcoin was C++.
Netflix was Java.
Spend your innovation points on your product, not on programming language.
Google is mostly Java and C++, Go has more use outside than on internal projects.
Nokia Networks customers were using a mix of C++ and Perl running on HP-UX back in 2004, and nowadays it is mostly C++ and Java running on Linux distributions. Not every telco is using Siemens Erlang based switches.
Apple created Clascal and Object Pascal, migrated to C++, got Objective C via NeXT acquisition, which previously licensed it from StepStone. They also created Mac Lisp, Hypercard, Dylan and Newtonscript.
Microsoft used BASIC for a looong time, dabbled with Pascal, had one of the best macro assemblers in the market, was the last MS-DOS vendor to actually add C++ support to their C compiler, focused on VB and C++ until .NET came to be.
This is the most Hacker News thing I read all year. What are you talking about? Where did he say anything about wanting to cause a paradigm shift?