> what I've called the ML-ification of programming languages (static types, type inference, higher order functions, exceptions, ...) has been pervasive
It has been popular in some class of languages that directly influence one another, and less popular in others. If you look at the software world in, say, 1995, I think a larger precentage of programming was done in typed languages than today. The question of "to type or not to type" wasn't on anyone's mind because virtually all languages used for serious stuff were typed. In any event, even if there is an MLification in important corners of the industry, it's still nowhere near a silver-bullet level of adoption. There is, I agree, a return to the popularity of typed languages, but it hasn't yet reached the level it was before 2000.
In fairness, the adoption of typed languages in the 1990s was about C/C++, Java, possibly Pascal, Delphi, Ada. None of those languages were built upon ML's innovations (type inference, first-class generics, first class higher-order functions, pattern matching etc). I conjecture that the rise of dynamically typed languages was primarily because the aforementioned typed languages adopted an awkward approach to typing (no inference, subtyping + casting to implement both, ad-hoc polymorphism and generics, or, in the case of C++, generics by compile-time meta-programming which has been awkward in other ways).
So 1990s types were a worst-case scenario: not expressive enough for many practical cases (you had to cast a lot anyway), hence the safety you get from typing was not much, syntactically heavyweight, bad error messages. Pointless trivial type annotations like
A a = new A(...);
are truly grating, but ubiquitous in the typed languages popular in the 1990s. No wonder people preferred Python ...
I’m not sure why you start in the 90s; plenty of people were using C and Pascal in the 80s.
Are you sure there was a big rise in dynamic languages at any particular point? I’m not so sure there really was (but I’m interested in documentary evidence if you know of any).
I always figured dynamic languages became more popular as computers got bigger and faster, so efficiency became less important in most scenarios. But that’s not the only trend -- plenty of people programmed in BASIC in the 80s, on tiny 8-bit computers! It was slow as hell, but P-code can be very memory efficient, so even on a tiny slow computer dynamic languages can make sense.
(I count BASIC as dynamic even though it lacks useful abstraction mechanisms because most dialects manage memory for you automatically.)
Edit to add: maybe Forth is a better example (and a much better language!) although I don’t think it was ever as popular as BASIC.
Lisp is from the 1950s, dynamically typed and one of the most influential languages of all times. Lisp introduced GC, the single biggest advance in programming languages (but it took 30 years to make it fast). I am not trying to argue that dynamically typed languages became popular in the 1990s.
If anything, I see ML as the (spectacularly) successful attempt at typing Lisp for increasing reliability. Remember Milner was at Stanford in the 1960s, surely he will have conversed with McCarty. Milner's early theorem provers were in Lisp. He invented ML to reduce the pain of using Lisp (remember, ML stands for "meta language"), and to make theorem proving less error prone. More precisely to reduce the TBC (= trusted computing base) in theorem provers through types (viz the famous Thm abstraction).
ML was essentially finished in the late 1980s [1]. Every responsible programming language designer could have taken on board ML's lessons sincen, and it's an interesting historical question why this did not happen.
Thanks for that, I wasn’t fully aware of the history of ML! Very interesting (and very impressive that this is just one of Milner’s contributions).
I started with Haskell, where the main influence besides ML was Miranda. (If I remember correctly, Haskell was only created because Miranda was proprietary; similar to BitKeeper and Git.) I guess Miranda’s main innovation was lazy evaluation. That has certainly been influential but outside Haskell I don’t think it’s ever had widespread adoption in the same way as ML-style typing and type inference.
* Every responsible programming language designer could have taken on board ML's lessons sincen, and it's an interesting historical question why this did not happen.*
Agreed! But maybe it is happening, it’s just that it’s taken 30 years instead of the 5 or 10 one might have expected?
Yes, Miranda was very expensive, had an onerous license and only ran on Unix, that's why researchers felt the need to create a free and open alternative.
Lazy evaluation was invented multiple times. First in theoretical studies of the lambda calculus by Wadsworth in 1971. Later, Friedman & Wise, and independently Henderson & Morris proposd a lazy Lisp, Turner (who later did Miranda) gave SASL a lazy semantics (SASL was eager initially). All three were from 1976, so it was clearly an idea in-the-air. SASL later evolved in to Miranda. Another of Miranda's innovations was to use combinators for graph reduction to implement the Miranda run-time.
Which is double “damning” because plentiful memory means more objects, and often deeper interactions, which means the likelihood of getting object lifetime exactly right declines. Not only does manual management become less necessary, it also becomes less tenable.
Hmm, I’m not sure you’re right about the historical trends. As I see it, the static typing battle has been going on forever. Each style has been popular in certain areas but neither has ever been completely dominant.
Javascript, Python and Ruby are the big dynamic languages today. But you can go back as far as you like, to when languages like Perl, TCL, Lisp and BASIC were big. There have always been dynamic languages.
Static languages have gotten more robustly type-safe over time (even today’s C is much safer than K&R C). But apart from that shift, it seems to me that the static vs dynamic landscape has been much the same for decades.
I'm not sure why you're including BASIC in with the dynamic languages -- in its classic microcomputer form, it's pretty well limited to int or float, and string, and arrays of those things. When you make a variable, you always define it's type (A is a double, but A$ is a string).
Some versions are a little bit richer in that you can have both ints AND doubles (woot, woot!).
It has been popular in some class of languages that directly influence one another, and less popular in others. If you look at the software world in, say, 1995, I think a larger precentage of programming was done in typed languages than today. The question of "to type or not to type" wasn't on anyone's mind because virtually all languages used for serious stuff were typed. In any event, even if there is an MLification in important corners of the industry, it's still nowhere near a silver-bullet level of adoption. There is, I agree, a return to the popularity of typed languages, but it hasn't yet reached the level it was before 2000.