Lisp has a profound effect on the business model because it can give you a dramatic productivity boost (like an order of magnitude or more) but it makes staffing more difficult because there aren't many experienced Lisp programmers because very few organizations use it, which makes for a vicious cycle. But this is precisely the sort of situation that if enough people simply changed their minds that by itself could change the underlying reality.
It's not just Lisp. There's a similar thing happening today with Rust, which is clearly superior to C from a technical point of view, but which very few people use simply because there are very few people using it. But Rust might be one of the rare exceptions where the technical superiority is enough to allow it to break this cycle.
> An order of magnitude "or more" is an extraordinary claim. The evidence just isn't there.
Let me clear: I am claiming that these kinds of productivity gains are possible, not that using Lisp will automatically give you a 10x improvement under all circumstances. And yes, I can give you concrete examples of demonstrable >10x productivity improvements which resulted in products succeeding where they otherwise would undoubtedly have failed. These are generally found in niche applications where there is a lot of domain knowledge that needs to be brought to bear. So you're not going to see big wins in, say, commodity consumer products, which is the reason that the wins don't get much press. But the evidence is definitely there if you look in the right places.
> I can give you concrete examples of demonstrable >10x productivity improvements which resulted in products succeeding where they otherwise would undoubtedly have failed.
Well, I would definitely be interested in these examples. I occasionally write Lisp (admittedly, Emacs Lisp rather than Common Lisp) and while I appreciate having macros and other metaprogramming tools at my fingertips, I've never encountered a situation in which their use was critically important. I can always replicated the thing I wanted to do in Python with a bit of boilerplate; if I had to choose, I would certainly take Python's huge ecosystem over Lisp's metaprogramming. Frankly, I don't think there have been any language silver bullets after structured programming and garbage collection. So I'm very skeptical of the claims of extreme Lisp productivity. I'm open to being convinced otherwise though.
I think a couple factors are at play here. First, most developers never really learn metaprogramming or use it, even in languages with native facilities for it. You don't need it to get the job done, strictly speaking, and it is a difficult skill to acquire. Second, many software applications don't benefit that much from metaprogramming even when you have those skills. The benefits aren't universal, which brings the costs into question.
Nonetheless, for some types of software, writing code without using metaprogramming will have several-fold the LoC, complexity, etc of the equivalent with metaprogramming. But if you never developed metaprogramming skills, you are unlikely to recognize when these opportunities arise. In these cases, you do see large productivity multipliers. I see this pattern all the time in C++; most C++ developers have no idea how much concision (and type safety) metaprogramming enables in contexts where it is perfectly suited for the job because they never learned metaprogramming in C++, so they write vast amounts of brittle boilerplate instead.
I've used metaprogramming in enough languages and contexts to recognize it as solving a broad class of problems in a general way, but you still want to pick your moments because it isn't free. Similarly, garbage collection is the right choice for many software applications but it isn't free and there are contexts in which garbage collection introduces far more complexity than is justified by the benefits.
Recognizing these situations and being able to take advantage of them is a market opportunity.
The big wins are in niches that involve a lot of domain-specific knowledge. The two best examples that I was personally involved with were the NASA Deep Space One Remote Agent and the Meta chip design tool from Barefoot Networks (acquired by Intel in 2019). In the former case, an attempt was made to do the implementation in C++, which failed outright. In the latter case you can do a pretty direct apples-to-apples comparison of the design cycle time relative to off-the-shelf design tools. Meta lets you iterate in minutes what would take hours using standard tools. (To be fair, Meta does not do everything that the standard tools do, and before you can tape-out you have to do a few iterations on standard place-and-route and timing verification. But it's still a huge win over just using those for the entire design.)
I have experience with EUROPA, which has gone on to have a long life independent of RAX (e.g. I believe it's still used via SACE for the ISS solar arrays). EUROPA is C++, not Lisp, even though Lisp isn't an unusual choice in AI planning.
I don't doubt that RAX used Lisp, but they replaced it with C++ within a few years, somewhere between HSTS and the second generation of EUROPA. The C++ versions are the ones that have been in 'production', so to speak, for a couple decades now. The early 2000s Mars Exploration Rovers might've been on an old enough version to still be running Lisp, though.
That's a good question without an easy answer, but there are two leading theories. One is that languages are infrastructure and it's really hard to replace infrastructure once it gets established (look at how much time it's taking for electric cars to replace gas-powered ones). The other is that Lisp's productivity boost allows individuals to get things done by themselves and so it tends to attract people who aren't good at collaborating (the famous "Lisp curse"). So on an individual level it's a win, but at an organizational level it might not be unless you manage it very carefully.
> So on an individual level it's a win, but at an organizational level it might not be unless you manage it very carefully.
This is why I choose Go many times over other languages. Its a bit easy to keep on the rails since it is so restrictive (at the cost of repetitive, explicit verbosity).
Historically I think there's a very strong case that Symbolics did outperform others with their software productivity, especially in graphics. They had an ability to wade into certain domains and produce legitimately shockingly competitive products, which really should not have been possible.
But I also think Lisp leads to spectacular burnout as I think it imposes a greater cognitive requirement on the part of the developer.
"But I also think Lisp leads to spectacular burnout as I think it imposes a greater cognitive requirement on the part of the developer."
I have never heard about this as a speciality of Lisp.
> there's a very strong case that Symbolics did outperform others with their software productivity
Symbolics used object-oriented programming in the form of Flavors. It had an integrated, GUI-based development environment. Development was mostly incremental: one can write some piece and run it immediately inside the application under development. Everything runs in a debug mode, without an explicit debug mode. The whole system, the whole documentation, the whole source code was always available. The applications were large, but for today's standards nothing shockingly large.
But there were also downsides. Mostly: hardware developed outside much faster and software had to move there.
First the Symbolics graphics suite was ported over to SGIs running Allegro CL, then also Windows NT systems. A refresh of the graphics suite was developed under the name Mirai.
I developed my first 3D graphics system on a Symbolics back in the mid-80's, before Symbolics released their S-Geometry package.
Though the software was powerful and made use of the extensibility provided by Lisp, it failed to gain much traction in the industry because the hardware was not suited to 3D graphics. Silicon Graphics (SGI) workstations took that market by the early 90's.
I do think it's the relatively superficial uniformity of it, regardless of the level of abstraction at which you're operating, which eventually makes things crack.
My hunch is top performing lisp devs are capable of holding more of this in their head at once than mere mortals. A key benefit of more conventional languages is you can limit the scope of your concerns and reasoning (within limits), however, that does prevent you from being able to make whole system optimisations, which is notably what Symbolics (at least in graphics) were outperforming everyone else at. What the Symbolics devs achieved seems to me only viable by having an unusually capable team focused on a specific domain for a long time, without too much diversion/distraction. The bus factor must have been horrific.
This thread made me go off and look more into this, and I couldn't help thinking the software industry has really lost something with the commercial failure of these very deeply specialised and incredibly high end tools. (ICAD was the other). It's odd how people could justify spending huge amounts on the hardware just to run the software, but didn't want to pay for the software when it became possible to run it on just about anything, and so the software stops being developed.
> Lisp ... can give you a dramatic productivity boost
... if you are smart enough. If it were that easy to get more productivity, everyone would be using Lisp. But you need to hire very smart developers to get that prodictivity, and most developers are by definition average. Your average developer will be frustrated, not more productive, with Lisp.
Those very smart developers would be just as effective in a mainstream language, and probably more so in an appropriately chosen mix of languages. It seems like the real advantage of lisp is that it self-selects for smart people who program for fun, and these people typically are also good developers.
The difference I would point out, is that Rust has corporate sponsorship. I don't recall any large corporations sponsoring e.g CMUCL/SBCL to the same level.
There are plenty of counterexamples here, with languages that had corporate sponsorship but did not succeed (e.g. Go) and vice versa (Perl, Python).
In the case of Lisp, it was done in by two things: AI winter, and the fact that the Lisp community was never able to organize itself. This is the famous "Lisp curse": it is precisely the fact that Lisp is a productivity multiplier that seals its fate because it allows individuals to get things done without collaborating.
I always thought this essay was a great take on Lisp(ers), but it's interesting how those sort of statements eventually can self-perpetuate a couple negative events into a state of learned helplessness.
It's not just Lisp. There's a similar thing happening today with Rust, which is clearly superior to C from a technical point of view, but which very few people use simply because there are very few people using it. But Rust might be one of the rare exceptions where the technical superiority is enough to allow it to break this cycle.