Hacker News new | past | comments | ask | show | jobs | submit login
If Lisp is so great (mihaiolteanu.me)
33 points by molteanu on Jan 26, 2024 | hide | past | favorite | 58 comments



I'm afraid that most people would just read the title and start completing the title with whatever ideas they have about why Lisp is not mainstream.

I read the whole article and its a lot of words to simply say: There are a lot of factors at play. It could be a mix of those factors.

Beyond that the article does not seem to give any specific insights. It provides many loose analogies to show how something good might not be not mainstream in other walks of life. But it does not go on to elaborate what it is specifically about Lisp.

So I guess I am trying to say I don't know what to do with this article. The talking points are all common sense. And there's nothing specific about Lisp I can learn from this article.


I'm not even sure all the points are common sense.

Is he saying that English is better than Chinese? Or that using a different kind of keyboard for Chinese would be better? Or that the author believes that only English uses the Latin alphabet?

What point is he trying to make about golden crowns and Caesar having nothing to conquer? That using Lisp would be pointless if everyone used Lisp because it's only useful for gloating about using the best language?


This article is not about lisp. It is about values and judgement.


The main factors are alien syntax and extremely deep abstraction levels, leading to cryptic/dense code like APL but with nested lists instead of arrays.


I’d argue modern languages have taken many of the attributes of LISP, not least the garbage collector, development tools, higher-order functions, etc. The difference is that other languages use parsing technology based on Chomsky’s grammar whereas blub languages like Lisp can’t see the value in it. But I’d say if you are programming in Java or Python you are most of the way to LISP from Pascal or C.

For that matter, Common Lisp had what was probably the first modern language specification, that, as much as it pretended to be machine independent, was carefully designed to be implementable on the 32-bit microprocessors that were then coming online. There was a time when I thought Java was the first programming language to be specified by adults but know I know they were following in the footsteps of Common Lisp.


> I’d argue modern languages have taken many of the attributes of LISP, not least the garbage collector, development tools, higher-order functions, etc.

Having features of a given language don't make something a full or even partial replacement for that language.

"It's not what languages do, it's what they shepherd you to" - https://nibblestew.blogspot.com/2020/03/its-not-what-program...


Raku has metaprogramming and an almost unprecedented amount of syntax... and an insanely slow implementation. Although they're chipping away at the speed thing.


Metaprogramming will struggle to go mainstream. It's the way you shrink a 50,000 line program down to 1,500 lines that nobody can understand but the author. I absolutely love doing it and I think I could hand the code off to somebody and have them make small changes in the DSL code but if they have to change something in the implementation to extend the DSL all bets are off.

After all these years we still don't have macro assemblers as good as what the IBM 360 had even though we now have architectures that have enough registers that it would be reasonable to pass a register name as an argument to a macro like you could do in IBM Macro Assembler.


Like most scripting languages (e.g. Python), execution speed was not the top priority - in fact, like Python, most heavy lifting can be / is done by modules that are written in native C code, or Rust or similar via the C FFI / Inline::Perl interfaces.

To your point, I recently measured the compile time of this raku module...

# speed (2020)

# use Physics::Measure :ALL; ...13s first-, 2.8s pre- compiled

# speed (2024)

# use Physics::Measure :ALL; ...4.4s first-, 0.9s pre- compiled

... so about a 3x speed up in the last 4 years.

Also raku has no GIL and has good support for hyper / race so can get a lot out of your 32 cores (if you want speed).

Another raku module to mention (Dan::Polars) connects to the Rust Polars library via FFI (thus getting Rust level execution speed since Polars a lot faster than Python Pandas via the underling Apache Arrow data structures) ... this takes about 2s for the raku to compile and about 15s for the rust cargo stack to compile ...


When was the last time you checked performance? :-)


I actually did make a quick search to see if I was blowing hot air, and found this blog post that shows a bunch of benchmarks over time with a fairly typical Raku/Perl flavored text processing task, and it was taking 0.23s for the July 2022 release vs. 0.59s in Jan 2016 [1].

So that's a pretty impressive improvement--3x over 6 years--but I remember Raku being numbers like 4x or 5x slower than Python on benchmarks from the last few years, so by my very sloppy math it's still got to speed up by at least 2x or 3x to go to match Python.

It's also possible that there has been a ton of speedup in the last year-and-a-half since that benchmark, or it's not representative, but that's where I got the idea from.

[1] https://blogs.perl.org/users/sylvain_colinet/2023/01/benchma...


> blub languages like Lisp

I see what you did[0] here.

0: https://wiki.c2.com/?BlubParadox


And to me the parsing aspect of lisp is key. It means the language won't resort to external tooling to improve. Most of the programming world defaults to increasing the number of things you need in order to do more, while lispers seemed to like absorption of concepts as the main strategy (see Steele: growing a language talk IIRC).


And you wind up overloading a small number of syntax constructions for an endlessly expanding number of uses and soon you're in a twisty maze of parenthesis that all look alike. Look at how HP calculators got trashed by TI.

I'd grant that parser generators still suck, people still act like you're crazy when you say you want to be able to write one grammar and automatically generate not just a parser but an unparser. (I could do amazing stuff with Sphinx if only it supported RST output as well as schemaless RST.)

CASE tools in the 1990 could parse code, let you edit it in a GUI, and make a clean edit to the code (not mess up comments, whitespace, and the ordering of things which is only significant to your version control tools) like a professional programmer would. That's still like something that fell off a UFO.

You should totally be able to compose two grammars. I ought to be able to stick a SQL query right into the middle of program in Java or any other language and have it parsed to an AST. If parser generators were sane I could add

  unless(X) { Y }
to a language like Java and add a method that rewrites it to

  if(!X) { Y }
and it shouldn't be more than 50 lines of code including imports and ceremony, just a patch to the grammar (AST objects ought to be code generated from the grammar) a simple rewriting function and telling the system where to find the grammar patch and the new function.

Parsing Expression Grammars are a step in the right direction but for a PEG parser to be really revolutionary it needs a few features I've yet to see in one, in particular there has to be some easy way to specify operator precedence either numerically or with a set of statements like

   Closer(*,+)
I am mostly disappointed w/ the PEG parser in Python because it falls short of revolutionary promises but you always get "fooled again" with parsing because people care about how fast their compilers are to the exclusion of almost everything else.


The twisty maze of parens I never understand. (And the fact that TI won is probably, imho, only vaguely to syntax, kids barely use calcs anyway, and I'd argue that if you'd show them RPL immense programming surface, lists, vectors, lambdas, they'd use it more than the usual fix function graphing calc)

I never had a chance to see proper CASE tools, my education started in 2000.. java/uml took the light and the few environments I saw were very very subpar (post IBM, eclipse based, rational suite).

The grammar composition is still a big open problem, I think Lawrence Tratt tried to attack it (maybe this https://soft-dev.org/pubs/pdf/diekmann_tratt__parsing_compos... ?) but concluded it was still fragile/difficult.


> Look at how HP calculators got trashed by TI.

That had almost nothing to do with syntax and almost everything due to lobbying.

HP was not an "approved" calculator for all manner of standardized tests while really shitty TI ones were.

HP calculators were everywhere in my engineering department from 1988 to 1992 (the HP-28S and then the HP-48SX went through them like wildfire).


And they're still both amazing piece of hardware and software and very much prized by people today. It falls in a totally different space of immense possibilities.


My point is that they were objectively better than anything TI was producing in the calculator space for a very long time--the evidence is that engineering students at a state school were forking over what were non-trivial amounts of money at the time for them.

The problem was that places like the College Board allowed calculators in 1994. That meant that an "approved" list of calculators appeared and for a long time the HP programmable calculators were considered "too powerful" and disallowed. That "approved" list propagated to other things that used calculators in official capacities.

That absolutely killed HP as the volume all went to TI since everybody bought TI calculators and nobody was going to then spend on an HP calculator that might not be allowed on your test.

By contrast, look at business and finance majors where lobbying didn't create an "approved" list--you will pry an HP-12B from their cold, dead hands.


Yeah I understand, I wonder what would have been if school had 50/50 blend of both brands. Educationally speaking it would have propelled a lot of things forward... but I guess education institution weren't able to foresee that much of a leap.


I keep trying to dedicate time to getting comfortable with lisp, and it hasn't happened yet. So I just live vicariously through posts about lisps.

One thing that stands out about them is that they're all so happy. Try it. Search up a HN post about a Lisp. They'll be using words like "joyful".

So my theory is that while lisp may have plenty of technical merits, part of why it's so great is that it's typically being used by people who are having fun.

That's not to say that it would perform poorly if used under duress, but maybe there's some wisdom in not putting that experience that you enjoy anywhere near drudgery, lest it become contaminated.


Here's what my company experienced with lisp.

We had a small group (~20 devs all together) that decided they wanted to do a lot with clojure. So they started several projects throughout the company and, for the most part, they all very much enjoyed using clojure.

However, as those projects shifted into maintenance mode and that group moved on to greener pastures, we were left with a bunch of projects that were inscrutable. The big issue we ran into is that lisp LOVES to give programmers the ability to metaprogram and programmers love metaprogramming. However, when it comes to maintaining a metaprogrammed monster... that's basically just learning a new language used by nobody but this single project.

The end result was that people dreaded taking charge of lisp projects. They were hard to ramp up on. Hard to maintain. And they had really strange and hard to diagnose bugs. We've since spent the money rewriting most of the projects in java. I believe there are 2 that were simply too big and complex to do such rewrites against.

My takeaway is that lisp is much like perl. When you know what you are doing it can be a lot of fun, but heaven help the person that has to later read what you wrote.


> programmers love metaprogramming

and (many of them) hate writing documentation.

> that's basically just learning a new language used by nobody but this single project

and there is no source to learn the language from. other than the "self-documenting" code, which often isn't.

I am not opposed to using abstraction. Sometimes abstractions are powerful. But the more abstract you go, the more effort you need to spend to make sure you communicated it properly to the rest of the team -- including its future members.


Are you sure you didn't just Google up the Lisp Curse and use it as an excuse to avoid engaging what was mostly idiomatic Clojure?

Rampant metaprogramming is not that common in Lisp codebases, and what of there is will tend to be pretty shallow, like bits of syntactic sugar here and there.

Do you have some example of the inscrutable metaprogramming?


I have a few examples of it.

One dev came up with their own macro that linked together the http client and json parsing into one magic macro which allowed them to also format the json, log it, and a bunch of other conveniences mostly for that dev. Unfortunately, the function was highly tailored to where it existed (with hard coded params and such). It's invocation ended up looking something like `(ht-j g foo 'log')` (because this dev also liked shortening everything).

Another example of this is some devs wanted to make a super generic data fetching... thingy... so they came up with this wild macro where you could feed in http urls and a custom configuration and it'd expose calls to that downstream services. It allowed for a sql like interface into several different microservices. Interesting in concept, impossible to maintain. That project was ultimately scrapped.

I was unaware of "the lisp curse" before my comment. This was just what my company experienced.


Something like (ht-j g foo 'log') could be a function. Someone could do that in Javascript like htj.(g, foo, 'log') or whatever, which is no more or less clear. You wouldn't have a code generation step to deal with, but you can expand macro calls in situations when the macro body isn't readable.


I spent quite a bit of effort seeking Lisp Enlightenment (TM) because of the happy people using Lisp that you mention[0]. The fact that Lisp is not used more commonly for practical applications is actually a great source of frustration for me now, and I think I share that experience with many other people who enjoy using Lisp.

The thing about Lisp is that when you get used to it, you realize how needlessly obtuse so much of programming is today. So much effort has been spent making programming harder and harder to understand.

At this point I truly think that if Lisp had won, the world would be quite a different place for the better. We live in a world where the average person thinks that a terminal emulator is an error window, and coding is something they identify as not being able to do. Lisp is so nice because when you understand the (extremely simple!) grammar, any vocabulary is correct. You just have to make sure that an implementation of that vocabulary is defined in terms of the vocabulary provided by the system. This is to say: the language let's you express things in the exact fidelity that you think them. That's why it's so appealing. I think that if Lisp had won, general programming would be as approachable as using a graphing calculator.

[0] FWIW I found it in the SICP lectures.


I have met so many programmers who travel from job to job looking for functional programming enlightenment like an itinerant martial artist from a manga, never finding it. I've seen cults around Lisp and Haskell lead so many coders into the wilderness.


reflects on all that code written for a paycheck

Someone please lead me into the wilderness.


> This is to say: the language let's you express things in the exact fidelity that you think them. That's why it's so appealing.

Different people think differently. Having to decipher someone else's thoughts is less appealing that expressing your own.

Some people are great thinkers, and their thoughts-made-code are elegant. But there is no guarantee that your colleague who started a project that you join three months later is one of them.


People do think differently, so they should be allowed to. The problem of deciphering your colleagues thoughts should be solved by abstractions and modules.

The goal of programming on Lisp is to define a new vocabulary to further build on. If you don't like the vocabulary defined by your colleague, you can easily use (or define) a different one at the same level of abstraction.


Programming in anything builds a new vocabulary. In Lisp, most of your vocabulary should be ordinary entities like functions, classes and methods. Not macros.

It's not a goal of any reasonable Lisp developer to write as many macros as possible.

What macros are written don't have to be earth shattering new languages; it's okay to just make some syntactic sugar needed only in one file and whatnot.

I wrote an accounting system for my self-employed business activities. In the entire codebase, there is one macro:

  time.tl:43:(defmacro def-date-var (name . date-range-val-triplets)
def-date-var defines a global variable whose value depends on the date. Not the current date, but the date established by context of a financial calculation. E.g. if we are adding a new transaction across the ledger, which represents an invoice to a customer, the code doing that would install the transaction's date as the date variable. Then all the date vars referenced will take on the value from that date. This is useful for things like tax rates and whatnot.

Everything else in the program was done without custom macros. Of course macros from the programming language are used; you can't do anything without them.

I could have made a mini language for, say, defining an invoice. But nope, it's just normal syntax for constructing a new object.

You should almost never write a Lisp macro to do something a function can do. This is a FAQ for Common Lisp:

https://www.cs.cmu.edu/Groups/AI/html/faqs/lang/lisp/part1/f...

      - Never use a macro instead of a function for efficiency reasons.
        Declaim the function as inline -- for example, 
          (DECLAIM (INLINE ..))
        This is *not* a magic bullet -- be forewarned that inline
        expansions can often increase the code size dramatically. INLINE
        should be used only for short functions where the tradeoff is
        likely to be worthwhile: inner loops, types that the compiler
        might do something smart with, and so on.

      [...]

      - Don't define a macro where a function definition will work just
        as well -- remember, you can FUNCALL or MAPCAR a function but 
        not a macro.


  The source code of the Viaweb editor was probably about 20-25% macros. Macros are harder to write than ordinary Lisp functions, and it's considered to be bad style to use them when they're not necessary. So every macro in that code is there because it has to be. What that means is that at least 20-25% of the code in this program is doing things that you can't easily do in any other language.
From: https://paulgraham.com/avg.html


I've committed most of my errors in a terminal window, so yes, it is an error window.


>One thing that stands out about them is that they're all so happy. Try it. Search up a HN post about a Lisp. They'll be using words like "joyful".

Feels like it is relative

If you spent whole life in C then anything sane will be joyful


I rather enjoy python, but somehow the culture is different. We don't spend much time talking about how much fun we're having. (I assume I'm not the only one having fun.)


On a good day I find Java is joyful.


Also related: https://www.youtube.com/watch?v=QyJZzq0v7Z4

Network effects likely explain a big part of it. Many of the "popular" languages today aren't necessarily great languages and adopted for their technical merits. Often they were the only language available on an exclusive platform everyone wanted to develop software on. Other times it was because folks wanted to write a particular kind of application and there just happened to be this framework in this weird language that made the task easy.

We probably have more programming languages now than ever before but I think the market for new languages in the mainstream is relatively static. It shifts... but very slowly.


Reluctant to say this in the current climate of "yay static typing" but I am starting to find some things easier to express in SML than in scheme. The type driven pattern matching is a really big thing.

Currently liking lisp for exploratory programming and ML for precise communication of fully formed ideas to the machine.

Anyone happen to know a lisp/ML pair that target the same VM/IR for seamless interop between the two?

(in this context ML is metalanguage, ideally StandardML, though ocaml/miranda/haskell are the same sort of idea)


What about Coalton?


What a long and meandering post just to say "sometimes practical and theoretical needs are different"


Raku (www.raku.org) does a surprisingly good impression of Lisp:

  (sub (:&is-even = (sub (\n)
                      {([or] ([==] 0, n),
                             (is-odd (pred n:)))}),
        :&is-odd =  (sub (\n)
                      {([and] (not ([==] 0, n)),
                              (is-even (pred n:)))}))
     {is-odd 11})()
https://www.codesections.com/blog/raku-lisp-impression/


It's quite an entertaining rant, but the scattergun approach means that it's one of those things where it's impossible to address the points made because there's so many. But to just look at the first paragraph:

"There is only so much space available in a city."

Is there a limited number of possible LISP practitioners in the world?

Looking at the broader point: yes, there are a lot of factors to take into account, but perhaps we should be looking at the factors that specifically involve LISP.


Well, yes, and the discussion about the "factors that specifically involve LISP" has been going on since forever with no end in sight. So maybe there a at least some extra forces involved here. Who knows. I'm just exploring.


I actually really liked the rhetoric made in the essay. The only problem is that the title may imply that this essay is about Lisp, while it's more about the ambiguity that are caused by words like "best".

The key takeaway of the essay is that: When someone questions why something supposed to be "best" isn't widely used, the answer is not solely technical. It also have sociological, philosophical, economic, and political aspects.


Indeed, it is. And yes, Lisp is only marginally relevant to the point beeing made. It's just that the question "if X is the best why hasn't it won" and similar variants of it spring up so often around Lisp and are not quite resolved or even close to being resolved.

The essay is thus attacking the question form a different angle. Thanks for the kind words.


Is there a modern, non-JVM lisp that cleanly threads types through the expressions? I'm thinking like Haskell or Roc, where the type definitions are optional, but can be written down for clarity.

Lisp was by far my favorite language in college, but these days the idea of sloshing around in all those untyped s-expressions just gives me the willies.


LISP is postorder evaluation order of a tree. We can go further and more mind expansive than this!

The future I am trying to design is a language where traversal order is arbitrarily repetitive and arbitrary and corecursive and has term rewriting traversals.

I think algorithms are just reified traversal evaluation orders and joins - of algebra and mathematics.


"If X is so great, whe aren't everyone using it" implies that programming languages exist in some kind of well working marketplace of ideas where the best on their own merits win out. But it's not like that at all.


A curious counter-example to 'metaprogramming is essential' is Zig that explicitly rejects macros and "abstraction towers", there is no operator overloading or OOP methods.


> their querty keyboards

Is this an unintentional misspelling, or an intentional one? That it’s italicized makes me think it’s an intentional reference or joke—if so, what is it?


Unintentional. Fixed.


If you look at the examples https://en.wikipedia.org/wiki/Paradigms_of_AI_Programming I'd say that these would be a struggle to implement in FORTRAN or BASIC, possible in C or C++, easy in Python, and (for me) joyful in Java. One of the most important features is having hashtables in the standard library which the last two languages have but the first four don't.


I enjoyed the article, I wish the author put up an RSS feed so it’s possible to follow and read new articles as they are published


No static typing...


(Talking about CL) Isn't gradual typing with a typed standard library enough? Because that's kind of what SBCL provides.

Anyway, some other reasons:

* Baby ducks who can't get over the parentheses (which quickly become invisible to the eye, you read Lisp via its indentation).

* Baby ducks who can't get over too weird/big differences from C/C++/C#/Java/etc... like CL's compilation model.

* ML typing being /the/ fad these days.

* Low-level fetishism. I know a lot about this, since I've had my own C weenie phase (you know, the kind to spit on GC by principle) as a young university student, before turning smug lisp weenie ten years later; Tcl was actually my gateway drug into useful homoiconicity.

* Some hard technical limitations:

  * No user accessible parametric deftype.

  * No recursive deftype (so no typed lists/trees).

  * Gimped hash tables (untyped, lacking literals thus read/write transparency).

  * CLOS being bolted on instead of truly integrated in the language; would need a JIT and something like https://github.com/marcoheisig/fast-generic-functions on system classes to go fast enough, Julia kinda does this (but it hurts my eyes).

  * Lack of LSP; no, SLIME/Sly isn't the same, as you're lacking lexical information that allows to complete/rename stuff.

  * And hundreds of other rust spots sometimes fixed by extensions (e.g. gray streams, extensible sequences) or libraries (loop -> iterate, trivia), very often crutches in look and feel.


I've been meaning to give Coalton a try: https://coalton-lang.github.io/20211010-introducing-coalton/ (found via a previous HN post)


Optional static typing. Always strongly typed.


We tend to hope so, but CLHS is full of stuff like “the consequences are undefined if the value of the declared variable is not of the declared type” which gives a lot of leeway for generating code that blows up horribly.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: