In my long journey to make computers do what I want spanning several decades, systems, languages and trends, I've come across a certain class of geeks.
This group would swear by Lisp. And by that I mean, they would treat Lisp as the gold standard. Sometimes, the only standard that matters. They would go to unimaginable lengths to explain why Lisp is really the purest form of computer science expression; how it sharpens your mind and elevates your skill; how it is akin to poetry, only better, because it's poetry in code.
And after all this was done, and repeated several times over, I would gently nod at this exalted madman who had been blessed by the touch of the Lisp God. And then we'd both go back to whatever language we were actually using at the time.
Same for FORTH. I don't think I've ever seen a FORTH programmer and a LISP programmer in the same room; I can't imagine the discussion terminating.
Over the years, as a left-handed person, I've come to see this as evidence of underlying diversity in thought patterns that can't simply be learned/trained around. Certain people find certain metaphors and modes of computing much easier to work with than others. Few people on either side can see this. The result is that you have people who, having feeling like they were fighting scissors all their lives suddenly discovering that a different sort of scissor exists that gives better results, start evangelizing it to everyone. The rest of the world tries it, finds it impossible to use, and concludes the weird scissor advocates are mad.
Neither the LH people holding LH scissors nor the RH people holding RH scissors are "wrong". It's just the RH people trying to use LH scissors who are having a bad time.
(The limit case of this is a few people who are using languages which are uniquely tailored to themselves and absolutely nobody else; colorforth, the late author of TempleOS, Urbit, etc)
I actually think it's a lot more about initial training. and corporate sponsorship.
People would rather die/retire than change how they program. The assembly guys died/retired rather than learn structured languages. The imperative people died/retired rather than learn OOP. The OOP people are currently dying/retiring as languages shift to be much more functional.
ALL of these things have existed for 50+ years, so why did they take so long to adopt (as people now almost universally agree these things are better)? It's all about what they are taught and use early on in their career. With the exception of 5-10% of outliers, everyone takes what they learned in school, adds a little more past that as they are learning to code, then cement their synapses into that pattern for the rest of their career.
A conflating factor here is corporate backing. If you look at the popular programming languages, only a couple became successful without a large corporate sponsor. Corporations tend to be conservative in their investments too. This creates a secondary effect where even when you encounter a superior language pattern, you cannot use it because your company isn't likely to pay you to rewrite complex libraries in a new language.
JS pushed the world so far toward functional programming because even though it was different, it was the only standard. OOP devs in the 90s and 00s would rather die/retire than actually learn JS and more functional programming paradigms. But because it was the standard, big companies like Google and MS were forced to pour in resources and make it accessible. In turn, that has led to a glut of functional features being bolted on to other languages now that a generation of programmers has adopted them.
I'd hypothesize that if Eich had made a Scheme instead of JS (as he originally intended), we wouldn't be having this debate today. It would have forced corporate backing of a lisp and of a functional language. The current crop of devs would have been introduced to both of these ways of thinking and nobody would question the merits of a lisp-like language compared to ALGOL/C derivatives that have been foisted into mainstream education and corporate support for the last 50+ years.
> The OOP people are currently dying/retiring as languages shift to be much more functional.
I don't know what programming language landscape you're looking at, but to me it doesn't look like that at all. I grant you that more and more languages are getting functional bits glued on. That doesn't make them FP languages, though. And my perception is that they aren't being used as FP languages. They're being used as other-paradigm languages that can do a bit of FP when desired. If people actually wanted an FP language, they'd switch to one. And they aren't.
> This creates a secondary effect where even when you encounter a superior language pattern, you cannot use it because your company isn't likely to pay you to rewrite complex libraries in a new language
I think the software dying or retiring is perhaps a more important factor than the developers, who can be flexible if the money/kudos is there for them. Rust has been picked up in various places to "forcibly retire" C/C++ programs which have had too many CVEs in their lifetimes; the only reason they were not retired earlier is the lack of a language with the required properties. If there was a bug-free highly performant implementation of SSL that existed in Lisp, that might be a thing that encouraged people to take a look at adopting it despite their lack of familiarity.
> ALL of these things have existed for 50+ years
Especially Lisp. Lisp has been around for a long time. At this point I think it's suffering from a "true communism has never been tried" argument which overlooks that people have tried it and takeoff has not been achieved, and its advocates continue to blame everyone else (as in your post) rather than engage in a little introspection or reflection.
> I'd hypothesize that if Eich had made a Scheme instead of JS (as he originally intended)
As the world turns, WASM now has a canonical format in S-expressions.
I’ve seen the programmer battles over programming paradigms in years past. They’ll pick inferior tech simply because it’s what they know.
I’d argue that alternatives to C/C++ exist, but they aren’t so C-Like, so nobody wanted to try them.
Unlike communism, Lisp has been successful at pretty much every place it’s been tried until someone showed up and demanded everyone change to their preferred language.
Ironically, WASM is the least usable “lisp” ever created because of its stack design.
> Neither the LH people holding LH scissors nor the RH people holding RH scissors are "wrong". It's just the RH people trying to use LH scissors who are having a bad time.
True, and good insight, but it doesn't mean that all scissors, or all languages, are equal. They can all perform computation, but a person with a particular language might create incredible things that another person with another language would have a hard time with.
Let's appreciate the difference in tastes, which reflect our different in thought processes and approach to the world, but it does not mean that every language is the same, nor every person is the same. A Lisp virtuoso will build tools which are completely alien in design and operation than tools built by a Java virtuoso.
I really don't want to take this off-topic, but this fallacy is very present in our modern approach to human diversity, where instead of celebrating our differences, we simply reduce it to a one-size-fits-all approach. There is enough space for everybody, the result is effectively the same, but you simply can't replicate a Lisp-wielding Paul Graham in any other language.
>, but you simply can't replicate a Lisp-wielding Paul Graham in any other language.
But anyone who wants to really dig deeper will wonder about replicating which particular aspect of PG's productivity.
If the aspect they care about is the ViaWeb ecommerce site that he sold to Yahoo for a few million, then all of PG's essays[1] about (paraphrasing) "Lisp syntax being their secret weapon to agility and flexibility compared to others using inferior blub languages" ... isn't going to be convincing to others who notice that Amazon ecommerce was built on Perl/Java/C/C++ instead of Lisp. And later, Shopify was built with Ruby on Rails and PHP instead of Lisp.
This is why language discussions focusing on "superior syntax" really don't really move the needle when it comes to mass adoption. Yes, Lisp's "code-as-data" is powerful etc, etc but there may be other factors that make that syntax feature not so important in the bigger picture.
Wild guess here (not a Lisp programmer). When I hear people talking about the practical benefits of Lisp, not just the nice theoretical stuff like homoiconicity and macros, I hear about interactivity. In Lisp it sounds like you can be pretty successful with a more experimental, on-the-fly development process. That might help a crack team push out features faster, but they might produce code that is proven to work more by experiment than by logic, which might mean it's harder to understand and keep building on. This leads to small Lisp shops that beat their competition in a niche but have trouble scaling it to Amazon size.
Lisp code tends to be built from the inside. Working in a REPL, modifying the system state as you go, storing code into a buffer then recompiling and loading it into the image with a keystroke. Recompiling functions as you modify them, without recompiling the rest of the file, updating the code behind some classes without taking the system down, and the interactivity you get from the tooling for debugging and error handling. It all adds up.
By itself this doesn't seem like a distinctive advantage of Lisp anymore. Lots of languages have REPLs. But I'm told that Lisp is more advanced in some way. Better debugging, breakloops, everything can be modified in-flight, etc.
Interactive interfaces (command line interfaces, etc.) are long available: BASIC, Smalltalk, Prolog, APL, UNIX shells, etc. Some of them also used source level interpreters.
One simple difference is that the READ EVAL PRINT LOOP of Lisp works with code and data in similar ways. It makes it easy to process lists which are data and also to process lists which are programs. READ reads lists, EVAL evaluates lists & other values (and can be rewritten in itself) and PRINT prints lists. Code is lists and programs are lists, too.
In the moment where the REPL uses a source level interpreter, debugging can go very deep, incl. modifying the running source code. That's kind a second nature when one is interactively developing Lisp code: every error stays in the error context and provides a break loop to explore/change/repair the current state - including that each additional error in a break loop just gets us into another break loop level, another level deeper.
> A Lisp virtuoso will build tools which are completely alien in design and operation than tools built by a Java virtuoso
Crossing with the "build what, exactly?" thread: https://news.ycombinator.com/item?id=36195055 - does the special alien-ness translate into either greater user satisfaction or greater commercial success?
It is possible that there are Lisp virtuosos. It is also possible that you have to be a virtuoso to write Lisp, as is definitely the case for J. Every J program is a tiny work of art. That's why there aren't very many of them.
> you simply can't replicate a Lisp-wielding Paul Graham in any other language
You can't replicate Tolstoy in any language other than Russian, either. I'm not sure that proves anything other than the existence of unique talents that are also bound to their circumstances?
FORTH is (was?) absolutely fantastic in it's niche. As an alternative to assembly when you were developing for and probably more importantly WITH underpowered computers, it was great. Even better, you could roll the tools together yourself with minimal effort.
Today one's rarely in a position where you have to bootstrap your development environment from almost nothing, so there are better alternatives.
But if you're stuck on a desert island with a computer and a CPU databook, bootstrap a FORTH compiler and then use that to make your LISP. :)
As someone who has been learning forth over the last like... 2 weeks... yeah. Somehow my brain was like 'this', and it just sorta clicks really nicely in a way that no other language has so far. I also really like Lisp fwiw, so if you see me in a room, you've seen a Lisper and Forther in the same room ;)
What's interesting to me is that I see Lisp and Forth as extremely similar-in-spirit languages, though FORTH is definitely "lower level" (pointer arithmetic everywhere) than Lisp. Depending on your implementation, I bet you could squeeze out nearly every bit of performance to be had on a given system with forth (given enough time), but I'd be really surprised if you could with any lisp.
I came to a similar conclusion about how some people strive to reduce what they call "visual noise" on their programming languages, by removing things like semicolons, parentheses, curly braces, etc. and others, like myself, who like the punctuation.
I think I read code visually. I understand its literal 2D-visual shape first, and use punctuation as visual markers to help me along. Then I backfill the gaps in the structure with the actual "content" of the code.
For the longest time I was baffled by the other camp's attempts to get rid of punctuation. What are they thinking? I now believe what they're thinking is of code as linear text. If you read the code linearly first, as text, and build the structure in your head as you go then yeah, all; of(that) seems {kinda} pointless && annoying.
Now guess which camp is more likely to write blog posts about how theirs is the One True Way? Ah, the humanity.
I've actually gone one step further and I'm pretty sure I can adapt to anything now.
I coded in Java for 20+ years and then switched to Kotlin. Semicolons were gone, it was a breath of fresh air.
And then I learned Rust and I faced what I thought were two absolutely insurmountable obstacles: semicolons AND snake_case. I considered Rust ugly for these reasons and really dragged my feet. But I was curious about the language, so I perservered.
One week later, I wasn't noticing any of these any more. I still do think that semicolons are a manifestation of a language developer who favors their time over mine, but it's become more of a pet peeve than a deal breaker.
> Certain people find certain metaphors and modes of computing much easier to work with than others.
Absolutely.
I took a couple of Lisp classes in my computer science program, and I saw this almost instantly. Some people who had a very hard time catching up in most other languages suddenly shined. However, some students who were otherwise very successful, had a harder time with functional languages.
I always wondered how this worked. Is it that we're just wired differently?
> ...concludes the weird scissor advocates are mad.
I just wanted to clarify I don't think this. I appreciate when people are passionate about something like Lisp. At least, I'm happy for them!
very much so. Richard Feynman had once said that even the simple act of counting is perhaps completely different between different people. Some people count by visually seeing a count, some use a voice, and may be there are others who have a different system.
Every year, or so, I'll design a programming language on paper to work through my "latest thoughts" as impacted by languages and ideas I've picked-up since. Each time the design is different, with different priorities etc.
I think what I'm doing is exactly what you describe: clarifying my mental model of programming languages -- so that I arrive at something which "feels right".
I can then come back to actually-existing languages and express my "prelinguistic" ideas using whatever syntax they make available.
Absent this activity, I think I gradually end up too conceptually confused -- blending a mixture of whatever languges i'm working in.
The power of a single radical paradigm solves this problem for people, like me, who require a theory to feel comfortable; but without all the effort i go to.
(Though for me, of course, it's a hobby --- I like to see how close I can get to designing a language which, at that moment, is the one I'd wish to program in).
Naur is probably the closet computer scientist to my world view -- on many fronts.
I'm reminded of his genuine attempt to go into neurobiology, and via William James, take seriously biological adaption and plasticity.
I think over the last 20 years, engineers and mathematicians seem to have "taken over" computer science -- against the tradition of "scientific philosophy" which Naur represents.
> Every year, or so, I'll design a programming language on paper ...
Where do I subscribe?
> Each time the design is different, with different priorities etc.
Classic trilemmas. Nice.
Scott McCloud's triangle for style of illustrations was a eureka moment for me. The tips are ideographic, textual, and realism (IIRC). All comics lie somewhere within that triangle (solution space). Mind blown.
There are many either-or tradeoffs: closer to the metal vs abstractions, explicit memory management vs garbage collectors, etc.
But there's probably also a few trilemmas, like functional vs imperative vs declarative.
Anywho. Just a casual notion. I'd love to see a big list of design decisions for programming languages.
Today's embarrassment of riches has reduced the enthusiasm for language jihads. But it'd still be nice to have something more than esthetics and vibes to guide our choices.
This morning I was sketching how i'd do syntax for affine and linear types, which are (very basically) where variables can be used "at most once" or "only once".
In sketching I iterated various designs, my notepad is below. It began trying to think about how to express scopes visually, or using tags/lables -- then moved into how that integrates with other langauge features etc.
By doing this I understand much more about what really a language feature is trying to do -- I understand the tradeoffs, etc.
program a:
let x:a = 10
if 10 < 5 b:
let x:b = x
print(x)
program UseyTypes:
x : once = new Resource() # at most once -- rust's default
y : never = ...
z : once! = ... # only once
q : many = ...
program MoveSemantics:
a = new 10
x = 10
y = move x
z = copy y
i0 = new(auto) 10
i1 = new(memo.alloc.heap) 10 # default
g1 = new(heap) Graph.init({...})
g2 = copy(Graph) g1 # uses Graph's (deep)copy method
g2 = copy(Graph.copy) g1 # eqv to above
program :
global heap = Allocator.tmp(...)
if True:
local xs = Vec.new(heap) { 1, 2, 3, 4 }
memo.lifetimeOf(x) # eg., local scope a = lineno 10 - 20
memo.lifetimeOf(x.data) # eg., global via heap allocator
repeat x <= xs:
x = 10 # error
print(x)
repeat i <= int.range(10):
x[i] += 1
repeat:
const answer = input("What's 2 + 2?").to(int)
print(compiler.scope)
print(compiler.lineno)
const int z = 10
const ref(int) y = ref(z)
print(y, deref(y))
const x : global = 10
if 10 < 5:
int x : local = 10
int y : parent = 10
# polymorphic .to
fn str.to(int):
parseInt(.)
program EffectSystem:
fn[IO] println(const ...data: ...any):
repeat d <- data:
IO.writeline(d.to(str))
pure fn calc(): # pure = no effects
return 10 + 20
# by default, all fns are impure, ie., have IO auto-passed ?
println("Hello", "World")
with IO = open("output.txt", "w"):
println("Into some file!")
println[open("output.txt")]("hello world")
Out of curiosity, have you ever tried developing DSLs in Racket? One of its explicit reasons for existence is to enable fast development of custom DSLs.
The art of designing a language is expressing semantics in intuitive syntax -- it's an art because "intuitive" is essentially a psycho-social cultural condition. (ie., I reject Lisp)
C was "intuitively mind-expanding" to assembly developers and PDP-machine programmers -- and so on.
My aim is always to express a semantic concept in syntax so that it is so obvious that it's originating language developers will be shocked.
You can do that both with, eg.,
map fn over collection
and
xs/fn
and
repeat x from xs: fn(x)
and
{ fn(x) st. x <- xs & st. x > 0 }
etc.
In that each syntax resonates with a certain programming culture.
For novices, I suppose the following might be "consciousness raising",
set result :=
repeat:
set x := next xs:
save fn(x)
I'm a Lisper. One of my good friends at Apple was a hardcore FORTH guy. I had other friends there that were Lisp or Smalltalk enthusiasts.
We got along great.
The early meetings at Apple for the design of the Dylan language definitely had both Lisp and Smalltalk folks participating. I wouldn't be surprised if some of the participants were FORTH folks, too.
We used PdB to develop HyperLook for NeWS, integrate The NeWS Toolkit components into HyperLook, and implement the SimCity user interface on SunOS/SVR4/X11/NeWS.
This discussion thread revolves around the concept of implementing Lisp-like macros in PostScript for creating more efficient drawing functions. The user "DonHopkins" highlights their work on the Open Look Sliders for The NeWS Toolkit 2.0, where they leveraged a Lisp "backquote" like technique to optimize special effects drawings. The user explains that this approach accelerates drawing at the expense of increased space utilization. They also propose a potential solution to space conservation by only expanding and promoting macros during tracking, then demoting them upon tracking completion.
DonHopkins shares several resources on NeWS, LispScript, and the PostScript compiler, and also refers to window management tasks in Forth and PostScript for comparison. Additionally, they discuss a paper on syntactic extensions to support TNT Classing Mechanisms and share a demonstration of the Pie Menu Tab Window Manager for The NeWS Toolkit 2.0.
Another user, "gnufx", appreciates the shared resources and brings up the metacircular evaluator in HyperNeWS or HyperLook as a potential speed bottleneck in the system.
DonHopkins responds by explaining the use of a metacircular evaluator (ps.ps) they wrote for debugging. They clarify that speed was not a concern as the evaluator was not used in any speed-critical situations. DonHopkins also discusses the technique of "PostScript capture," likening it to partial evaluation of arbitrary PostScript code with respect to the imaging model. They relate this concept to Adobe Acrobat's "Distiller" and Glenn Reid's "PostScript Distillery".
I honestly love posts about Forth (or Factor) and Common Lisp (or Lisps in general). I love both languages. On top of that, I use C a lot, along with OCaml, Lua, and Erlang (and rarely Ada). I find each one of them beautiful. :)
The Forth programmer walked into the room backwards, like a moron. The Lisp programmer already in there had a perfect opportunity to get him in the back. Stupidly, his weapon was lying in a heap of stuff, and was boxed; he couldn't get it out in time. Moreover, it blew up in his face because he imported it, he had falsely declared it to be of toy type.
I suddenly stopped worrying about both LISP and FORTH, when my CS professor mentioned (around 1995) that it would be trivial to write a transformer between the LISP and FORTH.
Lispers get a mildly bad rap for some historical, unpersuasive "holier than thou" evangelism, but the truth is, almost all programming languages du jour have louder and fiercer evangelists. In fact, you can be paid to evangelize Rust, Python, and other languages.
As far as the last decade is concerned, the most productive open-source and professional Lispers have done some advertising of Lisp, but mostly through a combination of technical blog posts and actually making things.
I like Lisp a lot, especially Scheme. It's simple, it's pure, it's expressive, it's powerful, and it's fun to write. If I could, I would use it for most things.
I can't convince anyone to use it. They just don't like the look of it, they stumble at the brackets. I know that I can't write something in it and expect other people to use it, build it or maintain it.
Lisp advocates think that "everything looks the same" is an advantage, whereas most people strive desperately to make things look different in informative ways with syntactic features (e.g. different brackets) and syntax highlighting.
One of the reasons it has those data structures is because it sits on the JVM and to maintain compatibility with Java apps it must use them and provide a syntax for them. That’s a practical choice, of course, but also less of an idiomatic lisp.
In (pure) lisp there is only one main data structure and it’s everywhere.
If they had an actually good IDE for it that was LispWorks I would give it a go.
Forcing the overhead of learning CommonLisp, which is actually a fairly big language despite the syntax, on top of learning how to use emacs is a big ask.
But other than a Skyscanner predecessor (VIA), a computer algebra engine, and Grammarly, are there any other modern high profile lisp powered products?
Okay, other than Skyscanner, Grammarly, HackerNews, Emacs, CircleCI, Metabase, Crash Bandicoot, Nubank, other Clojure projects etc, what has Lisp done for us?
Or, to lose the Monty Python snark, "Aside from the same handfuls of projects, counted on the digits of two hands, and always reiterated anytime somebody asks for high profile Lisp projects, what other high profile code is there from a language whose proponents always advertise its huge productivity gains?"
Crash Bandicoot was Lisp, but the greatest PSX game (Metal Gear Solid) was written in C, so I’ll use that to justify my programming opinions to others.
Seriously, you really aren't aware of the association of Lambda with gay rights? Are you a native English speaker or an American? It's easy to google, widely known, and well documented. I'm glad for the opportunity to educate you!
I've also heard conservatives try to implausibly deny they ever heard of such a thing as the "gay lisp", too, but that ignorance-based excuse doesn't hold any water, either.
But I suppose there are some home-schooled Fred Flintstone conservatives living under a rock in Bedrock (or Florida or Texas) who have carefully cultivated their ignorance about gay history and culture, and who have never met any gay people (or are so openly homophobic that most gay people refuse to come out to them out of fear), and that their deep ignorance untainted by the facts is part of the basis for their rampant homophobia and terrified moral panic.
>The Lambda Legal Defense and Education Fund, better known as Lambda Legal, is an American civil rights organization that focuses on lesbian, gay, bisexual, and transgender (LGBT) communities as well as people living with HIV/AIDS (PWAs) through impact litigation, societal education, and public policy work.
>Lambda: In 1970, graphic designer Tom Doerr selected the lower-case Greek letter lambda (λ) to be the symbol of the New York chapter of the Gay Activists Alliance.[5][6] The alliance's literature states that Doerr chose the symbol specifically for its denotative meaning in the context of chemistry and physics: "a complete exchange of energy–that moment or span of time witness to absolute activity".[5]
>The lambda became associated with Gay Liberation,[7][8] and in December 1974, it was officially declared the international symbol for gay and lesbian rights by the International Gay Rights Congress in Edinburgh, Scotland.[9] The gay rights organization Lambda Legal and the American Lambda Literary Foundation derive their names from this symbol.
>The Encyclopedia of Homosexuality has the following entry on Lambda:
>In the early 1970s, in the wake of the Stonewall Rebellion, New York City's Gay Activists Alliance selected the Greek letter lambda, which member Tom Doerr suggested from its scientific use to designate kinetic potential, as its emblem. (Curiously, in some ancient Greek graffiti the capital lambda appears with the meaning fellate, representing the first letter of either lambazein or laikazein.) Because of its militant associations, the lambda symbol has spread throughout the world. It sometimes appears in the form of an amulet hung round the neck as a subtle sign of recognition which can pass among unknowing heterosexuals as a mere ornament. Such emblems may reflect a tendency among homosexuals toward tribalization as a distinct segment of society, one conceived as a quasi-ethnic group.
>In More Man Than You'll Ever Be by Joseph P. Goodwin (Indiana University Press:Bloomington, 1989) on page 26, Goodwin writes:
>The lowercase Greek letter lambda carries several meanings. First of all, it represents scales, and thus balance. The Greeks considered balance to be the constant adjustment necessary to keep opposing forces from overcoming each other. The hook at the bottom of the right leg of the lambda represents the action required to reach and maintain a balance. To the Spartans, the lambda meant unity. They felt that society should never infringe on anyone's individuality and freedom. The Romans adopted the letter to represent "the light of knowledge shed into the darkness of ignorance." Finally, in physics the symbol designates and energy change. Thus the lambda, with all its meanings, is an especially apt symbol for the gay liberation movement, which energetically seeks a balance in society and which strives through enlightenment to secure equal rights for homosexual people.
And then of course there's the purple (another classic gay color) cover of Structure and Interpretation of Computer Programs, with the two magic dudes dressed in drag with a lambda symbol floating between them.
> I've found that a lot of social conservatives tend to be unconsciously afraid and ashamed of Lisp out of moral panic due to its implicit associations with homosexuality (the gay lisp stereotypical speech attribute, and lambda being associated with gay rights).
This is hardly believable to me. How did you arrive at that conclusion? Isn't the community of Rust, a language I think that you can safely call a lot more popular than Lisp these days, also very vocal about supporting LGBTQ+ rights?
But Lisp has a much longer tradition of terrifying social and linguistic conservatives since 1959.
And look at all the social conservatives desperately fighting against the inclusivity of the Rust and other communities, which kind of proves my point that it terrifies and threatens them.
And gimp and AutoCad (ok, not anymore)
But I would like to say also: is it a solid/sound metric “how many well know projects use it”?
There where extremely popular junk languages for decades, that are mostly regarded as crap today.
That's not what I mean. It's up for debate whether Clojure is a real Lisp, but that aside, they're farther apart in paradigms than most languages are. One is an immutable functional language running on a JVM and the other is a hodgepodge of OOP concepts and low level programming capabilities.
I don't mean to say this as a dig on Lisp, but the reason you didn't just list Clojure projects or just CommonLisp projects is because we would transparently see how few have actually worked for either of them. So when you lump them together, it comes off like we're scraping the bottom of the barrel for examples, and that's not even considering how much these companies actually use Lisp or have continued using it.
Fairly certain Raytheon uses CL in their signal processing pipeline for simulating ICBM missile defense so if you not melt down in WW3 you got lisp to think for that, at least partially.
Fully expect the first HN thread while we climb out of the ruins to be though 'what has lisp ever done for us, it doesn't even run my web app'
The big one that I always remember is Crash Bandicoot - Naughty Dog I think had their own version of Scheme and then switched to Racket at some point. Nubank are also a semi-high profile company who use a lot of Clojure.
Of course, if you ditch the "modern" requirements, I'm sure there is more web infrastructure and scripts supported by Common Lisp than people would want to admit. . .
according to any Andy Gavin, the cofounder of Naughty Dog and a MIT AI lab alumni, crash 1, 2 and 3 were written on GOOL/GOAL[0] which was a home-brewed lisp. According to Franz themselves, the language was hosted on allegro common lisp[1]. the language gave him an ability to push ps1 platform to its limits by leveraging the kind of thinking that's part of lisp lore: incremental recompilation of a running ps1 instance using a remote little language written in and hosted on a Common Lisp dynamic environment. the previous sentence describes a poorly understood practice that of a dynamic environment leveraged development that was part of lisp machine and smalltalk machine and a handful of other now forgotten approaches. in a sense crash was not just "written in lisp", it was written leveraging lisp machine like thinking, that Gavin would've been familiar with from his MIT AI days.
when naughty dog got sold, all the remaining Gavin lisp systems were eventually stripped, so that the company for all intents and purposes became a standard c++ shop. some time later some hipsters wired plt scheme[2] as a scripting language for the Naughty Engine 2.0. unlike the original Gavin approach this is not some deep leveraged architectural decision, and it being lisp is pretty irrelevant to the sort of capabilities it provides. imho scripting language for a game engine selection is a lipstick on a pig kind of process, as demonstrated by various basic-like potato languages that came with legendary triple-As.
It's the Reddit story all over. Lisp devs know Lisp + X, but everyone else only knows X, so we'll use X instead -- even if it's inferior and causes issues down the line.
This isn't really limited to Lisp though. It applies to quite a few languages with the excuse of "market forces" where "market forces" really means "we want to makes sure our devs are easily replaceable cogs" (using a niche language actually pressures both sides to stick together because the dev can't easily find a new job and the company can't easily find a new dev).
It's slightly different: Naughty Dog had proven that they can deliver commercial successful applications (novel platform games on the Playstation with excellent content) using Lisp. They had their own IDE on top of Common Lisp and as a delivery vehicle a Scheme-like runtime.
They were bought by a much larger C++-Shop (Sony) and were trying to get the benefits from a larger eco-system. In the end they were bought for their talent, their experience, their brand - but not for their technology.
For Naughty Dog it could also have been the right moment, since from a certain point in time the game platforms are getting so complex that making custom inhouse game runtimes may no longer make sense for smaller game studios.
Reddit OTOH had never delivered anything commercially successful in Lisp, little experience with Lisp, but heard that it could be cool. They used unproven tech. Naughty Dog used proven tech and had enough experience to do low-level systems programming for a brand new game platform. Which is really an outstanding achievement. Reddit had only a rough prototype in Lisp, Reddit then switched inhouse to other technology.
Naughty Dog only switched because they were bought out by Sony who then demanded that they change languages.
Reddit was merged with another YC company. That company used Python, so they switched everyone to Django. Last I knew, most of Reddit’s outage woes were still due to the outdated ORM they are stuck with. In any case, Common Lisp is hardly “unproven Tech”.
> Naughty Dog only switched because they were bought out by Sony who then demanded that they change languages.
To reuse a larger code-base, instead of working on their own new platforms for the next systems.
> In any case, Common Lisp is hardly “unproven Tech”.
Common Lisp is a language. Software is running on implementations and SBCL was relatively new then (2005).
They used SBCL which at that time was not used to implement such websites.
Naughty Dog used Allegro CL, which was already used in a bunch of 3d/OpenGL applications. Their own runtime was custom build and required real deep expertise in implementing a GC-less Scheme for an embedded platform.
Reddit could have switched to a paid Common Lisp variant without any trouble if they'd actually had issues. The people there said they moved to Python because that's what the other team knew. I don't see a reason to argue otherwise.
The argument behind the Naughty Dog switch was also pretty clear. Sony wanted to be able to move devs between projects easily and everything else used C++, so they'd rather force Naughty Dog to use C++ than tell everyone else to learn Common Lisp. To my knowledge, there was zero discussion on the merits of one or the other and it was a business call.
Further, the reams of custom code Naughty Dog now has written on Racket points to them still loving lisp and not minding if they have to invest a lot of effort into being able to use it in their designs.
> Reddit could have switched to a paid Common Lisp variant without any trouble if they'd actually had issues.
I thought they had issues. Didn't they?
Paid Common Lisp variants tend to get expensive and even for those, the main applications rarely were high-traffic websites with UI frameworks.
Take ITA Software / Google, they were developing core business logic of the flight search engine in Lisp - the product than ran/runs in SBCL. They had a team of 100+ people and a bunch of the top Lisp talent of that time. They also invested a lot into improving SBCL.
> Sony wanted to be able to move devs between projects easily and everything else used C++, so they'd rather force Naughty Dog to use C++ than tell everyone else to learn Common Lisp. To my knowledge, there was zero discussion on the merits of one or the other and it was a business call.
A business call is based on assumptions: larger ecosystem, more libraries, shared runtimes, etc. That's all much more economical than doing it alone as a small studio.
> Further, the reams of custom code Naughty Dog now has written on Racket points to them still loving lisp and not minding if they have to invest a lot of effort into being able to use it in their designs.
Of course they love Scheme and they were then back creating their own content delivery tools. But they stopped implementing runtime things like core 3d graphics animation frameworks for new CPUs/GPUs, etc.
Things were more complicated than that with Reddit from what I've read (and from a now defunct blog post they wrote not to mention various talks and interviews from devs who were there at the time).
Their devs were using Macs in 2005 which ran on PowerPC. Their servers were x86, but running FreeBSD (honestly, that was a tall ask for most languages in 2005). They had an issue finding threading libraries that worked on that OSX/PPC and FBSD/x86 combo. They further complained that there weren't tons of libraries available for use either. Finally, they also made some bad architecture decisions unrelated to Lisp.
The switch is still a weird one if you move aside from the new team not knowing or wanting to learn Lisp. Python isn't threaded at all, so they could have stuck with non-threaded CL and still have had 100x faster code. Likewise, they wound up rewriting all the python libraries they were using because those libraries turned out to have tons of issues too.
Naughty Dog continued using Lisp for game development throughout the Uncharted series at least, I'm not sure about The Last of Us but I would be very unsurprised if that changed.
They just stopped having the game written nearly purely in custom Lisp dialect, instead only mostly - effectively switching from GOAL setup of Jak&Daxter run-time to similar approach to GOOL in Crash Bandicoot - core written in C/C++ running considerable portion of game logic in Lisp (variant of PLT scheme in Uncharted).
Uncharted dev tools were also built in PLT Scheme (aka Racket).
To clarify, Naughty Dog has still been using LISP-based scripting and asset definitions in their recent games[1]. (Though I count The Last Of Us as recent so I guess that shows how often I play games.)
Some people do not realize this, but the thing you build when you compile an Emacs distribution is not a text editor. It's just a specialized lisp interpreter that has primitives for handling things like frames, buffers, and other UI-type stuff like detecting key chords, plus some math and stringy stuff, and a few other bits and bobs. The editor itself is written in Elisp.
To illustrate, this is what I get when I run `sloccount` on the source download of Emacs 28.2 from gnu.org:
It's basically 75% elisp, 25% C, and more or less all of that C code is either implementing the lisp interpreter itself, or interfacing to system libraries. Using Emacs is really the closest you can get these days to working on one of the old LISP machines from the 80s, except it's more fun.
> It's just a specialized lisp interpreter that has primitives for handling things like frames, buffers, and other UI-type stuff like detecting key chords, plus some math and stringy stuff, and a few other bits and bobs.
One day, it may actually include a decent text editor too! But for now, it's the end-luser's responsibility to cobble one together from the provided parts.
The blub paradox strikes the lispers hardest, for they have been reassured by PG and each other that there is no language better than lisp. Those of us using more powerful languages should learn from their hubris.
I can see a lot of axes for better that involve removing capability from lisp. Lisp abstracts over data representation, lifetime, code layout where other languages force you to make choices which may fit the domain better. There are also a lot around programmer ergonomics - compile time detection of various error prone constructs.
The only one that comes to mind for more powerful is abstraction over control flow. Instead of delimited continuations, one can go with unification and implict control flow.
What are the increases in power you consider lisp users blind to?
(p.s. first class environments, first class macros are missing from common lisp and scheme, but not from all lisps)
Lisp programmers tend to think that every problem is best solved by layering abstractions. Macros, homoiconicity, quasi-quoting, and s-expressions facilitate abstraction. Some even think programming is abstracting.
But abstracting has a cost. Sometimes in terms of performance (perhaps mitigated with great effort or a sufficiently smart compiler), but always in terms of cognitive load. A elegant lisp expression is meaningless in isolation. When reading lisp code, you must recursively find and remember the definitions for each token on the page before you can understand what it does. Hopefully the author knew Naming Things is Hard and choose good names. Hopefully the names point you in the right direction. But you can't be sure unless you traverse each definition to its leaves. Lisp programmers are blind to the power and clarity of thought that comes from direct expression - all definitions visible at once, with no indirection.
A lisp programmer might look down on a Java programmer's reliance on IDEs. An IDE is a powerful tool for shoveling mountains of code, but a lisp programmer might say "I can solve this problem without mountains of code". Likewise an apl programmer might look at a lisp solution and say "I can solve this problem without defining any new terms". A forth programmer might say "I can solve this without parentheses". A C programmer might say "I can solve this without a heap". An assembly programmer might say "That entire program is equivalent to a single x86-64 instruction".
By crafting different bespoke DSLs for each new problem, lisp programmers lose the opportunity to hone one DSL to perfection. The knowledge that any problem could be solved with lisp lures them toward the fallacy that every problem should be solved with lisp.
> Lisp programmers tend to think that every problem is best solved by layering abstractions.
Nah, when I write Lisp programs, I have more of an appreciation for appropriate abstractions, not needless abstractions.
> A elegant lisp expression is meaningless in isolation.
Elegance is usually in relation to something. For example,
(list 1 2 3)
Is more elegant than
var list = new LinkedList<Integer>();
list.add(1);
list.add(2);
list.add(3);
> When reading lisp code, you must recursively find and remember the definitions for each token on the page before you can understand what it does.
You need to do this for any language. In Java and many other languages, having a "Go to definition" IDE function is very useful.
> Lisp programmers are blind to the power and clarity of thought that comes from direct expression - all definitions visible at once, with no indirection.
Not sure what this is referring to. Do you have an example?
> A lisp programmer might look down on a Java programmer's reliance on IDEs.
Plenty of Lisp programmers use Emacs, which has a great many tools to help developers including jumping to definitions, showing documentation, running and using a step debugger for code, etc. Not sure why Lisp programmers would look down on Java programmers because of an IDE.
> By crafting different bespoke DSLs for each new problem, lisp programmers lose the opportunity to hone one DSL to perfection.
This seems to be a Lisp meme. Just because Lisp can be used to make a DSL doesn't mean all Lisp programs are macro-implemented DSLs. There's plenty of Lisp code that looks just like Java code with function, struct, object definitions and calls, except it uses parens.
Fair, but if your language keeps growing, this task is never done.
> Not sure what this is referring to. Do you have an example?
Array programmers sometimes avoid abstractions because they prefer "idioms" like these[1]. So rather than curate a library of words like:
barchart: {x>\:!|/x}
and have the programmer use the word "barchart", they instead prefer to use the definition itself. The word "barchart" has a specific meaning (here, an ascii "bar chart" of 0s and 1s, showing the relative sizes of the values of input array x), but "{x>\:!|/x}" might be useful for more than just bar charts. This idiom contains smaller idioms like "count til max" (!|/) which in turn contains "max" (|/).
Being able to see the code makes it easier to explore and tweak to your specific needs. But more importantly, there are no "official" names for concepts like "count til max". That's just my personal name for it. A python programmer would call it "range". You could come up with your own name for (!|/) that makes perfect sense to you. But that name will probably be longer than its definition, and less flexible.
Ok, but Lisp doesn't force you to curate a library of words or require dealing with abstractions.
Maybe this example is relevant. Racket defines procedures `filter` and `map` for `list`. Also provided is `filter-map`, which I assume may satisfy your not-being-direct-expression concern about Lisp. But, `filter-map` exists for a particular reason:
> Like (map proc lst ...), except that, if proc returns #false, that element is omitted from the resulting list. In other words, filter-map is equivalent to (filter (lambda (x) x) (map proc lst ...)), but more efficient, because filter-map avoids building the intermediate list.
So it exists for performance concerns. It also exists in the "Additional List Functions and Synonyms" so it's not like it's being confused for a core, important function like `filter` or `map`. I still write Racket code where I just explicitly have something like:
The thing is, a compiler could easily recognize the pattern (filter identity (map proc lst ...)) and rewrite it for efficiency (perhaps by using filter-map).
Writing it that way in the first place reduces the verbosity of the code, making it look better.
Static typing is a _decrease_ in expressive power in exchange for compiler diagnostics.
Linear types are probably not representable in lisp (bad interaction with reified continuations, not great interaction with environment capture), that is something I miss.
> Static typing is a _decrease_ in expressive power in exchange for compiler diagnostics.
This sentiment is at the heart of the blubbiness in question. Just like homoiconicity (relative to something like C) lets you say more about code in exchange for a reduced ability to implicitly move back and forth between syntax and operational semantics, a good static type system lets you say more about computations in exchange for a reduced ability to implicitly move back and forth between denotational semantics and syntax.
Lisp seems less expressive than C until you start thinking of programs as more than just sugar for machine code; ML-family languages seem less expressive until you start thinking of programs as more than just their implementations.
There is something here I think. The lisp/C example is useful in that there are things which C lets you express things that lisp usually does not. Garbage collectors for lisp tend to be written in C, even when the lisp in question can directly munge the machine code if you wish. For that matter whether C is useful sugar or obstructive depends strongly on what you want the machine code to be.
It's tempting to handwave away compile time detection of missing cases in pattern matching as nice but inessential. However it is something which gets leaned on very heavily where it is available. It takes a collection of failure modes out of the mind of the programmer to encourage thinking about other things.
Expressive power is probably inconsistent as a concept. Whether introducing a constraint or removing it increases expressivity depends on what one is trying to express. For example, should code that does not typecheck prevent running code which does? Depends on context - it's deeply annoying during development, but helpful to avoid checking in code which no longer works on the paths you weren't looking at.
Lisp isn't any one thing. There are some with static type systems. PreScheme is a Scheme subset developed in the 90s with static typing via Hindley-Milner type inference. There's a wide world of PLT that has been explored with various Lisp implementations.
I am a big fan of "functional programming languages", including Lisp to an extent. That said, I appreciate the teachings they can impart - abstract structures, design sense / "design patterns", and ways of thinking etc. - more than any sort of dogma associated with "their way". Although, to be fair, I did go through my own period of foolishness in ... say, "venerating" this particular "methodology".
Your comment did prompt thought of Pratchett's book "Interesting Times". Specifically, passages like "... It had come as a revelation to Lord Hong when he looked at the problem the Ankh-Morpork way and realized that it might just possibly be better to give the job of Auspicious Dog-maker to some peasant with a fair idea about metal and explosive earths than to some clerk who'd got the highest marks in an examination to find the best poem about iron. In Ankh-Morpork, people did things."
The tricky bit in everything, and the difference between the true expert and not quite is often found in forms of "executive function" ... Knowing what to apply and when ... when to change up strategies, what the value is in some methodology and how to bring that into what you are doing ... all these sorts of things.
I always know when I'm at more of an "advanced intermediate" level (a sort of "first black belt" level) when I know too many ways to go about doing something but have no idea which is likely to produce solid progress in a relatively time-efficient way.
So, bit of a digression off of your comment, but stimulating from my perspective and hopefully of some use to anyone who might read this.
In all fairness we need to note that it speaks of the most vocal followers. There are likely many others who are evangelizing it much less, but that's exactly the source of the bias: we don't "see" them.
I in fact agree with the observation, but I also think that the same can often said of many passionate people. Scala? Blockchain?
Continuing the observations, mainstream approaches seem to have less passionate followers. Very few people are evangelizing Java as passionately as others do Lisp. I wonder what makes mainstream less attractive in this sense.
I've come across several classes. Sadly, a thing most have in common is derision towards any other class. Be it language choice, agnosticism, editor choice, paradigm choice, toolchain, whatever. It is frustrating, as so many of us seek to find where and how we disagree with each other, when we'd get far more mileage out of understanding where we agree.
Honest question: have you ever done something more than trivial in Lisp?
In my experience, people who used many languages have some favorite at the end. Sometimes depends on the task what they think is the best for task X.
spanning several decades too, i have a version of your observation: i think the class of geeks is reminiscing, because "back in the day" common lisp was (like everything now) a world in which there were libraries for all the hip (at the time) buzzwords.
This group would swear by Lisp. And by that I mean, they would treat Lisp as the gold standard. Sometimes, the only standard that matters. They would go to unimaginable lengths to explain why Lisp is really the purest form of computer science expression; how it sharpens your mind and elevates your skill; how it is akin to poetry, only better, because it's poetry in code.
And after all this was done, and repeated several times over, I would gently nod at this exalted madman who had been blessed by the touch of the Lisp God. And then we'd both go back to whatever language we were actually using at the time.