Hacker News new | past | comments | ask | show | jobs | submit login
The Roots of Lisp (2001) (paulgraham.com)
124 points by afurrysolver on Jan 16, 2020 | hide | past | favorite | 90 comments



> In 1960, John McCarthy published a remarkable paper in which he did for programming something like what Euclid did for geometry.

One could argue that Church's (1930s) lambda calculus, which underlies LISP, is a closer analogue to Euclid's distillation of the essence of Geometry.

With the minimal addition of pure binary IO, the lambda calculus is easily transformed into an untyped programming language [1]. On the other hand, full blown and richly typed pure functional programming languages like Haskell remain semantically faithful to their lambda calculus underpinnings.

[1] https://tromp.github.io/cl/Binary_lambda_calculus.html


Cool! You might enjoy this:

http://www.flownet.com/ron/lambda-calculus.html

[EDIT] Heh, I just realized that we corresponded on this back when I wrote it in 2014! Small world.


i don't think it's accurate to say lambda calculus underlies lisp, is it?

certainly lambda calculus was used to define functions in lisp, as seen in the original lisp paper (or the most famous one).

http://jmc.stanford.edu/articles/recursive.html

lambda calculus was a tool, but it seems godel and turing's work on computing machines and recursive functions were more fundamental to the original lisp.


The claim seems accurate to me, at least for Scheme: that is, lambda calculus does underlie Scheme.

Lambda calculus, as articulated in Church's 1935 paper, gives us a way to understand computability. But if we look specifically at Operation II in the paper (p. 347), what we have is the lambda function in Scheme. Given that every function in Scheme (including special forms) is equivalent to some lambda function, it seems that in virtue of Operation II we can say that lambda calculus underlies Scheme.

Whether we can push this further to say that lambda calculus underlies McCarthy's initial conception of LISP, or some arbitrary version of Common Lisp, I do not know for certain. But if all these versions treat lambda functions the same way as Scheme does (in relevant respects), then the claim holds for Lisp generally.

[1]: Church. An Unsolvable Problem of Elementary Number Theory. _American Journal of Mathematics_, Vol. 58, No. 2. (Apr., 1936), pp. 345-363.


Maybe "underlies" is a little strong, and I should have said "inspired"...


Recently had a conversation on how functional programming and declarative programming were somewhat linked, and how i observed that languages that started to go down the functional road slowly tried to add more and more capabilities for DSLs, and then slowly moved toward becoming a lisp. But that was just my feeling.

is this theorized somewhere ? Or am i just dreaming ?


Anecdotal experience:

(dynamic) functional programming enables very 'easy' succinct abstractions through combing out concretions and composing functions out of a general tool-set. Think code-reuse in the small.

With some discipline you naturally end up with a very data-oriented codebase, where the specificity of your application/library ends up in your data-structures.

Now these data-structures are really just plain data, without specific behavior attached to them, so they embody the (maybe almost) declarative part of your program/system. The next step would be to polish them to make them more human readable.

An additional feature of Lisps is their homoiconicity and macros. These come into play when plain data-structures are not the right fit anymore. Typically you want to handle some user/client defined behavior. You can now transform the syntax of the language itself to clean up your API to reduce boilerplate and increase readability.

In a sense you have now programmed your own DSL.


A well-known quote is that whoever doesn't understand Lisp is doomed to reinvent it: http://lambda-the-ultimate.org/node/2352


Greenspun tenth rule may come into play here [1]

[1] https://en.wikipedia.org/wiki/Greenspun%27s_tenth_rule


McCarthy wasn’t just interested in making Lisp for AI, he was thinking about the possible applications AI, with Lisp as a platform to get to more advanced states of understanding. An example of this is in his paper “Computer Control of a Machine for Exploring Mars." Stanford Artificial Intelligence Project, Memo N. 14, June 15, 1964, Author: McCarthy, John, 1927-2011; Stanford University Library collections.

https://stacks.stanford.edu/file/druid:qh147kq8662/SC1041_SA...


If I look at the operon structure of a genome, and squint a bit, it looks very similar to the LISP structure:

(func var0 var1 ... varn)

Could it be that the design of LISP is inspired by the structure of the genetic code?


I always thought this as well. In Lisp, data is the program similar to how DNA is data as well as the program.


We are all just mutated Lisp...and mutts.

https://xkcd.com/224/


Funny to see this after all these years. I read this paper just before starting college in 2003 and it inspired me to spend most of the summer writing a rudimentary Lisp interpreter in rudimentary C++. That was my first piece of my code that actually worked, and was more than a dozen or so lines. Good memories.


I get that Lisp is popular because of its meta-programming abilities, ie its macro system. And the reasoning was that you didn’t have to wait for the language writers to implement the feature you wanted, since you could implement it yourself as a macro.

But this presented other problems, namely that you might end up coding your libraries on an island, and no one else could understand your code. This is probably why Lisp is mostly more powerful when there is only one or a few programmers working on the code base.

But with the increase in other programming languages, and the massive amounts of other libraries out there, then is this macro system really all that necessary anymore?


Clojure is a lisp that runs on top of Java, JavaScript and a couple of others and has access to all of those libraries via npm and maven

Having more libraries than most languages doesn't nullify the use of macros, macros are usually discouraged in application code because they don't compose well, don't make a macro if a function will do

However having macros meant that when go popularised CSP concurrency at the language level it meant we didn't need to change the language (for all variants of Clojure) it just meant we could make a library with macros to do CSP concurrency and it works on the browser on the server etc

Completely opt in, well documented as a library and we didn't weigh down the language with this feature, if or when the next big concurrency thing comes along we won't be looking to deprecate go channels from the language

Macros are incredibly powerful and you probably won't need them often but for those small cases of scratching a particular itch they're a get out of jail free card


It amazes me how @pg managed to conquer the business side of startups and started YC given his very technical background!

To me it is not very common for (very) technical people to be good at both tech and business!


I'm electronics and embedded software developer, but I took a pause from that and founded a small web business startup with few guys two years before PG founded Viaweb. The product was very similar to PG 's Viaweb. Viaweb was more general and consumer market oriented. We had narrow focus with b2b trough our connections.

It's hard to describe how easy it was to make money in the late 90's with even little technical skill and tiniest amount of business understanding. We were making money hand over fist knowing next to nothing.

PG sold his company for $50 million, we sold ours for $5 million after we took the mountain of cash out of it (another $5 million). After reading how PG did it, I think his genius was paying $16,000 a month for a PR firm. He understood the value of PR, we didn't. http://www.paulgraham.com/submarine.html

His essays, HackerNews and Y combinator are expression of this understanding. He has created expanding self-reinforcing network of affordable PR and connections for startups. That's the real scarcity in the startup scene. Being accepted into a batch is valuable. Instead of hiring PR firm, PR firm accepts you and stamps you with the Y Combinator brand.


Does PR stands for Purchase Request here?


Public relations.

See related: http://www.paulgraham.com/submarine.html


For some counter examples:

Bill Joy co-founded SUN - but also wrote ex, vi, the BSD UNIX tcp/ip stack, and other things.

Marc Andreessen wrote Mosaic and Netscape - and is a famous VC person now (of Andreessen Horowitz fame).

Bill Gates, also very technical (wrote early MS programs, BASIC interpreters etc).

Eric Schmidt (ex Google CEO) wrote Lex (the lexer, now more known by its GNU port, flex) and was SUN's first programmer.


Early Microsoft tended to clone or buy successful products. They were not really into core R&D, only improvement R&D. They let others be the guinea pigs. Gates was a skilled poker player, which probably helped him be a shrewd business-person. When all was said and done, you would realize he walked away with all your best cards.


Gates was a skilled poker player, which probably helped him be a shrewd business-person. When all was said and done, you would realize he walked away with all your best cards.

That doesn't happen in poker.


Insert better card-game metaphor here.


Yet some of the most successful business people seem to be good at tech. I heard that half of the CEOs of the top 500 fortune companies have a STEM degree.


My guess: STEM degrees require and foster critical thinking, logic, and abstraction skills, which are all very conducive to being a good business person.


There might be some survivorship bias in there if STEM fields are more successful than others, naturally leading to more STEM leaders, or more opportunities for leaders to form.


Often times it's a matter of just listening to customers. Even if you don't have a knack for the domain, if you get enough feedback and react to it, you can "organically" fit it well. It's sort of comparable to a genetic algorithm. Of course, having a knack for a domain could mean fewer iterations of rework.

Lisp is ideal for changing on a dime, at least for a small team.


It's not very common for anyone to be good at business. But for some reason when people are good at tech, suddenly that becomes the reason they are bad at business.


Which is the better introduction to Lisp: On Lisp (by pg), SICP, or another text (maybe on Common Lisp)?

Or is it better to just go straight to Clojure these days?


On Lisp is an advanced text about Lisp macros, not recommended as an introduction (but highly recommended if you've written Common Lisp).

SICP is great but not really about Lisp.

I recommend Norvig's PAIP [1] or Graham's ANSI Common Lisp [2].

Another book that's often recommended is Practical Common Lisp but I think it's dated and hasn't aged well. It's also nowhere near as mind expanding as PAIP/Graham's books.

[1] https://github.com/norvig/paip-lisp

[2] PDFs can be found on Google

Clojure is not recommended as it is very different to Lisp.

Lisp code needs to be entirely rewritten before it will run on Clojure. That is not true for Common Lisp, Emacs Lisp and even Scheme.


SICP is not about Lisp, Scheme was chosen because of its simplicity, but the book is not about Scheme. If you always wanted to go through the book, maybe also take a look at "How to design programs." It's a bit like a SICP, but it's written in a different style, you may prefer that one instead.

- If you want something practical - do try Clojure. Don't listen to JVM haters and Lisp "purists". JVM is a pretty robust piece of tech and Clojure is a proper Lisp dialect. Besides, Clojurescript is a lot of fun.

- If you like Lua, and want to make simple, fun games try Fennel. If you're into math and enjoy playing with fractals and such, do try Racket.

- If you are serious about Lisp, sooner or later you may want to learn Common Lisp. Especially if you already using Emacs and want to improve your emacs-lisp skills.


Clojure is always the wrong option to get into the principles of Lisp.

* Clojure utilizes non-standard Lisp syntax like []{}, weird function declaration and differ semantics of some forms (like cond usage, a powerful specie of switch-case of Lisp);

* Clojure don't have cons pairs, neither car and cdr operators. Cons pairs are fundamental Data Structures to build compound data, very well explored in SICP and any LISP text book.

* Clojure has the batteries of Java, so you will own a giant ecosystem to build complex software, but the evil parts of jvm will be inherited too. For the first contact with Lisp, this can be a unnecessary pain in the ass.

I would recommend to start a LISP journey with Land of Lisp or Practical Common Lisp (PCL), both focused in Common Lisp. Land of Lisp has a lot of history about the beginning of Lisp and the author write the book in a fun way, lot of charges and xkcd-like humor. The book has a collection of game projects per chapter, one at time, teaching principles of the language. PCL it's very useful for understanding specific parts of the language, I used as complement when Land of Lisp was not sufficient (loop topic on PCL is very good).


another angle:

while Clojure is a Lisp with its own distinct flavor it gives you …

* a large community of practitioners and professionals (great for asking questions, finding collaborators, …)

* easy access to libraries in the js/jvm/.net ecosystems

* a style that relies more on data and transformations of data (illuminating simplicity)

in any case: it is worth digging deeper, one thing that kept me away was not knowing where to start (analysis paralysis). in hindsight picking any Lisp would have been great (instead of postponing).

Find a thread and start pulling :)

Rich Hickey’s talks were a great entry point for me https://github.com/tallesl/Rich-Hickey-fanclub

e.g. https://www.youtube.com/watch?v=rI8tNMsozo0 (the “Simplicity Matters” keynote at Rails Conf 2012)

edit: Land of Lisp is a great book as well


> Clojure is always the wrong option to get into the principles of Lisp.

Yet, somehow out of all Lisp dialects, Clojure today remains the most pragmatic choice. Clojurescript is arguably the best alt-js choice, where Elm, Reason and Purescript for various reasons still can't reach its level of the simplicity and practicality.

Clojure has immutability by default, which alone has number of great benefits. It is FP focused. It can be learned à la carte, which greatly simplifies process of learning.

Clojure community is one of the friendliest and most diverse communities of developers.

In comparison, Common Lisp today (sadly) becoming like Latin of programming languages. It is cool to learn it and know it, but in many circles the practicality of that knowledge remains somewhat questionable.


Clojure is a better language, and a better Lisp, and a better tool for making software, but not "really" a Lisp. If you want to learn about Lisp, I think you should zero in on Scheme, Common Lisp, and Emacs Lisp. But this of course depends on what you want.

The cons cell thing is a big deal. Cons cells are a stupid language construct. But that's baked into the identity of what Lisp is.


I'd personally avoid SICP unless you have an interest in writing programming languages or solving math problems. I personally found it really boring, and couldn't get through it.

My own personal taste also leans towards Scheme (especially Chicken Scheme), so that's what I'd recommend. To me it feels more elegant, more modern, and of lighter weight than Common Lisp. I'm also kind of allergic to the JVM and Clojure's non-lispy innovations in syntax, so I'd personally avoid it too.

That said, I myself started with Common Lisp and then moved on to Scheme and then Emacs Lisp (which is a decent choice of Lisp to learn, in that, despite some shortcomings, it puts the entire Emacs ecosystem at your fingertips). Once I'd learned Common Lisp, I found learning the rest to be quite easy.


IMO Lisp is just overrated, it lacks visual clues, reads right to left with horrible nesting etc..

Sure it has some good concepts. But fanboys on internet make it seem like some God tier thing.


"Some good concepts"?

It was the first language to add "if/else" constructs, GC, closures, first class functions, reference semantics, and recursion.

It took between 5 to 40 years before these became available in mainstream languages (conditionals like if/else were adopted early, GC not so much, closures took even more). Add to that macros and the flexibility of runtime evaluation / code creation, which most mainstream languages still lack.

And of course, Lisp's rules remain (and will always be) the most succinct way to define a full blown programming language with meta-programming facilities to boot -- as opposed to a mere Turing machine like thing or assembler.

>IMO Lisp is just overrated, it lacks visual clues, reads right to left with horrible nesting etc..

This doesn't make any sense...

The nesting is the same as in almost any language.

"reads right to left" - huh?

You know that Lisp code can be indented right?


I think they are referring to the nested nature of lisp function calls.

    (second-fn (first-fn val))
I think that is really just personal preference/what you were taught as opposed to good or bad language design. But it really doesn't matter as one of the big benefits of lisp is it's ability to be metaprogrammed. That same example could be rewritten in clojure as

    (-> val first-fn second-fn)
thanks to macros and doesn't suffer from the right->left reading.


> I think they are referring to the nested nature of lisp function calls. (second-fn (first-fn val))

How's this different than:

  secondFn(firstFn(val))
 
other than the placement of parenthesis, the two cases read exactly the same, left to right.


I'm on the side of lisp here, but in OOP you'd usually write something like

    val.firstFn().secondFn()


I've seen a lot of breaking that out into the following in imperative languages.

    var intermediaryVal = firstFn(val);
    var result = secondFn(intermediaryVal);


  (let [intermediary-val (first-fn val)
        result (second-fn intermediary-val)])
Or as one of the parents said

  (-> val first-fn second-fn)


Nesting can be sorted via macros, e.g. closures threading macros

    (defn unique [str]
       (->> (str/split str #"\s+")
            (map #(Integer/parseInt %))
            (partition-by even?)
            (take-while #(not= (count %) 1))
            (first)
            (count)
            (inc)))

  (unique "2 4 6 8 7 10 12")
   => 5


One small detail - though COND and IF/ELSE are very closely related, they aren't quite the same thing. Early Smalltalk also started out with COND (a right arrow which made code look like modern switch/case) and later evolved to #ifTrue:ifFalse: which is still postfix, so feels backwards compared to other languages. Lisp and Scheme got more conventional if/else eventually.


Something I discovered about Clojure's cond recently.

It usually looks likethis:

    (cond (< a b) (println "a < b")
          (> a b) (println "a > b")
          :else   (println "a = b"))
I thought the :else had to be :else, but it only needs to be truthy, so it can be anything that isn't false or nil (which makes sense as you want it to always execute that form if no others match).

So this is just the same:

   (cond (< a b) (println "a < b")
         (> a b) (println "a > b")
         :hotdog (println "a = b"))
Probably obvious to everyone else but it was a bit of a "duh, of course!" moment for me.


In traditional Lisp and common Lisp, and similar dialects, the symbol t is customarily used as the catch all last case.

There is an interesting situation is in the case family of constructs which match an input value against keys, in Common Lisp.

Inspired by cond, the t symbol also serves as the fallback in case when the key doesn't match the other cases. So that is to say:

   (case (expr)
     (a ...)
     (b ...)
     (42 ...)
     (t ...))  ;; <-- this isn't matching on the t symbol!
Common Lisp also supports the symbol otherwise in place of t.

But the programmer may also sometimes have the t symbol as a specific key value; or likewise the otherwise symbol. That requirement is handled by putting the key into a list:

   (case (expr)
     (a ...)
     ...
     ((t) ...))  ;; <- match on t


In Haskell you usually use `otherwise` for the last guard clause, like so:

     f x | x < 0     = ...
         | otherwise = ...
It took me a while to realize that `otherwise` isn't a keyword or bit of syntax. It's just a variable bound to True: https://hackage.haskell.org/package/base-4.12.0.0/docs/src/G...


In Lisp usually all non-nil values mean true.

  CL-USER 13 > (let ((a 1) (b 2))
                 (cond ((> a b) 'foo)
                       (:hello :there)))
  :THERE
In some early Lisp dialects not all objects (other than lists and symbols) would be self-evaluating, so one would have to quote them.


It's conventional in Scheme and Lisp to just use `#t` directly to match anything.


> COND and IF/ELSE are very closely related

In earlier Lisp COND was the primitive and IF was a macro expanding into COND. In newer Lisp it's IF which is the primitive and COND which is the macro.


> It was the first language to add "if/else" constructs, GC, closures, first class functions, reference semantics, and recursion.

"Lisp is still #1 for key algorithmic techniques, such as recursion and condescension."


As far as "Lisp lacks visual clues", after many heated debates on this, I've concluded that the ugliness and rigidness of "production" languages enforces certain visual and syntactic standards that makes reading others' code easier. Indentation of Lisp won't "solve" this because production languages also can be indented. It's not a difference maker.

Parameter lists are wrapped in parenthesis and separated with commas, while code blocks are wrapped in curly braces and separated with semi-colons.

Lisp won't do this because it's then harder to blur/merge/change the distinction between code and data, which is the very power of Lisp.

Building architectural eyesores is probably not a rational plan for a city, but dammit, those eyesores help you know and remember where you are.

Some people have a certain kind of eye and/or brain that allows them to read Lisp quickly, but this may not be universal. Some say "with enough time you'll get used to it", but the fact Lisp has been around for 60 odd years without catching on in the mainstream is evidence of this. Somebody would be rich by now off its alleged advantages if they were real (beyond a "write-only" startup language).

Functional languages are better at expressing ideas but slower at communicating them to other readers, on average. I also personally find them difficult to debug because they don't have enough "intermediate state" to x-ray for debugging purposes. Here's a conceptual illustration:

      // imperative:
      a = af(param);
      b = bf(a);
      c = cf(b);

      // functional
      c = cf(bf(af(param)));
In a debugger and/or Write statements I can readily examine intermediate values "a" and "b". Not so with functional. SQL presents a similar issue. Its new WITH statements help out, but only partly. Divide-and-conquer works better if you can examine the divisions.


You know your brain adapts so you can read something easier/faster the more you are exposed to it. For me, Lisp code couldn't be easier/faster to read and comprehend.

I'm convinced most of the people making jokes about parentheses or Lisp code being hard to read are superficially dismissing it without putting in even a minimal effort of working with it.

Things that don't immediately click are discarded. Individual curiosity ("Really smart people say great things about this, I wonder why that is..") leading to individual effort leading to deep understanding is not the prevailing attitude.

Sad state of affairs.


I think that the syntax has some hurdles, but it's not the parentheses: It's to mentally understand when lists and symbols are data and when they are code. That's a problem not found in other programming languages and at the same time it is an interesting feature. Lisp is not alone to have such hurdles - another example would be Haskell which is also more difficult to learn than the average programming language (lazy evaluation, type system, monads, ...).

Often Lisp had been used as a teaching language for computer science concepts (recursion, evaluation models, algorithms, etc.) and thus it was associated by students with novel concepts they struggle with and not with solving practical problems. A typical example is the SICP book. It's great, but mostly CS and mathematics oriented -> the result is that the feedback is mixed.


> Lisp won't do this because it's then harder to blur/merge/change the distinction between code and data, which is the very power of Lisp.

Though it has tried a bunch of times.

> Some people have a certain kind of eye and/or brain that allows them to read Lisp quickly

Could be that or that one just needs a bit of practice - as it is the case with many thing: riding a bike, driving a car, learning a foreign language, mastering a new dance, ...

> 60 odd years without catching on in the mainstream is evidence of this

That you observe two things does not mean that they are necessarily in a causal relation.

> I also personally find them difficult to debug

One just introduces variables like in your imperative example:

  (let* ((a (af param))
         (b (bf a))
         (c (cf b)))
    c)
Now you'd see the variables and their bindings in a debugger.


Re: Though [separating in Lisp] has tried a bunch of times.

I haven't seen it done well. The attempts kind of end up with the worse of both worlds.

Re: Could be that or that one just needs a bit of practice

But the key is how much, and how long does it vary per individual. There is no solid research that I know of, so it's just opinion and anecdotes either way.

I tried to get used to it and read it fast, but it just felt progress was really slow. Most Algol-derived languages use punctuation and symbols that seem to make them stand out better than words alone. I'm not quite sure how to describe it, but standardizing the meaning of "funny symbols" help my particular eyes work faster. My head parses words too slow, and I have been reading words since kindergarten. Similar discussions:

https://wiki.c2.com/?LispLacksVisualCues

https://wiki.c2.com/?ChallengeSixLispVersionDiscussion

One can make that claim about any tool I should note, even COBOL. Some seasoned COBOLer's can crank out and read COBOL code really quick, I'd note. The top COBOLer can probably out-code an average Lisper I'd bet.

Re: That you observe two things does not mean that they are necessarily in a causal relation.

True, but per lack of formal studies per above, speculative observations are all any of us have. It should be a curiosity why after 60 years it never caught on mainstream. Almost nothing else in IT has had that many shots at it.

Re: One just introduces variables like in your imperative example:

Yes, but the Algol-derived languages (like C, Python, etc.) do that "style" in an easier-to-read and more consistent way, at least in my opinion.

Side note: why am I getting a rotten score? What did I do bad? I don't want bad scores. I'm just expressing my opinion openly. I don't feel I deserve punishment.


> Most Algol-derived languages use punctuation and symbols that seem to make them stand out better than words alone.

If we read a long novel, then we read mostly words without much visual structure. The only thing that guides us are occasional marks of sentences, paragraphs and chapters. Still many can read Lord of the Rings without much problem.

Lisp is slightly unusual, since the code is written as nested lists (trees). To make Lisp code readable there are a few things follow: speaking symbols, standardized code layout and familiar code structures. After a while the reader/writer of Lisp code will recognize the symbolic hints and the usual visual patterns of the layouted code.

> The top COBOLer can probably out-code an average Lisper I'd bet.

In terms of lines of code I would think that's true.

> It should be a curiosity why after 60 years it never caught on mainstream

Why should it be in the mainstream? Why should a programming language originally designed for certain types of meta-programming (remember: the symbolic computation and code as data ideas, which are central to Lisp) be especially popular among programmers and their customers?

> easier-to-read

Now you are shifting the goal. You originally claimed that functional code is hard to debug, because there are no variables you can inspect. But that's what we do in Lisp too: we use variables for that purpose.


> Still many can read [novels] without much problem.

Compared to what? I didn't say reading of Lisp comes to a stand-still, only that it's slower than block-type-indicator symbols. Maybe it is indeed possible to refactor written English for speed reading. I haven't surveyed the tests. Does anyone here want to claim English is optimized for quick reading?

It's kind of off topic, but I can envision using various symbols and block-markers to delineate parts of English sentences such as subject, verb, object, etc.: "(subject..) $verb {object...}" I think it would quicken MY reading, but I can't vouch for other humans. Maybe I'll patent it, or did Oracle beat me to it ;-)

But that's the idea: commonly occurring features are marked or surrounded with different symbols. You can't really do that in Lisp because nouns are verbs and verbs are nouns depending on the context/usage of the libraries or deeper code, while with Algol-derived languages mostly hard-wire the difference up front. You instantly know and don't have to stop and think or dig.

Indirection/abstraction can and does slow reading in many cases. I'm just the messenger. If YOU can read and process textual words fast, that's great but may NOT be universal across most humans.

> In terms of lines of code I would think that's true.

No, I meant features-per-hour, at least for business-oriented coding [in COBOL].

> Why should it be in the mainstream?

The standing implication, as I interpret it, is that functional programming is inherently superior in general, for a price of a slightly longer learning curve.

> Now you are shifting the goal. You originally claimed that functional code is hard to debug, because there are no variables you can inspect. But that's what we do in Lisp too: we use variables for that purpose.

I'm addressing TWO goals: easier-to-read and easier-to-debug.

> But that's what we do in Lisp too: we use variables for that purpose.

You add them just for debugging? Imperative style typically does that as in regular course. Thus, one doesn't have to alter the code as often just for debugging.

> After a while the reader/writer of Lisp code will recognize the symbolic hints and the usual visual patterns of the layouted code.

Indentation is not a difference maker in comparisons because both candidates can use it.


> I didn't say reading of Lisp comes to a stand-still, only that it's slower than block-type-indicator symbols.

That's an assumption of yours.

> No, I meant features-per-hour, at least for business-oriented coding [in COBOL].

Another assumption, for which you assume justifications.

> The standing implication, as I interpret it, is that functional programming is inherently superior in general, for a price of a slightly longer learning curve.

I think of 'functional programming' as a tool. 'superior' is an attribute that you use.

Implication of what? 'functional programming'? Lisp is not 'functional programming'. Lisp in its root is a mix of full imperative programming (mutable variables + imperative control flow) with functional programming (first class functions, higher-order functions, ...).

> You add them just for debugging? Imperative style typically does that as in regular course.

Imperative code uses operators (which are functions) and functions.

  a = 2 * b + 3 ^ b
  c = 4 * a
  r = c * pi * sin(a)
It's basically arbitrary which variables one introduces. We could write the thing down as one expression, add more variables, etc. Using variables has two purposes: save intermediate results for multiple use and naming intermediate results for documentation/code readability purposes.

+, ^, sin, ... are basically functions. An operator is a function with an infix notation. Thus any imperative code which uses functions and operators is ALREADY a mix of imperative elements (mutable variables and imperative control flow) and calling functions.

In Lisp one might introduce the variables first and then set them:

  (prog (a c r)
    (setf a (+ (expt 3 b) (* 2 q)))
    (setf c (* 4 a))
    (setf r (* c pi (sin a)))
    ...)
or use LET*

  (let ((a (+ (expr 3 b) (* 2 q)))
        (c (* 4 a)))
        (r (* c pi (sin a))))
    ...)
or I could use local functions.

  (flet ((e1 (b q)
           (* (expr 3 b) (* 2 q)))
         (e2 (a)
           (* 4 a))
         (e3 (c a)
           (* c pi (sin a))))
    (let* ((a (e1 b q))
           (c (e2 a))
           (r (e3 c a))
     ...)
etc...

Or I could write the code in some infix notation, since one can add infix syntax:

  CL-USER 8 > (ql:quickload "infix")
  To load "infix":
    Load 1 ASDF system:
      infix
  ; Loading "infix"

  ("infix")

  CL-USER 9 > #I( a = 3 , b = 5 , c = a * b, c)
  15

  CL-USER 10 > #I( a = 3 ,
                   b = 5 ,
                   c = a * b,
                   c)
  15

We can write Lisp code any way we want. We can write it in an basic imperative style in s-expression syntax and also in infix syntax. We can also write it in slightly more functional styles.

Every function introduces variables and LET / LET* are nothing else then binding constructs which are function calls:

   (let ((a 10))
     (* a 4))
is basically the same as

   ((lambda (a) (* a 4))     ; anonymous function
    10)
The more functional style of variable free calls is not the general way to write down code in Lisp. The extreme style variant is so-called 'point-free' where functions are combinated is also not very much used.

So when you think that one does use a strict functional and variable-free style of programming in Lisp, then this has no base in reality. Lisp is very much an imperative language.

> Indentation is not a difference maker in comparisons because both candidates can use it.

Layout of code is more than indentation.


> That's an assumption of yours.

So is the opposite viewpoint. Neither of us has a solid study that's directly relevant such that anecdotal info is all we have here either way. A good study would probably require millions of dollars. Yours is NOT the default position given the lack of such studies.

> Layout of code is more than indentation.

Same issue: both can do it so it's not a difference maker in comparisons.

> We can write Lisp code any way we want...one might introduce the variables first and then set them...or use LET*...or I could use local functions...etc...

But that's part of the "lack of standardization" that contributes to Lisp being more difficult to read. Standards and conventions are somewhat counter to "flexibility". We typically don't want overly-detailed standards (hard to remember) nor want excessive flexibility because then everybody and every spot may do things different.

There is an optimum balancing point in the standards vs. flexibility spectrum. Goldilocks. And the balancing point probably varies per individual.


One of the problems to learn a new programming language syntax is a mental block. One looks for all kinds of excuses. It's actually not that Lisp syntax is overly difficult, it's the pain of learning something new.

> Yours is NOT the default position given the lack of such studies.

Your view would be more interesting to me if you'd have spent some time learning to read and write Lisp code. Much of what you claim is just guessing.

> Standards and conventions

There are standards and conventions in Lisp code. You just don't know them.

You are just guessing how difficult it might be to fly an airplane. It might be more difficult than a car, but how difficult it actually is not visible to you. You just guess that there are much fewer aircraft pilots than car drivers, and guess that flying a plane must be extremely difficult...


> Much of what you claim is just guessing.

Same with you. It's not statistically safe to say that because some people do better with Lisp that everybody will given enough time. That doesn't tell us whether it's a personal fit or something more general.

Your line of reasoning appears to be "I did X and got result Y, therefore if everybody else does X, they will also likely get result Y". You hopefully should recognize the statistical fallacy in that pattern.

Anyhow, we are going in circles. There are no solid studies to back either of our viewpoints so we just have "anecdote fights" that don't get anywhere, which is quite common in Lisp-related debates. BeenThereDoneThat.

> There are standards and conventions in Lisp code. You just don't know them.

But they mostly rely on "parsing" words. I personally believe in the power of symbols. My eyes/head process most symbols much faster than words. I can't explain it, they just do. That's just my brain, although others have told me similar. If YOUR brain can process textual words as fast as symbols, that's wonderful, but may be specific to you (and other Lisp fans). Your head is NOT my head.

It's not like I'm new to words such that after reading a billion words I'll finally get better. People's textual reading usually plateaus after about 5 years as adults. Throwing time at the problem won't significantly change this. Why should Lisp be different? Words are words.

And sure there are standards/conventions in Lisp, but the competition also has standards/conventions such that it's not a difference maker in comparisons. The difference maker is WORDS alone versus words + symbols. (Well, Lisp has parentheses, but mostly only parentheses, unless you "invent" something custom, which makes it non-standard by definition.)

And I'm not against new things, but if they don't seem to be making sufficient progress after a reasonable amount of time, I abandon them. If Lisp is unique in that it has a hockey-stick shaped benefits curve (flat for long stretches, then goes up), then it stands out from most tools. It's hard to know if it has a weird learning curve up front since there are no decent studies on it. Often fans of tools claim "just try it long enough". That's not sufficient because it's a fan habit to claim that.


> But they mostly rely on "parsing" words.

And structure.

> If YOUR brain can process textual words as fast as symbols, that's wonderful, but may be specific to you

Most people can do that. Most text people read just consists of words. Actually reading text with special symbols is quite complex - especially if the meaning depends on a mix of prefix, infix, postfix with different operator precedence.

> And sure there are standards/conventions in Lisp, but the competition also has standards/conventions such that it's not a difference maker in comparisons.

You claim a difference. I claim, you are just unfamiliar with Lisp.

> The difference maker is WORDS alone versus words + symbols.

No, the difference is words and structure.

Take for example the usual imperative code:

  a := 3;
  b := a*4;
  if a > b
   print a
  else
   print b
  ...
That's just a vertical sequence without much structure. You can add more { and }, but the shape largely stays the same.

Lisp code would look like this:

  (let* ((a 3)
         (b (* a 4)))
    (if (> a b)
      (print a)
      (print b)))
That's much more tree-like:

  LET*
       binding
       binding
   BODY
   BODY
   BODY
From that we can easily see that this is a new scope, what modifies the scope and what extent the scope has.

Lisp users learn to parse these visual and structural blocks & patterns. Once one has learned the vocabulary of basic code patterns, it's getting much easier to read Lisp.


> And structure.

They both have structure so it's not a difference maker in comparisons.

> Most people can do that. Most text people read just consists of words.

Yes, but "do" and "do better than alternatives" are different things. I already gave examples of possible alternatives/enhancements to typical English text that would help at least my brain. I won't reinvent that sub-discussion here.

> That's much more tree-like:

Being tree-like and being easier to read are not necessarily the same thing.

> I claim, you are just unfamiliar with Lisp.

How long do you believe it's realistic to keep at it if the benefits come slow? For example, if I keep coding in Lisp heavily for 2 years and STILL find it sluggish to read, is it realistic to then give up in your book?

> From that we can easily see that...

Who is this "we"?

I've been reading words and symbols for multiple decades in various contexts (programming, regular books, etc.). I've concluding after these decades that if used in the right spots, symbols GREATLY ENHANCE my ability to parse/grok/absorb written material IRREGARDLESS of the domain or the specific language.

For example, one thing I like from C# over VB.net is that C# uses square brackets for array indexes instead of parenthesis like VB does. It improved my head's groking speed of array-related code because parentheses have multiple meanings in VB. (I'm not claiming C# is overall better, this is just one aspect.)

Symbols MIXED with words (well) enhance the absorption of words. It's a ying-yang kind of thing. They COMPLIMENT each other. I truly doubt a billion years of using mostly just one OR the other will prove better than using them both.

Sorry, I believe Lisp is just plain lacking there. I want my ying-yang MTV. Go play your Wordy Gurdy in Jennifer's juniper garden.


> They both have structure so it's not a difference maker in comparisons.

Why not? structure can be different, example: shallow vers. deep. That both have some structure, that in both it's the same structure and that it is expressed in the same way.

> Being tree-like and being easier to read are not necessarily the same thing.

I did not claim that it is the same thing.

> How long do you believe it's realistic to keep at it if the benefits come slow?

You look for reasons not to learn it. You already KNOW that it will take years and will show no result.

> I've concluding after these decades

That makes little sense. After long and intense training lots of things look easy: juggling, flying a helicopter, ...

Unfortunately you haven programmed in Lisp and thus you have no idea about how difficult it would be to learn it.

I have also learned first to code in hex codes, assemblers, BASIC, Pascal, Modula, etc. Still I can read Lisp code just fine.

> I believe Lisp is just plain lacking there

You believe it, before even tried to learn Lisp and understand the difference.

You believe that a hammer is difficult to use, but you have only seen one, not hammered with it.


> I did not claim that [tree-like and easy-to-read] is the same thing.

Your example implied it. If you meant something different, it wasn't explicitly stated. There are multiple factors that play into "easy to read", and they often differ per person. Tree-ness helps in some cases, but not others, or can be over-done.

> After long and intense training lots of things look easy

But reading words PLATEAUS in most people. You can't throw time at it to speed it up much. "Speed reading" courses don't improve comprehension of details, only summary skimming speed.

> I have also learned first to code in hex codes, assemblers, BASIC, Pascal, Modula, etc. Still I can read Lisp code just fine.

I'm not sure what point you are trying to make here.

> Unfortunately you haven programmed in Lisp and thus you have no idea about how difficult it would be to learn it.

I dabbled it in many years ago, and just found it hard to read and didn't see that exposure time was paying off in the speed compared to learning/reading other tools/languages. Groking it faster either has a long learning curve, or is the "hockey stick curve" I mentioned earlier.

You haven't explained why it has that comparatively slow ramp up and why I should accept the slow grok start compared to other tools. If it does have the hockey stick grok curve, it stands out unique in that regard and there should be a good reason behind that, yet strangely nobody knows why. You appear to be avoiding these key questions/puzzles. I don't understand why. If you want to promote Lisp, you better start caring about this decades-old conundrum: it's not going away. "Just keep it at forever" is NOT a good answer.

> You believe that a hammer is difficult to use, but you have only seen one, not hammered with it.

After many decades I have a pretty good feel for what kinds of tools, symbols, layouts, and UI's work best for MY eyes and brain. I'm not going to force things for 10+ years to see if my assessment rules of thumb are actually wrong for specific tools. That's not a rational use of anybody's time.

And MANY others have said the same about Lisp, and it has yet to catch on in the mainstream despite being around 60 years and tried in many projects. It does fine in certain niches and continues to do fine in those niches. But if you keep failing mainstream beauty pageants for 60 years, common sense should tell you that the public just plain finds you ugly. Get a clue already! Lisp has buck teeth and a big nose. You may have a thing for buck teeth, but your brain is not a representative specimen of humanity.

Further, there's nothing to force people to use good Lisp style and formatting. If it did go mainstream, more would probably abuse and misuse it. Fans treat their prized possession with care, others don't. Nice things that happen in Nicheville don't scale to big cities.


> Not so with functional.

Sure you can, if you have an editor and REPL worth their salt. Select af(param)and evaluate it. There's what you're calling "a". Select bf(af(param) and evaluate it. There's what you're calling "b". How one carries out "select and evaluate" varies depending on the language, REPL, and editor in use, but there's almost always (always, in my experience) an easy way to do it.

Bonus: if it's a pure functional language (no side-effects) you can do it as many times as you like without messing anything up, and you didn't have to invent any otherwise-useless variables.


> In a debugger and/or Write statements I can readily examine intermediate values "a" and "b". Not so with functional.

You clearly have no idea. Lisp has a REPL. True REPL. No, it's not a thing where you type something into it and it blurts out something in return like in most other languages. And it's better than so much praised Jupyter notebooks.

Lisp REPL is connected to your editor, and from your editor or IDE you can not only examine and evaluate any expression and sub-expression, you can do it on the fly, it gives you immediate feedback. You don't even need to think about a debugger (which Lisps do have too). You can try things while writing them.

Imagine a landscape artist that changes the entire landscape in real-time with every stroke of his brush. That's how it feels to be programming in a Lisp.


Usually real-world code has a lot of context and references such that stand-alone expression evaluation is often of limited value. Maybe there are tricks of the trade to solve this, but they are not obvious to newbies.

It sounds like one has to throw away years of "imperative habits" and just do everything different: reading, debugging, structuring, etc. For us non-Sheldons, it's hard to overhaul our head in a month.


This may be a meme, but it's also very true (like all good comedy). It's truly represents how you start to look at Lisp code after a while:

https://www.thejach.com/imgs/lisp_parens.png


> they are not obvious to newbies

Modern Lisps like Racket and Clojure are not that difficult to pick up. I think Clojure might be easier to learn than Python or Javascript. I'm serious.

Look, I understand the sentiment because I was on the same side relatively not too long ago. I never had a formal introduction into a Lisp. I thought Lisp is a dying language, something like Fortran, COBOL, or Pascal. On a whim, out of the blue, I tried learning it. I don't remember exactly why. I was bored or something. I started reading a book on Clojure. Then I found a conference that supposed to happen in a week. It was the first Clojure/Remote conf. I thought: "Well, it's remote. It's cheap. Let me try it out. What do I have to lose?" That conference has changed my life. Quite literally. I was fascinated by seeing awesome things people building. After the conf, I sat in my chair, staring into the void of my blank screen for several minutes. The next day, I went to the office and told them that I'm leaving. Without writing a single program in a language I did not know, I decided to find a job with a tech stack focused on Clojure.

There are lots of misconceptions circling Lisp. People look at Lisp code, and it just doesn't look "sexy" to them. And they immediately dismiss it, claiming it is hard to read. That's not true.

They say - it's dynamically typed mess. Without ever knowing that Clojure has a type system called Spec, which can do things, most other type systems cannot. There's a ton of research happening around type systems in Racket. "Little Typer" book teaches dependent types using a Lisp.

They say Lisp codebases don't scale - that's not true at all. The truth is - programming in Lisp makes people so productive, you don't need large teams to maintain a massive amount of work. And there are companies with large Lisp ecosystems, one example: NuBank in Brazil has over 400 engineers, and their primary language is Clojure.

People say Lisp is not popular. Sure, Clojure's popularity might look dwarfed in comparison with massively popular Python, Javascript, C++, Java, etc. However, Clojure, compared to other "esoteric" languages, is quite popular - today, it has more books, podcasts, conferences, and meetups. More than languages like Haskell, OCaml, Rust, Elm, Purescript, ReasonML, F#.

Learning Clojure has opened a world of possibilities for me personally. My only regret that I haven't gotten into Lisps earlier. Please, don't negatively take my advice. My intentions are sincere. I wish someone would've convinced me to try Lisp early in my career. Please, do sometimes try to get your head out of your shell and give it a try. I promise, if you give Lisp an honest and heartfelt try, you will find something unique. Even if you end up not using it, you will never regret learning it.


You should stop oversellling things. Spec is not a type system.


Just because you can't do a rigorous static analysis (which to certain degree is possible, check out Spectrum) it doesn't mean that Spec is not a type system.


That's misleading. Spec is a contract system. Read the page on the spec rationale: https://clojure.org/about/spec . It even mentions that spec is not a type system and that spec is similar to contracts in Racket, etc.

If we look at programming languages like Java, Typescript, C++, Haskell, etc., then we see that their type systems are completely different and have an entirely different purpose. Claiming to those users that spec is a type system, is just misleading them.


Semantics. Spec is very close to dependent types. "Entirely different purpose" is kinda vague. If the end result is the correctness of the program, the purpose is the same. If someone can perform Vivaldi's Four Seasons on Ukelele, it's cool. It doesn't make Ukelele less of an instrument.


So, spec is the ukulele in the vast domain of Type Systems?


> But fanboys on internet make it seem like some God tier thing

Lisp is based on math and logic. And if there's a God, he probably speaks Math. So, maybe those fanboys onto something here.

But, seriously, once you learn a bit of Lisp, you realize that Lisp has influenced every single modern programming language. Essentially, no matter what PL you are programming in - we are all programming in a Lisp. Sometimes your language of choice adheres to core principles of Lisp, sometimes that connection is loose. Nevertheless, there's plenty of Lisp in every modern programming language. So, no. Lisp is not overrated.


I personally admire Lisp conceptually. It's a work of art. It's the practical use as tool in a team environment that you cannot carefully control hiring is where I have skepticism.


I've started using Clojure a few years ago as my primary language of choice. I switched multiple jobs since then. I never had a problem with finding one. I worked once for a big fin-tech company. We never said that knowing Clojure was a requirement, but we admitted that the preference would be given to people at least interested in Clojure. And you know what? It helped us to build an incredibly strong team. Even though not everyone used Clojure. Some teams used languages like Ruby, Go, and Javascript.

If I were given a choice to either hire five Javascript/Python/Go programmers and pay them each 100K/Y or hire only three Lispers and pay them each 200K/Y, I would go for the latter. ROI, in that case, would be much, much higher.

And Clojure is a very, very practical tool. Just watch Rich Hickey talks, you will see that the pragmatism of the language is paramount for everyone who uses it.


Re: If I were given a choice to either hire five Javascript/Python/Go programmers and pay them each 100K/Y or hire only three Lispers and pay them each 200K/Y, I would go for the latter. ROI, in that case, would be much, much higher.

If that were universally true, one could out-compete the entrenched software companies and be the next Bill Gates. The common "production" languages are successful I believe because they offer a semi-consistent way to doing things, and for large or fungible staff, that turns out more important than less lines-of-code or powerful abstractions.

In business, communicating between humans is more important than parsimony. This is based on direct and indirect experience. My own "grand abstractions" have been rejected also at times. The "best" programmers are often those who have a decent understanding of the domain, UI design, and various "team" skills. Somebody good in all those may not necessary be the best abstractionist or parsimonist. The best programmers get B's on lots of topics rather than an A+ in just one, such as abstraction. I'm just the messenger. Scott Adams made a similar observation:

https://www.forbes.com/sites/carminegallo/2013/10/23/dilbert...


> if that were universally true, one could out-compete the entrenched software companies

What makes a software product a successful product? Would you agree that popularity and sold copies do not necessarily make it the right product? There's a lot of marketing goes into success, and I admit - sound engineering often lacks good marketing.

What you can't see behind your clouded judgment that despite being non-mainstream tech, Clojure (I can only speak for Clojure since that's my primary language, I've been using for the past few years) has been quite successful.

- It is consistently being rated as the top paid language (alongside with F#) in numerous surveys (e.g., Stackoverflow and StateOfJS)

- The community has fewer active members than the number of programmers working for Google, but they are continually innovating. They are figuring out interop with Python and R, writing machine learning libraries and books about it, they are running Clojure-like Lisp dialect on BEAM, Clojure-like Lisp that compiles to Lua, researching type-systems, running Clojure on embedded devices, etc.

- Clojure teams at companies like Walmart and Apple processing massive amounts of data. Can you imagine the amount of data Walmart has to deal with on Black Fridays? Do you know how many people they have in the team that has built the pipeline for processing receipts? Seven. For comparison, React team at Facebook has more developers, and they also get massive help from it being open-source. I don't want to belittle the work they do; managing a massively popular UI library is not a simple task either. I just brought it here for the lack of better examples.

- Jira, incredibly popular and sometimes vehemently hated by programmers. Please take a look at its less popular competitive Clubhouse.io. It's beautiful, clean, and nothing but a fantastic product. I believe they've built it with a team of fewer than ten developers. And there are many examples like that: CircleCI, Grammarly, Pandora and many, many more.

I agree with what you say:

> The "best" programmers are often those who have a decent understanding of the domain, UI design, and various "team" skills.

That's why (after many years programming in other languages) I chose Clojure. It allows me to stay laser-focused on solving real problems. Like creator of Clojure once said: "solve real-world problems, not puzzles."


> Clojure teams at companies like Walmart and Apple processing massive amounts of data.

Yes, I already agreed it does well in certain niches. Usually in query-esque niches. SQL is also functional-esque for similar reasons.

You seem to be contradicting yourself, saying that other factors overwhelm FP's alleged programmer productivity gains, yet point out niches where it does well and is common. It spreads there, but not elsewhere.

> top paid language

I'll leave salary discussion to lispm's (user) reply thread.

> There's a lot of marketing goes into success, and I admit - sound engineering often lacks good marketing.

Yes, but that's mostly independent of programming productivity. It's not a difference maker in comparisons. Just because programming isn't everything to the bottom line doesn't mean it's nothing to it. For a software shop, programming is still a notable portion of activities that affect the bottom line. If you are 1% more profitable than your competitors, that difference will compound over time. If you grow 4% a year but your competitor grows just 3% a year, you'll eventually swamp them. Do a spreadsheet on it if you don't believe me.


> consistently being rated as the top paid language

means: developers are rare and too expensive. -> companies will avoid it.


I was going to make a similar comment. Programmer salary and owner profits are not necessarily related. The more specialized a skill is, the more it pays in general. (From the worker's standpoint this is a mixed bag in that you get more money if you are employed, but may have less geographical choice.)


No, Lisp does not lack visual clues. But yes, very often people complain about Lisp readability. And that stems from the lack of familiarity, most people are familiar with the infix notation, but not prefix, used in Lisp. But I have seen many times, people who pick up Lisp (Clojure, Racket, etc.) as their first programming language - they do just fine, a matter of fact - they later find code written in other languages harder to read.

Unlike other languages, Lisp sometimes is difficult to read by just looking at it. Staring at Lisp code, examining its "static" properties without prior training indeed may pose a challenge. But there is a way to learn how to parse Lisp code mentally. And the best way of learning that is by editing it.

First, you need to find an IDE that supports structural editing of symbolic expressions (or s-exps). Basically, it would help if you learned a few operations:

slurp, barf, transpose, raise, kill s-exp, yank s-exp

Depending on the IDE you use, they may be called differently, and there may be more, these are the basic ones.

Once you learn those basic operations, you are now ready to untangle "unreadable" Lisp. The trick is to change the code while you read it.

Similar to how Tony Stark looks at the hologram of a model and moves his hands outward, forcing the hologram to expand, then he picks one of the parts, studies it, maybe completely removes it. He then would move his hands in a closing motion, and hologram would "assemble." That's how you work with Lisp.

So, for example, you read a function, if necessary, extract things, pick an expression, evaluate it, etc. Once you gain a good understanding of what the function does, you can undo all the changes, or if you think you have improved it - do save.

In other languages, when the structure of the code is settled, it's more challenging to pick it apart, move things in and out. You are forced to read the code "top to bottom." But in Lisp, code is a living, breathing thing. You basically feel like performing a surgery - you can tweak things, try them, and see instantaneous feedback in the REPL.

In other languages, code is "dead," until you save it, (re)compile it and execute it. And REPL in those languages is "crippled" and not a fully-fledged "true" REPL like in Lisp. Being able to pick any [sub]expression and evaluate it, without any kind of prior ceremony is truly liberating and empowering.

It takes time, for some it's hours, for others - weeks, but eventually you will stop seeing parentheses, and instead, you will see structure, consistency, and meaning. And Lisp becomes more readable than any other language you've used before.


Re: It takes time, for some it's hours, for others - weeks, but eventually you will stop seeing parentheses, and instead, you will see structure, consistency, and meaning. And Lisp becomes more readable than any other language you've used before.

I'd like to see a university test this theory. I suspect it's subjective and varies greatly per individual, but testing on say 100 random subjects could better resolve the Great Lisp Dispute.


Of course, it's subjective. Do you think everyone is born with an innate ability to read Javascript?

I don't know anything about music, but I feel this debate is like guitar tabs vs. standard music notation. I've heard, for example, that Paul McCartney doesn't know how to read music notation. "Yesterday" is still a masterpiece, though. Now imagine if Paul all his life advocated young musicians against musical notation, because "It's hard to learn to read it. And it's overrated."

The truth is - Lisp is fundamental and absolutely essential for anyone who chose a lifetime career in computer science. And it's not going anywhere, anytime soon. So please stop whining about it and at least get yourself familiar with it or stop calling yourself "a pro."

[Apologies for dramatization, the last part is not aimed at you personally. it's a general message for everyone.]


I didn't say get rid of Lisp. It's indeed a great teaching tool. What I'm against is the implication that if one doesn't like it or use for production systems, they're doing something wrong or bad.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: