Hacker News new | past | comments | ask | show | jobs | submit login
The Roots of Lisp (paulgraham.com)
296 points by DanielRibeiro on Oct 24, 2011 | hide | past | favorite | 56 comments



For nearly two decades I was a diehard LISP advocate. I even forced all my programers to code three Crash Bandicoot and four Jak & Daxter games in custom LISP dialects that I wrote the compilers for (an article on one here http://all-things-andy-gavin.com/2011/03/12/making-crash-ban...).

But by the mid 2000s I started doing the kind of programming I used to do in LISP in Ruby. It's not that Ruby is a better language, but mostly it was the momentum factor and the availability of modern libraries for interfacing with the vast array of services out there. Using the crappy unreliable or outdated LISP libraries -- if they worked at all -- was tedious. Plus the LISP implementations were so outmoded. It was very hard to get other programers (except a couple enthusiasts) to work that way.

Ruby struct a decent compromise. And it's type system and object model are better than CL anyway. The syntax is more inconsistent, and the macro model nowhere near as good. But it turns out. Libraries and implementation matter a lot. Still, you can feel lots and lots of LISP influence in all the new runtime typed languages (Ruby, Python, etc). And 30 years later, listeners still rule!


You? That... that was you? GOAL?

Sir. I have no idea what to say. You were my direct inspiration for a long time. I'm a gamedev (worked at a couple studios now) and I've wanted to integrate Lisp into gameplay for years. I even got a Franz CL evaluation. The discovery of GOAL/Crash Bandicoot further motivated my efforts.

Ultimately nothing came out of it; Lua and Python are just so much easier to integrate into a realtime C++ simulation (video game) and are nearly as effective. However, I became a better programmer for life. Discovering GOAL enabled me to think like "if Naughty Dog could do it, then so could I", which sparked me along a path of learning that profoundly impacted my sense of programming aesthetics and quality. Thank you.


Thanks for all fan love :-)

One of the interesting things about LISP is that it's actually a pretty easy language to parse, interpret, and compile. This isn't actually an accident as the S-expression syntax I'm sure was initially chosen for it's machine regularity (in those early days of underpowered machines). Newer languages are syntactically much more complicated. Ironically most normal programmers, being human, seem to find the more complicated syntax easier and the "simple" S-expression syntax confusing (being backward much of the time to normal human convention). I always found it unambiguous, but go figure. It's also precisely this regularity that makes the awesome macrology of LISP possible. In Ruby you can manually build up strings and feed them into the interpreter, which is equivalent to simple backquote. But you can't do the kind of cool nested constructions that are trivial in LISP.


Its really not about the syntax. Its about writing in AST. Availability of large set of libraries. And especially I find it very difficult to use it for my day to day uses(Places where I use Awk, sed and Perl) or in other words everyday practical needs.

I still don't know How to start learning lisp as a beginner and use it for my day to day needs?

I don't think anybody today has the time to study the science behind lisp and then see where its useful to them.

We want some thing that we can browse through and straight away go and use now!

Is there a recipe book?


I don't know about recipe books, but for a general introduction I've found Conrad Barski's _Land of Lisp_ to be very good and loads of fun, although the commentary gets a little triumphalist at times. You can grab it through NoStarch or Amazon iirc.


If you want to learn Lisp from a more practical perspective, you should probably learn Racket or Clojure. Racket comes with excellent tutorials and documentation as well as an IDE with debugging, REPL, etc. You can read How to Design Programs to learn the basics of Racket (Racket started out as Scheme and still contains an implementation of standard Scheme). Clojure is an even more practice oriented Lisp. It is a bit harder to set up and get started with, but there are books on Clojure, e.g. Programming Clojure and The Joy of Clojure. Compared to Clojure, Racket is a bit more academic and has many advanced language features like contracts and first class modules, but is also has a good standard library. Clojure's strong points are functional data structures, concurrency and using generic operations (meaning that you don't end up with vector-ref and list-ref and hash-ref, but just a single function to index into anything). Clojure's syntax is also a bit terser.

If you then decide you want to learn Lisp more deeply:

How to Design Programs is a book that teaches basic Scheme programming. It is oriented towards beginners, so if you want to understand recursion, higher order functions and data structures this book is for you. Conversely, if you already understand those things, this book is not for you. http://www.htdp.org/2003-09-26/Book/curriculum.html

Structure and Interpretation of Computer Programming is a more advanced and probably challenging book that goes through all aspects of recursion, data structures, symbolic mathematics, generic operations & implementing an object system in Scheme, continuations, writing a Scheme interpreter, writing a Prolog interpreter, and writing a hardware simulator and a compiler that compiles Scheme to that hardware's assembly language. http://mitpress.mit.edu/sicp/full-text/book/book.html

While Structure and Interpretation of Computer Programming all about code=data, it doesn't actually teach you about macros. For that you could read Paul Graham's On Lisp, which is essentially a whole book on macros. It is a good book to learn about macros which are useful in any Lisp but it uses Common Lisp which I'd not advice using.


Practical Common Lisp (http://www.gigamonkeys.com/book/) by Peter Seibel is an awesome book and free online.


But you can't do the kind of cool nested constructions that are trivial in LISP.

In ruby you can[1] do this, and in python more natively[2]. It is not pretty though.

[1] http://bit.ly/sVYe8l

[2] http://docs.python.org/library/ast.html


And in Perl you can use Devel::Declare - https://metacpan.org/module/Devel::Declare


Um, you're my hero. Not sure if you know that you have admirers out there. After reading about you, I became confident enough to write a Lisp-based DSL for describing and valuing financial derivative contracts. Yes, those. It directly and indirectly made me richer than I've ever been (that's not saying much, though.)

In the end, we moved to Python for similar reasons.

But you inspired me. Thanks.


Oh, you can't just leave us hanging like that...!

We want to know all about your Lisp experiences; or at least, I would personally love to hear about them.


I am curious what you found superior in the Ruby object model compared to CLOS. From those familiar with both Ruby and CLOS (or Moose), I've always heard that the latter is more powerful and more convenient, and (although intermediate at best with all three) found the same myself.


CLOS is perhaps more powerful, but the whole generic function thing is more awkward than object based dispatching. It is perhaps more powerful as it allows arbitrary overloading, but it was a pain. And I wrote a really big CLOS program: The GOAL compiler.


What do you think of Clojure? (if you've checked it out)


I keep meaning to check it out. But my pile of "keep meaning too" (which includes bills, taxes, and other boring things) is very large.


One day you'll die :)


>And it's type system and object model are better than CL anyway.

I couldn't disagree with this more. I don't see what is so different about their type system but SBCL is doing more and more type inference for you.

As far as object model, generic functions are a superset of message passive. One good way to gauge the power of different languages is to look at the kinds of language-deficiency patterns you have to use. What drew me to Smalltalk was that it needed so few compared to, e.g. Java. What made me leave Smalltalk for Lisp was double-dispatch patterns. Double-dispatch patterns (e.g. Visitor) simply aren't needed in a language with generic functions.

>And 30 years later, listeners still rule!

With this, do you mean the REPL or something else?


CL's generic functions are not a superset of message passing, and neither is message passing a superset of generic functions. Message passing allows an implementation of doesNotUnderstand or noSuchMethod or similar, whereas generic functions don't.

Not that I think that is a good trade-off vs all the advantages of generic functions. Besides, the things people accomplish with doesNotUnderstand are much better accomplished with macros anyway.


>CL's generic functions are not a superset of message passing

I would say they are. Generic functions can dispatch on a single object if they want to.

As far as doesNotUnderstand, most OO languages don't support it anyway. Further, while I can't find it just now, I seem to recall MOP or CLOS itself providing doesNotUnderstand functionality. You could also either provide a doesNotUnderstand specialization on T, or set up a single handler for it.

If you insist on similar semantics to Smalltalk/Ruby/JS(?) then it wouldn't be hard to use MOP to create a new kind of method dispatch that simply turned the call for a failed specialization lookup into a does-not-understand call as a last step instead of signaling error.


Message passing OO like in Ruby and Smalltalk means that an object receives messages that it can interpret in any way it wants. For example you can create an object that prints the messages to the console:

    class Foo
      def method_missing(meth, *args)
        puts "#{meth}(#{args.join(',')})"
      end
    end

    f = Foo.new
    f.foo() # prints foo()
    f.bar(1,2) # prints bar(1,2)
    f[2] = "hello" # prints []=("hello")
You can't do this with CLOS because methods are lexically scoped.

Why is this interesting? For example because with this you can implement transparent remote objects. When you make a call on the object, it makes a network request, on the other computer the call is made and the result is sent back over the network, and finally the original method call returns. So the code would end up looking something like this:

    # server on ip 1.2.3.4
    shared_object = Array.new
    shared_object[3] = "good morning"
    serve_object("foo", shared_object)

    # client
    remote_object = RemoteObject.new("1.2.3.4", "foo")
    remote_object.length        # makes a network call to 1.2.3.4 and computes the length
    remote_object[3]            # returns "good morning"
    remote_object[3] = "hello"  # sets the index 3 to the string "hello"
    # code in the server can now access shared_object[3] and it will get "hello"
Ruby actually has such a library. It's called Drb.

All in all, I think that generic functions are better than message passing, but generic functions are definitely not strictly more powerful.


If you have any, I'd be interested in hearing some useful tidbits being a long term lisper now grounded in ruby may have brought forth.


Well it certainly shows you how BAD/OLD the CMCL and ACL Garbage Collection code/algo's were (at least when I last used them in 2006). In ACL you could get these LispMachine-like multi-hour GCs. Now we use tons of GC based systems everyday (like browsers) without having to reboot too frequently.


Libraries and implementation do matter, that's why I've stuck with Clojure. I couldn't imagine starting a project in Common Lisp today.

Are you using Ruby in game runtimes, or just tools? How do you feel the use of Racket worked out in the Uncharted games? It seems like an embeddable Lisp (like Racket) might just hit the sweet spot for enabling DSLs and dynamism without bogging you down in an outdated platform like CL.

(P.S. big Naughty Dog fan here... I've always wondered about these sorts of details. I think a lot of programmers are very curious about your war stories.)


I compiled some of these posts and some other thoughts on LISP (far far from completely formed, but perhaps amusing) into a blog post:

http://all-things-andy-gavin.com/2011/10/25/lispings-ala-joh...


In the 1990s I did mostly C, but in the 2000s I did mostly Kawa Scheme. Java class libraries, especially JDBC, are what led me to choose a Java-based implementation.


Out of curiosity - what made you go with Ruby rather than Smalltalk?


I can't answer for agavin here, but I can tell you one reason why I write more code in Ruby than in Smalltalk: integration.

Smalltalk, more than Lisp or Java, is just not suited to little programs that get the job done. It is really great for rapid development of things that live in the Smalltalk image and play with other Smalltalk objects... but stepping outside of those boundaries is often painful and frustrating. Squeak, for example, is plagued by arcane and undocumented classes for socket I/O or HTTP.

It just feels like its own isolated world compared to a simple Ruby hashbang script that reads from stdin and writes to stdout.


Thank you. :) As a fan of PG's articles regarding language power, I've wondered why he didn't make more comparisons of other languages to Smalltalk (whether positive or negative)


PG: "It seems to me that there have been two really clean, consistent models of programming so far: the C model and the Lisp model. These two seem points of high ground, with swampy lowlands between them."

We lost founders of both in the last couple weeks. Ritchie for C, McCarthy for Lisp.

What's cool about the paper Paul is talking about here: if you start with car, cdr, cons, quote, cond, atom, eq, and a notation for functions expressed as lists, you can write an "eval" function reimplementing the language. The rest can be built on top of it.

Graham calls this an "axiomatization of computation". The notion of a Turing machine was already well known before McCarthy's paper, and is its own fundamental description of computation. Then came this high-level language which constitues a new one. It isn't "THE axiomatization of computation", just "AN axiomatization of computation".

PG continues the story in a later essay, "Revenge of the Nerds":

"What happened next was that, some time in late 1958, Steve Russell, one of McCarthy's grad students, looked at this definition of eval and realized that if he translated it into machine language, the result would be a Lisp interpreter."

"This was a big surprise at the time. Here is what McCarthy said about it later in an interview:"

"'Steve Russell said, look, why don't I program this eval..., and I said to him, ho, ho, you're confusing theory with practice, this eval is intended for reading, not for computing. But he went ahead and did it. That is, he compiled the eval in my paper into [IBM] 704 machine code, fixing bugs, and then advertised this as a Lisp interpreter, which it certainly was. So at that point Lisp had essentially the form that it has today....'"

"Suddenly, in a matter of weeks I think, McCarthy found his theoretical exercise transformed into an actual programming language-- and a more powerful one than he had intended."

http://www.paulgraham.com/icad.html

Pretty cool story :)


"Steve Russell said, look, why don't I program this eval..., and I said to him, ho, ho, you're confusing theory with practice, this eval is intended for reading, not for computing. But he went ahead and did it."

Here is the point in the story where we need Norvig's annotation from PAIP (p. 777):

"So the first Lisp interpreter was the result of a programmer ignoring his boss's advice."

I like to think that the "ho, ho" in the original quote means that McCarthy appreciated this.


Forth is the third model -- very clean and consistent, and still widely used.

(Forth is usually the first HLL to get working on a new chip due to its simplicity.)

If you're looking for a modern stack-language implementation, give Factor a try: http://factorcode.org/


I think Forth is particularly notable for providing metaprogramming facilities which are arguably on par with Lisp while taking a completely different approach. Lisp represents code and data the same way and can manipulate both with ease.

Forth has what might be described as an "interactive compiler" which can be easily extended during the compilation process. Since an immediate word can hijack the parser, you could theoretically build Forth out into a reasonable facsimilie of any other language if you so desired. These capabilities are already used in most Forth distributions to provide things like flow control, comments and string literals.


There are more clean, consistent and interesting models of programming besides the C and Lisp models.

The Ml/Haskell family of languages. Logic languages (though these are somewhat subsumed by the former). Dependent-type languages.


Here on hacker news, we can (and do) argue all day about how to categorize and rank programming languages. I'll readily confess that I do not have enough depth or breadth in the field to be any kind of authority when it comes to that.

The point I wanted to make was, if you are willing to give some benefit of the doubt to Paul Graham's statement in the article (which I quoted) singling out C and Lisp, then the last couple weeks are very significant indeed in terms of the loss to the field of computing.


The Lisp system is a meta model. Thus you can implement a Lisp that works like ML/Haskell/Prolog if you like. ML/Haskell don't seem to provide that kind of meta power as cleanly or consistently IME. Prolog has a good meta model, but it doesn't seem to subsume other models as cleanly, consistently, or efficiently as Lisp.


Implementing a Lisp that works like ML/Haskell will be as much work as implementing a compiler/interpreter of these languages outside of Lisp. The macro system is not a great compiler toolkit for languages with semantics radically different from Lisp.

Lisp thus does not really subsume these languages.


> The macro system is not a great compiler toolkit for languages with semantics radically different from Lisp.

I take it you're not familiar with all the evidence that has Racket against that statement.

In anycase, that's not my real point. Lisp has a simple yet powerful meta system. ML/Haskell do not. But their models weren't designed to provide that kind of simple consistent power.


Lisp and Haskell to my mind form the two leading languages of two fundamentally contradictory families. Lisp works by empowering programmers and building on that power, and Haskell works by limiting the programmer and building on those limitations. Despite what the leading word "limitation" may lead you to believe, both philosophies can lead to great results. For example, for the same basic reasons you can't even build an STM in C#, it's probably even more impossible to build an STM in Lisp, whereas in Haskell it's pretty easy, because of the limitations.

You can not just macro your way in a (conventional) Lisp over to Haskell, because Haskell is fundamentally built on limitations being not merely suggestions, but things you can then further build on. If you do, with something like Clojure and its data model, you've got something that isn't the same language any more, and you need all-new libraries to make it work.


You can build an STM in pretty much any language. There have been STMs in at least C++, Java, C#, Common Lisp, Clojure and OCaml.

Haskell may be one of the rare exceptions. Haskell's STM was not implemented in Haskell, and AFAIK cannot be implemented in type safe Haskell due to limitations of its type system unless you add specific extensions required for things like STM.

So to say that STM is enabled by the limitations in Haskell is exactly backwards.


"You can build an STM in pretty much any language."

Sure, but they don't work. STMs don't work because whenever a transaction is retried, anything that operated on "real state" gets redone, and attempts to deal with that with unrestricted side effects are generally agreed to have been a failure.

(Except Clojure, which built it in either from day 1 or at least very early, and in fact does manage effects.)

"Haskell's STM was not implemented in Haskell, and AFAIK cannot be implemented in type safe Haskell"

I just skimmed the entire STM library in Haskell and have no idea what you are talking about. I don't see anything terribly exotic in the LANGUAGE declarations: CPP, DeriveDataTypeable, FlexibleInstances, MultiParamTypeClasses, MagicHash. That may not be Haskell 98 but nobody cares, those are all either very standard at this point or completely boring (CPP ("use the C preprocessor"), MagicHash ("let me end variables with #")). unsafePerformIO doesn't even show up.

By the way, "skimming the entire library" is about a minute of your time. The whole thing's just shy of 16KB of source.


Where do I find this library? The only thing I've been able to find is a bunch of files doing no real stuff except building a nicer interface on top of more primitive STM operations.

As far as I know you cannot implement typed variables (monadic, of course) in Haskell in a type safe way. Note that I'm not even talking about efficient references, but any implementation of references. I could be wrong though.

A natural way would be to start with the State monad and make a State of Heap, where Heap is a data type that will hold all the variables and passing around this Heap will be hidden by the State monad. Unfortunately you're now stuck when trying to implement operations for allocating a new variable and for setting/getting the value of a variable. If Haskell were dynamically typed this would be easy: just make Heap an array or list.

I'd disagree with you that those STMs are broken. Yeah, you have to do all your state in transactional variables, but then the same is true for Haskell. True, Haskell enforces this in its type system, but that's hardly a requirement for a library. By the same logic Clojure's string library is broken, because it does not statically enforce that you actually pass strings to the library. Neither do all Haskell libraries enforce all their preconditions/invariants statically, or at all. And actually Haskell doesn't really enforce that you don't use external effects in STM, since you can always do unsafePerformIO. When working with other languages you should regard external effects as you would regard unsafePerformIO in Haskell.


Can you elaborate how Racket is evidence against that statement?


PLT Racket has implemented languages with different semantics within Racket - http://docs.racket-lang.org/

See the other languages section.


TILs such as forth.

Smalltalk.

What family does APL belong to?


Certainly its own family, along with its successors J and K.


You could argue those languages are not very clean, at least syntactic-wise.


Don't confuse having lots of operators with being unclean. They're syntactically more internally consistent than Common Lisp, for example.



You can describe the execution semantics of a "C" machine or a "Lisp" machine in a paragraph. Can't really do the same for a Spineless Tagless G-Machine...


The semantics for System F could be explained in roughly a paragraph. Lisp is, more or less, simply typed lambda calculus + code-as-nested-lists + compile-time computation mechanisms; Haskell and ML are System F + various type mechanisms (e.g. typeclasses or modules.)

On the other hand, I suspect an abstract machine for executing Lisp—one that accounted for reader macros and the like—would be of roughly the same complexity as the STG machine.


Apples and oranges. Can you explain implicit continuation representation in a paragraph?

The simple description of lazy evaluation is graph reduction. ML doesn't even use lazy evalution.


Everyone using functional languages (ML/Haskell) or logic languages (Prolog) should respect the considerable contributions McCarthy made to these areas throughout his entire life.


In a meeting, the person who suggests a good idea and the person who recognizes a good idea are usually two different people.



"As computers have grown more powerful, the new languages being developed have been moving steadily towards the Lisp programming model. A popular recipe for new programming languages in the past 20 years has been to take the C model of computing and add to it, piecemeal, parts from the Lisp model[.]"

Hey! That's what I think, too!

Around 2007 when I'd had quite a bit of exposure to both Python and then Ruby I'd started to think the same thing. Although I didn't have the foresight to say it in a public essay I would tell my friends "In five years Javascript will start to become a popular general (i.e. non-web only) programming language and Lisp in 20."

I now think think the big Lisp (or Scheme) event will happen even sooner--say in 5 to 7 years. (Meaning you will see the kinds of articles you see around Node today on HN about Lisp.)





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: