Hacker News new | past | comments | ask | show | jobs | submit login
How Lisp macros differ from static code-generation and metaprogramming (brandonbyars.com)
141 points by gnosis on Jan 26, 2011 | hide | past | favorite | 80 comments



> Except, of course, that doesn’t work. The preprocessor only makes one pass > through the file, meaning a macro can’t call another macro.

Huh? That's just blatantly wrong, and I (and many others) have used the C preprocessor to create multi-level macros in quite powerful ways. See for example http://blog.nelhage.com/2010/07/implementing-an-edsl-in-cpp/ or the insanity Linux's tracing macros have implemented: http://git.kernel.org/?p=linux/kernel/git/torvalds/linux-2.6...

His article has a point, but he's not doing C macros justice -- they are quite powerful and can be used in really cool ways, even if people tend to look at you funny when you do :)


The reason Linux trace macros are insane is because C preprocessor is so primitive.

Think about it.

> they are quite powerful

For small values of "quite". C macros are strictly less (and by large margin powerful than Lisp macros. One can't even use #ifdef inside #define!


Note, the power of the macro system is not equal to its general good.

The C macro system was already too powerful, and one of the things that hurts languages is a preprocessor of virtually any kind. The badness of the C preprocessor starts much earlier with things like include files, and the fact that you can't statically analyze the code in any reasonable way.

The benefit from the preprocessor is absurdly small. With that said, I think the benefit of the Lisp preprocessor is pretty small too. C would be harmed more by its omission due its compilation model, but as we've seen with languanges in its family, like C#, its unnecessary.

In a decade or less, the Lispers will see this light too.


The C preprocessor can't do metaprogramming, by definition - there's no way to take apart and examine the arguments passed to macros.

The reason you can't statically analyze C code that uses macros has to do with the preprocessor's broken design - it is a separate stage from the compiler, and there's no way for the two to communicate.

All this actually means that Lisp macros can be used for static analysis (via macroexpand), in a user-extensible way, without needing to add extensions to the compiler or stages to the compilation process.

I don't understand why you cite C# as a good example of something that doesn't have code preprocessors. Visual Studio has T4 (Text Template Transformation Toolkit), which is a dumb joke.


First, the use case for T4 is fundamentally different than language macros. You would't use Lisp macros or C macros for what you'd do with T4. For example, I've seen T4 used to do some elaborate documentation generation.

But back to the point, I partially agree with what you say about C, but I think you misunderstand where the crux of the problem. It's not that they're separate stages -- I believe they are in most Lisp implementations (but I may be wrong), but rather that you can't do macro expansion w/o the context of the full program, i.e., within the context of the build system -- yet the effect is global.


"First, the use case for T4 is fundamentally different than language macros. You would't use Lisp macros or C macros for what you'd do with T4. For example, I've seen T4 used to do some elaborate documentation generation."

I don't think you know anything about the use-cases for Lisp macros. It's a standard practice to use them to generate documents (most CL HTML generating libraries are just sets of macros), configuration information (SQL schemas, XML) for other systems, and programs in other languages (for example, this is how Parenscript compiles Lisp to JavaScript).

"rather that you can't do macro expansion w/o the context of the full program, i.e., within the context of the build system -- yet the effect is global."

I don't understand this.


* don't think you know anything about the use-cases for Lisp macros. It's a standard practice to use them to generate documents (most CL HTML generating libraries are just sets of macros), configuration information (SQL schemas, XML) for other systems, and programs in other languages (for example, this is how Parenscript compiles Lisp to JavaScript).*

Maybe I don't know the Lisp uses cases for writing conceptual docs, but I must say that this sounds horrible. But clearly each to their own... when all you have is a hammer. I'd much rather use just about any templating system (and there are many) than any language (or preprocessing system) that I've ever encountered.

I don't understand this.

In C, the only way to know the impact of macros on code is to build it (preprocess it). And macros have global effect on the code, so you don't know, when looking at any piece of code in C, what it actually does, unless you look at it preprocessed -- for the most part. This means pulling in the build system, since obviously you don't know what is being pulled from where otherwise.


"I think the benefit of the Lisp preprocessor is pretty small too"

Lisp doesn't have a preprocessor, Lisp is a preprocessor. You can program with it, but thinking about it as a programming language (ASCII or Unicode Strings to be Interpreted) is at least partially wrong.

"In a decade or less, the Lispers will see this light too."

It hasn't happened in the last half century, so how much do you want to bet on it?


Lisp doesn't have a preprocessor, Lisp is a preprocessor. You can program with it, but thinking about it as a programming language (ASCII or Unicode Strings to be Interpreted) is at least partially wrong.

This would be news to McCarthy. McCarthy on several occassions, including the History of Lisp, refers to Lisp as a programming language. I've never heard him say Lisp is a preprocesser. I certainly am no expert compared to McCarthy. Maybe you are?

It hasn't happened in the last half century, so how much do you want to bet on it?

It's importance has ebbed and flowed over time. This is natural of language features. It was a decade ago where all the rage was template metapgrogramming in C++. Now, outside of a few library writers, most people don't touch it. This isn't to say it won't be popular again some time in the future.

I should probably ammend my statement in that light and say that we'll likely see a major ebb in the coming years.


"It's importance has ebbed and flowed over time."

Are you talking about the importance of Lisp macros here? That's what you appear to be doing. I don't see a decline in enthusiasm for use of Lisp macros among Lisp programmers.

"It's importance has ebbed and flowed over time. This is natural of language features. It was a decade ago where all the rage was template metapgrogramming in C++. Now, outside of a few library writers, most people don't touch it."

The reason that few touch template metaprogramming in C++ is because it is a painful, limited, and in pretty much every way sucky experience. Many of the simplest tasks you wish to perform are either nearly impossible, or often drastically increase the amount of code you have to write. Many of us tried (present company included) and moved on because we figured out what a waste of time it was.

Lisp macros, on the other hand, are a very powerful tool that can reduce the amount of code you have to write (and provide new abstractions to do useful things with, and probably at least a dozen other things I haven't mentioned). They are not perfect, and they can be abused, but they shouldn't even be mentioned with C++ templates in the same breath (unless you are talking about how much better Lisp macros are :-D ).

I see your ammended statement/prediction and raise you with this one: I predict that Lisp macros will continue to be at least as important as they have ever been (at least to Lisp programmers) in the next ten years.


   In a decade or less, the Lispers will see this light too.
Right. It's just been invisible for the last 25 years. Or maybe the last 50. Whatever. One of these days...


> Note, the power of the macro system is not equal to its general good.

It's hard to quantify "general good" of something complex as programming language macro system with all consequences included.

But it's quite easy to see which one is more powerful wrt expressiveness.

Your primitive macro system _forces_ me to write code like this:

#ifdef CONFIG_AUDITSYSCALL #define INIT_IDS \ .loginuid = -1, \ .sessionid = -1, #else #define INIT_IDS #endif

#define INIT_TASK(tsk) \ { INIT_IDS }

instead of more natural.

#define INIT_TASK(tsk) \ {\ #ifdef CONFIG_AUDITSYSCALL .loginuid = -1, \ .sessionid = -1,\ #endif }

> The badness of the C preprocessor starts much earlier with things like include files, and the fact that you can't statically analyze the code in any reasonable way.

Static analysis hardness or easyness has _nothing_ to do with preprocessing, because static analyzer can preprocess code in a very same way compiler will so both will analyze the very same stream of tokens.

> The benefit from the preprocessor is absurdly small.

This is subjective and unquantifiable.


It took five years for LINQ to be added to C# because it was too painful for anyone other than Microsoft to add any syntax to the language. Macros would have fixed that.


it's interesting that you say this because Nemerle, another language on the .NET platform implemented LINQ into the language using Macros.


Lambdas, type inference, extension methods, etc... all good stuff. LINQ itself they could have skipped. And if macros help accelerate more stuff like the syntax in LINQ -- well I think that proves my point.


Does everyone just troll now?


Apparently so.


By the way, don't confuse macros with backquote; the backquote facility is probably closer to C's preprocessors. I haven't heard this claim before, and I might not be entirely right, but it's what provides for the "fill in the blanks" type symbolic computing.

For example:

  (defun sum (x y)
    `(,x + ,y = ,(+ x y)))
This is a SUM function which looks like this when run:

  (sum 4 5) ==> (4 + 5 = 9)
The backquote (`) says the following form and its sub-forms (if any) are all quoted. But you can poke holes in the quote and fill in with the value of outside variables, using comma (,). So in the example, everything is quoted in the body of sum, except three things: x, y, and (+ x y). Each of those is then evaluated yielding 4, 5 and 9, then, finally, the template is filled in (4 + 5 = 9).

If you want to confirm this, you can change the + and = to any other symbol and it would still display. E.g:

  (defun sum (x y)
    `(,x frob ,y != ,(+ x y)))

  (sum 4 5) ==> (4 FROB 5 != 9)


Don't forget that `, (backquote-comma) can also be adapted for pattern matching/destructuring (http://www.cliki.net/fare-matcher does this).

I think combining backquote-comma's s-exp-making and s-exp-destructuring powers is a viable way to abstract s-exps from their representations as linked lists/arrays/whatever sequence you want to use.


I like Conrad Barski's take on explaining quasiquote:

http://www.lisperati.com/looking.html

"Flip, flop, flip, flop..."


You can also splice values into backquoted expressions using ,@foo rather than ,foo - which I seem to remember being rather useful.


It's incredibly useful. The actual definition of until from Arc:

  (mac until (test . body)
    `(while (no ,test) ,@body))
It's really useful for bodies of code like this.


So Lisp macros provide compile-time code generation, but in doing so, you have access not just to the code-generation instructions – which are just Lisp data – but also to the entire Lisp language and environment.

The same can be said for classic Forth. Indeed, the enthusiastic way that Lispers describe macros I find eerily similar to how I used to hear Forthians describe their language. Except in Forth the relevant term isn't macro but "defining word". Indeed, I think programmers in both camps see the proper use of macros/defining words as a sign that one has passed the beginner stage of using the language.

A classic compiled Forth word (ie, function) would consist of the word's name followed a string of addresses to execute; how those addresses were laid down, where they pointed, etc. were totally open to programmer control -- both during compilation and afterwards. It's been a couple of decades since I looked at Forth code, but I don't think there's anything that could be done in Lisp macros that couldn't also be done easily in Forth as well.

I think the big difference between the two -- at least, so far as the macro topic goes -- is that Lisp starts with an assumption that code and data can and should be interchangeable while Forth was initially developed for the machine control (embedded systems before they were called embedded systems) and thus code was merely a vehicle for creating applications.


Forth is almost the complete opposite of Lisp, and thus very similar.


You're gotten a number of up-votes, yet this statement makes as much sense to me as saying bananas are the complete opposite of apples. You can make a case of similarity or dissimilarity depending on which language aspect you focus on. But, "opposite"? I have no idea what that means.


Let me guess at some of what (s)he means:

Forth is an interpreter that can be switched into compilation mode, Lisp is a compiler that can be switched into interpreter mode. [edit: thinking of that, this more reflects how I have used the two than what they are. Lisps start in an eval loop, too, yet they feel different to me]

Forth is postfix, Lisp is prefix.

Forth programmers know their stacks are the environment, but rarely think of it as an environment. Lisp programmers know their environment is in some stack-like data structures, but rarely think of those stacks.


Lisp and Forth are both very simple systems in terms of basic concepts, and push them to their limits.

Both are quite flexible.

Forth often runs on the bare metal, and exposes everything below. Lispers are customarily more shielded.

Both encourage code to be treated as data and vice versa (without even looking at macros), but Lisp builds on the lambda calculus to do that, while Forth just sees code and data as being contents of memory.

In Forth if there's a bunch of data flowing into a word, you might put the number of arguments on the top of the stack and the other arguments below. Lispers would run away screaming from this manual and error-prone approach.

Lispers like to build e.g. their their own object system, if necessary. And in Forth it's easy to tag on a garbage collector---from within the language.

I hope that's some meat to back up my assertion. I hope I got some good examples. What I really want to say, is that the languages feel related in spirit. But less like brothers, more like to generals from opposing armies still respecting each other.


Sorry for the belated response, but for a sense of closure, so to speak, I'll add that I agree with your general assessment. Both languages started with a simple core that allowed developers to take some basic ideas and see just how far they could go with them. It just didn't occur to me to describe that as "opposite".

Interestingly, both languages wound up with a schism of framework enthusiasts (CLOS & "big Forths") vs. minimalists (Scheme & Moore loyalists (for want of a better term)).

As an aside, your bunch of data example would run afoul of Forth catechism. The normal response would be to use a dictionary address or even address and offset to access a data structure. Antagonism to deep use of the data stack is so prevalent that Moore eventually reached the point with his chip designs of implementing the top of stack as a couple of on-chip registers -- if what you want isn't within the top couple of stack entries, so the logic goes, then you've failed to factor your design properly. Perhaps this preoccupation with very low level operations is an aspect where Forth programming might be considered the opposite of normal Lisp programming.


I wish HN would give me the option of bubbling up old threads that got new comments.

And you are right about my example with the deep stack. I probably did not remember correctly. The variable array was probably something with passing around an address plus length on the stack---while Lispers prefer that as one object.


A couple points on the metaprogramming space (these aren't about the article, but the article reminded me of them):

1. Macros aren't unique to lisp. They're most developed and used in the lisp family, but you don't need to have a homoiconic language to have macros. The Mirah programming language--statically typed Ruby on the JVM--has non-hygenic macros, I know there's been a number of efforts to add macros to coffeescript (so far without success), and there are a number of academic languages that also have them. None are in widespread use, but they exist.

2. Metaprogramming's power is primarily limited by the restrictions of the host language. The best example I know of is Io where pretty much anything other than commas and parentheses is up for grabs. Example: http://anttih.com/blog/2010/10/29/json-in-io.html

3. User definable mixfix operators are in the sameish domain. Unfortunately, they're almost always mentioned in passing by people who know what they are so I don't have a good reference for introducing what they are and don't have direct experience working with them. I'm just mentioning them as something to be aware of if you're interested in general metaprogramming concepts. Here's the closest thing to an intro I could google up: http://maude.cs.uiuc.edu/maude1/manual/maude-manual-html/mau...


Macros actually saw a great deal of use in the assembly-language space for as long as people were writing large programs directly in assembly. (Typically, the assemblers would let you define "pseudo-operations" which could appear in place of an actual opcode, and whose arguments were used to fill in templates.) In fact, one of the raps on Unix among "big iron" programmers was that its assembler was so primitive --- meaning, in particular, that it had no useful macro facilities.


1. Macros aren't unique to lisp.

Perl6 also has macros (http://en.wikipedia.org/wiki/Perl_6#Macros) though they haven't been implemented in Rakudo yet. Also Ioke states that it has macros.

2. Metaprogramming's power ... The best example I know of is Io where pretty much anything other than commas and parentheses...

Touched on Io introspection/metaprogramming before on HN:

* http://news.ycombinator.com/item?id=1804599

* http://news.ycombinator.com/item?id=1810480


The question isn't so much whether other languages have something they call "macros", but whether their macro systems approach Lisp's in power, flexibility, ease of use, integration with the language, and natural fit on to the language representation? Or is the macro system in question more of a Turing tarpit?


Prolog's macro system does, but given it's model of computation (search/pattern matching are built in), it ends up being used far less than Lisp's.


I believe Perl6 will fulfil those requirements because the language is built & hosted upon Perl6 Grammars which makes it totally reconfigurable.

ref:

* http://en.wikipedia.org/wiki/Perl_6_rules

* http://news.ycombinator.com/item?id=1280021

* http://perlcabal.org/syn/S06.html#Macros


1. Yet, still to this day, Lisp remains one of the few languages with powerful macros that normal people cannot only use, but enjoy. Also the Scheme/Racket community actively researches how to make them more robust and easier to write (yes equally powerful macro systems do exist for other languages - however, ease of use is often sacrificed)

2. Io is great but the depth of it's runtime meta-programming facilities make it horrifically inefficient.


I don't disagree and I'm not trying to argue with the article or anything. I was just hoping to point people interested in the topic towards things that took me a while to realize/run across.

As for Io's efficiency, I think that's more about the implementation than in the language. Javascript (prototypal inheritance) and Smalltalk (message passing) aren't that far off and both have decent performance. I know Steve Dekorte was working on getting Io running in javascript in December but I don't think it's a high priority project for him.


For (1) look at OCaml's P4 (very integrated) preprocessor for some macros in a non-homoiconic statically typed language.

For (3) look at how Haskell defines new operators. I find Haskell's use of completely new operator-squiggles plus type classes for something faintly similar to C++-style overloading, much more sensible than say, C++-style overloading.


There is also template haskell. Nemerle is another non-homoiconic language with powerful metaprogramming abilities using hygienic macros. It really is quite impressive what those guys are doing. And it is such a contrast (not all inferior IMHO) to the lisp way. A dual approach maybe.


  > Except, of course, that doesn’t work. The preprocessor 
  > only makes one pass through the file, meaning a macro 
  > can’t call another macro.

  macro.c:
  #include <stdio.h>
  #define LOOP(n) for (int i = 0; i < n; i++)
  #define PRINT10 LOOP(10) { printf("%d", i); }
  int main(void) { PRINT10; printf("\n"); }

  gcc -Wall -std=gnu99 macro.c  && a.out 
  0123456789
Of course macros can call other macros: http://gcc.gnu.org/onlinedocs/cppinternals/Macro-Expansion.h...

Ignoring his poorly protected macros, and taking nothing away from Lisp, am I missing some subtle point he's making, or is the author just flat out wrong here? I think he's just wrong, and this makes it hard for me to even read the rest of his argument.


A C macro can't directly -- nor indirectly -- refer to itself; even through other macro. You can't do loops, you can't do recursion.

(a nonsensical example, as I can't think of a proper one right now)

#define PRINTFOO PRINTBAR

#define PRINTBAR PRINTFOO

PRINTFOO

The ANSI C standard says a macro, while it is being expanded by the preprocessor, ``is painted blue'' (is no subject to further expansion).

I'm not sure, but probably it follows the C macros are not Turing-complete.

EDIT: on the other hand, a LISP macro system is a Turing-complete programming language that's geared towards manipulating lists of symbols and values (which, incidentally, is LISP program).


Do you have a link for the "painted blue" reference? I agree with your explanation, but had never heard that particular phrasing and was intrigued by it. But searching on Google for things like 'expand ansi macro "painted blue"' returns only your comment and a scraped spam site of someone answering a similar question.


Indeed surprisingly hard to find a first hand reference. But http://duckduckgo.com/?q=%22painted+blue%22+expansion returned link to wikipedia article, which, in turn, links to some meta-document about new C standard. There's also some post on Boost (library) mailing list: http://duckduckgo.com/?q=macro+%22painted+blue%22+boost -> http://lists.boost.org/Archives/boost/2006/01/99264.php

Perhaps I was wrong; perhaps the phrase was in K&R C? Can't find the book online @_@;;


correct, not Turing-complete


Note that this was, in fact, by design. Anyone who seeks to make C more complicated than this should be shot on sight.


I noted this too. Seems the author has a broken C compiler.


Nice try blanco niño, but the following program runs fine. This doesn't detract from your overall point, but if you're trying to win over people who think the Boost macros are super awesome, this argument won't do it.

#include <stdio.h>

#define LOOP(n) \ int i; \ for (i = 0; i < n; i++)

#define LOOP10 \ LOOP(10) printf("%d\n", i)

int main(int argc, char *argv[]) { LOOP10; }


Yesterday I was writing code that kept messing my Lisp's signal handling, so I wrote a FOR-DURATION macro that arms a timeout and makes sure whatever that runs in its body gets killed after N seconds.

Here it is:

  (defmacro for-duration ((seconds) &body body)
    `(handler-case  
         (bt:with-timeout (,seconds) 
	   ,@body)
       (bt:timeout () nil)))
Five lines to alter the evaluation model of your language. Not bad.

Use as:

  (for-duration (10)
    (loop 
      (print "Infinite loop! where is your Godel now?")))

"bt" is the bordeaux-threads package, a portable Lisp library for thread programming.


C99 + GCC extensions + POSIX:

  #define for_duration(seconds, body)                     \
    {                                                     \
      pthread_t tid_task, tid_watcher;                    \
                                                          \
      void* task(void* arg) {                             \
        body ;                                            \
        pthread_cancel(tid_watcher);                      \
        return NULL;                                      \
      }                                                   \
                                                          \
      void* watcher(void* arg) {                          \
        sleep((seconds));                                 \
        pthread_cancel(tid_task);                         \
        return NULL;                                      \
      }                                                   \
                                                          \
      pthread_create(&tid_task, NULL, &task, NULL);       \
      pthread_create(&tid_watcher, NULL, &watcher, NULL); \
      pthread_join(tid_task, NULL);                       \
    }                             
Use as:

  for_duration(10, {
      while (true) {
        printf("Infinite loop! where is your Godel now?\n");
      }
    });
  
Or with single brace-less statements:

  for_duration(10, while (true) printf("something\n") );
  
Lexical scope is sane:

  int i = 0;
  for_duration(10, {
      while (true) {
        printf("%d\n", i++);
      }
    });
My comparison isn't quite fair, because your threading library provides 'with-timeout', while pthreads doesn't. If you factored this out, say with a signature like:

  void with_timeout(unsigned int seconds, void (*func)(void));
(in analogy to the common lisp function), then the macro part becomes just four lines:

  #define for_duration(seconds, body) {			\
    void func() { body ; }				\
    with_timeout((seconds), &func);			\
  }
I acknowledge that the lisp solution is rather more elegant. Also, my C macro is potentially dangerous because it is unhygenic. (It shadows outer declarations of 'task()', 'watcher()', 'tid_task', 'tid_watcher', and 'arg', in the body of 'body'.)


Is there a C macro that can do this?

    (defmacro execute-in-reverse (&body body)
      `(progn ,@(reverse body)))

    (execute-in-reverse (print "Hi") (print "Middle") (print "Bye"))
Prints...

    "Bye"
    "Middle"
    "Hi"
Point: You can do whatever you want with the symbols sent to the macro, whatever their contents. Mahmud's wrapper macro is trivial (it could be done with lambdas). Lisp's macro system allows you to create new syntax, including changes in flow control.

But my example is trivial, too. I could go on to swap individual parts of my forms around, remap them to other forms depending on various conditions, etc.

In addition, people often talk about Lisp macros, but they neglect to mention the power of reader macros, which allow you to go beyond Lisp's basic AST look-and-feel.


>Is there a C macro that can do this?

No, of course. C macros only see strings; you need to parse a syntax tree to do your reverse example (like lisp macros).

No argument from me.

>Lisp's macro system allows you to create new syntax, including changes in flow control.

Well, C macros seem to be able to manipulate control flow. You can't break up an expression to do that (not smart enough to parse), but you can rearrange expressions that are given whole.


Wonderful example of what macros can do.

Just for shit & giggles I'd thought I give it a go in Io (http://www.iolanguage.com):

    executeInReverse := method (
        m     := call argAt(0)
        stmts := list()              // list of statements (ie. messages)
    
        loop (
            rest := m next           // rest of messages after current 
            stmts append(m setNext)  // get current message
            m := rest
            if (m == nil, break)     // exhausted statements when "nil"
        )

        stmts reverseForeach (n, doMessage(n))
    )

    executeInReverse( writeln("Hi") writeln("Middle") writeln("Bye") )
And I think with a bit more work I can get it to amend its AST rather than rerunning the loop each time it sees executeInReverse.


I don't believe nested functions like you're using there are valid C. They're a nonstandard extension that some compilers implement because they're so handy. So this basically illustrates that C gets closer Lisp's usability when you add Lispy features to it.


You're right, it's not even valid C99. My mistake.


GCC extensions make me want to hurt people.


    (defun for-duration (seconds body)
        (handler-case  
             (bt:with-timeout seconds 
        	   (body))
           (bt:timeout () nil)))
    
    (for-duration 10
        (lambda ()
          (loop 
            (print "Infinite loop! where is your Godel now?"))))


I think Greg is trying to say "why the hell do we even need macros for this"?


With functions you have to make a conscious effort to quote arguments, otherwise the strict evaluation will evaluate its arguments before the function. Also, with macros you can introduce you new "implicit" control and structure semantics; it's very common for forms to have an "implicit progn", or a block named NIL or similar.


Good post, but I still think that C's macro are pretty powerful. I've created a foreach loop in C (GNU C at least, though I could probably port it using C1x's _Generic) just to prove this: https://gist.github.com/632544.


Lisp macros remind me of dynamically generating SQL statements.

If you've ever built a string in T-SQL or PL/SQL and then exec'd it, you're probably much closer to being productive with macros than you think.

Of course, this is probably because dynamically generating SQL is something that I've done. Other programmers probably have their own mental models that warp into lisp.


Yeah, I think that's a fair way to start thinking about them. But Lisp macros, to extend your example, are more like if you could generate the SQL using SQL. Which sounds like a totally horrible thing actually. :) But with Lisp it makes much more sense because lisp programs are lists. If SQL programs were tables, then it'd be a better fit...


You'd have to stretch the analogy to not only generating SQL with SQL, but extending SQL with SQL, to create constructs that weren't even possible to write in SQL before you extended it.


> If SQL programs were tables, then it'd be a better fit...

Now there's an idea...


Thanks for presenting Lisp macros in terms that I (as a C++, Python et al programmer) can understand.

I can see why they are as powerful as claimed.


Nice post, though I find the summarization of SICP and the Little Schemer as academic texts inaccurate. The paradigms covered in them are of greater importance than macros - that is, your macros are only as powerful as the abstractions your language gives you.


Seeing how this guy comments about loop, be a little careful when using it. There is no formal standard on how to implement loop, so what you write might result in different answers depending on what Common Lisp implementation you use. Evaluate these loop-expressions and see what happens (from ANSI Common Lisp by PG):

    (loop for y = 0 then z
          for x from 1 to 5
          sum 1 into z
          finally (return (values y z)))
          
and then evaluate this:

    (loop for x from 1 to 5
          for y = 0 then z
          sum 1 into z
          finally (return (values y z)))


What aspects of the actual behavior of loop are unspecified?


How to combine them is poorly defined and complex. The loop clauses by themselves is not the problem.


Metaprogramming is a more general concept than the author claims. For example, lisp macros and eval/quote (at any level of compilation or runtime) are particular metaprogramming facilities. You can even metaprogram using only explicit code quotation if you prefer (with no implicit reflection exposure of regular code).

I agree with the definition at http://en.wikipedia.org/wiki/Metaprogramming


In dialects that support fexprs such as Kernel, PicoLisp and Eight, you get something almost like first-class macros. Macros (or fexprs, rather) are then even more powerful, because they can be passed around as arguments to functions or even to other fexprs.

A key tradeoff is that they must be expanded at run-time, and in doing so they incur a performance penalty that doesn't exist with compile-time macros.


Are natural languages homoiconic? (If you get my drift.) Does that question even make sense?


Hmm, I don't know about in general, but in English we have the double-quote when being used to refer to the statement itself. For example, I can talk about the use of "for example" by quoting it like that. So (written) English can embed English in a reasonably standard format. Very like the Lisp quote operator, no? The name is not coincidence.

However, we have no useful English operators to take the quoted words and change them into something else meaningfully. Though a non-English set can be argued to exist: http://creativeservices.iu.edu/resources/guide/marks.shtml If those were English themselves that would arguably complete the loop, but I would consider that stretching the point; these were explicitly invented to avoid the use of English and also translate to any other character-based language.


I can't think of a natural language equivalent for the comma operator, other than (maybe) incredibly awkward and confusing uses of parenthetical statements


Even early stabs at x-bar theory indicate that natural languages are highly homoiconic.

http://en.wikipedia.org/wiki/X-bar_theory


Can you please explain more?


Homoiconicity means program and data have identical representation. But after you Lisp for a while the boundary blurs entirely.

X-bar theory says that in all natural languages, Noun Phrases, Verb Phrases, and depending on who you read other types of phrases all have the same form; a single Head (which may be composite) and a Tail with zero or more elements. Never mind that they've just discovered S-expressions. The point is that in the natural language program:

    multiply the radius squared and pi
Both the data-like parts and the program-like parts share the same structure.


Buffalo buffalo Buffalo buffalo buffalo buffalo Buffalo buffalo

But seriously, take a look at Quines and Hofstadter's Godel, Escher, Bach


I was re-reading Doug Hoyte's Let Over Lambda today, and came across this quine:

  (let ((let '`(let ((let ',let))
                 ,let)))
    `(let ((let ',let))
       ,let))
Originally from: http://www.scribd.com/doc/47702904/Bawden-Quasi-Quotation-in...


The factual errors concerning the C preprocessor aside, the author also ignores C++-style metaprogramming. Granted, C++ templates have their flaws, but they do provide a Turing-complete compile-time metaprogramming environment...


I like run-time meta programing. Which i assume lisps are capable of. speed excites me not at all, and i consider having a compile-time to be a disadvantage.

i wish authors didnt craft their arguments around C/Java.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: