I could have done that, but I couldn't be bothered to look up the interpolation strings for format (yes, we have it), or the one level flatten function, and the splicing unquote is less efficient than cons in this case. But yes, in real code, I'd probably go in that direction.
There is one key difference between my code and yours: in mine, the functions, values, and symbols referenced in the macro are automatically renamed, thus guaranteeing no namespacing conflicts. Due to the way the CL package system works (IIRC), CL provides almost the same guarantees, at least in this case. But it is an important semantic difference.
And it is the Common Lisp. Despite how you may feel, Scheme, Clojure, PicoLisp, NewLisp, Racket, Interlisp, LeLisp, EuLisp, and others are as much a Lisp as CL.
Well, then, if you don't have macros, then you don't have procedural macros, which was my point.
And I didn't say it was a legitimate reason to use cons. I said it technically had better performance. As I stated above, the real reason I used it is because it fit my mental model of what was happening.
> Well, then, if you don't have macros, then you don't have procedural macros, which was my point.
Trivially.
Anyway, I don't consider them to be Lisp dialects anyway. They are new languages, Scheme dialects, scripting languages with parentheses, whatever. The name 'new'LISP says it already.
> I said it technically had better performance.
You thought it had, without actually knowing it.
CL-USER 36 > (let* ((bar '(1 2 3))
(baz `(foo ,@bar)))
(eq bar (rest baz)))
T
So it's not copied and no traverse is needed.
What actually is traversing the code is your IR-macro mechanism. Twice. -> ir-macro-transformer. Which makes it slower both in the interpreter and the compiler.
Not considering them to be Lisp is ridiculous. Picolisp is closer to Lisp 1.5 than CL is. CL took many ideas from Scheme, and vice versa. The claim that CL is The One True Lisp is tenuous at best, and absurdly ridiculous at worst. It's like a Catholic claiming that they're the only true Christian religion (not a great analogy, but not the worst in the world). It also, quite frankly, given how these languages, particularly PicoLisp, fit every definition of Lisp I've heard, takes us straight into No True Scottsman territory. Haggis, anyone?
It's good to know that splicing unquote optimizes for that case in Lisp. I wasn't sure, and so assumed the general case. In any case, that's not the reason I didn't use it, as I've explained multiple times now.
Yes, I know ir-macro does code traversal. Yes, sc-macros are more elegant. But that's the mechanism we picked, And I have no issue with it. Furthermore, it doesn't traverse all the same code twice. It traverses the inputs and the outputs.
>Btw., the code won't win any beauty contests.
...Says the CL user. Actually, I agree, it won't. But it works, it's reasonably clear, and it doesn't do anything obviously stupid. It's okay.
...Unless you're talking about my code. You want me to clean that up? Okay. I will.
> Not considering them to be Lisp is ridiculous. Picolisp is closer to Lisp 1.5 than CL is.
How so? CL runs a lot Lisp 1.5 code unchanged.
Picolisp not. The Picolisp evaluator is not compatible with Lisp 1.5. It doesn't even have LAMBDA.
>CL took many ideas from Scheme, and vice versa.
Sure not. The main idea CL took from Scheme was 'lexical binding by default'. Other than that the Scheme influence was minor.
CL is based on Lisp Machine Lisp, NIL, S1 Lisp and Spice Lisp. All coming from Maclisp, which was developed out of Lisp 1.5.
> The claim that CL is The One True Lisp is tenuous at best, and absurdly ridiculous at worst.
I never said that. But it is the most widely used Lisp, and the one I mostly use - minus some minor use of Emacs Lisp.
> It's good to know that splicing unquote optimizes for that case in Lisp. I wasn't sure, and so assumed the general case.
You could have looked it up or tried it, before claiming it. I did it for you.
> Yes, I know ir-macro does code traversal. Yes, sc-macros are more elegant. But that's the mechanism we picked, And I have no issue with it. Furthermore, it doesn't traverse all the same code twice. It traverses the inputs and the outputs.
Yeah, but claiming that the CL code was less efficient. Great move.
I looked that up from the Chicken Scheme sources, to actually see what it does. It traverses inputs and outputs during macro execution. You could have mentioned that.
Sorry, I don't trust your judgements, your claims are simply not backed up by the source and how things actually work.
> Says the CL user
I can't remember seeing such ugly code for macro expansion in a CL implementation.
Take make-er/ir-transformer . That function code is fully obfuscated. It bundles several utility functions, which don't belong there as sub-functions. Some are using access to lexical variables defined several dozen lines above, others don't. The result code is over hundred lines long, even though the basic mechanism could be written down much more compact. Each subfunction can only be understood by referring to the surrounding code, which is above or below.
Then it takes a parameter for using two different expansion mechanism. From that, two new closures are created, which then are given to the user in, again, two differently named versions. The code itself contains lots of debug code, which simply outputs intermediate results, and which will overwhelm any human user for any non-trivial macro expansion.
>The Picolisp evaluator is not compatible with Lisp 1.5. It doesn't even have LAMBDA.
However, it keeps a lot of ideas from 1.5 that were later dropped by CL and others. Names don't matter: ideas do.
>You could have looked it up or tried it, before claiming it. I did it for you.
It wasn't entirely relevant to the present situation, until I mentioned that I thought cons was more performant. Thanks for trying it. I don't have an excuse, but thanks.
>Yeah, but claiming that the CL code was less efficient. Great move.
I didn't. I said that splicing unquote had to traverse the resultant list, making it slower than cons, which was pretty much irrelevant in this case. I then explained the real reason I used cons.
>I looked that up from the Chicken Scheme sources, to actually see what it does. It traverses inputs and outputs during macro execution. You could have mentioned that.
Quite honestly, I didn't see how it was relevant. We weren't discussing macro system internals until just now. It's not great, but it gets the job done, and that wasn't the point.
I'm starting to get really really frustrated here. You seem to miss every point I make, to the point that I'm very nearly wondering if it's deliberate.
> However, it keeps a lot of ideas from 1.5 that were later dropped by CL and others. Names don't matter: ideas do.
That's what I say: vague ideas don't matter much when forming language families. Code does. Books. Libraries. Communities.
What were those ideas that were dropped? Fexprs would be one. That was dropped when compilers were used and Fexprs were found not to be compilable. That happened in the 70s before CL existed. Pitman published his paper on macros in 1980, which summarized the view of the Maclisp / LML developers. What else?
The Lisp 1.5 manual gives an extended example: the Wang algorithm.
What were the 'ideas' that were dropped, even though somehow old code still runs?
> We weren't discussing macro system internals until just now. It's not great, but it gets the job done, and that wasn't the point.
The point was, claiming a 'slower compilation process' due to splicing backquote usage, while in fact the whole compilation of the example you gave was the really slower one, because use used a slower macro system which traverses code for renaming and re-renaming.
> You seem to miss every point I make
EVERY POINT? Are you really sure I miss EVERY POINT you make?
Personally I would only claim that you miss SOME of my points, not every. In some cases I would claim that we have different opinions, for example what makes a language and its dialect.
But I would not claim that you miss all my points.
Well, maybe not EVERY point. It just often feels like you emphasize the parts of the write that I focus on least.
>What were the 'ideas' that were dropped, even though somehow old code still runs?
Well, fexprs and dynamic scope by default are the big ones, but also the idea of functions as lists, which are why it doesn't have lambda.
>The point was, claiming a 'slower compilation process' due to splicing backquote usage, while in fact the whole compilation of the example you gave was the really slower one, because use used a slower macro system which traverses code for renaming and re-renaming.
I appreciate the irony, but as I've now said several times, that wasn't my justification for using cons. I even said that they hypothetical speed increase would be negligible, and unlikely to be noticed, before you showed that the speed increase wasn't even there. This is one of the things it seems like you missed.
>That's what I say: vague ideas don't matter much when forming language families. Code does. Books. Libraries. Communities.
That's not entirely true. Sure, code matters a bit, but Java definitely comes from the C family, and the code doesn't transfer at all. As for communities, see for yourself: Scheme was born from the MACLisp community, and retains strong ties the modern equivalent: Common Lisp.
There is one key difference between my code and yours: in mine, the functions, values, and symbols referenced in the macro are automatically renamed, thus guaranteeing no namespacing conflicts. Due to the way the CL package system works (IIRC), CL provides almost the same guarantees, at least in this case. But it is an important semantic difference.
And it is the Common Lisp. Despite how you may feel, Scheme, Clojure, PicoLisp, NewLisp, Racket, Interlisp, LeLisp, EuLisp, and others are as much a Lisp as CL.