Hacker News new | past | comments | ask | show | jobs | submit login
JavaScript Isn't Scheme (stuffwithstuff.com)
197 points by munificent on July 19, 2013 | hide | past | favorite | 108 comments



I learned JavaScript before "The Good Parts" was printed and more importantly after I had been writing a lot of Common Lisp. I remember being really shocked and excited when I realized how much JS had in common with lisp given that my expectation when learning it was that I was in for an awful time.

I think the origins of the "JavaScript is Scheme!" meme have more to do with the fact that you can pretty trivially re-implement all of the code from the Little Schemer (including the applicative order y-combinator) in JavaScript, than any of the reasons mentioned here. JavaScript still, imho, has nicer closures and lambdas than most mainstream programming languages.

Actually, thinking back on it, I remember having a conversation with Prof. Daniel Friedman about how similar the two language are and I believe his response was something along the lines of: if you have the lambda calculus, what else do you need?

Final nitpick: most of the "Y isn't really a lisp!" (another meme) articles I've read always miss what I think is the biggest thing missing from all languages compared to any lisp: symbolic expressions. To me this is the single most underrated feature of lisp that almost no other programming languages offer.


The points in the article are still not addressed in your comment. The exercises in TLS can be programmed in pretty much any one of the mainstream, dynamically-typed and lexically scoped languages (beginning with Ruby, which I have partially done). The author is right; the hype around the phrase is unjustified and js programmers should focus on the true merits of Javascript alone.


That's true, but that any "dynamically-typed and lexically scoped languages" with closures and first-class functions could be considered mainstream is fairly new. Crockford wrote "JavaScript: The World's Most Misunderstood Programming Language", which I think is the source of the idea that JavaScript is like Scheme, back in 2001. At that point, Ruby was almost unknown, Python was fairly minor, PHP didn't have closures or first-class functions; the other major programming languages were statically-typed things like C++ and Java.

So I think it's worth remembering that it's really a very recent development that we can now treat it as obvious that a language like JavaScript would contain the things it has in common with Scheme.


> That's true, but that any "dynamically-typed and lexically scoped languages" with closures and first-class functions could be considered mainstream is fairly new.

When I first started thinking about writing this post, I was considering going from the angle of "JS was Scheme when Crockford said that but now every language has caught up and we're all Scheme now" but somehow a different post fell out of my head when I started writing.


Perl was big in 2001 and had closures and first-class functions.

And let's not forget great book Higher-Order Perl: Transforming Programs with Programs http://hop.perl.plover.com/


Well, Perl is on a level not many languages dare touch. If I'm not mistaken, you can execute arbitrary code during compilation and generate arbitrary source. Which in turn might be executed to generate more stuff...


So, just like Lisp, then.


You can do that with any language that has an eval function. Eval takes a string and feeds it into the interpreter.


For some reason I didn't see that he'd said, "during compilation". Perl is an interpreted language, so there's no compilation stage. A compiled language is modified in this way using a preprocessor. It doesn't really make sense to claim it's possible to inject code during compilation, that's the entire purpose of a compiler, to generate machine code from a syntax tree.

But code generation and execution during runtime, which is what I think you meant, is capable in any language with eval().


Perl is an interpreted language, so there's no compilation stage.

Perl claims to have a compilation stage, by which the documentation means that the lexer and parser produce an optree, which the runtime phase traverses. During that compilation stage, it's possible to run code which changes how the parser will treat subsequent syntactic elements.


I actually agree that the "JavaScript is Scheme!" meme is over used today. My comment was more pointing out where I felt the meme's origins had come from. It really wasn't that long ago that mentioning dynamic programming languages would get scoffed at by "real" programmers. Today, with underscore.js being pretty standard, it's trite to say "Hey! JavaScript has lambdas, closures and first class functions!" But that was not always the case.


>The exercises in TLS can be programmed in pretty much any one of the mainstream, dynamically-typed and lexically scoped languages (beginning with Ruby, which I have partially done).

That's spurious. Anything that can be implemented in one Turing complete language can be implemented in any Turing complete language. The question is how easy it is.


I learned Lisp after I learned Javascript, so I had the opposite experience of discovering that a bunch of Lisp-y things worked just as I expected them to.

I would never claim that Javascript is scheme (and it doesn’t sound like that’s what you’re claiming), but the things I love about Javscript are also things I love about scheme.

Is it the homoiconic-ness of the s-expressions you like?

Personally, I'd love to see continuations in Javascript, despite the potential for horrible abuse. Deeply nested callbacks are a pain in the ass. Of course, promises are a pretty decent solution.


Google "brendan eich javascript scheme". Result #1 is from Brendan Eich's website:

I was recruited to Netscape with the promise of “doing Scheme” in the browser...

I’m happy that I chose Scheme-ish first-class functions and Self-ish (albeit singular) prototypes as the main ingredients.

https://brendaneich.com/2008/04/popularity/

So "JS=Scheme" is not very crazy. Maybe it oversimplifies. But it's not baseless. And it's short.


The main reason the comparison between JavaScript and Scheme is helpful is that it breaks the connection between JavaScript and Java. The first thing many Java programmers do when trying to write JavaScript is to try to create a class hierarchy (rather than using prototypical inheritance) using idioms from Java. Although possible, it ends up awkward and confusing in most cases. Understanding that functions are first-class and seeing functional idioms fall out from there is a big paradigm shift for Java devs.

Obviously JavaScript isn't Scheme using the author's "top 10 defining characteristics of Scheme." Crockford's analysis is on track - and John Resig and Bear Bibeault highlight the same idea by introducing functions are really fundamental to the languages more than objects (in "Secrets of a JavaScript Ninja").

The language's history does imply the comparison as well. Eich's original mandate was to "write Scheme for the browser" but he was later directed to give his language a C-like syntax and to ride on the coat-tails of the hype surrounding Java.

"JavaScript isn't Scheme" might not be accurate for folks schooled in the finer points of Scheme. It is very useful to those who cut their teeth learning Java.


We might just as well claim that Javascript is Perl then. Javascript has a lot more in common with Perl than it does Scheme; those things that both Javascript and Scheme have in common are also common in many other languages.

Anyone trying to draw parallels between Javascript and Scheme isn't crossing some pedantic line, they're off in crazyland. Or, more likely, they don't understand both languages as well as they're implying.


    it breaks the connection between JavaScript and Java.
This is a good point. The defining characteristic of Scheme in the minds of many is really just closures. Almost every modern language except Java has those, so "Scheme" has inadvertently turned into a weird shorthand for "inverse of Java".

I think that's too blunt of an instrument to help you reason about any of the languages in question, but I do appreciate encouraging people to think functionally.


> Scheme is even more Lisp than Lisp

That's not true. In fact, there are some prominent people in the Lisp community who do not think that Scheme is a Lisp.

See this thread from 2002 on comp.lang.lisp (especially Pitman and Naggum's comments) for more. If you are too young to have posted on C.L.L when Usenet was still popular, and want to see what a full fledged flame war looks like, look at the end of the thread.

Some people who posted there (including the OP) are on HN.

https://groups.google.com/forum/#!topic/comp.lang.lisp/Bj8Hx...


There are people — like Pitman and Naggum — who argue for defining "Lisp" to mean "Common Lisp" for political rather than technical reasons, essentially because they want to stop the evolution of Lisp. (Those who haven't read the thread might think I'm exaggerating.) Fortunately, they are relatively irrelevant today, and Lisps like Clojure, Hy, and Racket are carrying on the further development of Lisp.


But languages like Clojure, Hy or Racket are not a Lisp. They are Lisp-derived/inspired/... languages. That's also why they don't have Lisp in their name.

OTOH Lisps like Emacs Lisp, Common Lisp, ISLisp, and some others are still carrying the core of the first Lisp. Basically you can run Lisp 1.5 code in those without many changes.

Just like Java is not the new C. OTOH languages like C++ and Objective C still support much of C. You can port C to C++ easily.

Over the history many other languages have been trying to replace Lisp: ML (functional programming), Dylan (object-oriented programming), Logo, Scheme, ... all tried to modernize/improve Lisp. Now it is Clojure and some other languages.

Still there are core Lisp languages which carry on the genes.

But not Clojure:

    Clojure 1.5.0-RC2
    user=> (car '(1 2))
    CompilerException java.lang.RuntimeException: Unable to resolve symbol: car in \
    this context, compiling:(NO_SOURCE_PATH:3:1)
What?

    user=> (append '(1 2) '(3 4))
    CompilerException java.lang.RuntimeException: Unable to resolve symbol: append \
    in this context, compiling:(NO_SOURCE_PATH:4:1)

What?

    user=> (assoc 'foo '((foo . 3)))
    ArityException Wrong number of args (2) passed to: core$assoc  clojure.lang.AFn\
    .throwArity (AFn.java:437)

    user=> (atom 'foo)
    #<Atom@2285af04: foo>
???

Does not look like it supports Lisp processing as defined by McCarthy:

http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.91....

Additionally the Java error messages are especially irritating. I enter Clojure code and get Java error messages.

Compare that with a real Lisp:

    bash-3.2$ java -jar abcl.jar
    Armed Bear Common Lisp 1.1.1
    Java 1.7.0_25 Oracle Corporation
    Java HotSpot(TM) 64-Bit Server VM
    Low-level initialization completed in 0.628 seconds.
    Startup completed in 1.968 seconds.
    Type ":help" for a list of available commands.
    CL-USER(1): (car '(1 2))
    1
    CL-USER(2): (append '(1 2) '(3 4))
    (1 2 3 4)
    CL-USER(3): (assoc 'foo '((foo . 3)))
    (FOO . 3)
    CL-USER(4): (bar 10)
    #<THREAD "interpreter" {6EEE1E2}>: Debugger invoked on condition of type UNDEFINED-FUNCTI\
    ON
      The function BAR is undefined.
    Restarts:
      0: CONTINUE     Try again.
      1: USE-VALUE    Specify a function to call instead.
      2: RETURN-VALUE Return one or more values from the call to BAR.
      3: TOP-LEVEL    Return to top level.
    [1] CL-USER(5): 
We even get Lisp error messages. The compiler is written in Lisp. http://svn.common-lisp.net/armedbear/trunk/abcl/src/org/arme...

Clojure may be a cool language, but the core of Lisp has been replaced in parts with something else. List processing is different. It's now based on 'persistent' lazy sequences...


Rainer, are you seriously complaining that Clojure is not a Lisp because it spells "car" as "first", "cdr" as "rest", and "append" as "concat"? That makes as much sense as complaining that Interlisp is not a Lisp because it doesn't have FEXPR.

In Clojure, your examples are spelled:

    (first '(1 2))
    (concat '(1 2) '(3 4))
    (get {:foo 3} :foo)       ; or (get {'foo 3} 'foo)
    (symbol? 'foo)            ; or (not (coll? 'foo))
And lots of things in the world irritate me for reasons that have nothing to do with whether they are Lisp or not.

I'd like to point out that your examples wouldn't have worked in Lisp 1.5 either. You would have had to type, for example,

    CAR (QUOTE (1 2))
interactively.


Clojure does not implement Lisp lists. Just plain and simple. It may partly look like Lisp lists. But they aren't. Functions are randomly either looking compatible or not at all. ATOM does something completely different. CAR does not exist. FIRST exists, but works on persistent lazy sequences (wonderful, but it is not using Lisp-style chained cons cells).

Btw., LispWorks, plain:

    CL-USER 23 > CAR (QUOTE (1 2))
    1
A more complex example:

http://www.informatimago.com/develop/lisp/com/informatimago/...


You're mostly mistaken, and you've mostly been fooled by changing names.

Clojure lists aren't lazy, and they support the Lisp operations, under slightly different names, and with the same asymptotic performance characteristics you're used to, because they're made of chained cons cells. The cons operation is implemented at https://github.com/clojure/clojure/blob/master/src/jvm/cloju.... Your "more complex example" is a 1960 program written, originally, in M-expressions; here's the first definition on the cited p.32:

    th1r[v;a1;a2;c1;c2] = [atom[v] → member[v;a1]∨
        th[a1;a2;cons[v;c1];c2];T → member[v;a2]∨
        th[a1;a2;c1;cons[v;c2]]]
As far as I know, there hasn't ever been a Lisp system that could parse and run that, and certainly not LispWorks. But you can straightforwardly transliterate it either into Lisp 1.5, or Common Lisp (although the post you link doesn't bother; instead they wrote an interpreter for the incompatible Lisp 1.5 syntax in Common Lisp), or into Clojure:

    (defn th1r [v a1 a2 c1 c2]
        (cond (symbol? v) (or (some #{v} a1) (th a1 a2 (cons v c1) c2))
              true        (or (some #{v} a2} (th a1 a2 c1 (cons v c2)))))
The only weird thing here is using the set-membership test instead of MEMBER.

Compare the above to the LISP 1.5 version:

    (TH1R (LAMBDA (V A1 A2 C1 C2) (COND
        ((ATOM V) (OR (MEMBER V A1)
        (TH A1 A2 (CONS V C1) C2) ))
        (T (OR (MEMBER V A2) (TH A1 A2 C1 (CONS V C2))))
        )))
Clearly Lisp changed a lot between LISP 1.5 in 1962 and CLtL in 1984, not to mention current Common Lisp practice; what you'd write in any modern Lisp looks a lot more like the Clojure version than the LISP 1.5 version.

The set thing, though, points to a real difference in Clojure: it has a set data type, and you're expected to use it instead of the list data type when what you want is a set. It's still immutable, though, and supports efficient nondestructive update, so if you decide to rewrite this as

    (defn th1r [v a1 a2 c1 c2]
        (if (symbol? v)
            (or (a1 v) (th a1 a2 (conj c1 v) c2))
            (or (a2 v) (th a1 a2 c1 (conj c2 v)))))
you can be assured that you're not totally hosing your program's performance. It might get better, in fact, if the sets are large.

You could argue that adding sets (and finite maps) to Lisp is violating the underlying essence of Lisp, in which you use conses for everything, but in fact we already have vectors, hash tables, classes, structs, and closures in all of the other Lisps you're citing, none of which were present in the Ur-Lisp in 1960.

ATOM doesn't exist in Clojure, although you can define it. atom does, and yes, it does something totally different from what ATOM did historically.

By "persistent" Clojure means "immutable". That is, there's no RPLACA or RPLACD. But RPLACA and RPLACD weren't present in McCarthy's original proposal, and they're hardly central to Lispiness. You could even argue that their absence is Lispier, since it encourages functional programming and enables the compiler to optimize it better, and indeed Racket already abandoned them a few years back for that reason.

BTW, I was wrong about what you would have had to type at LISP 1.5's read-apply-print-loop. You would have had to type

    CAR ((QUOTE (1 2)))
for obvious reasons.


I'm not mistaken. Look closer.

Of course Clojure sequences are lazy.

    (defn map
      ([f coll]
       (lazy-seq
        (when-let [s (seq coll)]
          (cons (f (first s)) (map f (rest s))))))
Inside a LAZY-SEQ the operations are just that.

They are also not acting like cons cells:

    user=> (cons 1 2)
    IllegalArgumentException Don't know how to create ISeq from: java.lang.Long   clojure.lang.RT.seqFrom (RT.java:496)
> As far as I know, there hasn't ever been a Lisp system that could parse and run that

What you saw in the file was the source code used by the first Lisp implementation.

> an interpreter

That was no 'interpreter'. Mostly a different reader.

> what you'd write in any modern Lisp looks a lot more like the Clojure version than the LISP 1.5 version.

Not necessarily. See the Lisp source for http://www.shenlanguage.org/Download/download.html

Plain old-style Lisp code.

There are not many modern Lisp books. If you look at PAIP, that is old-style. PCL is more modern. But neither has the look or feel of Clojure.

But I'm not saying that Lisp's don't contain new stuff. I'm saying that they contain the old stuff + new stuff.

Clojure gets rid of core old stuff:

* names are different

* concepts are removed (mutable cons cells, ...)

* syntax is different from every other Lisp

* semantics is different (persistent data structures, ...)

> By "persistent" Clojure means "immutable"

immutable and persistent are different, but related concepts. A data structure can be immutable, but it does not need to be persistent. 'Persistent' means that an immutable data structure can be updated, keeps all its versions and provides certain performance/space characteristics (one does not do full copies for example).

Imagine a list (f o o b a r). Now I want to insert 1 between f o o and b a r.

Immutable: I need to make a new list with elements from the old list copied.

Immutable and Persistent: I make a new list, but I possibly reference parts of the old data structure. Thus I don't need to make a full copy.

Lisp's lists/vectors/... are not immutable and also not persistent.

Clojure replaces that. Just read http://clojure.org/sequences


Some Clojure sequences are lazy, but Clojure lists aren't. And you are right that there's a typing constraint that prevents you from making improper lists, and that's not the traditional behavior of Lisp conses.

As for the 1960 code in the manual, no, that wasn't the source code used by the first Lisp implementation.

In the Shen case, I assume you're talking about things like this, in primitives.lsp?

    (DEFUN eval-kl (X) 
      (LET ((E (EVAL (shen.kl-to-lisp NIL X))))
          (IF (AND (CONSP X) (EQ (CAR X) 'defun))
              (COMPILE E) 
              E)))
What I see here:

    - Indentation to show structure;
    - DEFUN;
    - ';
    - IF;
    - LET;
    - some lowercase letters.
To me, that looks a lot like the modern Clojure code, and not much like the 1962 code.

Your taxonomy of immutability and persistence is interesting; thank you. I thought you might have meant that Clojure lists were automatically serialized to stable storage on, for example, program exit. Lisp lists have always been persistent, then, except when you mutate them? Because you can make your (f o o 1 b a r) from (f o o b a r) without copying the whole thing in any Lisp.

Lots of Lisps have been backwards-incompatible with previous Lisps. Scheme, Common Lisp, Emacs Lisp, and even MACLISP and LISP 1.5 were all significantly backwards-incompatible with their predecessors. That didn't make them non-Lisp. Common Lisp was not the end of Lisp development.


> Lots of Lisps have been backwards-incompatible with previous Lisps. Scheme, Common Lisp, Emacs Lisp, and even MACLISP and LISP 1.5 were all significantly backwards-incompatible with their predecessors.

Right. That's what I'm saying. Clojure does not care to be backwards compatible with Lisp.

> Your taxonomy of immutability and persistence is interesting

That's not mine.

Clojure took its base data structures from Haskell and modern ML.

Not Lisp.

See:

http://www.cs.cmu.edu/~rwh/theses/okasaki.pdf

http://en.wikipedia.org/wiki/Persistent_data_structure

The book comes with examples in ML and Haskell.

http://www.amazon.com/Purely-Functional-Structures-Chris-Oka...


I don't think we're going to get anywhere further in this conversation, although I really appreciate everything you've posted so far, and I heartily second your recommendation of Okasaki's book, even though I haven't finished it myself. And I hope that I have avoided being anything like Erik Naggum in this conversation, despite my frustrations.


This conversation was incredible!


> But languages like Clojure, Hy or Racket are not a Lisp.

How are they not. Okay, I see your argument with Clojure, but how does that apply to Racket?

  Welcome to Racket v5.3.4.
  > (car '(1 2))
  1
  > (append '(1 2) '(3 4))
  '(1 2 3 4)
  > (assoc 'foo '((foo . 3)))
  '(foo . 3)
  >


Racket has some other differences: Immutable data structures, more emphasis on Functional Programming and less on imperative programming, symbols are not that important, slightly more static and less interactive. It's basically developed from R6RS on. The Scheme community is divided about that. The R7RS under development goes a bit more back to the older spirit and style - which is also controversial.


Sure, Racket has differences from CL, but I don't see how havi any of the things you mention make it "not a Lisp" . Clearly, it's not CL, but you don't seem to be making the "CL is the only Lisp" argument.


I got to the end of the page and thought, "Well that wasn't so bad..." -- then I saw that I was at the end of page 1...of 22.

Epic.


Civil wars over distinctions without a difference are the ugliest kind.


Holy smoke...

I started posting on Usenet ~1989. I've seen a number of flame fests that I considered epic but this moved the bar exponentially.


And he started it with this.

I hesitate to ask this question because I really don't want to start a flame war.


I asked on the author’s site as well, but I don’t understand the argument that JavaScript doesn’t have lexical scoping.

http://c2.com/cgi/wiki?LexicalScoping


Javascript has C syntax but does not follow the C tradition of having one scope per block, instead having one scope per function. This is really unintuitive if you are used to how lexical scoping is done in other C-like languages and is the source of many common annoyances, like the closures-in-for-loops bug or needing to wrap your code in IIFEs[1] if you want to create a local scope or prevent variables from being global.

[1] - Immediately Invoked Function Expressions: (function(){ ... }())


I assume he means that Javascript has lexical scope at the function level, rather than at the block level. It is much nicer at the block level, both for human users of the language and for optimizing JITs.


Ah, right, cobbal mentioned this, too. I hadn’t thought of that — I’m used to languages without that feature.

(To be strict, that seems to be more a matter of what a scope is, rather than if the scoping is lexical or dynamic.)


It probably refers to the fact that variables by default are global. If you prefix them with var they're function scoped, but there's no way to make them local to the curly braces of an "if" statement.

In scheme, lexical scoping is easily accomplished through the let form, and it's pretty hard to make a variable global by mistake.

So yes it does have lexical scoping in the sense that function arguments are lexically scoped, but that's the only scoping environment, and you can't hide that fact behind macros like you could in scheme if it was needed.


>If you prefix them with var they're function scoped, but there's no way to make them local to the curly braces of an "if" statement.

There is; it's just ugly. You just have to use an immediate function as the body of the if statement.


in the words of will smith in men in black, "please keep that kinda shi* to yourself"


I don't understand what you're trying to say, but I was describing a common and useful pattern.


i was complimenting you.. ahh nevermind


Did you notice that "let" is coming in through ecmascript?

https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...


That’s roughly what I was thinking, but it seemed a little ridiculous to me — Javascript has lexical scoping, it’s just not required.

It seems like this argument conflates lexical scoping with the requirement that variables be lexically scoped. Lexical scoping (even if optional) is extremely useful for closures. Requiring lexical scoping would very useful for safety.


JavaScript doesn't have complete lexical scope. Consider:

    function foo() {
      console.log(whereDidThisComeFrom);
    }
    window.whereDidThisComeFrom = "the damn global object";
    foo();
And:

    function foo() {
      var a = {
        whereDidThisComeFrom: "a, of all places"
      };
      with (a) {
        console.log(whereDidThisComeFrom);
      }
    }
    foo();
In both cases, you can't lexically resolve the identifier whereDidThisComeFrom.


In the first case the lack of a whereDidThisComeFrom declaration in scope would imply that it meant `window.whereDidThisComeFrom` (in normal mode) or would imply a reference error (in strict mode) wouldn't it. Lua does the same thing and I never saw anyone complain about its lexical scoping.


Scheme does the same thing too:

scheme@(guile-user)> (define (foo) (write where-did-this-come-from)) ;;; <stdin>:1:14: warning: possibly unbound variable `where-did-this-come-from' scheme@(guile-user)> (define where-did-this-come-from "the damn global scope") scheme@(guile-user)> (foo) "the damn global scope"scheme@(guile-user)>

Munificent's assertion that this doesn't represent lexical scoping is wrong, but his assertion in another thread that with() is an exception to lexical scoping is correct, and therefore JS is not lexically scoped, just mostly lexically scoped.


Lexical block scope is broken in (at least) three ways in JS: the global object and with() break lexical scoping, and hoisting breaks block scoping.

Strict mode in ES 5 does address the first two, and "let" in ES 6 addresses the latter, so that's good. But at that point, you're talking about the future and not JavaScript as it is today.


On one hand, it's true that Scheme has a lot of things JS doesn't, and vice versa; and the feel of the languages is very different. On the other hand, it's completely clueless to claim JS doesn't have lexical scoping. Only someone who didn't understand what the alternatives to lexical scoping were could claim that.


Quite the contrary. Javascript has a strange, ugly hybrid of lexical and function scope that is not that of Scheme, which I suspect that Bob understands very well given his interest in languages (and that he works on a language that compiles to JS).

This is not lexical scoping:

    function example(k) {
        function ugly() {return i};
        for (var i = 0; i < k; i++) {}
        return ugly;
    }
    example(4)() // => 4


Yes, the scoping in your example is lexical. That's why ugly returns k regardless of where you call it from; we can reason lexically to show that.

The i in ugly is lexically resolved to refer to a particular declaration, the one on the line below it. Scheme's lexical scoping rules are not the only possible lexical scoping rules. A lexical scoping system does not cease to be lexical simply because it is not the same as Scheme's lexical scoping. For it to be non-lexical, the scoping would have to depend on something other than the lexical structure of the program. For example, you could just as well have a dynamically-scoped JS in which ugly's return value would depend on its caller's binding for i, and in which calling a function with a free i from within example would make example's binding for i visible to that other function, but that isn't the way JS works.

It's quite reasonable to argue about whether that scoping is sane, and indeed with "let", the ES folks are giving us a saner binding construct; but it is certainly the case that the relationship between the declaration and the use of a variable is established by the lexical structure of the program text, not the dynamic structure of the executing program.


The "WAT" for me is that the `var i` is hoisted _out_ of the `for`, which IMHO, breaks the principle of least astonishment. I would expect the `i` to be lexically scoped to the body of the `for` block, instead of being hoisted out into the parent scope of the `for` block.


I speculate that this is because Javascript takes stuff from Self where loops are functions.

So probably Javascript was meant to write something like: while(function(k) { ++i; return i < k; });

where the while function would call the closure until it got false at which point the while function itself would return so having var declarations be hoisted was not a problem in this syntax because everything (including if and while) would have its own scope due to being a function


You could have designed a language like that, but Brendan didn't.


In the post, I specifically said Scheme had lexical block scoping. So, even if JavaScript had full lexical scoping (which it does not, thanks to with() and the global object), it still wouldn't have block scope (thanks to hoisting, though addressed in ES6), which I think is a defining feature of Scheme's approach to scope.


You're right about with(), which I'd forgotten about — JS with with() is not lexically scoped!

Access to the global object is still statically decidable. It just might raise an exception. This is not really different from Scheme.

I agree that block scope is a defining feature of Scheme's (and ALGOL's, and C's) approach to scope. However, Scheme's block scopes are all functions, at least if you accept the macro-definition of let that's been there since R5RS, rather than treating let as a separate primitive construct.


I would argue that Javascript is not lexically scoped because the scope of a variable definition does not depend on the lexical structure of a program (with the sole exception of function boundaries). This is true even to the point of multiple definitions having no effect on the scope of a variable.

I think that what you are arguing is that Javascript is statically scoped.


"Statically scoped" is the same thing as "lexically scoped".


What is it that you don’t like about this. Is it that the var is hoisted?

http://www.adequatelygood.com/JavaScript-Scoping-and-Hoistin...

Hoisting generally seems ugly to me.

Edit: I think there is some confusion here about scoping. One axis is dynamic versus lexical scoping (are there other options?). Another axis is function, block, global, etc. scope. They seem orthogonal to me.


This is not hoisting because it doesn't involve an alpha rename to avoid capture.

Have JS developers taken a bunch of compiler theory terms without understanding them and applied them to Javascript? I have to say, I'm not really surprised...


In a lexically scoped language I would assume that the scoping follows the lexical representation of the code. The given example goes against this assumption. The variable i is not even defined before it is refered to in the inner function.

The level of scope (global, block, ...) interact with the type of scoping in many ways, so I would not say that they are orthogonal. This example would behave very differently under dynamic and lexical scoping.

  function foo() {
    var i = 1;
    function bar() {
        var a = i;
        var i = 2;
        return a + i;
    }
    return bar;
  }
  foo()() // => ?


    In a lexically scoped language I would assume that the scoping follows the lexical representation of the code.
In a lexically scoped language the variables belong to the containing lexical scope. That that lexical scope can be a function rather than a block doesn't make Javascript any less lexically scoped (in cases other than the dynamic this).

    The variable i is not even defined before it is refered to in the inner function.
The order that variables appear within a scope has no effect on the scoping rules. For example, in C#:

    void Foo() {
        i = 10;
        int i;
    }
Gives "error CS0841: Cannot use local variable 'i' before it is declared". It knows exactly what variable "i = 10" is referring to because it's declared in the same scope, but the language designers have decided to add the rule that you're not allowed to reference it before it's declared (because doing so is most likely a bug).

Incidentally, Javascript made the opposite decision; you can refer to variables before their declaration, they're just "undefined". But in both cases the compiler knows what "i" it's referring to by the lexical scope.


I would argue that the order of declarations is part of the scoping rules. In your example the variable i is not yet in scope when the assignment is done, and thus the compiler complaints. But this is of course somewhat philosophical.

My first argument was that the order of statements is part of the lexical representation of code, and thus should effect the lexical scoping.


I'd say the order of declarations isn't part of the scoping rules. The scoping rules are used for name resolution. In my example, both i's belong to the same scope, so the name resolution says "oh, that first i is the one in this scope, not any parent scope".

In languages like C#, a separate analysis is then done to find out if the variable is definitely assigned at each reference. This can include some sophisticated reachability analysis(how sophisticated depends on the usefulness/complexity tradeoff).

I guess it's all semantics really, but I prefer to keep the scope (ha ha) of lexical scoping rules closer to name resolution than start mixing in definite assignment and reachability analysis and the things that come with that.

It makes understanding and expressing the commonalities and differences between different language's lexical scoping a lot easier. C# is lexically scoped with blocks introducing new lexical scopes. Javascript* and Python are lexically scoped with functions introducing new lexical scopes. As an orthogonal concept, C# and Python disallow references before definite assignment, while Javascript allows it.

(*) With the dynamic 'this' caveat.


The point isn't that I don't like it (although I really don't), it's that Scheme doesn't work the same way as Javascript, and the Javascript-is-Scheme slogan contributes to non-understanding of those differences.


In particular I was bothered by the author's separation of scoping and closures. For example, if you look at what Objective-C has in terms of closures, and what JS has, then you're practically looking at completely different animals. But if you just say "closures ✓", "perfect lexical scoping ✗", you completely miss that point.


Take a look at the author's toy language for the rationale for the separation: http://magpie-lang.org/


I'm pretty sure I understand lexical scoping, and this ain't it:

    function foo() {
      window.c = "really?"
      d = {};
      function bar() {
        var a = "var";
        {
          with (d) {
            var a = b;
          }
        }

        console.log(a);
      }

      d.b = c;
      bar();
    }

    foo();


You are of course correct about with() being dynamically scoped; thank you. I had forgotten about with() when I wrote what I wrote. I must weaken my claim to "JS is lexically scoped, except for the with() construct."


Part 1 of Douglas Crockford's On JavaScript gives a really nice history of the many programming languages leading up to JavaScript.

Chapter 1: The Early Years: http://www.youtube.com/watch?v=JxAXlJEmNMg

Chapter 2: And Then There Was JavaScript: http://www.youtube.com/watch?v=RO1Wnu-xKoY


I was thinking the same thing while reading the article.

JS has inherited from both Scheme and Self.


Haskell doesn't have first-class continuations, it has syntactic sugar that allows you to write continuation passing code in a direct style.


This is absolutely insane. I don't even know how it got to the point that an article was necessary to explain this.


I started coding in a Timex 2068, back in 1986, I keep getting surprised with one needs to explain nowadays to youth generations, given the amount of available information about computing in general.


This article is based on the assertion that there are people that think Javascript is Scheme with an ugly makeover. But do those people exist? Most people who make the comparison aren't saying they they're equal to each other, but that they share similarities. Maybe some people hear that out of context and take it too far, but I don't think they're the majority.

I think the thing that makes the comparison is more the first class functions than the closures. The closures assist the first class functions, but the functions are the star.

Not a whole lot of non functional programming languages support first class functions with the simplicity and completeness of javascript.

http://en.wikipedia.org/wiki/First_class_function#Language_s...

In javascript, passing functions as parameters is a day to day thing, and that style of being able to pass around different functions as building blocks is what gets it compared to functional languages. Sure, that's not the only thing that functional languages have going for them, but it's the most important thing, and what they're even named after.


Very well and funnily written, among other things. Thoroughly enjoyed :)


Couldnt agree more, the writing style of the author is amazing(especially considering the topic). The use of humor to make what could be a dry subject pretty damn good is a rare talent.


Thank you! I try very hard to make it worth the reader's time to read it. Humor and interesting phrasing is a big part of that.


I am glad you saw that, me and a coworker were discussing how you should keep on blogging, good work!


It's true because it's funny.


Beautiful.


JS might look like Scheme (can't imagine what shall I smoke) only to those, who never managed to learn the later.)

But people like to repeat slogans without understanding. This is how everyone is calling Clojure a Lisp, for example, while it is a language of its own, which happen to use parenthesized syntax.)

It is not parenthesis that makes a language Lisp. It is not first class functions that makes it Scheme.) Lisp/Scheme is a set of conventions/features, and when some are broken and other are missing that means it is something else.


At the risk of feeding a troll, a bit of reading comprehension and critical thinking 101: regardless of whether or not you personally consider Clojure to be a Lisp, this is outside the discussion of the article. The home page for Clojure explicitly states "Clojure is a dialect of Lisp."

This is in contrast to JavaScript, whose creators never claimed it was Scheme or even a dialect, although the author of the article suggests it was influenced Scheme. In addition, he has perceived that lately, many people claim JavaScript is Scheme, and he rejects that claim.

Disputing the validity of Clojure as a Lisp is orthogonal to the subject of the article.


I agree, but I'm wondering what you think is missing or broken in Clojure to prevent it from being a Lisp? Sure, it sits on top of the Java type system and one major thing I can think of that is lacking is the condition system (although, isn't this a only Common Lisp thing?). I don't think it has to be Lisp all the way down to be a Lisp dialect, if that's what you're getting at.


http://karma-engineering.com/lab/wiki/Clojure but, this is only quick review.

It is possible, for example, to go through some books, especially "The Joy Of Clojure" which contains 20 line of marketing slogans for 1 line of code, and make explicit commentaries on all the subtle differences, but I'm not going to perform such a tedious task for free.)


I am familiar with this line of reasoning and I have read the Joy of Clojure. I can't say that I agree though. The article you linked is just opinion and doesn't back up its arguments at all. Afaict, it really boils down to "it's not CL" and "it's not built on cons cells". I agree that the fact that Clojure sits on top of the Java type system is a bit of a mess but it's a language to get shit done and not satisfy some purists.

> "The Joy Of Clojure" which contains 20 line of marketing slogans for 1 line of code

> but I'm not going to perform such a tedious task for free

Well, obviously there's no point in discussing this further and we have to agree to disagree. Have a nice day anyway.


Let's make it simple.) There is classic homework code in two different dialects of Lisp:

  (define (cross xs ys)
      (cond ((or (null? xs) (null? ys)) '())
            ((atom? xs) (cons (list xs (car ys)) (cross xs (cdr ys))))
            (else (append (cross (car xs) ys) (cross (cdr xs) ys)))))

  (defun cross (xs ys)
      (cond ((or (null xs) (null ys)) nil)
            ((atom xs) (cons (list xs (car ys)) (cross xs (cdr ys))))
            (t (append (cross (car xs) ys) (cross (cdr xs) ys)))))
Could you, please, provide the equivalent code in Clojure?


Well, you could do it this way:

    (defn cross2 [xs ys]
     (cond (or (and (sequential? xs) (empty? xs)) (empty? ys)) '()
           (not (sequential? xs)) (cons (list xs (first ys)) (cross2 xs (rest ys)))
           true (concat (cross2 (first xs) ys) (cross2 (rest xs) ys))))
which is pretty much exactly homologous, allowing for the detail that you can't ask if an atom is empty? in Clojure, cond (Arc-like) takes alternating conditions and consequents rather than condition-consequent pairs, and the spellings of the list operations no longer refer to IBM 709 machine instructions. Also, it works on any kind of sequences, not just lists, with of course a punishing performance overhead on sequences whose `rest` operation is slow.

But I would argue that this interface is poorly designed, since you can say (cross2 '(a b c) '(1 2 3)) or (cross2 'a '(1 2 3)) but not (cross2 '(a b c) '1), and worse, (cross2 '(a (b c) d) '(1 2 3)) implicitly flattens the (b c) into individual items, which is probably a latent bug rather than desired behavior. So I would argue for writing it in this form instead:

    (defn sc [x ys]  ; scalar cross
          (if (empty? ys) '() 
              (cons (list x (first ys)) (sc x (rest ys)))))

    (defn cross [xs ys]
          (if (or (empty? xs) (empty? ys)) '()
              (concat (sc (first xs) ys) (cross (rest xs) ys))))
which avoids those irregularities and makes the code easier to understand by removing misleading false symmetries.

Except really, if this isn't a homework problem, I think you should write it like this in any of these three Lisps:

    (defn cross [xs ys]
          (map (fn [x] (map (fn [y] (list x y)) ys)) xs))


With the last two lines you won.) My point was that to make correct translation one must use (recur ...) which will mess everything up.

One more subtle thing: your solution produces (((a 1) (a 2)) ((b 1) (b 2)) ((c 1) (c 2))) while the contract was to produce "list of all possible pairs".


Oh, that extra nesting was stupid of me! Thank you. It should have been

   (defn cross [xs ys]
          (mapcat (fn [x] (map (fn [y] (list x y)) ys)) xs))
and maybe in CL one would prefer

   (defun cross (xs ys) 
          (loop for x in xs
                appending (loop for y in ys
                                collect (list x y))))
which of course has no equivalent in Clojure, Scheme, or really any other language I can think of.

I'm not sure I agree on (recur...). You would need to use (recur...) if you were translating tail-recursive code that iterated over something other than a data structure and didn't produce new live objects on every iteration. But the code you gave wasn't tail-recursive, and what it iterated over was a data structure, and every iteration produced live objects that can't be garbage-collected. Even if you rewrote it to be tail-recursive, it wouldn't run out of stack for reasonably-sized output lists anyway; and for unreasonably-sized output lists, it would be likely to run out of heap for the output before it ran out of stack. I'm interested to hear if you manage to get it to stack-overflow. (It seems likely to be possible, but perhaps a bit of a challenge.)

Regardless, I don't think it's reasonable to claim that languages that don't have tail-call elimination — which I suspect you may be on the point of doing — aren't Lisps. Many popular Lisps have had TCE, but many more Lisps haven't, and the CL standard doesn't require it.


> ... which of course has no equivalent in Clojure, Scheme, or really any other language I can think of.

Python, for example: def cross(xs, ys): return [[x, y] for y in ys for x in xs]

LOOP in disguise :) (at least to my (very possibly faulty) understanding.)


A number of languages have listcomps that can do this, and which are actually more useful for this than CL's LOOP macro, but the thing I meant to point at was the APPENDING bit. I guess (loop for x in xs append (loop for y in (f x) collect y)) is awfully similar to [y for x in xs for y in f(x)], though.


Including Clojure for those who might not know:

    (defn cross [xs ys]
      (for [x xs, y ys]
        [x y]))


> http://karma-engineering.com/lab/wiki/Clojure but, this is only quick review.

This definition seems to exclude Common Lisp from being a Lisp. That might be a defensible position, but it does call into question how your definition of "Lisp" is useful to you. It also seems to exclude languages like Dylan, which are commonly regarded as Lisps by people far cleverer than me.


Every other Lisp happens to have TCO, so if you try to port other Lisp programs to Clojure you generally get stack overflows.


But afaik TCO is only a requirement for Scheme implementations, not for e.g. Common Lisp (i.e. you don't need to implement TCO to satisfy the Common Lisp standard). I don't see how the lack of TCO prevents it from being a valid Lisp dialect.


Do you need to implement the common lisp standard to be "a lisp"? What is a lisp?


Exactly my point. I was just using CL as example because I think it would be hard to argue that it is not a Lisp yet the standard doesn't guarantee TCO.

> What is a lisp?

It's a very good question. I think PG sums it up quite nicely in "Revenge of the Nerds" [1]. Although, now that I'm thinking about it, I'm not quite sure there's any point in classifying something as a Lisp or not...

[1] http://www.paulgraham.com/icad.html


Emacs lisp doesn't guarantee TCO (and in practice does not perform it at all), and yet few people claim that it is not a dialect of lisp.


Without TCO you have stack overflows, and without proper numeric tower you have integer overflows. Roughly speaking, one of the aspect of why it isn't Lisp is underlying Java stuff.


I think we have already established that TCO is not necessary to be a Lisp (CL, Emacs Lisp). While it's true that Clojure doesn't have a proper numeric tower it does have bignum support and arbitrary precision math operators which will not overflow. But either way, imo this is not a defining feature of a Lisp dialect.


In fact, all non-toy Lisp implementation provide TCO - http://0branch.com/notes/tco-cl.html because, it seems, it's a natural feature of a Lisp system (Scheme just requires it).

Again, Lisp could be defined as a limited set of conventions/features. As long as some other features, such as CLOS added there is no problem, but if some features are broken, then it is not Lisp anymore. It is just doesn't walk like a duck.

Let's say that Clojure was developed with a "put everything useful together" or Ruby-approach, if you wish, which is very popular for scripting languages, while development of Scheme and other Lisp dialects was founded on "put only what is absolutely essential, and done right".

The first approach "stuff anything in" you could see almost everywhere. The second approach "research first, and do the best" is unpopular for the obvious reasons and could be rarely seen only in masterpieces, such as Gambit-C, nginx, old-school marvels such as Informix.

So, in my opinion, Clojure is much closer to Ruby than to Lisp (let's not be deceived by parentheses) - it is a scripting language (to quickly put everything together with variety of clever special syntax and fancy data-structures without much thinking about implementation details). This is, of course, most productive approach to coding - this is why people love scripting languages so much.


Your manner of dismissing LISPs without TCO allows you is something of a No True Scotsman argument, enabling you to proclaim counter examples toys by merit of their being counter examples. Yes, most lisps have it. I don't think anyone denies emacs-lisp or AutoLISP were Lisps due to their lacking it.

Your suggestion of "let's say ..." is based in what appears to be complete lack of familiarity with all of the languages involved. Providing useful libraries doesn't preclude having done things right. Supplying a bare minimum of libraries does not preclude having made them miserable. There are plenty of awkward moments in using Common Lisp libraries that have made this plain to me.

Your suggestion that "research first, and do the best" is unpopular for "obvious reasons" is just hand-waving. The "obvious reasons" that are left unstated here are that "research first and do the best" languages general suck, hard. They suck because they sit in toy environments for years while the "release early and iterate" languages flourish under constant adaptation to real world usage. Both will have warts. The latter will be worth using.

Suggesting that "Clojure is closer to Ruby that Lisp" is just silly. What lisp? Scheme and Common Lisp, both definitely Lisps, are easily as different from each other as Clojure is from either. Ruby's insane class monkey patching is closer to the type of advice you find in Common Lisp than the immutable datatypes and carefully conceived concurrency primitives found in Clojure. Common Lisps many different name classes are a horror found in few modern languages. There's nothing in Clojure's "clever special syntax" that many developers did not toy with using reader macros and other abominations. Your suggestion that the olders lisps data structures, usually cobbled together with a pattern of lists and a prayer, are somehow more thought through than Clojures is both ignorance and meanness combined. Yes, Common Lisp had many builtin and library added datatypes. No, it didn't stop alists and structure built from underlying alists from being its fondest love.

As for classifying Ruby and Clojure as "scripting" languages, please define "scripting" language. It's a meaningless term for nearly anything other than `bash`.


One of the correct, but subtle analogies with Ruby is that in times prior to 1.8.x there was nothing, but reference implementation. For the question "define what is Ruby?" the answer was "this MRI".

The differences between, say, Scheme and CL are few and subtle - #' and funcall syntax, behavior of nil, etc. all the foundational special forms and general function application rule are the same.

Of course, CL is a much bigger language, but all its features never broke the basis on which everything is founded - a few special forms, list structure, general evaluation rule with exceptions only for these special forms, each of which follow its own rules.

Most of CL's features are macros and libraries, so they do enrich the base language, without breaking it up.


Clojure currently needs to be implemented and extended in another language: Java. The Clojure compiler is written in Java. The runtime is written in C. Its core is written in Java.

https://github.com/clojure/clojure/blob/master/src/jvm/cloju...

Most Common Lisp implementations have their compiler written Common Lisp. The language is implemented and extended in Common Lisp.


> This is how everyone is calling Clojure a Lisp, for example, while it is a language of its own, which happen to use parenthesized syntax.

Well, the official website doesn't help much by itself calling Clojure a (dialect of) Lisp.

http://clojure.org/

The only other time I heard someone saying "Clojure is not Lisp" it wasn't in order to say something nice about the former.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: