Hacker News new | past | comments | ask | show | jobs | submit login
Response to 'Reasons why Lisp games suffer' (techsnuffle.com)
106 points by etiam on Dec 12, 2018 | hide | past | favorite | 88 comments



Instead of using a vague term like "power", perhaps the LISP fans should promote the concept "leverage", whereby you can talk about getting a task done with the fewest words of code. Small LISP programs can be incredibly clever, and transformation languages like LISP and FORTH are well known to win every program-shortness contest. However Assembler has always been more powerful in the sense that you can do things in Assembler that you can't do in higher level languages, because there is no direct mapping from a higher level language to many of the special instructions that exist on modern chips. So using "power" is a lamentable word choice. LISP does have problems though, many consider it a "write-only" language which transfers very poorly to other programmers. You can measure transferability of a codebase by assigning a new person to it, giving them a task, and seeing how long it takes, and how much is broken in the process. LISP is avoided by most large companies for this reason alone. Leverage and other factors are heavily outweighed in overall cost to the organization by the difficulty of transferring a code base among people.


I thought APL (and its successor J) was the "winner" for doing the most with the fewest words of code.

(~R∊R∘.×R)/R←1↓ιR


Yes, there are degrees. But everybody wants their language to be a winner, I guess.


I like "power." It's evocative, and it maps to the feeling you get with a good language - when your intent is made manifest with little effort, you feel powerful (and _are_ powerful).


I think these are fairly good reasons why the term should not be used. Appeal to the developer's power fantasy?


then should we aspire to... weakness, fragility, and impotence?


How about things like readability and ease of maintenance and modification? To those people making real products (and not just coding for a hobby), enabling collaboration is far more important than whether it lets the programmer feel clever and powerful.

(Hobby coding is cool and important too, but the aims are different.)


what a buzzfeed worthy comment


Is Assembly powerful?


I get your point but it seems unnecessary to optimize verbage like "power" or "leverage" if there's a popular view that the language isn't easily readable. If you can't convince people the language is readable it seems like it isn't going to get use outside of hardcore fans or trivial applications anyway.


I came in here hoping for more information on developing games in Lisp/Scheme. I liked the recommendation for implementing old games. I might take a swing at the original StarCraft (just for kicks).

I've been poking around trying to sort out how to bind Chicken Scheme [0] to the Godot [1] C API.

The idea is to work in Godot using C files compiled by Chicken. This should work, in theory. The engine can handle the important timing stuff (mentioned in the article), and Scheme can be used for actual game-logic and algorithms.

I know a Scheme binding for Godot exists, but it hasn't been updated in quite a while.

[0] https://www.call-cc.org/ [1] https://godotengine.org/


> When you speak of non lisps are ‘blub’ you disregard the actual characteristic of those languages and how they fit to the task.

What does 'blub' mean?


It refers to Paul Graham's article about Lisp http://www.paulgraham.com/avg.html Blub basically means average mainstream languages.

> I don't want to hurt anyone's feelings, so to explain this point I'm going to use a hypothetical language called Blub.

> As long as our hypothetical Blub programmer is looking down the power continuum, he knows he's looking down.

> But when our hypothetical Blub programmer looks in the other direction, up the power continuum, he doesn't realize he's looking up. What he sees are merely weird languages.


That whole "Blub" section reads to me as a snowglobe version of the Lisp community's long-standing failure to communicate with everyone else.

Graham starts with his "shockingly controversial statement" that "programming languages vary in power", which, I don't know that that is controversial, but OK. I'm with him that far. But he immediately loses me, because all of his exposition on this idea consists of beating around the bush, talking about characteristics of programming languages that are most definitely not the same thing as power, before moving on to this rather arrogant-sounding parable that makes it sound like he thinks of this "power" idea in remarkably similar terms to how colonial-era Europeans thought about "civilization" when it came to understanding why not everyone did things the way they did.

I would argue, on the contrary, that, when we're talking about programming languages, "power" is a characteristic that no reasonably well-traveled programmer should feel comfortable thinking of as an objective characteristic, let alone something that can be crammed into a linear continuum. Lisp has some powerful features, yes. So does C. And the differences that a Lisp fan might say make Lisp more powerful than C correspond remarkably closely to the things that a C fan might say make C more powerful than Lisp.

They're both right, too. Because any flat statement that "X is a powerful language" is incomplete. There's an unstated major premise, there - what problem are you trying to solve? That will be a big factor in determining what powers you need and don't need.

Which is why, around the time that Graham was writing this essay, I was implementing my first really big solo programming project in a mix of Scheme and C.

tl;dr: Yeah, what TFA's author said.


I always just assume they mean expressive power when “power” is mentioned.


Which is a very dubious "power" in a language not designed for the academic exploration of programming itself.

General principle: the more "clever" a thing is, in terms of making intricate use of the language's expressive power, the more incompatible, confusing, buggy and slow it is.

Another general principle: if your language gives you wheelwright tools but no wheels, nobody's wheel will fit anybody else's axle until conventions are established. And those conventions will make the open freedom of tools obsolete in practise. And they will be a hidden burden of learning to use the language.


Expressiveness can mean several subtly different things. The one I'm most interested in is, bringing the way things are expressed in the programming language as close as possible to the way you'd describe the process in a natural language. Any definition of "expressive" that permits code golf is not one I want to use. For "expressive" to be a useful concept, its opposite needs to be "unreadable."


Just to pile onto that a little bit more - the Lisp family of languages is not unambiguously more expressive than other languages. I think that, when Lispers claim that Lisp is really expressive, what they really mean is that it has macros, and the supporting features that make macros work so well in Lisp.

The problem is, most dialects also have some horribly unexpressive things, too. Dynamic scope, cadadr, fifty bajillion words for "equals", stuff like that. My least favorite is that, in most Lisps, code and data look exactly the same. No M-expressions means that you can't reliably understand the basic structure of a blob of code by simply skimming. You've always got to be carefully reading it. Which gets exhausting if you're working in an unfamiliar codebase. Perhaps that's why Lispers have less of a tendency to travel in packs.

There's really just one Lisp dialect that gets to unironically wear an "I'm super-duper expressive!" t-shirt, in my book, and it's a young one: Clojure.


> the Lisp family of languages is not unambiguously more expressive than other languages

Common Lisp is a local maximum in language programmability (code as data, macros, programmable reader, EVAL, COMPILE, Meta-object Protocol, etc.).

> The problem is, most dialects also have some horribly unexpressive things, too. Dynamic scope, cadadr, fifty bajillion words for "equals", stuff like that.

The opposite is true. Common Lisp is its own code generation target - as such it has both low-level (blocks, go to, basic data structures, ...), mid-level (structures, streams, ...) and high-level (CLOS + MOP, macros, ...) programming constructs. Something like the additional dynamic scope expands the expressiveness of the language.

> No M-expressions means that you can't reliably understand the basic structure of a blob of code by simply skimming.

That's unrelated. You can easily read s-expression code as a human. It may only be unfamiliar to you - just like riding a bicycle is unfamiliar when you have never done it. Once you learn it, it's no issue.

   (defun foo (a)
     (+ a 10))
is no less readable than

   defun foo (a)
     a + 10
It's only a matter of a bit training. Humans are actually very good at adapting to different sign systems (think about the differences between English, Arabic and Japanese) - each speaker of the various language groups think that the other language is difficult and the own one is natural.

There is a bit of fear against Lisp code - this is partly justified, because of the extensible syntax - but that's unrelated ti s-expressions - one can have extensible syntax in M-expression based variants -> this creates the same problem of unlimited syntactic structures.

M-Expressions are not easier to reliably understand once you have macros in M-Expressions - which current versions of syntaxes similar to M-expressions actually support. See for example the RLISP of REDUCE.


Other posters have given serious answers. I would add my own: its an excuse for a language superiority complex. It assumes that everyone using "less powerful" languages do so because they just don't understand "more powerful" languages while excluding the idea that they might just find that language X is the right fit for their domain. (e.g.: asm is probably more "powerful" than python, but I'd rather do string manipulation in the latter)


eh, it seems you got it backwards, pg's main point seem to me to be productiveness, he even talks about python being one of the best(lispier) of the mainstream languages, so in that context python would be more powerful than asm and not the other way around.


Substitute "productivity" there for power and I think the point still stands. PG sees a hierarchy of languages based on some inherent goodness of the language - and surprise, surprise, his favourite language, Lisp, is on top. I think that's BS. How good a language is depends more on the problem than some inherent quality of the language. I think it's pretty hard to get past the fact that a term like "Blub" clearly indicates that PG feels that he's a superior, higher class programmer than those who use "Blub" languages.


You say it depends on the problem you're solving, but if you turn around and look at the problems people are solving, you could easily see them falling largely into a few common classes.

Like, we all will admit that no natural language is inherently better than any other, but yet we still recommend that programmers learn English -- the quality of the language may not be inherent to the language, but it could be inherent to the context in which the language exists.


It's not what PG precisely said that matters, it's the intent and notion behind it, which is misguided (and different people could use it to attribute it to asm or idris or whatever -- the notion of the hierarchy of languages that is concrete and task-independent).


as in blub ~= verbose and boilerplatey


I think it can't be reduced to that, I think a more important aspect is expressiveness, which isn't exactly terseness, I think some historical memory is pertinent here as even classic "blub" languages such as java have adopted plenty of functional characteristics since then, so the options available were very different.


The problem with 'expressiveness' is that beyond a certain point, the cognitive load imposed by reading the code exceeds any benefit from the increased density and abstraction. You basically have to decompress the code in your head in order to comprehend it in any useful way.


Only if it's written so badly that it leaks, or you need to modify that particular abstraction. In all other cases, you should be able to read the code easily and understand what it expresses without the need for exploring underlying layers of abstraction.

I agree it's hard to write good abstractions, but the problem isn't really in Lisp per se - it's just that being able to express any abstraction you want cleanly makes you realize that coming up with correct abstractions is very hard mental work. You may refrain from doing it, but Lisp at least gives you the option.


true, but I also think that blub is not really about smug or being condescending hence my comment above; expressiveness / hackability / self tooling


> asm is probably more "powerful" than python, but I'd rather do string manipulation in the latter

I don't think that's quite what PG meant by power. That being said, I do not think I would be able to close the distance between our understandings, sadly.


>That being said, I do not think I would be able to close the distance between our understandings, sadly.

Or you could just say what you think PG meant, e.g. "by more powerful he means a language more easily manipulated and expressive".

How about that instead of the current comment ending that can be understood as "I've given up hope that you'll ever understand what I mean".


It comes from a mistaken argument in an essay by Paul Graham. Others have explained the idea a bit. I'm going to explain why I think it is mistaken.

The essay says that Lisp is at the top of the power curve. Lisp programmers, looking at Haskell, are sure that they are looking down. They say, "How can you get anything done in Haskell? It doesn't even have (Lisp-style) macros."

But the Haskell types (pun intended) are also sure that they're at the top of the power curve. When they look at Lisp, they're also sure that they are looking down. "How can you get anything done in Lisp? It doesn't even have a decent (Hindley-Milner) type system."

Here you have proponents of two different languages, both sure that they're at the top of the power curve, and both sure that the other language is beneath their language. Something's wrong here...

What's wrong is the idea that languages can be placed on a one-dimensional axis labeled "power". That idea is mistaken.

To see why, think about hardware. We know what we mean by power in hardware - MIPS. But then someone says, "Well, we've got that floating point code, so we have to care about FLOPS. And then there's that data set that won't fit in cache, so we have to care about main memory bandwidth... except that sometimes you can trade off MIPS for larger cache size...." Now you've got at least four axes - MIPS, FLOPS, memory bandwidth, and cache size.

Then some psychopath comes into the room and says, "What I mean by 'power' in hardware is battery life." That is, their definition of power is something like 1/watts, which, for battery-operated hardware, is not as clearly unreasonable as the physical units might lead you to think.

Back to software. "Power"? Power for what? For writing programs. OK, which programs? For "general programming"? But I've never written a general program in my life. I've written a bunch of specific ones, though. So what I actually care about is power for writing this program. For this program, I have to ask what makes it hard to write the program, and pick the language that goes the furthest toward solving those problems for me.

High performance computing (or, as the article says, games)? I might need something that gives me control of memory layout. I don't want to try to deal with that in either Lisp or Haskell. Real-time computing? Let's skip garbage collection, thanks. (Yes, I know, it can be done. It sure doesn't make the problem of deterministic response time easier, though.) Let's also skip recursion and laziness.

On the other hand: Anytime we can, let's skip manual memory management.

Pick the language based on the characteristics of the problem. Don't blindly pick any language because it's "the most powerful".


I learned a definition of power that seems pretty unambiguous, at least to me: how easy is to build big things combining small things.

I've seen it very clearly in languages with a severe lack of power. In VHDL there aer familiar structures, like "if" blocks. But then I found that you can't nest certain structures and expect them to work.

I've seen it in more powerful languages. You can build functions, modules, but sometimes it's very difficult to abstract certain behaviours. They tried to fill that gap with AOP, annotations, but I find them patchy and unelegant.

I won't say that one language is "superior" to all others, but "power" is not so vague that can't be measured.


How about a more direct measure of "power", the ability to get things done? Take a group of developers and have them build something of value, then measure the value of that thing. How do you measure the value? Well, there are a couple easy ways to do that. Widespread adoption, especially people building other valuable stuff on top of what the team built. That's a strong endorsement. Also paying money for something represents an obvious signal of value, of course.

So you ask yourself, what sorts of languages are the things of obvious value built out of out there in the world? When you look you find a wide panoply of answers: C/C++, javascript, python, php, even bash scripts (even, ugh, batch scripts). You see a few examples of purportedly "advanced" languages like erlang or lisp (and relatives) in actual, shipping, functional, highly used software, but not that many. The arguments that there aren't enough brave pioneers out there learning these languages and trying to build stuff with them (there are tons) or that they just need critical mass of tools support to take off or something are fundamentally not serious. Devs work in all sorts of niche stacks constantly, and they're shipping stuff in all sorts of crazy languages all the time as well.

Ultimately, it turns out that the "power" of a language encompasses a lot more than expressiveness or "functionality". It may be really nice to have macros and higher-order functions and all that jazz but to some extent there's a law of diminishing returns. This is true in hand tools as well. Sometimes it's actually faster and easier to just put a workpiece in a vice and do some hand work on it with a file instead of trying to figure out all the right G-code to get the 6-axis CNC to do the work for you.


How about a more direct measure of "power", the ability to get things done?

Honestly, I don't think that's a more direct measure, on the contrary, though it's a good measure of value. The question you ask is more of "is really power what we need?".

I would say yes, we need power, but not only. I actually agree with AnimalMuppet: some factor (like battery life) can render useless an otherwise impressive device.

My objection is that you can't call "power" to any factor that makes you more productive.

It's not just nitpicking. If you redefine power to mean value in general, you lose a more precise definition and take real power out of the equation to create value.

One of the little insights that I've found useful when thinking about all kind of problems is the often overlooked importance of negative factors. People usually look closely to power, acceleration and readiness, but fail to consider factors that are absent or that prevent a process to get traction.

I believe that the problem with Lisp is not that power is not important, or that we should call other things power, but that power is not sufficient by itself and there are other factors that prevented it for becoming the dominant language.


Yes, power can have objective meaning. You still can't order languages along an axis, but powerful features make a powerful language, and features that compose well are more than additively powerful.

Overwhelmingly, the most important criterion for the power of a feature is the degree to which it enables wrapping important semantics in a library, without creating overhead that makes you worry about whether you can afford it. It's better when the user doesn't need to understand the feature to use the library, although some enable better library interfaces; some of those, used in the interface, still don't demand the user know why. That's power.

Libraries are more powerful, and make their host language more powerful, when they (1) can themselves be composed, with one another, and with powerful language features, and (2) exist.

This is why the libraries you want to use always seem to be in some other language.


> The essay says that Lisp is at the top of the power curve. Lisp programmers, looking at Haskell, are sure that they are looking down.

The article never mentions Haskell, or any similar language. It compares Cobol and Python, and it compares Perl 4 and Perl 5. It never compares two languages where people on both sides think they're on the more powerful side.

> What's wrong is the idea that languages can be placed on a one-dimensional axis labeled "power". That idea is mistaken.

It's also not in the essay. It says: "Note to nerds: or possibly a lattice, narrowing toward the top; it's not the shape that matters here but the idea that there is at least a partial order."

> The essay says that Lisp is at the top of the power curve.

True, but he carefully prefixes it with "Where they fall relative to one another is a sensitive topic." Along with the lattice comment, he leaves plenty of room for other languages to also be "at the top".

> High performance computing (or, as the article says, games)? I might need something that gives me control of memory layout.

This is addressed in the essay, too: "When you're writing desktop software, there's a strong bias toward writing applications in the same language as the operating system."


> > The essay says that Lisp is at the top of the power curve. Lisp programmers, looking at Haskell, are sure that they are looking down.

> The article never mentions Haskell, or any similar language. It compares Cobol and Python, and it compares Perl 4 and Perl 5. It never compares two languages where people on both sides think they're on the more powerful side.

That's because the article doesn't think that can happen. I think it can, and I give an example.

> > What's wrong is the idea that languages can be placed on a one-dimensional axis labeled "power". That idea is mistaken.

> It's also not in the essay. It says: "Note to nerds: or possibly a lattice, narrowing toward the top; it's not the shape that matters here but the idea that there is at least a partial order."

I fail to see how being a lattice allows there to be two elements where each is greater than the other. My criticism doesn't depend on partial vs. strict ordering; it depends on only one axis vs. multiple axes. Note that with multiple axes, you can't do even a partial order without imposing some outside criterion.

> > High performance computing (or, as the article says, games)? I might need something that gives me control of memory layout.

> This is addressed in the essay, too: "When you're writing desktop software, there's a strong bias toward writing applications in the same language as the operating system."

While games can be desktop software, high performance computing usually isn't.


> That's because the article doesn't think that can happen.

I don't see what else he could mean by "lattice" or "partial ordering", except an admission that this can and does happen. Help me understand.

> I fail to see how being a lattice allows there to be two elements where each is greater than the other. My criticism doesn't depend on partial vs. strict ordering

Isn't that exactly what a lattice allows? The absolute ordering of Lisp and Haskell is undefined, while both are greater than some (like Cobol), and probably less than others. Users of either might think theirs is greater, but each has features the other doesn't, which is what makes the ordering merely partial.


If I understand correctly, a lattice or partial ordering means that A = B is possible for different A and B, but if A < B, then "B < A" is false.

But even if I'm wrong about a lattice, my claim is still true for programming languages. Perl > Haskell... for certain situations. Even assembly > Lisp... for certain situations. And yet assembly and Lisp are so far apart on the "power curve" that, if Lisp is not > assembly always, then... do we have any kind of ordering at all?

I view programming languages as being a tree, with branches spreading in all different directions. Some languages reach further in certain directions than other languages do. Your program is a vector. (Really, it's a bunch of different vectors, of different lengths, pointing in different directions, but one direction is the problem that makes it the hardest to write the program that you're trying to write.) Pick the language that reaches the farthest in the direction of whatever makes it hard to write the program that you're trying to write.

This means that your problem imposes an order on languages... for that problem. It's not a universal order, though. Others, with a different kind of problem, may view your order as completely wrong. And they are right - for their problem, your order is in fact wrong.


You are thinking of a total order (in fact total orderings or often defined with the '>=' relation). A partial order is like a total order except there may be some elements that are not comparable, so given elements {A, B, C} we can have C < A and C < B but whether A < B or B < A can remain undefined. A lattice is a kind of partially ordered set.


> It never compares two languages where people on both sides think they're on the more powerful side.

Which is precisely why adding Haskell into the mix is a useful extension of the discussion.



I think we see more posts about Lisp community drama than we see posts about people actually doing things with Lisp.


Except for the the Clojure branch of the family tree.

Eg: there was this vaguely clojurey statically typed non-GC language, being targeted for games, presented at a recent Clojure conf: https://github.com/carp-lang/Carp


Ayup. I don't really see many Clojure drama posts, here.


Honestly, I can't do anything but put blame for this on people like Paul Graham.

Some people come to Lisp and expect to discover a magic wand to abolish all blub languages that was prophecised by the legendary almighty Lisp Prophet, and they rush out to seek it.

Then they realize that Lisp is just another programming language equivalent to all others by means of Turing completeness, with particular features, strengths and weaknesses they feel cheated and disenchanted, and go wing our their frustrations by means of articles like that.

Which is directly why articles like Baggers' happen.

The question is: why do people actually consider Lisp to be the Holy Grail By Which All ALGOL-Like Languages Must/Shall/Will Die? And the answer that instantly comes to my mind is all the unfair PR people like Graham give it. Calling Lisp a magical all-fixing wand is all fun and games until someone actually believes in it (who wouldn't believe The Y-Combinator Guy? after all), and then discovers that the reality doesn't really look that straightforward, and writing Lisp doesn't suddenly mean that you get to do less work than you'd otherwise do. As pointed out in Shinmera's response, https://reader.tymoon.eu/article/370 :

I believe Lisp allows me to be quicker about developing these tools than other languages, but making an actual game would be even quicker if I didn't have to make most of these tools in the first place.


The lisp community has a nice history about not being a very welcoming place, with a lot of people looking up to the nice attitude of people like Erik Naggum.

Things have gotten a lot better though.


Erik Naggum was not 'the Lisp community', though there were some people worshipping him on comp.lang.lisp - both from inside Lisp and outside. comp.lang.lisp was a popular forum, but not THE Lisp community. I always thought that his online behavior was kind of an ugly personality disorder.


I know. I should ha en put it more eloquently. But as you said, there were some people that idolised him. I found that for a long time people trying to learn lisp were confronted with really bad attitude from a loud minority.

It really didn't really make people feel welcome. I am but a visitor in lisp land (I work in CL a couple of times a year), and things have become noticeably nicer since 2010.


YES! I created an account after a year of lurking just to affirm your post.


I think that's because there's just actually more new Lisp community drama than there is actual Lisp programming.


Lisp has been used in AAA games for scripting. https://en.wikipedia.org/wiki/Game_Oriented_Assembly_Lisp


GOAL wasn't just scripting, it was the main language for the engine. In the one example I've ever seen of GOAL code, you can clearly see inline ASM for instance.

Interestingly Halo used Lisp for scripting, and that rarely comes up.


Naughty Dog abandoned GOAL during the PS3 era, but still uses PLT scheme for scripting.

https://news.ycombinator.com/item?id=9813111 https://www.gamasutra.com/view/feature/134566/the_secrets_of...


Yeah, they got bought by Sony to become a first party developer and share their code with other devs, necessitating the switch to C++.


> GOAL encourages an imperative programming style: programs tend to consist of a sequence of events to be executed rather than the functional programming style of functions to be evaluated recursively.

That alone is a pretty hard condemnation of LISP/functional for games.

It's not really LISP/Scheme at that point, but a LISP-ish syntax for an imperative language.

Some tasks are more innately imperative. Games (and game scripting) are in that category. Trying to force game programming to work in FP is like trying to build a computer in Minecraft: It's cool to see that it can be done, but it doesn't say anything about whether it should be done.

Proving Turing Completeness doesn't mean you would want to actually write code in a Turing Machine.


Lisp never was about functional programming (even if, in some way, it has invented it, or at least introduced the important building blocks). The Lisp family grew to be about symbolic computation, interactive development, and ability to modify the language to fit your needs (via macros) - which essentially eliminates whole classes of boilerplate, and allows you to directly express abstractions that cannot be specified cleanly in other languages.

Imperative is normal for most Lisps.


Just curious, but what do you mean by symbolic computation? Do you just mean evaluating mathematical expressions in a symbolic fashion (as opposed to numeric calculations), or do you mean something more general?


Yes, both. AFAIK symbols in Lisp were originally intended to support AI research, but part of that is being able to do symbolic manipulation and computation, be it analysis, algebra or logic.


Lisp isn't a functional language (in the sense that you appear to mean; it has functions as a first-class type, of course, but that's different). This code is imperative:

    (tagbody
      start
        (decf x)
        (format t "~a~%" x)
        (when (plusp x)
          (go start)))
Lisp works just fine in imperative contexts.


In what sense is Lisp not an imperative language?


That isn't really Lisp -- no cons, no garbage collection, no dynamic typing. It's closer to C than it is to Lisp, at least in terms of language semantics.


There are about a dozen or "Lisp dialects" with similar properties.

Even on GCed Lisps there were programs written which did manage memory manually.


I haven't programmed in Lisp yet, but I love the idea: it seems very directly inspired by lambda calculus. However, I tend to think of it as the opposite pole of expressiveness from ASM. Lisp excels at abstraction, but is not good at bit-twiddling, or (from my limited perspective) interacting with anything that isn't Lisp.

That said, I know that my perspective is incomplete at best, so I would listen very attentively to anyone who might disagree with it.


I made a Lisp dialect called TXR Lisp. I gave it a nice, ergonomic FFI. I took the MSDN "Your First Windows Program" which is a C sample, and translated it expression for expression:

https://rosettacode.org/wiki/Window_creation#Win32.2FWin64

The C original is here:

https://docs.microsoft.com/en-us/windows/desktop/learnwin32/...

The Lisp program recreates all of the needed C structure definitions, and binds to the needed functions from user32.dll and other libraries.

Here is another TXR Lisp program which parses TCP/IP packets right out of the "pcap" style output from tcpdump:

https://unix.stackexchange.com/a/379759/16369

Lisp dialects tend to be close-to-the-metal languages whose workings can be explained using bits and bytes (even if such an explanation is not always accurate due to issues of optimization and whatnot).

When I'm debugging Lisp, I think of word-sized arguments on a stack, and pointers and all those same kinds of things like when debugging a C program.

Historic Lisp implementations were bootstrapped from assembly code. Lisp primitives were written as a library of machine language routines. With that, plus I/O and memory management, you just need an eval routine for interesting things to start happening. The car and cdr terms are inspired by machine instructions, and they help keep us grounded in reality.


Thank you for the response. The article at hand suggests:

> Performance (beyond the efficiency gained from appropriate algorithms) is going to be dictated from your ability to control what is being processed and when. Maybe I can’t afford a GC in the middle of my frame, maybe to get the entity count I need I really have to maintain data locality.

> ANSI CL doesn’t have many provisions to allow for this kind of control. Certain implementations do have some and you can do a lot with CFFI, but at what point do the restrictions you impose on yourself making another language a better candidate?

This to me suggests that while certain operations may be "close to the metal" as you say, this is not necessarily descriptive of the whole, or perhaps that "close to the metal" may mean a number of different things. Other languages seem to offer more control, perhaps at the expense of abstract expressive power.

Also, while I believe your description of history is accurate as far as it goes, I had rather thought that the Lisp Machines were invented to run Lisp instructions in hardware. I'm not sure if that might argue against your point, nor if those were actually significant in the development of Lisp.

Thank you once again for helping to rectify my ignorance.


If I use a garbage collector in an assembly language program, then I get pauses; the assembly language is still "close to the metal".

There are ways to avoid garbage collection pauses in Lisp or minimize them ranging from program design and coding techniques, as well as compiler optimizations to avoid the consing that leads to garbage collection, to garbage collectors that perform incrementally so as to conceal the pauses by amortization over time.

Lisp was implemented on IBM mainframes initially and then various other mainstream hardware. Research on Lisp-oriented hardware began in 1973 or so, and wasn't commercialized until 1979 or so; the "boom" in Lisp machines could be regarded as a 1980's phenomenon.

Lisp machines allow for type checking to be significantly reduced in cost. The hardware instructions themselves can perform a check such as whether the two operands to an add instruction are integers. A Lisp machine can have idealized type representations as well, so that Lisp programs aren't disadvantaged on it. E.g. It's hard to stick a type tag into floating-point numbers on conventional hardware (doing so involves a loss of mantissa bits), but we could design Lisp hardware which has it has type bits as part of the floating representation.

Run-time type checking overhead can also be reduced in compiled Lisp programs on conventional machines thanks to type information which is either inferred to some extent, or comes from declarations, or a combination of both.


Just to be entirely clear, are you disagreeing with the author about the suitability of Lisp for video game development?


The author is noted for using Lisp for game development.


The author says:

> I am not going to make any bigger claims on the community or industry as a whole so this is just where I’m at right now. I’ll also touch a little on why I’m not using lisp for my current games.


Also says:

> CL is a great language that still has a lot to give. I’m gonna be using it for a long time including for games. It lives in the place of being fantastic and performant for scripting and also having the control to be truly viable for higher performance code. The implantation specific extensions to the language open up a ton of possibilities for squeezing more out of the machine and the abilities that macros afford us means there is a lot of meaningful stuff we can do trivially that require complex and unwieldy systems in other languages.

And:

> I’d heartily recommend making a non-trivial game in lisp.


Kay. But they did have some criticisms, and we may take it that you disagree with those?


Common Lisp is fine for bit-twiddling, you can see it for yourself if you read this chapter in Practical Common Lisp: http://www.gigamonkeys.com/book/practical-parsing-binary-fil...

The FFI of Common Lisp is on par with more mainstream languages like Python.


Does anyone have any background on the author of this article?

Sounds like they make niche games I'd love to try some out. Any recommendations?


Documentation for his library CEPL has full name and email. http://quickdocs.org/cepl/

I'm not up to date on actual games from him.

He's pretty active writing public source code https://github.com/cbaggers

Sometimes publishes cool video tutorials live coding 3D graphics. https://www.youtube.com/playlist?list=PL2VAYZE_4wRKKr5pJzfYD...


I wrote a 2d shooter in Scheme. It was decent. Ran fast as the dickens on Windows.


This was, hands-down, the single best article I have ever read about Lisp, after three decades of reading all kinds of stuff. It so far outclasses anything Paul Graham ever wrote on the subject that he (Paul) would best never mention Lisp ever again.

If only it were not ("just") a reply to a shitpost, I would print it out (yes, some of us old-timers still do that) and hang it on the wall, so I can point to it when somebody mentions Lisp and say "read that first -- if you disagree, then take it up with him, and don't mention Lisp to me ever again".

My Hat Is Off To You, sir.


[flagged]


Please don't post unsubstantive comments here. We particularly don't need programming language flamewars.


What? I don't think the majority of people interested in lisp are going into work looking down their noses at people. They're probably mostly people who like to try different tools and learn new things. I don't use lisp but I loved reading through PG's ANSI Common Lisp book because it really challenged me to think about programming and problem solving in a way I never had before. I think I write better code because I did learn lisp.

https://www.amazon.com/exec/obidos/ASIN/0133708756


Go and read http://wiki.c2.com/?BlubParadox . I've heard many lisp people reference this, which basically says lisp is so great that it is the best language which could possibly exist, and people who don't think that just think that because the languages they like are too stupid to let them appreciate lisp.


Yup. I love Lisp - very glad I learned it, but I find PG's Blub Paradox essay enormously offensive.


This sort of "lispers are hipsters", "haskellers are hipsters" comments are very destructive to our community yet I see them in every single lisp thread. Very unfortunate. I only wish this will stop one day.


In the meantime while they argue, the neanderthals in the games industry are building their products in C++ and making millions of dollars. Heathens!


Does this comment imply the only (reliable) way to make million dollars is using C++? My company uses (mostly) python and we're making millions of dollars too. It's almost as if for every different task there might be a better tool solving the problem.


No it doesn't, I think you may be reading way too much into my comment.


I have a very-very good Haskell programming friend, and quite a few times I caught him start speaking in technobabble to impress a couple of blubers.

I also have a few Scala friends, and I can see the obvious Haskell envy they have, they feel the need to become good at it to prove their alphaness.


So what? My whole point is that discussing these sort of social "issues" in lisp/haskell community adds nothing to the discussion at hand. It doesn't have any implication to lisp or haskell, we learn nothing from these sort of comments. They're just visual noise, almost put as a bait to trigger an ad hominem comment chain.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: