Hacker News new | past | comments | ask | show | jobs | submit login

> When you speak of non lisps are ‘blub’ you disregard the actual characteristic of those languages and how they fit to the task.

What does 'blub' mean?




It refers to Paul Graham's article about Lisp http://www.paulgraham.com/avg.html Blub basically means average mainstream languages.

> I don't want to hurt anyone's feelings, so to explain this point I'm going to use a hypothetical language called Blub.

> As long as our hypothetical Blub programmer is looking down the power continuum, he knows he's looking down.

> But when our hypothetical Blub programmer looks in the other direction, up the power continuum, he doesn't realize he's looking up. What he sees are merely weird languages.


That whole "Blub" section reads to me as a snowglobe version of the Lisp community's long-standing failure to communicate with everyone else.

Graham starts with his "shockingly controversial statement" that "programming languages vary in power", which, I don't know that that is controversial, but OK. I'm with him that far. But he immediately loses me, because all of his exposition on this idea consists of beating around the bush, talking about characteristics of programming languages that are most definitely not the same thing as power, before moving on to this rather arrogant-sounding parable that makes it sound like he thinks of this "power" idea in remarkably similar terms to how colonial-era Europeans thought about "civilization" when it came to understanding why not everyone did things the way they did.

I would argue, on the contrary, that, when we're talking about programming languages, "power" is a characteristic that no reasonably well-traveled programmer should feel comfortable thinking of as an objective characteristic, let alone something that can be crammed into a linear continuum. Lisp has some powerful features, yes. So does C. And the differences that a Lisp fan might say make Lisp more powerful than C correspond remarkably closely to the things that a C fan might say make C more powerful than Lisp.

They're both right, too. Because any flat statement that "X is a powerful language" is incomplete. There's an unstated major premise, there - what problem are you trying to solve? That will be a big factor in determining what powers you need and don't need.

Which is why, around the time that Graham was writing this essay, I was implementing my first really big solo programming project in a mix of Scheme and C.

tl;dr: Yeah, what TFA's author said.


I always just assume they mean expressive power when “power” is mentioned.


Which is a very dubious "power" in a language not designed for the academic exploration of programming itself.

General principle: the more "clever" a thing is, in terms of making intricate use of the language's expressive power, the more incompatible, confusing, buggy and slow it is.

Another general principle: if your language gives you wheelwright tools but no wheels, nobody's wheel will fit anybody else's axle until conventions are established. And those conventions will make the open freedom of tools obsolete in practise. And they will be a hidden burden of learning to use the language.


Expressiveness can mean several subtly different things. The one I'm most interested in is, bringing the way things are expressed in the programming language as close as possible to the way you'd describe the process in a natural language. Any definition of "expressive" that permits code golf is not one I want to use. For "expressive" to be a useful concept, its opposite needs to be "unreadable."


Just to pile onto that a little bit more - the Lisp family of languages is not unambiguously more expressive than other languages. I think that, when Lispers claim that Lisp is really expressive, what they really mean is that it has macros, and the supporting features that make macros work so well in Lisp.

The problem is, most dialects also have some horribly unexpressive things, too. Dynamic scope, cadadr, fifty bajillion words for "equals", stuff like that. My least favorite is that, in most Lisps, code and data look exactly the same. No M-expressions means that you can't reliably understand the basic structure of a blob of code by simply skimming. You've always got to be carefully reading it. Which gets exhausting if you're working in an unfamiliar codebase. Perhaps that's why Lispers have less of a tendency to travel in packs.

There's really just one Lisp dialect that gets to unironically wear an "I'm super-duper expressive!" t-shirt, in my book, and it's a young one: Clojure.


> the Lisp family of languages is not unambiguously more expressive than other languages

Common Lisp is a local maximum in language programmability (code as data, macros, programmable reader, EVAL, COMPILE, Meta-object Protocol, etc.).

> The problem is, most dialects also have some horribly unexpressive things, too. Dynamic scope, cadadr, fifty bajillion words for "equals", stuff like that.

The opposite is true. Common Lisp is its own code generation target - as such it has both low-level (blocks, go to, basic data structures, ...), mid-level (structures, streams, ...) and high-level (CLOS + MOP, macros, ...) programming constructs. Something like the additional dynamic scope expands the expressiveness of the language.

> No M-expressions means that you can't reliably understand the basic structure of a blob of code by simply skimming.

That's unrelated. You can easily read s-expression code as a human. It may only be unfamiliar to you - just like riding a bicycle is unfamiliar when you have never done it. Once you learn it, it's no issue.

   (defun foo (a)
     (+ a 10))
is no less readable than

   defun foo (a)
     a + 10
It's only a matter of a bit training. Humans are actually very good at adapting to different sign systems (think about the differences between English, Arabic and Japanese) - each speaker of the various language groups think that the other language is difficult and the own one is natural.

There is a bit of fear against Lisp code - this is partly justified, because of the extensible syntax - but that's unrelated ti s-expressions - one can have extensible syntax in M-expression based variants -> this creates the same problem of unlimited syntactic structures.

M-Expressions are not easier to reliably understand once you have macros in M-Expressions - which current versions of syntaxes similar to M-expressions actually support. See for example the RLISP of REDUCE.


Other posters have given serious answers. I would add my own: its an excuse for a language superiority complex. It assumes that everyone using "less powerful" languages do so because they just don't understand "more powerful" languages while excluding the idea that they might just find that language X is the right fit for their domain. (e.g.: asm is probably more "powerful" than python, but I'd rather do string manipulation in the latter)


eh, it seems you got it backwards, pg's main point seem to me to be productiveness, he even talks about python being one of the best(lispier) of the mainstream languages, so in that context python would be more powerful than asm and not the other way around.


Substitute "productivity" there for power and I think the point still stands. PG sees a hierarchy of languages based on some inherent goodness of the language - and surprise, surprise, his favourite language, Lisp, is on top. I think that's BS. How good a language is depends more on the problem than some inherent quality of the language. I think it's pretty hard to get past the fact that a term like "Blub" clearly indicates that PG feels that he's a superior, higher class programmer than those who use "Blub" languages.


You say it depends on the problem you're solving, but if you turn around and look at the problems people are solving, you could easily see them falling largely into a few common classes.

Like, we all will admit that no natural language is inherently better than any other, but yet we still recommend that programmers learn English -- the quality of the language may not be inherent to the language, but it could be inherent to the context in which the language exists.


It's not what PG precisely said that matters, it's the intent and notion behind it, which is misguided (and different people could use it to attribute it to asm or idris or whatever -- the notion of the hierarchy of languages that is concrete and task-independent).


as in blub ~= verbose and boilerplatey


I think it can't be reduced to that, I think a more important aspect is expressiveness, which isn't exactly terseness, I think some historical memory is pertinent here as even classic "blub" languages such as java have adopted plenty of functional characteristics since then, so the options available were very different.


The problem with 'expressiveness' is that beyond a certain point, the cognitive load imposed by reading the code exceeds any benefit from the increased density and abstraction. You basically have to decompress the code in your head in order to comprehend it in any useful way.


Only if it's written so badly that it leaks, or you need to modify that particular abstraction. In all other cases, you should be able to read the code easily and understand what it expresses without the need for exploring underlying layers of abstraction.

I agree it's hard to write good abstractions, but the problem isn't really in Lisp per se - it's just that being able to express any abstraction you want cleanly makes you realize that coming up with correct abstractions is very hard mental work. You may refrain from doing it, but Lisp at least gives you the option.


true, but I also think that blub is not really about smug or being condescending hence my comment above; expressiveness / hackability / self tooling


> asm is probably more "powerful" than python, but I'd rather do string manipulation in the latter

I don't think that's quite what PG meant by power. That being said, I do not think I would be able to close the distance between our understandings, sadly.


>That being said, I do not think I would be able to close the distance between our understandings, sadly.

Or you could just say what you think PG meant, e.g. "by more powerful he means a language more easily manipulated and expressive".

How about that instead of the current comment ending that can be understood as "I've given up hope that you'll ever understand what I mean".


It comes from a mistaken argument in an essay by Paul Graham. Others have explained the idea a bit. I'm going to explain why I think it is mistaken.

The essay says that Lisp is at the top of the power curve. Lisp programmers, looking at Haskell, are sure that they are looking down. They say, "How can you get anything done in Haskell? It doesn't even have (Lisp-style) macros."

But the Haskell types (pun intended) are also sure that they're at the top of the power curve. When they look at Lisp, they're also sure that they are looking down. "How can you get anything done in Lisp? It doesn't even have a decent (Hindley-Milner) type system."

Here you have proponents of two different languages, both sure that they're at the top of the power curve, and both sure that the other language is beneath their language. Something's wrong here...

What's wrong is the idea that languages can be placed on a one-dimensional axis labeled "power". That idea is mistaken.

To see why, think about hardware. We know what we mean by power in hardware - MIPS. But then someone says, "Well, we've got that floating point code, so we have to care about FLOPS. And then there's that data set that won't fit in cache, so we have to care about main memory bandwidth... except that sometimes you can trade off MIPS for larger cache size...." Now you've got at least four axes - MIPS, FLOPS, memory bandwidth, and cache size.

Then some psychopath comes into the room and says, "What I mean by 'power' in hardware is battery life." That is, their definition of power is something like 1/watts, which, for battery-operated hardware, is not as clearly unreasonable as the physical units might lead you to think.

Back to software. "Power"? Power for what? For writing programs. OK, which programs? For "general programming"? But I've never written a general program in my life. I've written a bunch of specific ones, though. So what I actually care about is power for writing this program. For this program, I have to ask what makes it hard to write the program, and pick the language that goes the furthest toward solving those problems for me.

High performance computing (or, as the article says, games)? I might need something that gives me control of memory layout. I don't want to try to deal with that in either Lisp or Haskell. Real-time computing? Let's skip garbage collection, thanks. (Yes, I know, it can be done. It sure doesn't make the problem of deterministic response time easier, though.) Let's also skip recursion and laziness.

On the other hand: Anytime we can, let's skip manual memory management.

Pick the language based on the characteristics of the problem. Don't blindly pick any language because it's "the most powerful".


I learned a definition of power that seems pretty unambiguous, at least to me: how easy is to build big things combining small things.

I've seen it very clearly in languages with a severe lack of power. In VHDL there aer familiar structures, like "if" blocks. But then I found that you can't nest certain structures and expect them to work.

I've seen it in more powerful languages. You can build functions, modules, but sometimes it's very difficult to abstract certain behaviours. They tried to fill that gap with AOP, annotations, but I find them patchy and unelegant.

I won't say that one language is "superior" to all others, but "power" is not so vague that can't be measured.


How about a more direct measure of "power", the ability to get things done? Take a group of developers and have them build something of value, then measure the value of that thing. How do you measure the value? Well, there are a couple easy ways to do that. Widespread adoption, especially people building other valuable stuff on top of what the team built. That's a strong endorsement. Also paying money for something represents an obvious signal of value, of course.

So you ask yourself, what sorts of languages are the things of obvious value built out of out there in the world? When you look you find a wide panoply of answers: C/C++, javascript, python, php, even bash scripts (even, ugh, batch scripts). You see a few examples of purportedly "advanced" languages like erlang or lisp (and relatives) in actual, shipping, functional, highly used software, but not that many. The arguments that there aren't enough brave pioneers out there learning these languages and trying to build stuff with them (there are tons) or that they just need critical mass of tools support to take off or something are fundamentally not serious. Devs work in all sorts of niche stacks constantly, and they're shipping stuff in all sorts of crazy languages all the time as well.

Ultimately, it turns out that the "power" of a language encompasses a lot more than expressiveness or "functionality". It may be really nice to have macros and higher-order functions and all that jazz but to some extent there's a law of diminishing returns. This is true in hand tools as well. Sometimes it's actually faster and easier to just put a workpiece in a vice and do some hand work on it with a file instead of trying to figure out all the right G-code to get the 6-axis CNC to do the work for you.


How about a more direct measure of "power", the ability to get things done?

Honestly, I don't think that's a more direct measure, on the contrary, though it's a good measure of value. The question you ask is more of "is really power what we need?".

I would say yes, we need power, but not only. I actually agree with AnimalMuppet: some factor (like battery life) can render useless an otherwise impressive device.

My objection is that you can't call "power" to any factor that makes you more productive.

It's not just nitpicking. If you redefine power to mean value in general, you lose a more precise definition and take real power out of the equation to create value.

One of the little insights that I've found useful when thinking about all kind of problems is the often overlooked importance of negative factors. People usually look closely to power, acceleration and readiness, but fail to consider factors that are absent or that prevent a process to get traction.

I believe that the problem with Lisp is not that power is not important, or that we should call other things power, but that power is not sufficient by itself and there are other factors that prevented it for becoming the dominant language.


Yes, power can have objective meaning. You still can't order languages along an axis, but powerful features make a powerful language, and features that compose well are more than additively powerful.

Overwhelmingly, the most important criterion for the power of a feature is the degree to which it enables wrapping important semantics in a library, without creating overhead that makes you worry about whether you can afford it. It's better when the user doesn't need to understand the feature to use the library, although some enable better library interfaces; some of those, used in the interface, still don't demand the user know why. That's power.

Libraries are more powerful, and make their host language more powerful, when they (1) can themselves be composed, with one another, and with powerful language features, and (2) exist.

This is why the libraries you want to use always seem to be in some other language.


> The essay says that Lisp is at the top of the power curve. Lisp programmers, looking at Haskell, are sure that they are looking down.

The article never mentions Haskell, or any similar language. It compares Cobol and Python, and it compares Perl 4 and Perl 5. It never compares two languages where people on both sides think they're on the more powerful side.

> What's wrong is the idea that languages can be placed on a one-dimensional axis labeled "power". That idea is mistaken.

It's also not in the essay. It says: "Note to nerds: or possibly a lattice, narrowing toward the top; it's not the shape that matters here but the idea that there is at least a partial order."

> The essay says that Lisp is at the top of the power curve.

True, but he carefully prefixes it with "Where they fall relative to one another is a sensitive topic." Along with the lattice comment, he leaves plenty of room for other languages to also be "at the top".

> High performance computing (or, as the article says, games)? I might need something that gives me control of memory layout.

This is addressed in the essay, too: "When you're writing desktop software, there's a strong bias toward writing applications in the same language as the operating system."


> > The essay says that Lisp is at the top of the power curve. Lisp programmers, looking at Haskell, are sure that they are looking down.

> The article never mentions Haskell, or any similar language. It compares Cobol and Python, and it compares Perl 4 and Perl 5. It never compares two languages where people on both sides think they're on the more powerful side.

That's because the article doesn't think that can happen. I think it can, and I give an example.

> > What's wrong is the idea that languages can be placed on a one-dimensional axis labeled "power". That idea is mistaken.

> It's also not in the essay. It says: "Note to nerds: or possibly a lattice, narrowing toward the top; it's not the shape that matters here but the idea that there is at least a partial order."

I fail to see how being a lattice allows there to be two elements where each is greater than the other. My criticism doesn't depend on partial vs. strict ordering; it depends on only one axis vs. multiple axes. Note that with multiple axes, you can't do even a partial order without imposing some outside criterion.

> > High performance computing (or, as the article says, games)? I might need something that gives me control of memory layout.

> This is addressed in the essay, too: "When you're writing desktop software, there's a strong bias toward writing applications in the same language as the operating system."

While games can be desktop software, high performance computing usually isn't.


> That's because the article doesn't think that can happen.

I don't see what else he could mean by "lattice" or "partial ordering", except an admission that this can and does happen. Help me understand.

> I fail to see how being a lattice allows there to be two elements where each is greater than the other. My criticism doesn't depend on partial vs. strict ordering

Isn't that exactly what a lattice allows? The absolute ordering of Lisp and Haskell is undefined, while both are greater than some (like Cobol), and probably less than others. Users of either might think theirs is greater, but each has features the other doesn't, which is what makes the ordering merely partial.


If I understand correctly, a lattice or partial ordering means that A = B is possible for different A and B, but if A < B, then "B < A" is false.

But even if I'm wrong about a lattice, my claim is still true for programming languages. Perl > Haskell... for certain situations. Even assembly > Lisp... for certain situations. And yet assembly and Lisp are so far apart on the "power curve" that, if Lisp is not > assembly always, then... do we have any kind of ordering at all?

I view programming languages as being a tree, with branches spreading in all different directions. Some languages reach further in certain directions than other languages do. Your program is a vector. (Really, it's a bunch of different vectors, of different lengths, pointing in different directions, but one direction is the problem that makes it the hardest to write the program that you're trying to write.) Pick the language that reaches the farthest in the direction of whatever makes it hard to write the program that you're trying to write.

This means that your problem imposes an order on languages... for that problem. It's not a universal order, though. Others, with a different kind of problem, may view your order as completely wrong. And they are right - for their problem, your order is in fact wrong.


You are thinking of a total order (in fact total orderings or often defined with the '>=' relation). A partial order is like a total order except there may be some elements that are not comparable, so given elements {A, B, C} we can have C < A and C < B but whether A < B or B < A can remain undefined. A lattice is a kind of partially ordered set.


> It never compares two languages where people on both sides think they're on the more powerful side.

Which is precisely why adding Haskell into the mix is a useful extension of the discussion.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: