Hacker News new | past | comments | ask | show | jobs | submit login
A man, a Plan. No canals (2011) (davidkendal.net)
81 points by randallsquared on Jan 9, 2013 | hide | past | favorite | 92 comments



This article is starting from completely the wrong point. The default state of a programming language is somewhere between "dead" and "niche". You wouldn't expect a language to take off unless it's got something special going for it. And Arc had very little. Basically lots of hype built around a cult of personality.

And in the other column, the Arc had significant things going against it. It was a toy implementation by somebody who really had more important things to do with his time. It didn't bring anything substantial to the table -- to a first approximation it felt like a basic Scheme with a idiosyncratic function naming scheme. It didn't have any killer features. It didn't have a significant standard library. Compare that starting point to e.g. Clojure or Go. It's like night and day.

The question isn't why Arc didn't take off. It's why anyone would expect it to take off in the first place.


You're being a bit uncharitable. Even though I haven't spent a huge amount of time on Arc, there's more to it than just hype. I know Common Lisp and Scheme really well, and I would much rather program in Arc than either of those. Now that I'm used to Arc, I find writing code in CL or Scheme painfully clumsy.

And though language design (as opposed to implementation) may not seem "substantial," it does matter. Scheme itself was initially an exercise in language design-- a cleaning up of MacLisp. It didn't only become "substantial" when people started writing complicated compilers for it.


I don't think the convenience of Arc is a very widely held view among Lisp users. But that's setting also setting the bar way too low. If Scheme or CL didn't exist, and somebody tried to launch them as new languages today with a first generation implementation, they wouldn't stand a snowball's chance in hell. Most probably CL or Scheme as new languages wouldn't get any traction even with a mature implementation + good library coverage right at the start. It's much easier for a language community to be self-perpetuating at a certain size than to actually grow to that size.

I agree that Scheme didn't only become substantial once the implementations became mature. But that's because it had a few fresh big ideas (lexical scoping/closures, CPS), not because it was new. Likewise it didn't become popular (as much as it ever became popular) just due to existing, but largely thanks to being used as the teaching language for a legendary intro course at a prestigious university.

As for the "hype" comment, that wasn't intended as a negative attribute. It's very hard for a language to succeed purely on its own merits. Tens of man years of effort had gone into Go before it was launched. It might still have failed had there not been such powerful marketing hooks in place (created by Bell Labs legends, having the appearance of being backed by Google). Arc had a good marketing hook as well. That would have made bootstrapping an initial community much easier than e.g. for Clojure.

And just to be clear this isn't intended as a personal criticism! There's just one YC, but hundreds of new languages and implementations appear every year, slowly advancing the state of the art.


Of course it isn't widely held, since hardly any have tried it. The question is whether it's true, and that is something I think I'm qualified to judge.


The discussion of PG's decision to use closures for web development seems off topic. That doesn't directly affect the popularity of the language; it's quite possible to do things in Arc without using closures for state on the web.

I played around with Arc quite a bit and the biggest problem was that it did not have a standard library with enough stuff in it. In contrast (and talking about a totally different type of language) one of the reasons Go has been successful is the excellent set of standard packages.

If you start your language with sufficient standard library kindling then others can build on that and write more and more libraries. But start with insufficient libraries and it's hard to take off.

So, the initial releases of Arc were constrained by things that PG had already written. If you wanted to do the same then it was just fine.

As as concrete example I wanted on my old UseTheSource site (which was in Arc) to go grab a URL from a third-party site inside the code. There wasn't an easy way to do that so I wrote an external script in Perl and did the following in Arc:

  (def hn-valid (user)
    (if (uvar user hn)
          t
          (let v (system (+ "perl check-hn.pl " user))
                  (if v (= ((profile user) 'hn) t))
                  v)))
The Perl script grabbed a page on Hacker News, did some light parsing and returned either 0 or 1. That really put me off making any extra effort to use Arc (and I like Lispy languages) because I was either going to shell out to something else, or have to write everything from the ground up.

A similar thing happened with UseTheSource's Twitter integration where I could use Perl's Net::Twitter::Lite to trivially integrate.


I bypassed the library problem by calling Python from Arc and returning the results as s-exps. It took 40 lines of Python code.


If a language's popularity was closely dependent on library support, then (in your opinion) why hasn't Clojure gained mass adoption?


My comment boils down to the logical statement

  NOT has large initial library => NOT gain widespread use
You seem to be making a logical error based on a common problem with people's understanding of modus tollens: http://blog.jgc.org/2009/05/frequently-misunderstood-logic-m...

It is NOT the case that from my statement that it's possible to derive:

  has large initial library => gain widespread use


It's a lot simpler to say "large initial library is necessary but not sufficient for widespread use".


Or "all successful languages have large initial libraries, but not all languages with large initial libraries will be successful".


Clojure IMO is gaining great mindshare as a practical Lisp, especially given how recent it is. I don't think anybody expects it to become as mainstream as Java. But it does attract programmers unfamiliar with FP, but whose appetite for languages higher in the power continuum has been whetted by languages like Ruby.

My colleague Steve presented a talk at RubyConf India titled 'Why Clojure is my favourite Ruby' which might appeal to such people. http://www.youtube.com/watch?v=PCdEbUBk6a0


It's easily the most successful lisp in over 20 years. How popular do you want it to be?


Relative to Arc, it certainly has.


I would say "can be closely dependent". And for a LISP, it is doing pretty darn well.


People who would like to use Clojure or Scale aren't fans of Java. And if you aren't a fan of Java, chances are good that you aren't a fan of the JVM, too.

I'm already using something else. Why should I use the Java eco-system?


I don't see why this should be. It sounds a bit irrational. From my perspective, Java is an iffy language and Java programs are often nightmarish to read. But the JVM? I see no reason why my feelings about Java and its culture should color my opinion of the JVM. The JVM is a technical marvel. It runs Ruby faster than Ruby with far less effort. It supports a whole host of very different languages, many of them quite good. And it's just blazing fast (modulo startup speed).

There are other valid choices, of course, and the JVM is obviously not the optimal choice for every project, but I think it's fallacious to rag on the JVM just because its most popular language isn't to your taste.

In short: Java bad. JVM very good.


> People who would like to use Clojure > or Scale aren't fans of Java.

I object to the word "fan", but I understand your meaning. Regardless, I love Clojure and like Java just fine.


I gained most of my expertise about the JVM running Ruby apps on it with JRuby (and after that, Clojure). I started out something of a Java-hater and eventually came around to kind of sort of liking Java (although I'd certainly rather write Ruby in the overwhelming majority of cases)


What a complicated answer to a question with a simple one. I spend all my time on YC now. The last release of Arc was years ago. So this case doesn't prove anything one way or the other about programming languages.


At the very least his point about continuation-based webservers is a good one.

I recently realized why the arc webserver needed to track IPs to ignore almost from day 1: everytime somebody requests a page, memory usage goes up. This includes crawlers, drive-by users, everybody. The standard way arc encourages webapps to be programmed (http://paulgraham.com/arcchallenge.html) is by leaking memory. Every load of the front-page creates 30 closures just for the 'flag' links. Ignoring IPs is just a band-aid to cover up this huge sink[1].

I've been programming in arc for 3 years and am still active on the arc forum. I love that arc is a small and simple language without lots of bells and whistles[2]. I really couldn't care less that it's 'not taken off'. But there's a different between toys that encourage exploratory play, and painting yourself into a corner design-wise. I now think of continuation-based webservers as an evolutionary dead end.

[1] Another ill-effect of the ip-ignoring is that every newcomer to arc who tries to run it behind a proxy server immediately runs into a gotcha: The arc webserver now thinks all requests are coming from the same IP address and so throttles everybody. http://www.arclanguage.org/item?id=11199

[2] If you spend any time with it it literally begs to be tinkered with. And the experience of programming in a language while referring to its compiler in another window is something everybody should experience.


I'm not sure what you mean by continuation-based. The Arc server can use closures to store state, but it doesn't use continuations.

Using closures to store state is like using lists as data structures: it's a rapid prototyping technique. You're in no way painting yourself into a corner, and the current news software is proof of that. You have a very old version of it. In the years since we released the version you have, we've gradually squeezed most of the closures out of the more common things, and we do have a proxy in front of the Arc server. That's how we manage to serve 1.7 million pages a day off one core.


I see. Yeah, I stand corrected.

I was conflating continuation-based and closure-based webservers because both allow straight-line code instead of explicitly managing url handlers.

And the traditional rhetoric in favor of this technique has come from Seaside, which popularized the name 'continuation-based' (http://www.linux-mag.com/id/7424; http://www.bluishcoder.co.nz/2006/05/server-side-javascript....)

I'd never seen anybody say this is just an early-prototyping technique. But searching found Patrick Collison concur: http://www.quora.com/Whats-the-best-continuation-based-web-f...

It was too harsh to say it paints us into a corner; it is possible to replace fnid links with new defops. I went back and looked at the frontpage when not logged-in, and saw that there are 0 fnid links in that common case. (One possible way to gradually use a second server would be to just serve the frontpage off it. You'd need to move to a centralized store for user/profile/story objects, but the fnids table could continue to live on one server.)

It also turns out that there's at least one company trying to scale continuation-based servers (http://bc.tech.coop/blog/040404.html). So it was overly harsh to call it a dead end.


PG: It could be instructive to be able to compare your new version with what is currently available.


His point on continuations is invalid.

"The problem: because closures can’t be stored in databases, you really have to use a hash table on your web daemon."

This is not true. Closures can be stored anywhere you want.


Yea, I think Gödel had something to say about this.


Would it be fair to say that if Arc had more success when you first released it that you would have been less willing to let the effort die? If so then the full analysis could still have merit.


In retrospect YC would have taken over my life no matter what. It has pushed out essays too, mostly.

I don't consider Arc to have died, incidentally. You used it to say that, and I'm using it to reply. If I ever retired from YC I'd probably start working on it more actively again.


I didn't intend to say Arc died, rather I thought I was stating that your effort died, but i can certainly see how it may come across that way.

FYI... I still use arc for certain projects and still have hope that one day it will get the attention it deserves. I know you've taken a lot of criticism about Arc, but I'll suggest there are quite a few of us that do appreciate the work you have done so far.


Do you have any timeframe for your involvement in YC?


Do you ever stop and think about working on it? After all, a language is a very personal project.


Without paying too much attention to the essay's content, the simplest explanation is that Common Lisp, Scheme and Clojure are more compelling.

Lispers are always tempted to blame Lisp's niche status on being too advanced, but it always sounds a bit like a smug non-answer to the job interview classic "what's your biggest weakness?" One friend who likes the syntax plus the existence of reader macros does not prove the syntax is not a big issue.

That said I think his conclusion is pretty sound, though you'd be hard-pressed to find much evidence of Unix-friendliness in C# or Java.


Maybe "friendly to a sufficiently powerful runtime" should replace "Unix-friendliness"?


That's good. I mean, people complained about using Git in Windows for ages because it was too Unix-friendly. At the same time, almost nobody actually likes SML/NJ, and I think part of that is because it has its own strange build system rather than autoconf/make/cmake etc. and until a few years ago really couldn't produce standalone executables.


The real problem is that many of those functional programming features touted by languages like Arc and Lisp have made their way into more mainstream languages like Python, and the types of problems that can better be solved with manipulating S-expressions are dwindling in cardinality.


I disagree. Homoiconicity and the metaprogramming it allows you[0] can never be present in a non-Lisp.

[0] Deterministically, I mean - heteroiconic languages do provide for metaprogramming, but it is inherently less robust than metaprogramming in a homoiconic language: http://lists.warhead.org.uk/pipermail/iwe/2005-July/000130.h...


Your comment is correct but it doesn't contradict what you were replying to. An Algoloid language with closures and lambdas and tail-call optimization etc. can't do 100% of what a Lisp can do and maybe not even 80%, but it can still do "enough" for many definitions of "enough".


Smalltalk, Io and (possibly) Perl6 & Rebol are good examples of non-Lisp homoiconic languages which have full & robust metaprogramming.


You can have metaprogramming, but it's not the same level by any means.

Take Perl, for example. You can't have robust metaprogramming on a language with an undecidable grammar. Read through the above link and you'll see the difference.


If you define "metaprogramming" as "homoiconic code generation" or "hygenic macros" or "the language I use to manipulate the AST is the same language as the AST", sure. I'm sure that's not the only thing people mean when they say "metaprogramming" though.


Minor niggle: the word is hygienic.


Indeed it is. The one time I don't proofread....


> You can have metaprogramming, but it's not the same level by any means.

In the languages I've mentioned it is as the same level. Here are some examples - http://news.ycombinator.com/item?id=3125375

> Take Perl, for example. You can't have robust metaprogramming on a language with an undecidable grammar

I'm referring to Perl6 here which is (intended to be) self hosted via perl6 grammars & also comes with hygienic macros:

- http://en.wikipedia.org/wiki/Perl_6_rules#Grammars

- http://en.wikipedia.org/wiki/Perl_6#Macros

While perl5 won't get this good you'll be surprised what metaprogramming can be done with it.

NB. For an example of macro-like things here is a list of CPAN modules that make use of Devel::Declare - https://metacpan.org/requires/distribution/Devel-Declare?sor...]

> Read through the above link and you'll see the difference.

I have and I remember reading it back in 2005 :) This doesn't affect the list of languages I mentioned earlier.


Well, there's Prolog.


Can you give three Examples?


Languages need momentum to gain foothold. It needs a community of dynamic, experimental, evangelistic pioneers to gather around your language and start building.

When I go to the Arc site [http://arclanguage.org/], it seems pretty obvious to me why Arc has not "taken off": A drab HTML 2.0 site with a tiny font describing how it's unfinished, and the only way to install is through another Lisp version, and no indication that there is a community of developers behind it.

It's almost like they don't want anyone to use it. To be fair, it seems Paul Graham does not care about popularity. But not caring about popularity means it's dead in the water.

It's not like lots of people don't want a Lisp these days. Clojure has become very popular, after all; it has been able to hit the sweet spot in terms of modernity and lispiness; good, practical technology with a solid community.


I really think the biggest problem that new languages have is contending with the colossal hegemony of three stacks: C, Java, and Javascript.

There are a few exceptions, but for the most part, your language is going to be successful to the degree that it can interoperate with these environments. Common Lisp, Racket, Haskell, OCaml -- all of these languages have ffis, but they're "begrudging" ffis. They don't really care about your "legacy" systems or the titanic number of C and Java libraries that already exist.

Clojure and Scala do care about that, as does Lua. And Ruby and Python (and Perl) care as well.

I see beautiful new languages every day. I can tell whether they're going to live or die primarily on the basis of their attitude toward the dominant library environments in modern programming.


And Clojure can target and interoperate with at least two of those stacks: Java (standard Clojure), Javascript (ClojureScript). And there is experimental support for C/C++ (clojurec, Ferret, etc). Not to mention in progress support for targeting CLR/C# (ClojureCLR) and python (clojure-py).


Right. Clojure isn't the first Lisp for the JVM, but it's definitely the first one to care so deeply about other languages and existing libraries.

If you ask about C interop or creating standalone binaries on a Common Lisp board, you'll get an answer (because you can do this). But you'll also get this whole, "Oh, but why would you want to infect our beautiful language/runtime with the fallen world of imperative code, UNIX conventions, etc. Free your mind!"

That. That right there. That's the problem.


Well , there's a few things.

Not all "hackers" are language geeks, plenty of smart people will not necessarily invest their time in learning new exotic programming languages as language learning for the sake of it is not interesting for everyone.

Other considerations for language choice will certainly include documentation and the accessibility of that documentation. Looking up a few pieces of code to do familiar tasks is much easier than reading a grand language design document as a way to get a feel for a language.

Having usage in some large commercial setting is also a consideration. For example knowing that Google uses python extensively can give one some confidence that python is unlikely to suddenly die out one day because Google wouldn't let that happen.


Totally off topic, but if the author of the post is reading this, please reconsider the fonts you've used on your blog. It's quite difficult to read as it. Compare and contrast your current design with this http://www.readability.com/articles/4xlr1wov.


While I didn't think much of the article, I found the font to be attractive and easy to read.


I wonder if you were seeing the first choice "Mrs Eaves" or the backup Georgia? Or perhaps you're on a high resolution display? The combination of font and size does seem a bit extreme to me.


I'm seeing "Mrs Eaves". And I'm on a Thinkpad T60, so definitely not high resolution. Of course I increased the font size from the site's default, but I usually do that, so that I don't need to use the reading glasses that I need to use to read books (ah, to be young again!).


I think that when you design a language you can take the low road or the high road. The low road is to design your language to explicitly interop with an existing language/platform so that you can hijack its libraries.

C++ took the low road by co-opting C. Scala and Clojure took the low road in co-opting Java. Languages like Arc, Smalltalk, Haskell and Eiffel took the high road and as a result they don't have as many users.

Having said that, I don't want to imply that language designers are doing a bad thing when they take the "low road." They get adopters and they bring powerful tools into the world. It's just that when they do that they have to sacrifice coherence a bit.


If you didn't want to imply something bad by your analogy, you should have chosen a different one. The analogy of the High Road and the Low Road has implicit that the High Road is the good path and the Low Road is the bad.


I find it strange to read an essay on the (un)popularity of Lisp which does not mention Clojure.


The essay isn't actually that recent.


Agree with the conclusion, but some of the reasoning is unsound. This statement is patently untrue: "This doesn’t mean you can attract smart hackers to your language just by running an advertising campaign though. Sun tried that with Java, but a: their marketing was too explicit, and it just seemed like one company trying to push its own technology, and b: Java itself is a detestable language. So, in addition, it seems like language quality is not something you can make up for in marketing."

It implies that Java failed in the marketplace. It did not. It has taken and kept huge market share. Sadly, it is the language most employers are hiring for. I would rather poke myself in the eye with a pencil than program in Java, but it has most certainly succeeded in the enterprise.


It hasn't taken off because it has not been marketed. Languages are products which need to be marketed like everything else. If PG wanted it to take off, then he could market it without much problem. But he has better things to do.


This, sadly, may be true. I remember reading PG's essays, thinking, "huh, I'll have to look into arc" then never hearing about it again. It's not packaged for Debian stable (one of my litmus tests), and just about every week I'm hearing about clojure, go, scala, etc, etc, but nothing about arc.


arc would not take off even if it were more-aggressively marketed, IMHO. it has too few differentiating features compared to other lisp dialects. 'I made a LISP' is not enough, even if it is a nice elegant one.


You are right. Though Javascript with its weirdness took off...


Curiously, this is one the things he disliked about Java. http://www.paulgraham.com/javacover.html


Altho in reality it probably has no bearing, but most people's casual exposure to Arc is the HN website - which at the best of times is slow. So the mindshare being developed here is "arcane AND slow" - not a great advert.


I think (and this is total subjective speculation) one of the problems is that Arc is so beautiful in its simplicity that the sort of people attracted to it think "man that looks like fun to implement, I think I will work on a lua/js/python/jvm port! then it will REALLY take off" instead of building things (apps, tools, libraries) WITH it.


Hands up if you saw the headline and thought about Objective-C Automatic Reference Counting...


ARC vs. Arc. (Which is why it should be Lua and not LUA)


FWIW, the modified title "Why Arc hasn’t taken off" was more informative than the original.


Are we in some king of Lisp day or something? Hahaha


Arc is beautiful, but Paul Graham was wrong about one thing. For language adoption-- at least circa 2013 (which he couldn't have foreseen)-- libraries do matter. This is to Clojure's advantage and Arc's detriment. A typical IT boss is not going to let you spend 6 months writing libraries, so a lack of libraries is a problem. The people who pick languages in all but the most daring startups tend to see "lack of libraries" as more of a problem than it actually is.

Paul Graham was building a web company in the mid-1990s, so rewriting a bunch of libraries was not an impediment. First, they had top-notch programmers who were willing to work with FFIs, write tools they were used to in other languages, and understand technology deeply enough to write good libraries. Second, the state-of-the-art for libraries in the 1990s was, if nothing else, less Big. Third, they didn't need to sell their language choice to a boss; they just needed to sell the product to the market and investors, neither of which cares what language you use as long as it works.

Arc is prettier than Clojure but, in my experience, Clojure is more than attractive enough... and it has the familiar JVM libraries.


A big reason why PHP was as successful as it was is that it came with a fairly complete and correct standard library for writing webapps.

This way you could spend zero time writing your own urlencode function (and screw up a corner case) or searching through maven to find functions to do very simple things and finding later other people on your team imported similar functions from four other projects.


"with a fairly complete and correct library for writing webapps"

Some of which were dangerous and unwieldy actually. Then proven dangerous and unwieldy. And then most of things that made idiomatic PHP were removed from the language or discouraged and nobody writes that way anymore.


Suggesting that, at least for language adoption, a standard library that is "good enough" is sufficient, and can be fixed later.


This probably depends on the environment (including alternatives) at the time.


It didn't matter, the reference frame was aligned with php for most things, it seems to me that it was more "valuable" to have bad glue between curl, mysql, foo, than have a good linguistic substrate and good stdlib.


Where on earth did you get the idea PG thinks libraries don't matter? I don't think I've ever seen PG say anything like that. Here's where, in fact, he says the exact opposite:

Of course the ultimate in brevity is to have the program already written for you, and merely to call it. And this brings us to what I think will be an increasingly important feature of programming languages: library functions.

http://www.paulgraham.com/popular.html

In fact, read the whole section 6, on Libraries.


Great link. I remember him saying something to the effect of libraries not mattering in his decision to use Common Lisp for Viaweb, but that's not the same thing as him saying that they're unimportant for language adoption. So I guess my memory is inaccurate.

Where there is disagreement is that most CTO's are afraid to adopt "weird" languages out of library FUD, which PG asserts doesn't matter. On that, I think PG is right.


And of course Clojure addresses another issue raised in this article of regex as a first-class citizen. I'm pretty new to Clojure, but that has already given me a few "ah" moments.


I wonder if the embedded-language angle of attack is more promising today. Lua and Javascript did not need bug libraries. They are embedded into larger programs and invoked useful functionality there instead. Nevertheless, in both communities people try to write standalone programs nowadays and standard libraries will emerge at some point.


Lua more so I think as it has easy C interfaces built in.


I don't believe the explanation in the article.

Lisp variants can be seen as a scripting language equivalent -- with efficient compilation. It was (and still is, to a large degree) an obvious win compared to Perl/Ruby/etc.

Academic stigma and/or culture clash with scripting users is what I'd guess. Or maybe it just is lack of hype?

But I don't expect to see an explanation. It is a larger mystery than dark energy to at least me -- why didn't lisp take over the world 10-15 years ago when the scripting languages started going?!

Edit: pjmlp, I talked about the last 10-15 years and compared with scripting languages, so hardware/AI winter/IDE are earlier/later problems. (Today, a lack of libraries might be the worst problem, except for Clojure(?).)


- Parenthesis

- Lack of a proper IDE

- The AI Winter

- Mainstream hardware wasn't powerful enough

- Most blue collar developers don't understand FP concepts

Just some possible reasons out of my head.


"- Most blue collar developers don't understand FP concepts"

HAHAHA! The vast majority of FP advocates are either unemployed or work as math teachers. Like it or hate it, FP is absolutely nothing more than a pseudo-programming paradigm (largely emulating some concepts from astract math and using notation somewhat similar to math notation) that attracts people who can't wrap their heads around OOP, rich frameworks and other associated stuff. Sorry folks, computers are neither abstract nor stateless. And the same holds true for software, which often deals with real world stuff, which again, is neither abstract nor stateless. Virtually everything that can be done in a functional language, can also be done in a procedural or OOP language. The opposite on the other hand is totally untrue. It's really funny to see how FP advocates struggle even with some extremely basic things. Using languages/platforms like C/C++, Java,.Net - there's always an increase in performance compared to any functional language (yeah, including scala, f#, clojure ocaml an so on) The "elegant code" argument is one of the most ridiculous things FP advocated come with, since it's almost always synonymous with crappy, cryptic code that no one wants to read besides its authors (maybe not even them after a few weeks or months) :D So I'm afraid that the FP advocates are far worse than real blue collar workers.


"Sorry folks, computers are neither abstract nor stateless."

It's true. But you know what computers are first and foremost? They're deterministic.

And you know what's one the biggest problem programmers do face in the Real-World [TM] when the shit hits the fan (and most devs' jobs is to fix shit that just hit the fan)? It's being able to recreate the state and to then be able to deterministically reproduce the shit that did hit the fan. As to prevent it from hitting the fan again.

Why do we seen market makers using 90 Ocaml programmers and raving about it?

Why do we see investment banks moving from Java to Clojure and slashing their codebase by doing so by a factor of ten? And then explaining how easier their life became in the face of changing requirements (eg new laws/regulations coming in)?

Do you really think that a codebase ten times smaller is "harder to read"? Do you really think that making it easier to reproduce the state is not a goal worthy to achieve?

I realize you feel insecure in your Java/C# + ORM + XML + SQL hell but don't worry: there's always going to be lots of real-world jobs for code monkeys like you ; )


"It's true. But you know what computers are first and foremost? They're deterministic."

That's like saying that computers have mass and are made of matter.

"And you know what's one the biggest problem programmers do face in the Real-World [TM] when the shit hits the fan (and most devs' jobs is to fix shit that just hit the fan)? It's being able to recreate the state and to then be able to deterministically reproduce the shit that did hit the fan. As to prevent it from hitting the fan again."

That's false. Programmers don't have to recreate the same exact state, actually it's not necessary to recreate the error at all in many cases. There are more tools than you can imagine for identifying errors from logging to memory dumpers and analyzers/profilers...

"Why do we seen market makers using 90 Ocaml programmers and raving about it? Why do we see investment banks moving from Java to Clojure and slashing their codebase by doing so by a factor of ten?"

Well, I'm afraid that happens in your imagination only. I also happen to be a trader. Almost NO ONE uses functional languages (fewer than 0.01 %) for financial trading. The main languages are C/C++ (especially for high frequency trading) and, of course, Java and also .Net.

"I realize you feel insecure in your Java/C# + ORM + XML + SQL hell but don't worry: there's always going to be lots of real-world jobs for code monkeys like you ; )"

You're pretty delusional about how secure or insecure I feel (haha!) and how much of a "codemonkey" I am. LOL! You don't even know me, but you already pretend that you know me. Unfortunately for you(and all those like you), this is a typical characteristic of FP advocates: you live in an illusory world, have a totally distorted view about software engineering and of course about the people who do make real world software. Anyway, it's always funny to see the reactions of FP advocates when they're left without any objective, verifiable, real-world arguments. :D


- The fact that most tutorials cover topics like "rolling your own object system" before topics like "writing a string to a file".

Lisp example also always cheat at things like Quines... the program will return itself instead of writing itself to stdout.


Another I'd add is lack of a clear execution model.

There is something to be said for being able to just run 'python foo.py'

Where as in many lisp variants there is no clear entry point, and often not really an "interpreter".


Interesting. I didn't realise that I'm a 'blue collar' developer. What do I need to do to upgrade to white collar?


Your parent implies that not getting functional programming concepts makes one a "blue collar" (i.e., less sophisticated, more commodity) programmer. Presumably, understanding functional programming is one necessary requirement of upgrading to "white collar" (i.e., more sophisticated, less replaceable) programmer.


I'm guessing he's implying that "white collar" programmers have a degree in Computer Science that is heavy in theory (as opposed to being primarily vocational, like, say, the University of Phoenix). Most such curricula will include at least one class that focuses on functional programming. (UC Berkeley teaches their 101 CS class using Scheme!)

The people who care about this distinction are the sort of people who give job interviews where they ask you to implement a balanced binary search tree on the whiteboard and then grill you on the order of growth of each function that you wrote. :-)


"Blue collar" is a common expression to describe developers without CS background, that usually develop CRUD applications and are expected to be easily replaced by management.


"What do I need to do to upgrade to white collar?"

Develop a stubborn willingness to ignore the at-times unsatisfactory performance of immutable data structures.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: