Hacker News new | past | comments | ask | show | jobs | submit login
Haskell vs. Ada vs. C++ an Experiment in Software Prototyping Productivity (1994) [pdf] (yale.edu)
102 points by aaronchall on May 4, 2017 | hide | past | favorite | 59 comments



please stop linking this paper. it is too old, endlessly debated, and does a disservice to any precieved motive of promoting or demoting any of the mentioned languages.

previous discussion:

https://news.ycombinator.com/item?id=13275288

https://news.ycombinator.com/item?id=7050892

https://news.ycombinator.com/item?id=7029783

https://www.reddit.com/r/programming/comments/1v0z02/haskell...


And C++ has come a long way since then. A long long way. It feels almost like a completely different language now.


It would have been, unless the existence of the endless baggage of old C++ code to support and maintain...


clang-tidy can be useful to modernize code.

http://clang.llvm.org/extra/clang-tidy/index.html


Template metaprogramming still sucks. And will keep sucking with concepts.


For limited set of problems templates are fine, like char/wchar_t for strings, double/float for vector algebra, template containers, etc.

Also there’re another evolved parts of the language that don’t suck. Such as the standard library: when the article was written, there was no STL _at all_.


offtopic: I've never posted any links on HN before, but is there no confirmation page upon submitting that tells you the link has been submitted before?


If you post the same link as someone else shortly after them, you'll just get redirected the current discussion page and I believe it counts as an upvote to the article.

If sufficient time has elapsed since the last posting, HN allows links to be re-posted since peoples' viewpoints may have shifted, modern events may influence discussion, etc.


Here is how I would run this experiment.

For each language, I would have two experts in that language. Call them P and D: programmer and documenter.

They would not be permitted to communicate.

P would be required to write strictly code, without a single comment or a shred of documentation.

D would not be permitted to view the problem description: only the input data (along with documentation describing what the data means) and the code written by P.

D would have to determine what the code is doing and specify the problem being solved, relying on his or her expertise in the implementation language to unravel how the code is treating the input data, and hence what is its meaning.

Then evaluate the time spent by P, the time spent by D, the size the code, and the degree to which the code actually solves the problem and the degree to which D's documentation actually reverse-engineers the specification not revealed to D.

Then we could say meaningful things: like a given solution is very small and was developed quickly, but it took great effort to unravel what it's doing. That's not bad: there are situations when that is acceptable, namely the development of one-off solutions that will never need to be understood or maintained, and which are needed ASAP. Situations would also reveal themselves when the program takes long, and isn't particularly short, but is a breeze for D to understand. Usually we would prefer that situation in permanent production code.

The all round winner would be: small code written by P in a short amount of time, quickly reverse-engineered by D to recover the spec. (With bonus points for time performance, small memory use, etc).

There is also a degree of realism in this setup. How often are we called upon to understand some code which is utterly undocumented? That's when we feel the differences in languages, programming approaches and styles.


Tradeoffs in speed vs maintainability could be the intentional result of choices made by P though. What instructions would they be given regarding how to handle those tradeoffs?


> What instructions would they be given regarding how to handle those tradeoffs?

"Try to win this contest to make your language look good."

P can gamble with their task at the risk of blowing up D's time.


Hm, this paper is pretty obviously promoting Haskell. The main authors wrote the haskell variant, did not even execute the other prototypes and make conclusions without any sort of comparison except LOC.

I like haskell, but this paper is doing a poor job of representing scientific honesty.


The people conducting the test also, without the authors' knowledge, got a grad student, gave him eight days to learn Haskell (with no formal training, but the option to ask a more experienced Haskell programmer questions), and then handed him the problem spec and told him to implement it in Haskell. He did it in eight hours with a bit under twice as many LOC as the paper's authors' version. The paper's authors also point out issues with the study, and describe it as showing the possible worth of functional languages, not Haskell specifically.


They added many disclaimers saying that LOC alone is a bad metric and it's more interesting to look at characteristics of the code for anecdotes. The article is full of of advice to take the findings with a grain of salt.

They don't even mention haskell in the conclusion, they just say that functional programming is well-suited to rapid prototyping.


Someone (I think John C. Peterson) mentioned this paper at the Paul Hudak memorial symposium last year. His comment was the that the comparison was unfair, because the Haskell program was written by Mark P. Jones, who was a good enough programmer to outshine his competitors no matter what language. :)


https://news.ycombinator.com/item?id=14269305

See this for a pretty powerful rebuttal.

The second Haskell program in this paper, that also out compete everyone, was built in a week by a grad student that did not knew Haskell before hand without talking to the first implementer.


> Paul Hudak

Ah :-)

> memorial symposium

Oh no :-(


Are any of those languages even remotely similar after 25 years? For example, C++11 is practically a different language from C++<98.


One counter argument is that you're still building on the same foundation.

Likewise C++ isn't going to be as concise as Haskell ever.

So the paper still hold some truth if not mostly valid.


The conclusion mentioned some interesting truisms about prototyping, team-size-induced-complexity and "long bugs" vs "short bugs". Good stuff. If the PDF was not a collection of images of the text, i'd have copy-pasted that bit here. :)


Haskell changed way more than C++. I don't think Ada changed a lot.

Still, they are so different from each other that I don't think all that change makes much of a difference.


The paper is dated 94 which is the same year as the original implementation as the C++ STL (https://en.wikipedia.org/wiki/Standard_Template_Library#Impl...). So it's a fair bet they didn't have access to that. I would think the entire difficulty of the exercise would change with that.


Ada has had 3 major revisions since 1994. It's changed at least as much as C++.


I don't think you can compare the changes between Ada 83/95/2005/2012 to what has happened to C++. The Ada language seems much more stable and backwards compatible to me, and given it's nature, it makes sense to avoid introducing major changes. However, I'm not an Ada expert so correct me if I'm wrong.


Oh, I bet you can given it's been done in detail more than once. Here's a recent one:

http://www.electronicdesign.com/embedded/c11-and-ada-2012-re...


Not to mention the changes to the SPARK subset of Ada for doing proofs on parts of your code have improved a lot.


If I were to jump into a random c++ project, which version would I expect to find?


If it was written before 1998 I expect you'd find a version of C++ that existed before 1998 :)

If it was written today, I don't know that's a really good question. I think its a mistake to ignore the advances that 2011 and 2014 bring to the table though.


I do not think that someone wants to use C++11/14 to make app prototype. There are a lot more languages now which in my opinion are better suited to prototyping than languages mentioned in article. Most of people will use Python, JS or some other scripting language.


There might be many biases in this research. If one is able to write code in Haskell it tells a lot about them, and he might be a better programmer than one who writes in Ada or C++.

To make this research unbiased we need to do the following:

* Take people who are profecient in Ada, Haskell and Ada

* Randomly assign them to different teams

* Compare their results


Even this is probably not very meaningful.

Seasoned and experienced Ada programmers are likely to come from a high-level (software) engineering background in some large company, e.g. in the aviation industry or military. Haskell programmers will more likely work in academia, e.g. a professor of computer science. And the C++ programmers could come from any field or industry.

These people will likely shine with certain tasks and be less productive with other tasks, depending on their varying skill sets. So you're probably only measuring the overall level of proficiency of programmers of certain languages in a particular programming domain. For example, I'd bet that the average experienced C++ programmer is slightly less skilled than the average experienced Ada programmer (if both of them are mostly working with their language professionally), simply because C++ is more widespread whereas the few Ada jobs left require a relatively high level of expertise. On the other hand, if you're writing programs in Ada for Airbus, you're probably accustomed to a slower development pace than if you write C++ code for a small gaming company, so the "productivity" will appear to be fairly low in comparison.

Then there are measures like LOC that are fairly meaningless across languages and perhaps even in general.

Finally, languages are usually chosen for their tool and library support and on the basis of available developers anyway.


"There might be many biases in this research. If one is able to write code in Haskell it tells a lot about them, and he might be a better programmer than one who writes in Ada or C++."

I'm going to bite as an Ada fan to say this is probably true to a degree since it's a high-level, functional language. Haskell similarly gets slammed in productivity by Common LISP which is high-level, less safe, has fast iterations, and good tools. However, if we're talking apples to oranges, Haskell already failed at doing the kind of low-level, efficient, and/or real-time apps that Ada was designed for vs Ada that makes it straightforward. The House operating system was a celebrated achievement and embedded use Haskell-based DSL's extracted to C for good reasons. Meanwhile, SPARK variant is allowing proof of absence of key errors without people being experts in theorem provers and with most of the efficiency of Ada vs provable ML/Haskell.

Yeah, I'd love to see a bunch of teams trying to write deterministic, safe, and fast software in each of these languages. Let's see about protocol engines, string libraries, real-time apps, high-performance on single cores, lightweight containers, provable absence of problems in component, and so on. I'm betting on Ada 2012/SPARK 2014, C++, and Haskell in that order given attempts to balance all those requirements.


> To make this research unbiased

I wonder if there's ever been even a single comparative programming languages study which would adhere to the standards routinely required in medical and epidemiological studies?


> which would adhere to the standards routinely required in medical and epidemiological studies?

For a long time, medical studies could change the topic of the study after the study has been completed (i.e. study was initially done to prove effectiveness for one thing, but it was discovered that it was not effective for this thing, but for another, and happened to not kill anyone, so the study was now for the other thing), so I would definitely not use the field of medicine as a example of unbiased studies.


> If one is able to write code in Haskell it tells a lot about them, and he might be a better programmer than one who writes in Ada or C++.

And here's the arrogant, self-important attitude that turned me off of Haskell.

When I was using it, I didn't see any evidence that Haskell users are any better or worse than programmers in any other language.

If it were true, why aren't Haskell developers taking over the world with their superior software?

In reality, any new, interesting language is going to attract better than average developers at first, because those developers are always learning new stuff and trying different things.


> And here's the arrogant, self-important attitude that turned me off of Haskell.

Haskellers are usually out to convince that Haskell is for most programmers, and that "need PhD to use it" is a myth.

It is people frustrated by the (difficult) learning curve of Haskell that say such things.


I started to learn Haskell several times and decided every time that it's not really worth effort and time spent. Technology should adapt to humans, not humans to technology. IMHO Haskell exemplifies the latter.


Haskell is quite well adapted to the humans that like using it :-)

Partially it's a matter of taste.

I hated Haskell syntax in the first few weeks ("Why did they have to differ on this?!") but later I realized it's objectively nicer for expressing the things Haskell expresses and much prefer it.

Then there's a lot of bad tutorials out there.

And finally the tooling / IDEs aren't great for discovering the language and type system.

I don't see significant ways Haskell itself could be more geared towards humans than it already is. It brings simple mathematical concepts and abstractions to the forefront - and simple is not easy, but is worth it.


> It is people frustrated by the (difficult) learning curve of Haskell that say such things.

And there it is again. I haven't drank the Haskell kool-aid, so I must not be smart enough to use it properly.


> If one is able to write code in Haskell it tells a lot about them, and he might be a better programmer than one who writes in Ada or C++. - solomatov

> And here's the arrogant, self-important attitude that turned me off of Haskell. - jlarocco

> It is people frustrated by the (difficult) learning curve of Haskell that say such things. - peaker

> And there it is again. I haven't drank the Haskell kool-aid, so I must not be smart enough to use it properly.

Firstly, reading "people frustrated by the difficult learning curve of haskell say such things" as "I haven't drank the Haskell kool-aid, so I must not be smart enough to use it properly." is a very uncharitable interpretation.

Secondly, I believe solomatov and Peaker's comments could be written more unambiguously as follows (though I thought they were precise enough):

solamatov commment rewrite: Someone who can write Haskell might be a better programmer than programmers who use other languages because Haskell is difficult to learn.

peaker comment rewrite: People frustrated by the difficulty of the Haskell learning curve say things such as "Someone who managed to learn Haskell to the point they can write code in it have proven their worth more than those who use easier to learn languages"

So jlarocco, could you still call classify these comments and the thought processes behind them arrogant and self important? Is there anything that would turn you off of them?


Haskell is difficult to learn! For virtually everyone. Especially people who already learned more mainstream languages, because it is so different.

When I learned Haskell I was frustrated by its difficult learning process until I stuck with it long enough, as did almost all who passed the learning curve.

How do you get any reference to "smart enough" out of this?

"persistent enough" is definitely relevant for learning Haskell, though.


Because companies want replaceable cog programmers that they are able to pay peanuts for. They don't want to pay for quality.

Just look at Go design goals regarding target audience, as explained by Rob Pike himself.


> If it were true, why aren't Haskell developers taking over the world with their superior software?

Haskell doesn't make superior software; it just (as far as I understand) allows you to make software of equivalent quality, faster. (As is true of functional languages generally; and of strongly-typed languages with type inference generally.)

Quality in all software projects sort of approaches perfection asymptotically, because nobody is all that interested in "perfect software." People just put in less and less QA effort as software "stabilizes", so the remaining bugs in a codebase (even one of static size! even bugs equally easy to find!) are found more and more slowly over time.

One could likely take a "military discipline" approach to line-by-line code reviews, and get a constant output of bugs found per man-hour; but nobody except NASA—and possibly the military itself—is ever going to bother.

---

All that being said, when people say "Haskell programmer", they're more-often-than-not referring not to anyone who codes in Haskell, but to the very specific type of person who creates Haskell libraries encoding new category-theory abstractions—which is to say, mathematicians.

Programmers who happen to also be mathematicians do write "better" software than programmers who are not mathematicians, I think. This is just because mathematicians have much broader mental libraries of abstractions available to them, where any one of those abstractions may be the key to reducing an ugly, complex solution to a simple and elegant one.

Being a mathematician doesn't force you to code in any particular language, but a lot of them happen to be drawn to Haskell when testing out constructions of new abstractions in category theory, computability theory, information theory, etc. Other Haskell programmers can then take these abstractions and—if they can understand them—reuse them in regular programs, such that those programs then become more elegant, in the same way they would be if a mathematician wrote them themselves.

Of course, just because these libraries happen to be brought into existence in the Haskell ecosystem, is no reason for them to stay there. You can port such abstractions to any language you like (if it can support them—but many languages do!)

It's just not often done, because it is kind of hard to "see the point" in such abstractions without reading this-or-that paper... and all the people who are willing to do that gravitate toward Haskell, leaving the rest of the language ecosystem with very few people interested in expanding their language-of-choice's abstraction capabilities. (Another sad network effect.)


> Haskell doesn't make superior software; it just (as far as I understand) allows you to make software of equivalent quality, faster.

But the same question applies to that claim: If it is true, why aren't Haskell developers taking over the world with their ability to deliver the same software, faster?


Much easier question to answer: there aren't that many of them, and network effects are important.

There's a reason code-bases like Facebook Chat get successful, and then get rearchitected from niche languages (Erlang) to popular languages (C++), and it isn't ease of development or quality of result. It's the ability to hire people to maintain the thing. And, in large corporations, that reason is usually considered early-enough on that nobody lets development happen in such languages in the first place.

As such, it's only small ISVs that will end up using niche languages. (Sometimes small ISVs that get crazy successful—like WhatsApp—but still, small.)

Even then, the problem is that a code-base that's full of elegant abstractions... is still a codebase you have to learn. The fact that not everybody understands those abstractions is a bad thing for hire-ability, just as much as it's a good thing for ease-of-development and maintainability.

It's just like use of macros in a code-base: they (usually) make the resulting code better once you have taken the time to understand it—but the code starts off more opaque to anyone who hasn't yet taken that time.

Haskell's elegance happens when people who read math papers write software. People who haven't read those math papers can't maintain that software. Thus: Not Viable In The Enterprise.

And, to bring the point home: if you write better software, at the same speed, you can win as an ISV, because your software is, well, better. But if you merely write the same software faster (or for less cost), your bigcorp rival is going to win. No customer cares about how little time or money it took you to build the software. They just care about the result. If FooCorp can spend 10x the resources to make a product that's 10% better than yours, they'll still capture your whole market.


Because companies want replaceable cog programmers that they are able to pay peanuts for. They don't want to pay for quality.

Just look at Go design goals regarding target audience, as explained by Rob Pike himself.


An interesting side note: I work with several mathematicians. We are an algorithmic management company, they come up with optimization algos, we (software guys) make it scale. They seem to prefer imperative code over functional style.


I enjoyed the paper, thanks. I am now inspired to rewrite some C++ in Haskell.

Anyone have a modern example of this type of research? Opinions are welcome!


Basically 'Relational Lisp' won. Just three hours development time.


That is insanely low. Odd to see that it gets basically a passing mention. They acknowledge the absurdly fast development time. Gave the language an "on par" evaluation (though, how they account for a C in prototyping support in what was hands down the winner here is, interesting), and then move on.

For those that didn't click through, it was about half the time of the next fastest solution.

Edit: I am a really forgetful person. Seems I've made this same observation on this same paper every time it comes up. My apologies if the same people are reading it. :)


And a write-only prototype.

I'm sure it could become more readable with some more time invested.


Can you let others know where you were able to download this?


What it says about the Lisp implementation:

> The documentation size is also quite low, which may partially account for this [low implementation time].

It then got a B in understandability, what is the lowest grade on that dimension.


> The documentation size is also quite low

You can guess that it is very declarative code, given that they used a Lisp with high-level facilities on top.


Anything declarative is pretty much going to be unreadable to the average coder, who equates readability with "able to follow the Fortran-like step-by-step recipe of what the computer is being told to do or calculate".

If you've had a 30 year career writing imperative, OOP and functional code, with zero declarative experience (logic programming and that sort of thing), it's not magically going to be readable to you. You have to, ... ahem! ... backtrack a bit, and go down a different learning path.

It's somewhat telling that the Lisp shared an A score in the "Executable Specification" area with Haskell and that Proteus.

The subjective panel could tell that there is a wonderful executable specification there; they just didn't quite grok its notation or semantics. :)

Or maybe they did grok it: they started with an A+, and then docked points down to a B for all the meddlesome parentheses. :)


I'm not going back into the article, but there's a "code reads like documentation" dimension on the evaluation. I don't remember it getting a top score there.


But got an A in the table for "Executable Specification", that's what 'declarative programming' actually means. One specifies/declares WHAT to do and the system figures out HOW to do it. Relational Lisp supports this on a 'logical'/'relational' level.


BTW... in the diagram showing the regions of interest for a given set of craft and their positions, some of them are referred to as "doctrines". Anyone know what "doctrine" means in this context?


I'd like to see a more recent evaluation of this.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: