Hacker News new | past | comments | ask | show | jobs | submit login

What I like about Haskell is that it is unashamedly a language for programming language research, by programming language researchers. "This change needs to happens because otherwise the mathematics don't work out" would never fly in golang or Ruby if it broke backwards compatibility, but happens regularly in Haskell.

This has a couple of effects:

- It does make it harder to maintain code for businesses, since keeping up with language updates means that you will have to do relatively more maintenance work to keep up with these breaking changes.

- It slowly makes the core libraries more and more elegant over time and this paves the way for new advances in eg type systems and whatnot. Linear types would have been way harder to add if the existing system had been (even more) a giant mess of hacks to maintain backwards compatibility 20 years back.

- The ease of writing GHC extensions makes it so that it is relatively easy to extend the base language in some way and back out if it doesn't turn out to work. This makes experimentation way cheaper than if every core language change has to be "permanent".

These changes combined mean that Haskell itself will probably never be a mainstream language for business applications, and that is fine. Because many (most?) programming language implementors have had some exposure to Haskell in university and because they all speak to one another on conferences etc, many of the ideas first explored by Haskell (and its predecessors in academia) are "leaking out" if they are good (like list comprehensions in Python or type classes in Rust) and they don't get taken over if they turn out to have been mistakes (like lazy I/O). The true value of Haskell is having a language in which to experiment with new concepts, so they can be proven useful (or not) before they make their way into the wild.




> It does make it harder to maintain code for businesses, since keeping up with language updates means that you will have to do relatively more maintenance work to keep up with these breaking changes.

To be honest — I think even in my 90KLOC Haskell codebase — handling these breaking changes are a cheap and easy change because of the compiler.


Yeah, refactoring the whole codebase in a language like Haskell is, maybe not trivial, but something the language lets you lean into, systematize, and make SOP.


That's not really true. There was a huge outcry against the removal of (/=). There are still lots of warts in Prelude and base (head being partial, foldl is in prelude but not foldl'). So yeah, language evolution is still a hard problem


You are right that it's not "really" true, but I do think that at least it's not wholly untrue. Foldable/Traversable got through, and so did Monad Of No Return, the Functor-Applicative-Monad Proposal and several more that I can't name off the top of my head. It does happen, even if we both would like progress to be quicker and more drastic :)

IMO, the existence of the Haskell Report and the inability of the community to update it in a reasonably timely manner is the biggest cause of the persistence of the biggest warts like partial head and foldl. I don't think anyone wants to keep those but "The Haskell Report specifies that they are in the prelude and with the exact implementation they have" tends to kill any discussion. Let's hope the HF makes some progress on that soon!


> What I like about Haskell is that it is unashamedly a language for programming language research, by programming language researchers

Wait, what? This runs completely counter to my experience of Haskell. I use it whenever I can, and I’m pretty sure I’m not a PL researcher. Lots of other programmers write actual, real-world programs in Haskell as well. Much of the discussion I see in Haskell communities concerns areas such as performance, toolchains and libraries — areas which PL researches are famous for ignoring. I will admit that we often talk about GHC extensions and type theory and whatnot, but the discourse around those areas is not all that mathematical; it tends towards ‘how is this useful for writing programs?’. In other words, exactly like every other real-world programming language out there.

(That being said, maths is fun, and I regularly see people defining weird and wonderful abstractions. But this rarely gets in the way of writing programs. If anything, every now and then someone comes up with an abstraction which turns out to be incredibly useful in practice: lenses, free monads, applicativeṣ, HKTs…)


I didn't mean to imply that nobody else but PL researchers can use Haskell, clearly a lot of people do. What my post intended to convey was that the focus of the language seems a lot more focused on research than (say) Ruby or Javascript. This is not strange, given its history as a designed-by-committee language "to serve as a basis for future research in functional-language design". (see https://en.wikipedia.org/wiki/Haskell_(programming_language)... )

The point in your last sentence that "every now and then someone comes up with an abstraction which turns out to be incredibly useful in practice" is exactly what I meant in the OP: these abstractions seem to be developed way more often in Haskell first and then they leak to other programming languages later than the other way around.


Yeah, I'd say that Haskell has done a phenomenal job for most of its history of maintaining a balance between the interests of researchers, commercial programmers, educators, and enthusiasts/hobbyists. At different times, each of these communities has been inconvenienced by decisions made by the Haskell community, but the community has nevertheless been for the most part welcoming to all of them. On the other hand, some of the darkest chapters of the community have involved power plays where one of these groups feels entitled to sideline the others and decides it should be in charge.

By contrast, a typical mainstream programming language might, say, completely neglect one or more of these communities in favor of whatever is best for commercial programmers. Particularly when, like most mainstream languages, it's mainly funded by those interests.


Breaking changes happen regularly in many commercial and business adopted languages. Es. PHP 5 vs 7, Python 2 vs 3, Typescript ecc. Even Java added some keywords over time.


A breaking change is only a breaking change to you if you switch to the updated language. If your business does not actually need anything the updated language offers you can often just stick with the old version.

There are plenty of sites still on PHP 5 for example.

If you are doing something where you rely on third party libraries, and you have to keep those third party libraries up to date (e.g., a third party library that uses a remote service and that remote service keeps changing their API), then you may be forced to update to the new version of the language because the library switches to the new version.

For a language that isn't really mainstream for business use, I'd expect that there aren't a lot of third party business libraries that you'd be using and so the "library made me do it" language update would not be necessary. That should allow staying on the old version as long as an OS upgrade doesn't kill the ability to run the compiler or interpreter.


To be fair, almost all of the keyword-ish things Java's added have been done such that they rarely can change the meaning of any existing programs. Assuming I didn't miss any, all the keyword-ish things which have been added, ever:

  Java 1.4, 2002: assert
  Java 1.5, 2004: enum
  Java 10, 2018: var
  Java 14, 2020: yield
  Java 16, 2021: record
  Java 17, 2021: permits, sealed
Almost all were added in such a way that the extent of the breakage would be that a program which previously worked would now fail to compile (i.e. fail safely.)

(The only exceptions I can see are for `assert` and `var`, and even then only when some parts of a program are compiled with older versions of the compiler, and even then only when various other conditions are met.)


Nit: Adding a keyword is not a breaking change, removing them is.

With PHP, you could argue that they were needed because it developed organically, instead of through a process by experienced language designers. I'm not sure about Python though.

Counter-example, Go hasn't had backwards-incompatible changes yet, and at the moment there's no compelling reasons to make a breaking 2.0 version - and any plans for a 2.0 version so far have minimal changes, so going to 2.0 should be a smooth and quick process.


> Nit: Adding a keyword is not a breaking change, removing them is.

Langages generally try to add contextual / soft keywords these days but otherwise it’s absolutely a breaking change: any variable named the same will trigger a parse error. That is why languages try to either not add keywords, or find ways to make them opt-in somehow.


>Nit: Adding a keyword is not a breaking change, removing them is.

Of course adding a keyword is a breaking change. It will invalidate all uses of that keyword where used as a variable. Perhaps PHP is unaffected, as it has sigils on its variables, but most languages, including haskell, do not.


> Perhaps PHP is unaffected, as it has sigils on its variables

Locals are prefixed but functions, constants, and classes are not.

Also barewords but that horror was removed in php 8.


That’s breaking forward compatibility though, not backwards. The former is pretty cool but unrealistic and not as useful.


Very well summarised.

I like that Haskell has inspired Scala, my main language. While it lacks some of the elegance of Haskell, it is a nice compromise between an advanced and Haskell-inspired type-system and stability for business, plus having the whole JVM ecosystem is awesome.

Therefore, hopefully Haskell keeps advancing and bringing innovations and having them trickle into more mainstream languages.


Having written both Haskell and Scala professionally, I actually see Scala's dependence on the JVM as its biggest flaw.

If it weren't for the JVM Scala could make do without an Any type and without having to deal with null values. I hope that alternative runtimes like Scala Native will become more popular, since Scala could evolve away from its JVM roots then.

I'm also not a fan of JVM performance optimization, since there are too many knobs to turn, and the knobs often have (undocumented) side-effects on other knobs. This is a lot simpler with the Haskell runtime (in my experience) since you mainly need to tweak the GC settings there.


You really should never tweak anything in case of the JVM other than perhaps max heap size and rarely target pause time (for G1, so you can either prefer throughput or latency). Anything else you find online is likely already outdated and very specific for a given code. Instead just use a recent JDK and profile your code.

As for scala, missing the java ecosystem would pretty much decimate the language, no matter how cool it is.


I know others who share the same opinion as you, so I can relate.

That being said, the Any type would be there in any case - this is not JVM specific. "null" is, but I think it's not too much of a big deal with Java 3's Union-types, see: https://dotty.epfl.ch/docs/reference/other-new-features/expl...

The only problem is that if you now use native Java libraries, you'll be confronted with everything being explicitly nullable (no runtime error problems though). But that's a small price to pay for having access to this huge ecosystem.


Let's hope Haskell never becomes commercially successful then :-)


I recall reading historical threads and being surprised at animosity towards FPComplete and some of the efforts they were making to standardize changes to make Haskell more commercially viable. At the time I thought being upset by that was absurd, but over time I guess I sort of understood the sentiment.


Too late. The people are making money with Haskell for quite a few years by now.


> It does make it harder to maintain code for businesses, since keeping up with language updates means that you will have to do relatively more maintenance work to keep up with these breaking changes.

That is so incredibly slow, that this point is moot, unless you are comparing it to C++.

You will have much more work keeping up with change on the Haskell library ecosystem, will have more work keeping up the changes on the core of most mainstream languages, and will have orders of magnitude more work keeping up with the ecosystem of any other language.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: