There is a problem. I know no other language, which manages to have at the same a type system as powerful, abstractions as insightful, as good a support for concurrency and parallel programming, an ecosystem as rich and which performs as well.
Usable records are just an icing on the cake, but it's the cake that matters.
Another way to solve this problem, is to allow multiple (nested) modules per file. A bit like OCaml. Then standard Haskell-syntax can already handle everything we need.
This article sounds a lot like the way that records are implemented for Scala (in Shapeless), which is a language that meets all your other criteria :P.
(Not that there aren't good reasons to use Haskell instead of Scala, but Scala has everything on your list)
Actually Scala is the language I moved to Haskell away from. It turned out to be too messy and bloated for my taste. The syntax and the type inference are massively inferior to Haskell. An absolute majority of Scala's advanced features are just copycatted from Haskell (typeclasses, STM, Scalaz, the majority of the Shapeless library that you mentioned). The promoted usage applies it's object-orientation mostly only to encode a modular system (e.g., the cake pattern). A lot of things are simply impossible to achieve due to Scala not being purely functional.
So yes, Scala might have an edge over Haskell here and there, but IMO, as of a person who happens to have an extensive experience in Scala, in general it is majorly inferior compared to Haskell.
Ceylon's type system is based on Java, so no, it is nowhere near as powerful as Haskell's type system. The above post didn't mention it, but I would add purity to his list. Ceylon doesn't have that either.
It's not, Ceylon's type system is way better than Java's, it's actually the best type system I've seen in a language. I wouldn't say Haskell's is significantly more powerful, if at all.
What I like about Ceylon is that it's pragmatic, it's great for functional, objective and imperative programming. One thing that I disliked about Haskell is that it can be a bit cumbersome to do imperative and side-effect programming (which often the most readable and understandable way to write code).
Hmmm, perhaps my cursory skim of Wikipedia and the Ceylon site led me to wrong conclusions there. I'll defer a more detailed type system comparison to experts in the field. The big difference I see on my second pass is purity. Purity in Haskell is huge IMO. I would argue that purity makes Haskell's type system more powerful because a function's type allows you to know exactly what kinds of side effects it might have, or more importantly, not have. But I suppose one could debate whether that's a type system thing or not.
> can be a bit cumbersome to do imperative and side-effect programming
On the contrary, I think Haskell is great for imperative programming. It allows you to operate at higher levels of abstraction and create more abstract imperative constructs than most other languages. If you want to mutate things, there is a small cost to construct IORefs, STRefs, and the like, but I think that's actually a good thing because it correctly incentivizes the reduction of side effects. And again, much of this boilerplate can be gotten rid of. See https://github.com/ekmett/lens/blob/master/examples/Pong.hs#... for an example of some really nice imperative code made possible by the lens library.
I have very little experience with Haskell so I may be wrong about this - I gave up on Haskell because it didn't feel pragmatic, it didn't seem to make trade-offs that maximize productivity. Does purity (and effects of it on language design) really make the language more productive? Do people love Haskell because they are super productive with it or because they find it elegant etc?
Ceylon on the other hand really clicks with me more than the many other languages I've looked at.
No language as drastically different from the mainstream as Haskell is will feel pragmatic to someone who is unfamiliar with its paradigm. So I think it's perfectly understandable that you feel that way. It takes substantial effort to get out of the local optimum to find a better global one.
If you define productive as the number of lines of code you are able to write in the next month after you start learning the language, then Haskell will probably not be more productive. But if you define productivity as the amount of time it takes an experienced Haskell developer to build an application with a certain defect level, then I think Haskell is indeed more productive. And I believe this phenomenon becomes more noticeable the larger the project. Purity completely eliminates entire classes of bugs and allows you to be much more confident about the behavior of code when looking at it in isolation. It also substantially improves code reuse for the same reason [1].
Writing new code is one thing, but maintaining existing code is an even bigger area where I think Haskell has higher productivity. Haskell allows me to fearlessly refactor code in a way that no other language I've used comes close to. Purity is a big contributor here too.
There are probably certain classes of problems for which Haskell still needs improvement in the library ecosystem in order to compete. But I'm in this for the long game and am willing to deal with this in order to get to a better capability over the long term. The ecosystem has surprising depth already and is growing quickly even though the community is still relatively small. For instance, a relatively recent improvement is that Haskell now has more complete OpenGL bindings than any other language [2].
So in short yes, there are definitely people who use Haskell because they believe it makes them more productive. I personally believe this is related to its elegance.
Having looked into Ceylon and thought about this a little more...
Ceylon's subtyping is the big thing that is coming to mind right now that makes me prefer Haskell's type system. The purpose of a type system in my mind is to prevent bad programs from being written. But when you combine subtyping with generics you get a problem. Nothing keeps you from looking for an Employee in a Set<Int> because both Employee and Int are a subtype of Object and the covariant use of Set<Int> can be promoted to Set<Object>. But that's a nonsensical thing to do and should obviously be a type error. This kind of thing comes up a lot in practice and is a place where Haskell's type system is safer.
This shouldn't be a problem I think. There's not a reason to cast to Set<Object> and then look for employees there. If a function wants to accept a set of let's say emplpyees and ints, it would have a Set<Int|Employee> argument.
> imperative and side-effect programming (which often the most readable and understandable way to write code)
Oh come on, at least provide a couple examples with a claim like that...
> One thing that I disliked about Haskell is that it can be a bit cumbersome to do imperative and side-effect programming
From a not quite yet ready to release everywhere web scraping library of mine in Haskell:
main = runScraper $ do
get "https://www.paypal.com/login"
postToForm (Name "login_form") (Just creds) AllVisible
get "https://www.paypal.com/myaccount/home"
cursor <- liftM (fromMaybe (error "Couldn't get cursor")) getCurrentCursor
liftIO . putStrLn $ "Your Paypal balance is: " <> getPaypalBalance cursor
where creds = [ ("login_email", "email@example.com") -- put your credentials here
, ("login_password", "password")]
An example of using lenses and imperative programming to create pong[0]:
-- Update the paddles
updatePaddles :: Float -> State Pong ()
updatePaddles time = do
p <- get
let paddleMovement = time * paddleSpeed
keyPressed key = p^.keys.contains (SpecialKey key)
-- Update the player's paddle based on keys
when (keyPressed KeyUp) $ paddle1 += paddleMovement
when (keyPressed KeyDown) $ paddle1 -= paddleMovement
-- Calculate the optimal position
let optimal = hitPos (p^.ballPos) (p^.ballSpeed)
acc = accuracy p
target = optimal * acc + (p^.ballPos._y) * (1 - acc)
dist = target - p^.paddle2
-- Move the CPU's paddle towards this optimal position as needed
when (abs dist > paddleHeight/3) $
case compare dist 0 of
GT -> paddle2 += paddleMovement
LT -> paddle2 -= paddleMovement
_ -> return ()
-- Make sure both paddles don't leave the playing area
paddle1 %= clamp (paddleHeight/2)
paddle2 %= clamp (paddleHeight/2)
From "Program imperatively using Haskell lenses"[1]:
battle :: StateT Game IO ()
battle = do
-- Charge!
forM_ ["Take that!", "and that!", "and that!"] $ \taunt -> do
lift $ putStrLn taunt
strike
-- The dragon awakes!
fireBreath (Point 0.5 1.5)
replicateM_ 3 $ do
-- The better part of valor
retreat
-- Boss chases them
zoom (boss.position) $ do
x += 10
y += 10
As for "which often the most readable and understandable way to write code"... let's find out! I'll trade some examples for common tasks if you'll do the same :)
> use a language which solves this problem by design from the beginning
Now you either give up everything else Haskell gives you, or you've created a new language. Do you see how both of those solutions are problems in and of themselves?
While the annoyances are real, they are just that: annoyances. If you change language every time a syntactic feature (or lack thereof) irks you, you're never going to get anything done.
In which case you still aren't going to get anything done, but for other reasons (lack of libs, playing with the language instead of doing anything productive, advocating for Lisp instead of doing anything productive, etc).
> In which case you still aren't going to get anything done, but for other reasons (lack of libs, playing with the language instead of doing anything productive, advocating for Lisp instead of doing anything productive, etc).
Curiously, Racket lacks most of these pitfalls, except maybe that the macro system is so good you'll end up toying around with it too much. With some discipline it's essentially the perfect engineer Lisp, though.
It's not like having your own homegrown dialect of non-idiomatic code is a good thing anyway. On the other hand, the linked library looks more like a working proof of concept for a solution to a longstanding problem, with the goal of eventually merging it into the language.
What makes you think this is not what happens in other languages? A recent article made a pretty good case for Perl being the most productive language ever but I don't see people ditching their current favorite language for Perl.
>What makes you think this is not what happens in other languages?
Of course it happens in other languages. But I'm certain of two things, a strong predicate and a less strong one:
1) A thing X that happens in all/most languages, doesn't happen to the same deegree in all of them. Of this, I'm 100% certain.
2) Playing with/melding the language instead of being really productive tends to happen in Lisp a lot more (this is a personal observation, based on 1).
>A recent article made a pretty good case for Perl being the most productive language ever but I don't see people ditching their current favorite language for Perl.
For that to happen (a) the article would have to be correct (b) easy for people to verify that it is so, (c) people should have the tendecy to migrate to what's more productive (instead of what they like, are used to etc).
So I don't think the fact that "people don't migrate to Perl" despite "an article about Perl being the most productive language being published" means anything with regards to whether Lisp programmers tend to be non-productive in the real world.
This can be true, but with discipline you can use superior technologies that may not quite be as popular as Java/C/Javascript/PHP but have all the libs needed for your use case.
> I think I have a radical idea: use a language which solves this problem by design from the beginning. Much simpler.
I completely disagree. It's much simpler to use a language with a few simple, powerful constructs, which can be used to build everything else as libraries.