My question is, how will he feel about this three years from now? When he is trying to hire someone? Or when the folks behind Elm don't update it as often as they should?
The problem with unpopular languages is twofold:
* lack of talent that can step right in and be effective
* lack of resources to push the language forward
The first can be remediated by planning to bring new hires up to speed, and just making that investment in them. (It can also be a useful filter, making sure you are hiring someone who is really interested to the company and is willing to make the time investment to learn what is probaly not a very portable skill.)
The second is a bigger problem, if the main sponsor of the language moves on. If the main sponsor is committed, then you're probably fine. (I have no idea who pushes Elm forward, looks like there's a foundation from the wikipedia page.)
Worth noting that in our experience, hiring has gotten way easier for us since we became an Elm shop. We really struggled to hire React engineers (who have a zillion positions to choose among - why would they pick ours?), whereas there seem to be a lot more great programmers who want to use Elm than there are companies hiring for Elm positions.
Here's a verbatim quote from a cover letter (one I happened to be reading this morning; we see a lot of similar stories):
> Despite my valiant evangelizing of Elm, my company has decided to embrace React over Elm, so I am looking for opportunities to develop Elm professionally.
Our Head of Talent said she'd never seen an inbound pipeline as strong as ours, and the #1 reason people cite for wanting to apply is Elm. The "Python Paradox"[0] is real!
The communities for languages like Elm are definitely smaller, but they're also "passion languages", if that makes sense. Nobody is learning elm because they've been or feel forced to (like you might with React, eg) or because they needed to maintain a legacy codebase at an old job or something. They're learning it because they're interested/like it, etc. So when it comes time to hire you have a small(er) pool of candidates than a react position might draw but the candidates are going to be people who've played around with Elm and enjoyed it and want to get a job writing Elm.
I'm kinda surprised more companies don't take the risk - its not like these are bad languages, either. There's still a huge chunk people (in terms of absolute number, not portion of total devs of course) out there playing around with languages like Elm or Haskell or a Lisp/Scheme, or OcaML, F#, etc. who'd be super excited to use those languages professionally.
But are language enthusiasts really better hires in general? I've known many of them who loved the theory and toying with a language but were not any better at producing actual value than the average joe.
> But are language enthusiasts really better hires in general? I've known many of them who loved the theory and toying with a language but were not any better at producing actual value than the average joe.
I think they may be safer hires, anyway. They will be able to program themselves out of a wet cardboard box, which is more than you can say about the average Joe who may just be good at bluffing the interview.
Provided you dangle enough money and have a good interview process you can easily cull those people who can't code themselves out of a wet paper bag.
The thing that would worry me about niche languages is the amount of wheel reinvention you'd have to do. Not sure the cost/benefit calculation starts to look so great any more when you realize that your enthused elm developers will have to build a whole lot of stuff that you can just import in other languages.
Most niche languages have an FFI escape hatch for when you really need a library and don't want to re-invent a wheel. Elm, Purescript, Bucklescript, and ghcjs can all interoperate with javascript; Scala and Clojure work with most (all?) native java libraries; F# has access to most (all?) .NET libraries. Haskell, Idris, ATS, Erlang, and SBCL Common Lisp all allow you to access native system libraries.
FFIs can have warts, but in the general case library support isn't the obstacle people make it out to be.
I don't agree. I find that the impedance mismatch of an FFI is an incredibly fertile breeding ground for nasty and obscure bugs and deployment problems.
> Provided you dangle enough money and have a good interview process you can easily cull those people who can't code themselves out of a wet paper bag
Interviewing costs money—good tech interviewing particularly so. The less of the people you have applying to start with, the more money you save on culling then by way of tech interviews.
Not sure how Elm interacts with Javascript libraries, but JVM languages can generally use any existing Java libraries. So a language built on a popular run time (JVM, .Net VM, guess there's exactly two), can still have excellent library support.
Aren't you just as likely to get people who excel at hyping and hello-worlding new languages but can't actually make applications? Or people who just read the python paradox (with python scribbled out and Elm written over it) who think they can look like a great developer who understands the value of Elm but really don't?
You'll get those, yes. I know because I was one of them... But I think you unerestimate how bad the average interviewee can be.
Someone who hello worlds new languages may not know the syntax or idioms to write fizzbuzz in any specific language, but they will be able to characterise the rough shape of the problem, and not just sit in silence...
I have noticed a lot of language / framework collectors where every new project is in a new framework or language on their CV. While it will expose you to a lot of different ideas I doubt you will code very well in any of them if you are swapping every year. I have seen plenty of Python written like its Java and its just less elegant code. And Django code that doesn't follow best practices seems to be the norm. If you take the time to master a few things you can produce some quite elegant solutions. If you only know the basics, you reinvent the wheel in a crappy way usually.
The best ideas are those that you can share between languages. When you get hung up on idioms, or when you recreate idioms that don't matter in another language, that's where the problems are. Exposure to a lot of languages allow a person to figure out that union of computation that benefits all languages.
Like I say when I see python written in a Java style its verbose, and doesn't take advantage of much of the expressive syntax that Python provides over Java. You have given up the speed benefit of Java and have kept code that is nearly as verbose.
And don't get me started on the way people misuse Django.
In my experience they're quite correlated (but not 100%). It seems to be a strong signal for passion about development, which I'd say is important for hiring someone who really cares about doing their job well (which I also find is an important and rare attribute in a good dev).
I think this depends on what you mean by “actual value”. A working product powered by a tangled yarn of hastily-assembled spaghetticode has a highly dubious long-term value, and what’s worse, your “average-Joe” coders might not even realize/see the value of what that “one overly-careful coder” brings to the table
They're a great way to seed your recruiting pipeline. Enough of them ought to be good hires to make this useful. After that, you just have to trust your hiring process, just like you would for any other source of candidates.
Because you get all of the problems that you get mentioned in the article. And on top of that javascript land is fad heavy, and whats cool today might not be cool 2 years from now.
I personally know of a story where a startup started with datomic and clojure, and it was a bad idea because datomic could not effectively delete things. Eventually they went the standard java and standard cassandra / postgres / redis type route.
He was working in a health startup and this was something like a couple of years ago. He said that datomic was immutable so actually deleting things to HIPPA standards was not possible I think. They might of fixed it by now.
Having installed a medical records system for live-fire use by an optometry practice, I can say that it's exceedingly rare that you ever want to delete anything that is logged in a medical records environment. It's so rare that it's never been done on the four or so years I've been maintaining the system.
You're correct that Datomic considers immutability a feature. That said, Datomic on-prem supports excision which completely removes data for exceptional cases. I suspect Datomic Cloud will sometime in the future.
With EU's GDPR (General Data Protection Regulation) looming on the horizon I think Cognitect would do themselves a disservice should they not include excision in their Datomic Cloud platform sooner rather than later.
F# is a bit surprising, since it’s already on top of a mainstream platform. If all else fails you can glue in C# if you ever find that you really can’t do something with F#.
F# and C# compile to the same intermediate language, are interpreted on the same runtime, and are produced by the same organization for the same IDE... Their libraries are compatible and they rely on shared access to the common .net ecosystem.
Your absolute worst case scenario would involve decompiling your F# source as C#. More realistically you would use dependency management to gracefully transition out of one of the languages.
Transitioning piecemeal works exceptionally well: it's exactly how I've transitioned numerous projects away from C# onto F# ;)
Ummm Purescript compiles to JS and shares the same runtimes. Both languages have support from popular text editors, and their libraries are compatible and rely on shared access to the common JS ecosystem. In what way is it an apples-to-oranges comparison?
I have to ask, do you guys have the catch 22. Of we want you to have worked in it professionally before? I'm being totally serious.
I worked in Elm, and loved it. I tried to get it in several positions I worked at. Applied to a few elm positions, but was told I needed prior professional experience. So moved all my personal projects to React so I could be more marketable. My day job is still using java server faces, or django templates largely.
As noted above there are pros and cons. But if you're using a fringe language/tech and allow your team to contribute back upstream. That helps immensely. One of the other stands out I can think of is Jane Street and OCaml.
Just tinker with Elm at work on your lunchbreak and say you uswd it at work. If you're awesome, no one will know and/or care where you picked up your awesomeness.
Please don't do this. If you're not found out in the interview, you'll be found out on the job eventually due to your obvious lack of experience. Some of my most personally loathed coworkers have been people who have bullshitted their way into positions by claiming skills they don't have or aren't qualified in, making them a nightmare to work with.
Good developers should be able to pick up new technologies on the job though. And you are conducting interviews in such a way as to select for good developers, right? To the extent that that's possible anyway?
I'd rather just hire for good developers in general who are interested in working in Elm than to hire only for Elm experience. As you've pointed out, there are problems in doing the latter.
Yep, I'd rather take the person who says "I've never worked with Elm but I've heard it does X, Y, and Z well because those are things I've always found lacking in [insert JS framework here]" or even "I've never heard of Elm, but based on your description it sounds like it might mitigate X, Y, and Z problems because of [reasons], cool!"; over the person who says they've used Elm before but gives vague answers to follow-up questions (even if they aren't lying and have actually used it).
This can get really messy. Using it on your job, creates a one-off orphan that no one knows how it works. Then I've seen a number of people claim they used it but didn't
I've also found this as a partial negative. Positive you took initiative, and want to learn. Negative you've now written a product that only you can maintain, and doesn't listen well.
I go down the honest path. Of I did a proof of concept, or scratch project in this language. But didn't get the buy in from the rest of the team. Which sometimes leads to why am I not good at sailing new technologies.
Aside, at least my side projects can be pushed to Github. I feel like committing to publishing a repo is the only reason I even finish anything, like writing a solid README, stubbing out a few issues, and then eventually closing them.
I never know if someone has seen my repositories but it's been invaluable to have projects I can confidently link to. Until I got to that point, I remember an anxiety of "gee, I sure hope they take my word for my skills." Talk about imposter syndrome.
But yeah, I agree with what you were trying to say. Whether you are honestly representing your skill level doesn't have much to do with whether your experience was paid or done in free-time.
The aspects in which they differ are numerous and substantial. If a candidate expressed the sentiment that they are the same in an interview, I would not take them seriously since it would indicate that they are either ignorant or naive, and either apathetic about work, or likely to favor personal coding pursuits over the job they're applying for.
And you've managed to try your best to denigrate me without actually stating any of the reasoning. If you would, try again, but hopefully without the ineffective sense of superiority.
* Technical failure in a professional setting can lead to loss of your source of income and therefore, in the worst case scenario: loss of access to food, shelter, comfort and other basic life necessities. Failure at a personal project has no such comparable consequences. Techniques to minimize failure are quite literally a matter of survival in a professional context; they are often an afterthought in a personal project. Fault prevention and tolerance are often the most difficult characteristics to achieve in any sufficiently complex software system, and their prioritization or lack thereof has an enormous impact on virtually every aspect of working on such a system.
* In a professional setting, you often have little choice of who you work with. Therefore the technical choices you make in a professional setting must take into account the possibility that the next person who works on your project may be someone who does not understand, like, or care about the way you do things, and that the way they interact with your work may have significant negative repercussions for both you and the project. This is often a tedious, difficult, and time-consuming task that requires different techniques and approaches than one would use when working on a personal project. Conversely, in a professional setting you will also inherit the work of others which may be done in ways you don't understand, like, or care about, with few or no options to change those ways, and you must develop and adopt technical strategies to deal with this as well.
* Documentation, readability, and testing take much more precedence in a professional setting since they serve as guardrails for the long-term integrity of your work as it gets passed on to others to maintain and expand.
* Working with others also requires agreement, and usually mutual compromise, on basic standards, conventions, and processes that can significantly affect your daily workflow and may greatly differ from how you would work on a personal project.
* The end goal of personal projects is usually personal enjoyment, satisfaction, and learning. The end goal of the vast majority of professional projects is to generate revenue, or otherwise serve some need that your employer deems sufficiently important to fulfill by compensating you. The intersection of techniques that are personally appealing to you and the ones that will generate value for your employer may be very small or nonexistent much of the time.
* Your employer may not care about letting technical debt pile up indefinitely as long as you can keep cranking out new features and meeting deadlines, code quality be damned. Alternatively, your employer may impose onerous processes to ensure code quality and integrity that aren't really necessary or effective, and which may even be counterproductive. In fact, there's a good chance your employer will understand very little about what you do at all and make demands that are incongruent with technical realities, which you must deal with, sometimes by performing pointless tasks solely to appease them.
* Personal projects are usually started from scratch, and often dropped or 'completed' once they reach a certain level of complexity, due to the increased level of difficulty brought on by said complexity. Rarely does one have the luxury of dropping projects on a whim for such reasons in a professional context. In general, working in a professional context often involves dealing with a lot more annoying complexity and thorny problems that simply cannot be ignored. Even successful, enjoyable professional projects may become boring after a while since their success will draw continued investment by the employer, oftentimes in areas that are uninteresting or largely irrelevant to the aspects that were originally exciting.
> Techniques to minimize failure are quite literally a matter of survival in a professional context; they are often an afterthought in a personal project. Fault prevention and tolerance are often the most difficult characteristics to achieve in any sufficiently complex software system, and their prioritization or lack thereof has an enormous impact on virtually every aspect of working on such a system.
See, I almost see this as backwards. My personal projects often depend on only myself to maintain survival, and I didn't put all that time in to see them go by the wayside so easily. This means I build in tools and resources to help put them up and keep them up, because I'm the only one who can. In a professional context, I've often seen team members not consider these issues because once it's out, it's someone else's problem- if it causes downtime, DevOps, if it's a simple bug, whoever takes the ticket.
> Your employer may not care about letting technical debt pile up indefinitely as long as you can keep cranking out new features and meeting deadlines, code quality be damned. Alternatively, your employer may impose onerous processes to ensure code quality and integrity that aren't really necessary or effective, and which may even be counterproductive. In fact, there's a good chance your employer will understand very little about what you do at all and make demands that are incongruent with technical realities, which you must deal with, sometimes by performing pointless tasks solely to appease them.
While certainly true, this is also something I would never expect someone on my team to be proud of. This is a shameful act, and done often enough leads to the departure of good talent. I expect management(including myself) to do it's job: Manage. This means not taking advantage of your employees often enough to make it a norm.
---
However, the thing I keep noticing about a lot of these is a certain level of assumption of what a personal project entails. That you may be producing something with significant technical debt, that you are coding in a way that isn't trying to work best with others, that you're not aiming for revenue or having to work under unrealistic constraints. Personally, I code at home the same way I code at work: With quality and hope that it will continue to have a life long into the future. Sometimes I don't return to projects for months- and I want to be able to pick them up the way I put them down. The other thing I recognize, is the kind of experience that has to do with working with others- something most personal projects certainly don't approach, but I don't look for them to solve it. Team experience still matters, I just also think personal projects mean something for technical experience.
> My personal projects often depend on only myself to maintain survival, and I didn't put all that time in to see them go by the wayside so easily.
That's great, but when push comes to shove, you can and will ignore an intractable issue in a personal project until you feel like working on it again. With a job, consistently ignoring issues can endanger your career and thereby, eventually, your ability to provide for yourself. (And to me, being a parasite who punts issues like you describe is hardly a better existence even if you're able to get away with it.)
> While certainly true, this is also something I would never expect someone on my team to be proud of. This is a shameful act, and done often enough leads to the departure of good talent. I expect management(including myself) to do it's job: Manage. This means not taking advantage of your employees often enough to make it a norm.
Sometimes accumulating technical debt is a more pragmatic option in the bigger picture - what I'm describing isn't necessarily a universally bad thing (although it often is bad); the point is that professional and personal projects have fundamentally different priorities and that they require different skills and approaches to navigate effectively.
> However, the thing I keep noticing about a lot of these is a certain level of assumption of what a personal project entails. That you may be producing something with significant technical debt, that you are coding in a way that isn't trying to work best with others, that you're not aiming for revenue or having to work under unrealistic constraints.
You are validating this assumption with statements like this:
> Sometimes I don't return to projects for months- and I want to be able to pick them up the way I put them down.
When you are in production with real customers, you rarely have the option to do this. It's great to strive to adopt good practices in your personal projects, but when shit hits the fan at 3am, you're not going to get out of bed and fix it unless you have some real skin in the game, and you're not going to be truly motivated to code as if that's a real possibility (That means having: runbooks, paging, escalations, dashboards, metrics, alarms, gradual rollouts, time windowing, calendar blackouts, rollbacks, multi-step deployments, canaries, tech-ops, feature toggling, A/B testing, backup/restore, DNS safeguards, load balancing, SLAs, pentesting, failover, status reporting through 3rd-party channel, etc. etc.). I never said personal projects don't mean anything for technical experience; they certainly do. But that doesn't mean they're representative of professional experience either.
Don't you want people that care about your product instead of the technology behind it? I do. Because there may come a time when another technology is a better fit for your product and ultimately your customer's experiences. Then what? Now your not using elm and your team leaves or is disgruntled. I hire on passion for what we are trying to accomplish, not the technology stack.
> Don't you want people that care about your product instead of the technology behind it? I do.
I would rather have people who care about their craftsmanship and are
indifferent about the product than people who care about the product and are
indifferent about the craftsmanship. The former ones do a good job regardless
of what the product is about.
Depends how good people you want on your team. If your bar is low, then you
might find enough people above the skill threshold that care about your
product, but there are plenty of good craftsmen who couldn't care less about
your IPTV offering or some other SaaS for housewives, but who would put their
skills to great use for either, just because they do care about how solid is
the technical part.
Seems like a false dichotomy to me. It's plenty possible to choose (and recruit based on) a suitable language at the time you start a project, and to have people psyched about making a great product.
If down the line you find language XYZ is a much better fit, well you still have choices about migration/etc. I wouldn't expect the devs to revolt against this, if it really is a better fit.
Time and technology marches on, that doesn't mean we can't try to make the best tool choices we can while still accepting that change is a fact of life and adapting the best we can.
I tried this. Bought into the company vision despite lack of experience/desire for RoR. The plan was to migrate to microservices. This happened too slowly and rarely and I ran out of enthusiasm working with a legacy monolith. The scaling problems we're interesting although they could be sidestepped other ways.
I find this whole line of thinking odd. First peoples preferences over technology change and pretty much everyones preferences or "passions" develop. Second, you dont need to be passionate about product to produce good work. You need not to hate it. You need to like the position and work you are doing, but that does not require passion for contracted web page for financial company (or whatever).
These are unbelievable fantasies. Where the time comes that another technology is a better fit is few years after when a.) team members might have changed preferences already multiple times b.) given average employee changing job once in 2 years your original team members left.
I think that hiring would is much better when companies hired less on applicant emotional state and more on calm rational decisions about work.
The Python paradox is interesting in 2018. At time he wrote this, Java was the go-to language to teach in university and Python was reserved for hobbyists. It seems that has almost completely flipped now, and everywhere wants to teach Python first. Evidently, there's an awful lot of no-so-smart python programmers available on top of the smart ones now. It looks just like Java looked in 2004.
I definitely disagree that java is looking like - or even starting to - python did in 2004, it still has a huge market share, even compared to python. Yes python is definitely more popular now, enough so that the python paradox doesn't even apply, but it will probably never apply to java, or at least not in the current computing paradigm.
Maybe I read GP wrong, but what I took it to mean was that the market/talent position of python in 2018 is similar to java in 2004, not that java in 2018 is similar in any way to "2004 python".
Elm, Haskell, Clojure, etc, seem like they fulfill the role of "2004 python" in 2018.
Yes sorry, I worded that badly and didn't intend to mean that Java was in any way like Python was in 2004. Only that Python is no longer something you'd consider someone smart for using.
I saw the same at CircleCI with clojure. People want to use functional langs in real systems - we had a lot of Haskell lovers apply because "close enough". Way easier than hiring rails devs. One of the reasons I'm using OCaml/Elm in my new startup :)
> One of the reasons I'm using OCaml/Elm in my new startup
Where do I apply? ;-)
(To prove the point: yes, I'm one of those passion people, moving to Denmark to work in OCaml full-time, before working in Clojure full-time. Now someone give me an Idris job, heh!)
I'm not the quoted person, but we hire Elm devs (or people who want to become Elm devs - no prior Elm experience necessary; you can pick it up after joining!) all over the world.
Most of our team is remote, including one from Copenhagen!
You and your team (NoRedInk, Evan, et.al.) could actually grow the number for amazing Elm devs out. I think part of the reason is that there's a very narrow avenue to traverse to even build production-level Elm apps. One of the sure fire discouraging factor for someone learning and wanting to embrace a language is when s/he is unable to see the same see the light of the day at the hands of real users/consumers.
P.S. I'd love to learn and apply Elm across. I've applied at NoRedInk but did not get any response.
There is SimCorp and Issuu that I know of, but I've also been at a meetup hosted by a company doing ReasonML. And there are a couple of people doing OCaml at Zendesk, but I don't know if they use it in production or just for fun.
Strangely, if you switch Elm for Scala, you get the exact opposite result (at least where I work). We got mostly inexperienced candidates and the ones able to understand the codebase can get more interesting offers elsewhere.
Scala has a steep learning curve, so we're looking for devs having at least one or two real production projects (that aren't using spark) on their belt. It turns out to be quite a challenge to find someone who have experience in the ecosystem and the language, AND who _like_ working with it.
Plenty of developers have significant Scala experience and like working with it, but the demand greatly outstrips the supply - if you look at a chart of average salary by language Scala is a huge outlier.
Interesting, I wonder if there is a fade on the Python effect, where the talented devs that pick up on new languages move on to something else. eg If you want to create a big new long-lived project you pick a new language to get a good team, but for maintenance you'll struggle if that language doesn't become popular.
I'm a little confused. React isn't a language, or at least I thought it wasn't a language. When I hire people I'm not the least bit interested in which JavaScript framework they have already used. I kind of assume that if you can learn one you can learn them all. The idea of labelling yourself a react engineer seems really limiting to me
> I kind of assume that if you can learn one you can learn them all.
Knowing C doesn't mean you can't learn React, but if you've only ever done C and low level systems programming means you have a lot to learn to get to the level of someone who has specialized in knowing React and it's environment. It's not just React. It's everything around it that also matters. Such as browsers, HTML, CSS, and all the best practices there. And while I'm sure anyone can learn that, the question is, would you do it in reverse?
Would you hire someone who knew React, CSS, HTML, and web development to write systems level C and expect them to learn it all, and, most importantly, be effective in their role?
Labelling yourself as a react engineer isn't limiting. It's just one of many things. You can have many labels, and adding a label doesn't take away from other things you can do. However, it is an effective way to communicate what skill sets you have to people that would be good to work for and with.
I would look askance at someone who calls themselves a "C Developer" in much the same way I would at someone who calls themselves a "React Developer". The reason being that it just smells wrong, like they read a "C for Dummies" book and are trying to fake it til they make it or something. The correct label is something more general about the domain (eg. systems developer, front-end developer, etc), although I realize these labels are imperfect and subject to their own anti-patterns (eg. devops engineer), they at least signal you understand something about the broader landscape.
Where does that specialization stop, reasonably? Isn't this the same with JavaScript? If you have worked with any Algol-derived language you can learn them all.
In theory it doesn't stop. Most developers have the capacity to learn anything. In reality though I've found its more that some Devs are not willing to work with certain languages.
Specifically with respect to the hiring advantage it’s a little disingenuous to not explicitly state that the creator/designer of the language works there.
That’s a significant enough reason to pick one shop over another and also a reason to opportunistically apply.
“You get to work with Evan/Guido/dhh/Gosling” is a different sell than “we use Elm/etc”.
Part of the reason we hired Evan was because the hiring advantage had been so great already, we started to ask the question "how do we maintain this hiring advantage as Elm gets more popular and the Python Paradox eventually wears off?"
I don't think we're especially close to that happening yet, though. Obviously the opportunity to work with Evan is a big differentiator for us among Elm shops, but the cover letter I quoted exemplifies a person whose reason for leaving their current position was that they wanted to work with Elm, not React.
There are plenty of opportunities for companies to attract people like that. :)
You really ought to disclose that you also hired the language's BDFL when you make comments like these.
With that said, I sort of agree with the sibling comment here. Most other people involved with a language that I've seen are still able to make objective criticisms about it.
Yes it's the ol' Haskell tax. As a part-time Elm enthusiast that's why I'll probably use it to build my own stuff to opensource/sell, rather than try to get a job doing it, lest my family goes homeless. Hyperbolic, but kind of true! However it's great for the company hiring. They can get solid code at a solid price. I've seen the same with a Haskell/Blockchain job. The company owner admitted in a chat channel that he would have said "Berlin Salary" rather than "Not quite SFO" in the job ad if he had know the response rate he would have got.
Elm doesn't support code splitting or `insertRule` styling, both of which are important for performance. `elm-package` is currently piggybacking GitHub's infrastructure instead of having its own hosting. I think Elm should have more patch releases. I don't think `comparable`, `number`, or especially `appendable` should be in the language, and hope they get removed someday. I don't think `==` should be `a -> a -> Bool`. If I were in charge, ports would work with `Value` directly instead of doing automatic conversions at the edges.
Also, your comment was needlessly harsh. If you wanted to know what I consider Elm's flaws, you could have tried just asking.
I'm always impressed by your ability to keep cool when someone seems to be making a personal attack (though I'd like to assume they aren't). I think Elm/Evan is lucky to have you as a kind of evangelist.
You're right, my comment was needlessly harsh and I apologize for that. You didn't deserve that and it was rude of me.
The subject of Elm often gets me riled up because there's so much good about it mixed with what to me is a greater portion of frustration. I strongly disagree with how the language is managed, with the closed nature of its development, but mostly with what looks like the constant dismissal of any concerns about any of these issues any time they're brought up. By far, the most common response feels something like "everything is fine and your opinions are baseless."
There are people I know who, like me, have tried to convince employers to even consider Elm and have been shot down because, in management's words, "there doesn't seem to be any idea where the language is going", or its "too unsafe, its all dependent on one guy" or "looks like its dead now anyway, no releases in over a year and apparently it sucks at basic things like dealing with json". And yeah, some of those things, like the json nonsense, shouldn't be showstoppers. But not having an idea of what the roadmap or timeline looks like is a big deal. No releases in a year is a big deal. So then you bring that up on the elm forums or the slack, and you end up getting blasted there too because now you're seen as criticizing this thing that everyone loves.
Obviously I'm just a jackass on the internet with more mouth than brains and I wouldn't hold your breath waiting for me to ever make anything as interesting or yes, successful, as Elm is right now. I appreciate the work you've put into making Elm as good as it is.
> what looks like the constant dismissal of any concerns about any of these issues any time they're brought up.
I remember earnest (and exhausting) discussions of these topics from back in like 2014, when it was unclear what the best way to scale Elm's development would be. We tried different approaches, but the outcomes weren't great.
We're actually still experimenting with this. For example, Evan wanted to put a community member in charge of the debugger, so he did...about a year ago. That experiment hasn't been fruitful (the debugger has not had a substantial release in that period), so we're trying handing it off to someone else. We'll see if that works better based on what we've learned from the previous attempt.
It's easy to say someone else should have taken it over sooner, but at the time there wasn't anyone else who understood the internals well enough to manage it, while also being interested in taking it over. Now there is.
Relatedly, I get that some bosses want to see a higher bus number on the compiler, but functioning teams of compiler authors don't just drop out of the sky. It's a rare specialization, and even among the few people who have the training to do major work on a compiler like Elm's, most are academically oriented and didn't go through a pH.D program so they could performance optimize Haskell code, make nice error messages even nicer, or find ways to remove language features rather than adding that cool new one they read about in a paper.
There's also communication. Evan used to give lots of updates about his progress on upcoming releases, and the main effect was to delay the release. He'd announce some progress and it would instantaneously become a Q&A - at best. Often people would pressure him to give up on whatever the latest impediment was and ship something sooner. If he didn't respond to those responses to his progress update, people would complain that he was unresponsive. So doing public updates strictly increased complaint volume, which of course also takes a toll on Evan, being a human and all.
> But not having an idea of what the roadmap or timeline looks like is a big deal.
The two options here are:
1) Be honest
2) Try to trick bosses
There isn't a world where Elm stakes out a roadmap and a release timeline and then hits it. Elm is trying to do a bunch of things differently than how they've been done before, which means each release is in some ways experimental, and also that each release changes substantially based on lessons learned from the previous release.
So yeah, it'd be possible to make up a bunch of "oh yeah, Feature A is gonna come out in April, and then B will land in July" but anyone could look back a year or two later and realize that the stated roadmap had barely any relationship to what actually happened. (People would complain about that too - that Evan wasn't sticking to the roadmap he'd laid out.)
We know from past releases that this is how it's gone (in my hubris, I told my editor at Manning I predicted Elm 0.19 would be out last summer) so there's no way to publish an official roadmap without knowingly misleading people.
If anyone considers it a red flag that there's no official timeline, well - that's because such a timeline wouldn't mean anything anyway. Evan's approach is to be open and honest about this. [0]
This is how we've ended up with a community of people who largely don't think it's a big deal: process of elimination. Everyone who considered it a deal breaker is still using JavaScript.
I don't know how helpful that all is, but maybe it sheds some light on the history of some of these things. :)
Maybe it would be useful to list your criticisms of the language? All Robert did was point out a single point about how NoRedInk's decision to go all in on Elm has been a net win on hiring, why such a negative comment without anything constructive being said?
His FrontEnd Masters course demonstrates this to be false. He was straightforward about the language and acknowledged shortcomings around things like JSON decoders, for example.
Furthermore, last I checked the source code doesn't live inside Evan's head like a family recipe. The functional principles and clear Haskell influences won't die if Evan were to go away.
At Real Kinetic we've actually been internally using Elm for 2 years. While at Workiva, I OKed some internal projects to use Elm as well. Your observations are absolutely correct.
We have actually found on-boarding engineers to Elm is fast and relatively easy. There is a very predictable learning curve engineers seem to follow. That's one of the things we like about Elm versus some other solutions. The even bigger observation I've made is that all of the engineers we've exposed have become fans, like Alex.
The lack of resources and other companies contributing to the community and libraries is a challenge, and a concern. As an example, we use Elm Native UI for some projects. It has a limited user-base, which has at times been frustrating. We're hoping to see more adoption of Elm to help mitigate this problem.
Yet another problem is that for an unpopular language it is easy for its developers to introduce incompatible changes. If one does not have resources to fix the codebase, it is not much different from the developers abandoning the language.
Elm still at risk of such changes, witness fundamental change to it's event subscription model after 0.16.
Of cause, such incompatible changes may happen even with popular languages. But at least you know that you are not alone and there will be companies that support your version. Like, for example, the case of Python2, still supported and even installed by default on many systems instead of python3.
> The first can be remediated by planning to bring new hires up to speed, and just making that investment in them. (It can also be a useful filter
It can be a useful filter in both directions. I'm more impressed with prospective employers who have an onboarding plan including some reasonable model of a learning curve than prospective employers who assume "oh, we use popular language/framework/tooling x/y/z, so if we find someone who knows these specific things, onboarding will be a snap or practically take care of itself."
In the general sense (not talking about elm) It's a risk going off the beaten path to leverage some new way of doing things that makes life easier, sometimes that risk pays off, sometimes it doesn't. Sometimes it pays off bigtime early and then starts costing big time later ( I experienced this using Meteor from early on in its life ). I went with C# when it was still beta, that paid off real well. I went with embedding lua into a embedded system very early in luas life, that was a great choice. I tried with moving to F# from C# and it was just too painful for the devs to switch (though I really like F#, it's advantages weren't huge over C# that made it compelling ). Considering trying again with converting C programmers to Rust for an embedded system where we are refreshing to a more modern chipset.
So mixed results, but just because it might not work out isn't reason to not take a risk, just have to work out how appropriate that risk is.
Hiring: Elm is quite easy to learn. It's a shaved down Haskell for the language, with a military style strict version of React as the view paradigm. So you just need someone who has done some ML type language or is keen to learn, which is many people. Also see my sister comment about Haskell tax. (The employee pays the tax to the employer!). We really need to get away from this idea of "She's a Java developer", because she isn't. She is a problem solver with programming skills.
Lack of resources could be an issue. I am hopeful about Elm though as the founder has plenty of cash and passion to keep pushing it forwards, plus a full time job doing just that. Plenty of people who can step in if required. The compiler is written in Haskell so is probably quite maintainable. Much of the functionality is in libraries maintained by a diverse group.
This is always a risk with new tools and techs. Thankfully there are sometimes enough early adopters to help keep the new thing in motion and improving.
Considering that a lot of code that gets written is thrown away or otherwise doesn't survive for years, a bit of risky experimentation isn't actually as risky as it seems.
Even though I'm one of the never-satisfied/always-seeking-newer-better group (and my list of used and discarded tools/languages is very long), I do sometimes wonder if we would be better off with fewer choices and more effort spent on the concepts and methods of software development rather than on the tools.
Absolutely. Whenever I see a new project, I look at business risk and technology risk. Either one is OK, but both are not. (I picked that up from someone, don't remember who).
I think that there's real value to being a Tom Bombadill (and really learning a language/framework/problem space deeply) as opposed to being a Gandalf. However, one fundamental issue is it's easier be a consultant and speaker if you are focused on the new shiny objects, and so much of what is written is from that perspective.
I have pondered about this depth vs breadth approach for some time. My path has made me a generalist who learns and does whatever I think is necessary in a given situation. This is great because I can do anything. But I envy the domain experts who do one (or a related few) things VERY well.
Perhaps it depends much on the mind of the person; I don't think I could just focus on one thing forever. But while the new shiny tech guys may seem to have more opportunities, I believe the domain experts may get paid more and have more of their career time spent as "recognized leaders".
It's probably good that people are all different, and both types (and all in between) exist.
To analogize, if you are a general carpenter, you can always find work, but it will be at a lower rate. However, it will be varied.
If you are a fine cabinet carpenter, your work will likely be more focused, possibly more repetitive, more lucrative, and harder to find (you'll have to seek out the folks who need really really nice cabinets).
Exactly. And maybe the grass is always greener, but sometimes I envy the COBOL guy who earns a small fortune doing something that doesn't involve learning a new language and framework every two years.
I work for a consultancy (both general consultancy and engineer), and the phrase they like to use is "a T-Shaped consultant". Meaning that you have a broad base of skills and one specialism. For us, most people fitting that model works pretty well.
> * lack of talent that can step right in and be effective
Competent people, not "talent". Talented people are very rare, and the
chances are, you haven't ever seen one in reality.
And there are several things to note about hiring programmers for writing in
less popular languages: (1) it's much more difficult to hire an incompetent
idiot that knows Elm than similar idiot who knows C#, Java, or Python --
signal to noise ratio is much better; (2) there is such thing as Haskell tax,
which pretty much means that writing in Elm can be treated as a job perk; and
(3) you don't need people who know Elm, you only need people who can learn
it in reasonable time.
Especially (3) is important, because learning yet another language of the
same paradigm is not that difficult for a competent programmer.
By coincidence, I just had breakfast with someone whose company adopted Elm; they made a great case for it. I'm excited to check it out.
But I'd add that popular languages have related problems. Java, Python, and Ruby are all ones I've used in production, and I'd say every one has a big problem pushing the language forward. They all have failed me differently in that regard, and I'd still happily use any of them again in the right circumstances.
The talent situation is interesting as well. With a popular language I can find more people who claim to know the language. But of the people who show up, a lot more of them will not be particularly good. In practice, given the work necessary to get somebody up to speed on our domain, our chosen libraries and frameworks, and our own code base, helping them learn a new language doesn't seem like much on top of that.
The big things that keep me from picking novel languages for long-lived production code are libraries and production considerations. It's such a huge advantage to be able to download a decent library rather than having to code everything from scratch. And I don't want to be the company pushing a language into unknown performance territory. It's noble work, but it can be expensive and introduce a lot of volatility that I really don't need.
Regarding the second point, As a counter-argument, it’s not certain that languages and frameworks from major vendors will continue to be supported either. For example Microsoft deprecated various UI frameworks, Firefox extensions are changing, and so on. (I’m sure there are better examples I can’t think of at the moment.)
Sure, large vendors deprecate stuff all the time. But you'll probably have plenty of notice and they'll have a path for your existing applications to follow. It may be painful, but it won't be catastrophic.
Contrast that with what would happen if No Red Ink went out of business and the author of Elm couldn't find a job that would allow him to continue to develop it. Things would be fine for months or years but eventually bitrot would set in.
Definitely not trying to spread FUD. I know some great folks that swear by Elm. It's just another risk factor (just like tech debt that might accrue should you use jQuery) to consider.
Actually, I used to work with Evan Czaplicki (the author of Elm) at Prezi around 2015. He was laid off with a bunch of other employees during a “reorg”. It didn’t seem to have affected Elm.
That's not a ringing endorsement of Elm's commercial viabity. Prezi hired the Elm creator and funded Elm development and then ran out of money to pay him.
> [the author of Elm couldn't find a job that would allow him to continue to develop it]'s just another risk factor (just like tech debt that might accrue should you use jQuery) to consider.
Nope. jQuery fit(ted) a different use case (spicing up server-side rendered HTML upto small browser based app-bits). Doing a browser-side app in Elm (what it's specifically made for) makes for code base that is so much easier to work on some years from now, than using jQuery to accomplish the same.
What I want to say is, the debt you're going to accrue applying it to Elm's domain will not be a risk, it is impossible not to be destroyed by that debt. Where Elm provides a serious alternatives to the browser app frameworks that are 10 generations younger than jQuery.
This is correct but sad. It's why better languages like Elm and Rust struggle to gain acceptance while inferior but well marketed efforts like Swift dominate.
Woah. I love Rust and program in it super often, but I've looked at Swift also and it seems like an incredibly capable language. It has similar capabilities for algrebraic types, and a "typeclass" like approach to parametric polymorphism. Also if I'm not mistaken, Graydon, the creator of Rust works on it now.
What makes you say Swift is inferior? It seems like a huge improvement over ObjC.
Most good devs can learn a new JS framework very quickly...hell I've done it about 10 times now. It's really not a big deal, most are pretty similar or variations on other popular frameworks from other languages.
What matters more is the quality of the language/framework and relative availability of libraries for the particular usecase (general popularity is not always necessary).
Your interested in solving problems you might have in 3 years, and Elm solves problems you definitely have now and will continue to have in 3 years. In 20 years of development I've never been happier, more productive, and more error-free.
I think the hiring woes problem for languages like this is a reasonable hypothesis, but I've never seen any actual evidence for it. I constantly hear from folks who are worried that they will have this problem, and frequently hear folks saying they tried it and it wasn't a problem (see sibling comments, for example). I'm not sure I've ever heard from someone who has actually had this problem.
Sometimes niche technologies have benefits, as you can be a big fish in a small pond.
When I need to do something new, I look for small companies from weird places with promising products. I get favorable terms, and usually rapid turnaround in features as we figure out what things we though we needed vs reality.
This has been a pain point for us with a couple of the language updates. Not as bad as other pre-1.0 languages I've used in the past (like Rust), but it has been painful. The compiler is helpful enough that upgrades are mostly just tedious.
In the worst case, you just adopt the runtime yourself. I don't understand why people draw boundaries around certain frameworks, runtimes, and languages and say "I'm hiring that kind of engineer". A good developer can go up and down the stack as needed and fix any part of it --- including the language runtime.
People do tend to have a mental block where the possibility of fixing one of their dependencies is something that wouldn't even occur to them. To the point where they'll implement really complex workarounds in their own code instead of submitting a one-line patch (or sometimes even reporting it).
That said, as someone who is no stranger to contributing to upstream, the prospect of becoming the maintainer strikes me as something that is certainly not to be done lightly.
Haskell has enormous momentum now[1], and it's speed of development is accelerating. I came to it not because it was cool, but because my experience maintaining and refactoring a big Python program had become really painful. Haskell lets me keep the codebase smaller, it's easier to be pretty sure things are working, it's easier to refactor, and I'm sure the language will only keep getting better. Those are all understatements.
That has exactly been my experience as well, and the reason why I decided to work with Haskell professionally instead of continuing to write Python, Elixir or what have you.
On to the topic of this submission itself, Elm turned out to be a gateway drug to Haskell for me. Similar to Paul Chiusano[1] I decided to switch from Elm to GHCJS (Haskell). I started with https://haskell-miso.org/ (based on the Elm architecture) but I'm currently developing using http://docs.reflex-frp.org/
I used Java, C and C++ before coming to Python. Unlike theirs, Haskell's type system is complete -- even a function of functions can specify exactly what kinds of functions it uses for inputs and outputs. That makes higher order programming much safer -- and higher-order programming might be the best way to move fast.
Haskell is also astoundingly terse. In Java and C my data type declarations were too long to fit on a page, full of redundancies and boilerplate. In Haskell if you want to make, say, a data type that is either an X or an O (suppose you're writing tic-tac-toe), you could do it in four words: 'data XO = X | O'. (Notice that there's not even a natural way to do that in Java or C, because they don't have sum types; you'd have to make a type that has a flag to indicate whether it is an X or an O. That gets really complicated if they're supposed to have different data associated with it -- but in Haskell, if the X is supposed to carry a float and the O is supposed to carry a string, you just add two more words.)
Pattern matching also helps with terseness. I don't have time even to write the last paragraph so I'll skip this one.
Purity keeps you from tripping up on IO-related errors. It lets you be much more certain that things are working. It also forces you to keep the IO in a thin top-level layer of your program, which might sound like a pain but once it feels natural you'll find yourself moving faster than you could before.
To be sure, Haskell has features that I don't use. But purity, sum types, pattern matching, and the unusually rigorous (it's complete!) type system are all critical to its value to me.
That complicated type system w/ type inference does come with it's own costs, usually bad compile speed issues. It's what I've noticed when looking at languages like swift, rust, scala & haskell.
I'm currently dealing with it in a large swift project, and I would much rather go back to the extra verbosity of objective-c than have type inference at this point.
The type system is probably not the bottleneck in any of those cases. As a sibling comment points out, ocaml has very good compile times, and the inference problem is basically the same as in Haskell.
In the case of rust, I suspect one of the biggest issues is the way parametric polymorphism is implemented. Basically, if in your program you end up using e.g. Box<usize>, Box<MyType> and Box<Result<String>>, you're compiling Box 3 times.
I don't know enough about swift to hazard a guess as to where the build is spending its time.
My experience with Haskell is that compile times are neither great nor terrible.
Check out OCaml (BuckleScript or ReasonML on the frontend, depending on which syntax you prefer). It has a super-fast compiler with almost total global type inference. The Facebook Messenger team report incremental builds of less than a second.
Data point : I’m working on a several hundred thousand line server using Scala. The incremental compile time
is usually 2 or 3 seconds. A clean build of the project is around 2 minutes
I've only used swift at that size, but I've heard stories with all of those languages once you have a large code base.
I remember vaguely reading about how at a haskell conference people were basically cornering compiler maintainers about compile speed, but that was several years ago.
When I have to recompile, it's usually because I've only modified a few files, and it's usually really fast. I particularly enjoy that if I refactor things without actually making any changes in the way it works, GHC won't blink; it knows there's nothing to do.
Yeah the codebases I'm talking about are something around 1 million lines over several apps, libraries, all of the tests and codegened models, mocks and network services.
That's been the story of Haskell for over 10 years. The world and other language ecosystems are changing faster than Haskell is growing.
Clojure and Scala give you an alternative to Python that have the power of the Java ecosystem to take your programming out of academic and toy projects.
And Python finally has static typing, 10 years after it was announced.
You're sort of painting a picture here of Haskell and all other languages skating towards some unknown target point, with Haskell skating the slowest. But, at the risk of sounding like a fanboy, it could be that Haskell simply started out significantly closer to that target point, so that other languages are effectively skating towards it.
In this way, both your statement and the parent's statement can be simultaneously true.
What changes do you think Haskell can't keep up with?
If you want to approximate the cool things in Haskell with a Python-compatible* language, there's Coconut. In addition to static typing, it offers algebraic data types (how I lived without sum types, I don't know) and pattern matching.
Why is this not more popular? Coconut seems like something many people, including me, would want to use for every Python project of sufficient complexity. What's the catch?
Fanboy here. I've used Coconut extensively and it's a joy. I wouldn't start a (personal) Python-targeted project without it. Pattern-matching, a non-horrid lambda syntax, lazy evaluation, TCO, a built-in partial function syntax, and finally the pipeline operator (|>) are all things I'd hate to be without now. A more detailed list of features (incl. MyPy integration) is here: http://coconut.readthedocs.io/en/latest/ . The docs are just great too.
The catches:
1. You have to be OK with a compilation step that isn't part of the Python world.
2. To the best of my knowledge the language has been and still is being developed by a single person, so there's the "Evan gets hit by a bus" risk built in.
3. Tooling is virtually nonexistent. There's a Vim plugin that understands the language syntax but there's zero IDE support. Of course you can debug the generated Python with Pycharm/VS Code/etc and having done it I can say it's not terribly painful but isn't terribly fun either.
I think that over time the lack of tooling will be the biggest hindrance to the language's further adoption. OTOH I'd be curious to know if this has been a major factor in other unpopular languages remaining unpopular. I seem to remember a lot of complaints about tooling in the early days of Scala but the language still managed to become fairly successful once that situation improved.
I thought about it, and I think the unpopularity of Coconut might be due to the existence of the Toolz library. That gives most things that Coconut does minus better lamda and TCO.
You write in a dialect that python programmers do not understand. it is a deviation from the norm. expectations on others that have to read/understand/use your code.
Python 3 supports type annotations natively, and the mypy tool is an external typechecker that you can run just like any other static analysis tool during your build.
Right, but it's misleading to say "Python has static typing", because while an implementation of static typing in Python exists, the vast majority of the Python ecosystem doesn't use it.
What is a static or strong type system is sort of up for debate, the terms are not well defined. Not OP, but IMO type annotations are static but not strong. I'd consider a strong type system one that enables you to encode additional invariants in the types, with sum and product types being perhaps the minimal requirement.
This argument isn't convincing. These artifacts are hints that can be ignored, not constraints to the python interpreter. It seems to me a bit like arguing that the existence of Coverity (and other static analysis tools) means C is as safe as Rust.
The types were never meant for the Python interpreter, they were meant as documentation and as input for static analysis tools. Mypy just happens to be a static typechecker that takes advantage of the type annotations.
Just because mypy is optional doesn't mean it's not static typing.
I don't get it. The author didn't address his original concerns, he just said "Elm is awesome", which may be true, but isn't a rebuttal of his previous points, which are condensed here:
>You can go through the whole development lifecycle of the app and you’ll rarely encounter a situation where you can’t find a fix online in 5 seconds. Somebody else has already worked out the kinks. My strategy was flawless.
I suspect the missing connection is this: if you encounter a conceptual/technical bug with a popular language/framework, you can find a solution online quickly.
If you write buggy code (especially bugs that don't reveal themselves until live in production) it involves much, much more pain to fix those.
That's one of the points he raised, and I see the value now. However, the author also talks about the dangers that come from using lesser-known, using unstable languages (he mentions finding bugs in the compiler, segfaults, and so on), and he doesn't say how Elm solves them
The author does emphasise how complete and well-built the official Elm tools are, which kind of mitigates this concern. But I agree, the article doesn’t really spell out its reasoning.
> does emphasise how complete and well-built the official Elm tools are
It's not a good point. It's just unrealistic to rely on standard libraries, even if you're writing Scala or Python. Community size must be an important factor, even for Elm. It is highly unlikely that Elm is so perfect, that community size simply becomes irrelevant.
I think we can all agree that this article isn't finished at all.
Good point. I suspect to some degree it's mitigated by the fact that anyone attempting to re-implement (most of) Haskell must care about correctness a great deal, and having Haskell already in place limits the number of conceptual bugs.
Anyway, this is pure speculation on my part, I have yet to dive into either language.
If that also has the consequence of significantly less bugs in Elm libraries, you will also encounter much less of the former when using third-party libraries.
This is common reaction to Elm and especially to Elm.
I also use Elixir and it has great community and everything, but somehow Elm is even more.
All the concerns about 'unpopular' languages, lack of tooling, I feel it is quite the opposite. Elm formatter changed how I work and now I started using it in other languages, Elixir and JS are using it more, or maybe I just started paying more attention.
There are other smaller things that I noticed.
I wish I can work more in Elm, not less.
Also one more thing. Elm made me wish to be way better programmer. You are surrounded by smart people and you just need to show more if you want to keep up.
When you say smarter programmer I understand what you mean. But the way I look at it, Elm allows me to relax and be a dumber programmer. I commit my smarts up front to the type design and interfaces between types and then I can relax as the project grows from there because the compiler will enforce the invariants I've encoded into the types. Pure Bliss.
> Elm allows me to relax and be a dumber programmer
Yes, this! I don't feel smart enough to write programs well in JavaScript. Elm brings clarity and confidence without having to second-guess myself all the time (thanks to the compiler and fantastic error messages).
The beauty of strong static typing–you don't need to get the types right immediately. Just get an initial design out the door and iterate towards better designs as you go. The compiler helps you tremendously for refactoring, and managed deprecations let you change types gradually over time.
Elm is awesome. I just wish much more people and companies would adopt it (and ClojureScript) so it would gain popularity close to that of TypeScript. This could make the web (and the frontend job market) a better place.
I agree. I think TypeScript is a decent language, but after working with the DOM in a functional way, verses object-oriented and imperative paradigms, I don't think I'll ever go back. To paraphrase and hijack what the author stated, HTML feels like it was created for Elm (and functional programming in general). I'd add HTTP to that as well. The combination just feels right.
I'm a Java dev with just some basic experience writing functional code, mostly in Java/Kotlin/Groovy/Ceylon (all of which are primarily imperative!).
Can confirm: Elm is awesome, easy to pick up as long as you understand basic things like union types and immutability, and I found myself productive in it within a single day!
I notice Scala missing from your list - if you need the JVM or find Elm to not quite have what you need, it's a great language. It's basically the opposite end of the spectrum, design-wise - Elm is "let's create this highly opinionated, carefully curated language and try to make it perfect" and Scala is "let's throw every feature we can derive into our type system and let people work it out".
It's got it's issues (mostly that it's incredibly easy to abuse powerful features), but it also has a ton of stuff I really miss in other languages.
I have a web-based party game (CaH clone) I wrote in Scala for the back-end, Elm for the front-end. It's a bit old (I'm planning a rework and update when the next version of Elm comes out), and it's definitely not the best code ever as it's a hobby project, but you might be interested.
Scala's flaw - and Elm's strength - is it is a massive language that allows for a large amount of magic to happen. Elm is by comparison tiny and extremely explicit, and error messages thrown by the compiler are almost always super helpful. But Elm can't (really) be used on the server, so it doesn't hurt to look at Scala there, though be prepared to have a hard time finding experienced engineers to hire.
Id argue that while Scala has more powerful features, it has less "magic". Elm can't be used on the server because it has a magical create app function that you must feed the exact right functions into in order to make anything. AFAIK it's not a general purpose language.
Definitely not general purpose, but it has nothing asking the line of implicits and implicit type conversion and the such that makes reading Scala code impossible to read, if not write, without a tool like IntelliJ.
I find it best to use Scala's implicit stuff for things that would be completely invisible in another language (e.g. "this line might fail with an error" or "this line accesses the database"). That way if you're reading in a plain text editor you're no worse off than you were in, say, Python, but if you use a tool like IntelliJ (and you should!) then the GUI is enhancing your experience, telling you more about your code and reducing the need to click around library code to understand what the code you're reading is doing.
We have different definitions of 'magic'. `Html.program` is magic, `comparable` is magic. Elm has been steadily removing features and adding magic since they lost `Signals`
There is so much to love about Elm. It is a typed language, so it eliminates typing issues, like 1 + "1" = "11". Its compiler is great. The compiler catches almost everything and offers easy-to-read suggestions to fix your code when there is a problem. Elm's compiler virtually eliminates runtime errors; at least I've never had a runtime error with Elm.
I also like the debugger. It allows you to easily capture your steps as you click around your application, save those steps into a file, and send that file to other developers, which allows them to run through your steps on their own machine; seeing Elm's output at each stage. It works like a "steps to reproduce" bug report, only automated, which makes finding and fixing difficult bugs easy.
There is a lot of good documentation for Elm as well. Elm's documentation itself is good. Manning and Pragmatic Programmers both have good books on Elm (both are still early access versions though). Pragmatic Studio also has an excellent video course on Elm for about $60 (https://pragmaticstudio.com/courses/elm), if you're interested in learning it.
The recording of steps is called event sourcing and both vuex and redux implement the pattern, I'm sure elm is great but that benefit is not unique to elm
Elm's creator is a visionary [1]. E.g. Redux took inspiration from his thoughts. Rust's compiler error messages also. It was rather common on elm core newsgroup to see him asking for secrecy on new ideas before releasing a new version of Elm.
The negative part is that Elm's development is rather slow and not pragmatic. This is painful on the short term - specially if you come from JS land..
1 + "1" => "11" is not the sign of an "untyped" language. I would expect 1 + "1" => 50 (ascii asm), 1 + "1" => 242 (ebcdic asm) or 1 + 1 => 2 (length tagged string, eg, untyped pascal-ish variant), or 1 + "1" => some seemingly random value (C lang) in an "untyped" language.
1 + "1" => "11" is strongly typed, dynamically typed, with a particular type conversion that favours "strings".
But, on reflection, I am probably an pedantic iconoclast.
That's just a "type system by obscurity". Extending it to provide the same sort safety, you'd need specific function for all combinations of types: sum_two_integers(), sum_integer_and_float() etc... Which would be both extremely ugly and a terrible waste of memory.
You seem to have missed the whole point of my comment. sum_integer_and_float() is the opposite of what I'm talking about. If you wanted to extend this system within numeric operations, you'd have e.g. divide_and_return_integer() vs divide_and_return_float() (funnily enough, this is also a very popular distinction for languages to draw); sum_and_return_integer() vs sum_and_return_float(); etc.
This usually isn't done because integers and decimals are fundamentally very similar, so for addition there is a platonic correct result when you add an integer and a float (the floating point result), and the operation returns that. Then, if you wanted an int, you can cast your result.
That doesn't work if you want to consider string concatenation to be a form of numeric addition. It isn't. There is no ideal result when adding a string to a number, and if you do one when you meant to do the other, there is no way of producing the data you wanted ("11") from the data you have (2).
Perl doesn't have much of a type system. But it solves the common problem of manipulating strings when you meant to manipulate numbers, or vice versa, by not assigning the same operators to these radically different operations, and I think that was the correct choice, even for languages with stricter type systems.
> This usually isn't done because integers and decimals are fundamentally very similar, so for addition there is a platonic correct result when you add an integer and a float (the floating point result)
After further consideration, I believe this is incorrect. This usually isn't done because integers are closed under addition, subtraction, and multiplication, so the "promote according to argument type" C behavior doesn't lose accuracy in those cases. There obviously is a platonic correct result when you divide two integers, but it is not in general an integer, so the C behavior introduces bugs (explaining the popularity of divide_and_return_float() -- I don't care what types the arguments were, as long as I get the result I'm looking for).
My first thought: The author would probably be equally satisfied if they had used JavaScript with flowtype and react. It sounds like they're comparing JQuery + Bootstrap (and similar "old" frontend frameworks) to Elm.
I think the point still stands that unpopular frameworks/languages can still be stable and more effective than popular frameworks.
I can't speak for the author, but I made the transition from JS+React to TypeScript+React to Elm and it "changed my mind" too.
When I started with React/Redux, I didn't know much about functional programming, but I developed a certain interest... A couple years later my React stack was full of tools and libraries that allowed me to write my React apps in a more functional manner. I used TypeScript for the type system, ImmutbaleJS for immutable data structures, Ramda as a FP utility library and Recompose to call React itself in a functional manner. I also used pure stateless components exclusively... Then I switched to Elm and I realized, that the React stack I was working with was a crippled version of Elm. I'm currently writing my first app in Elm and it feels much smoother.
Perhaps, but with Elm you get a much better experience revolving about the tooling(Inspired from Golang I believe). This would be one of my top arguments for picking Elm over a giant mess of react+infinite choices of libraries and configurations.
Not the author, but I've used React + Redux and Flowtype and played a little bit with Elm. Elm felt so much cleaner, the typing was great and it was a breeze to learn. I'm certain I'll go with Elm on my next web project.
I built this in browser database app entirely in ELM. Since there is no existing rich component library available I had to write everything including a high performance grid implementation from scratch. The entire app took about a week and has zero runtime bugs since launch. I have to give credit to Elm for most of that.
One of the main problems with unpopular languages (which this article completely ignores) is growing the team, and hiring in general. In other words: the future of your project.
It's not just that it's hard to find people to join you, it's that even engineers who might be considering joining might decide it's not a good career move since they are going to spend years learning a language or a platform that sees no adoption and will not serve their future career.
This piqued my interest in Elm which lead me to start reading the Elm introduction[0] and it's great! I don't know if I'll ever use it in production but these well-written docs sparked the programming interest in me once again. Will definitely write some side project in Elm.
We use(d) Elm at the company I work at. (A start-up.) Elm is great. All of the positive rumors about it are true.
The issue we've had with Elm isn't typically discussed in these conversations: My CTO doesn't seem to see the value of it+. So we recently replaced our Elm code with JavaScript.
I wonder if anyone else finds themselves in a similar situation.
+It's a bit more nuanced. We're in a very "MVP" stage; the line of thinking is to use something everyone's more familiar with so we can move fast.
I wouldn't write off an entire company on this abbreviated description, but damn, I'd be worried about this. It does not speak highly of your CTO.
I'm not saying Elm is objectively better than JavaScript (I don't do either). But, can't these things sit side by side. And, if so, how is rewriting code aligned with moving faster? At worst, keep what you have in Elm, and write new code in JS? Also, there's plenty of "ugghh" with JavaScript that I'm skeptical of anyone throwing away existing code in order to rewrite it in javascript.
This is all doubly true for a early-stage MVP, where I'd expect a CTO to be technically minded and enthusiastic. Sounds more like "I know JS so we'll do JS."
Perhaps my description of our CTO/situation left out too much detail. (I was trying to be succinct.)
> It does not speak highly of your CTO
Our CTO is the kind to speak highly about. Strong technical/software engineering skills. Willing to experiment with new/different tech. Business minded. He agreed whole-heartedly to move forward with Elm in the first place. This is actually our 2nd project with Elm now.
> Can’t these things sit side by side? How is re-writing code aligned with moving faster?
Yes; in fact, we’d always had a mix of Elm+JS.
Our product is not a Single Page App. (Very deliberate decision.) We’re only using JS/Elm to make small parts interactive. So there was only a relatively small amount of Elm code that was replaced.
It allows us to move faster because our designer and a couple of junior devs on the team don’t have to struggle through learning Elm.
> It allows us to move faster because our designer and a couple of junior devs on the team don’t have to struggle through learning Elm.
Elm-html is my biggest criticism of Elm. Why make the contemplating so difficult for the non-programmers to work with? It's holding back Elm adoption for sure.
If you're looking for a language for your own projects then, sure, Elm is a fine choice. When it comes to getting a job as a developer, however, it's a different story. The languages companies are willing to pay big bucks for tend to have been around for a long time. Tech, as a profession, is paradoxically very conservative. Even startups tend to go with Rails and that's been around for over a decade. Ecosystem maturity matters where money is at stake. Today I was offered a £460 per day contract to do Codeigniter for MBNA. Unfortunately it involved relocation. Not even Laravel, just plain old Codeigniter. Who's paying that to write Elm? Personally I love writing Clojure for my own projects but, again, Clojure jobs are thin on the ground even in London so I don't expect to make a career out of it.
> > First, Elm has the natural predictability of a pure functional language; when you write Elm, the compiler forces you to consider every case.
> I'm a beginner with functional languages, but isn't the type system completely orthogonal to the fact that Elm is a functional language?
Yes. The author probably would be equally satisfied with any robust typed solution (flowtype, typescript). They also say that Elm nicely interfaces with the DOM, which I believe is mitigated by JSX.
So in some sense the article is more about JQuery/Bootstrap/other legacy solutions being bad.
Robustness, you're absolutely right, Elm cannot be beat. But it comes at a price: It's pretty limiting/underpowered. Typescript is very expressive these days and you can write some very neat libs that feel dynamic but are actually fully typed, if you bother (to be fair, most people don't bother). On the other hand, Elm usually doesn't provide many ways to do something, and you may even have to cheat and offload some work to dirty-old-JS-land via a port to unblock yourself or simply deliver a functionality on time.
I'm still unsure whether the freedom is worth it or whether the robustness wins at the end of the day; it may depends on what kind of app you're writing and how strong the team is (e.g scala has the same "issue")
With the huge exception being Lisp-based languages (closure, scheme, etc....). Heck, Lisp was the first functional language and first one that made full use of dynamic typing.
Dynamic typing (like static typing) is a spectrum, actually. You can flag method missing errors at a minimum, but in a nominally dynamically typed system, you could flag class mismatches before you ever got to a dispatch error.
"Functional language" has come to tend to mean a language with an ML-style type system, possibly because almost all serious languages these days have first-class functions and map/reduce/filter. But ultimately the term means different things to different people.
Not at all. The term language determines what type systems are possible. The lambda calculus serves as an excellent base for languages that admit good type systems.
It means that the two features are independent: you can have the compiler check types in non-functional languages, and you can have functional languages that do not have static typing. Elm happens to have both static type checking and language features supporting functional programming.
If one really wanted to... one could run a browser on the server to run a page powered by Elm, and then click on the page using Selenium style web driver. I'm guessing it wouldn't scale well :)
I wonder how to become a good functional programmer. I think I'm already decent at procedural/OO.
Do I need to have a strong maths background? I didn't learn it very well at university and this worries me. Many functional language fans seem to have degrees in maths.
FP is mostly a state of mind - of course picking the language really helps about incentives, but you can do it even in Java/C# (in fact if you do C# there is the great LINQ library that helps).
How to approximate FP in a mostly OOP language:
- use immutable data structures: there is no way around being able to fearlessly modify something. The naive way is to copy the object before touching it, the better way is to use efficient structures, e.g. Clojure data structures from Java.
- you either have data classes (no methods, no private fields), or execution classes (no data fields, only static methods), no mixing
- of you stick to the above, you will find that returning void from a method is very difficult.
Congrats, you're doing FP: the gist of it is that it's all about keeping state explicit; if you pass some A in a function, you will return a B at some point, and that's your result. No implicit state.
Of course, we're missing the whole part about side effects, so to add to the above: if you cannot write a unit test without mocking something in your method, you're doing side effects. They should be done only at the "border" of the application, to (maybe) get you the data you need, so you can bring it and process it in the pure core.
And this is where languages like Haskell help: to understand where the side effects are (because they are included in the types) and to prevent mixing them around (which only gets you an untestable mess in the end).
Math isn’t required. FP is about thinking of data structures and transformations from A to B. If you’ve ever used underscore.js, you’ve probably used FP patterns and functions. Map, filter, reduce are all part of that.
I don't think so. If anything I feel it's a bit easier as you don't need to reason about object state. It's also different, which is the main hurdle. If you use emacs you've already been exposed - Emacs Lisp is functional. If you're coming from a Web background both Elm and Elixir are great places to begin. You'll often hear stuff like "Javascript can be written in a functional style" - which is true, functional programming is a paradigm you can just go with - but seeing it really embraced by the language is really inspiring.
Pick a project. Pick a functional language. Code the project. That's it. It's not that different that, given enough willingness, you wouldn't be able to learn it yourself if you are decent at procedural / OO. You don't need any advanced math whatsoever.
> But when I joined Real Kinetic, I found out we were writing web client code in Elm. Elm? Really? The experimental language created for Haskell snobs who can’t handle stooping to the level of a regular blue-collar language like Javascript?
'bit of an odd thinking considering Evan's aversion to high-minded abstractions.
> "I don’t want to be the guy that finds a bug in the compiler."
When I'm wearing by "be productive" hat, I don't, either, but how realistic is that? Unless you've memorized the bug database for your compiler, running into a known bug is just as frustrating as discovering a new one, and I'm pretty sure I've run into at least a few bugs in every compiler I've ever used. According to my comments, my current flagship program has workarounds for 5 (known) compiler bugs.
I'd love to use only stable bug-free compilers (maybe Forth?), but I'm not sure that's practical. A more reasonable solution is to only use the popular parts of languages -- though apparently I'm not so great at discerning what those are, either!
> Elm requires that you think through all the edge cases. You must consider and specify what will happen in every case.
Wouldn't that be the case with _every_ typed and compiled programming languages (or at least, every typed and compiled programming languages that support pattern matching)?
* sum types and exhaustive pattern matching (incidentally, still not the default in GHC in 2018 because reasons[0])
* has only one escape hatch of "Debug.crash", which it strongly recommends not using and which it seems many Elm devs aren't even aware exists (in my avowedly shallow experience)
* and which you use as bottom by hand-rolling pattern matches
* at which point you might as well do it correctly
To the extent that you can require proper handling of everything, I would say that Elm is much stricter than Haskell or OCaml or Rust.
You can be very strict in them, but they don't enforce it to the extent Elm does. At least in my experience.
GHC will coverage check any case equivalent to one you can write in Elm. You can just write more powerful types where GHC can't tell that your case is exhaustive so it warns about missing branches that won't ever actually be taken. I seem to remember that perfect exhaustive checking for some combination of Haskell extensions is undecidable.
> Even though Elm is a small language with a small community, that doesn’t affect the Elm experience in a noticeable way.
This conclusion was pulled out of thin air, with no justification from the rest of the article. The title is misleading, the article is really about why the author enjoys Elm over JavaScript...
The general rule of thumb that a larger active community leads to faster software development is more or less still true. There is no reason to suspect this is not the case with Elm.
It's all fun and games until you get hired to build a production-grade web stack in a dinky game-scripting language with no community, that you have written 0 lines of code in. Some people live and breath to reinvent wheels in 19 different languages. Not my cup of tea.
> The general rule of thumb that a larger active community leads to faster software development is more or less still true. There is no reason to suspect this is not the case with Elm.
Yes, and not only because the volume of copy-paste source code on StackOverflow. I suspect Elm is slower to get a feature out, but once it is out you spend less time fixing bugs in feature A caused by adding feature B silently changing some assumptions you made in code. And so later on in the same project ... you are faster.
> It's all fun and games until you get hired to build a production-grade web stack in a dinky game-scripting language with no community, that you have written 0 lines of code in. Some people live and breath to reinvent wheels in 19 different languages. Not my cup of tea.
From what I have seen Elm is absolutely fit for production grade work. There is a community. The community tends to have smarter on average people in it. Simply because all the less smart people are put off. Sounds elitist? Maybe.
> There is a community. The community tends to have smarter on average people in it. Simply because all the less smart people are put off.
A community shouldn't be judged based on how smart everyone is (quite a difficult measure), a more useful metric would be how many useful community libraries there are. For example, if I can choose between 50 carousels implemented in React vs. 2 in Elm, and I'm an average developer -- what will I choose?
Modern JavaScript undoubtedly towers over Elm in terms of go-to-market time for arbitrary apps.
> you spend less time fixing bugs in feature A caused by adding feature B silently changing some assumptions you made in code.
This is a big problem with 2007 JavaScript and jQuery. React/Vue and other declarative VDOM frameworks bring the Elm philosophy to the masses.
> For example, if I can choose between 50 carousels implemented in React vs. 2 in Elm, and I'm an average developer -- what will I choose?
IMO this is an advantage of the smaller/smarter/niche (whatever you want to call them) programming communities. When they congregate around a couple libraries for common problems, those libraries tend to be very good. Speaking from experience in the Elm and Clojure ecosystems.
It’s much better in terms of developer time to evaluate 2 good libraries than 50 libraries where the quality ranges from abysmal to good.
> This is a big problem with 2007 JavaScript and jQuery. React/Vue and other declarative VDOM frameworks bring the Elm philosophy to the masses.
But ... in addition to npm/babel etc. you need a wetware constraint checker. React says "human: make sure your data is immutable - please don't forget or something might break later in production". Elm says "I guarantee immutability"
> A community shouldn't be judged based on how smart everyone is (quite a difficult measure), a more useful metric would be how many useful community libraries there are
That seems more of a metric for an ecosystem than a community.
It's presumptuous to assume "community A is smarter than community B". It's just not even the right question to ask. Better questions to ask are "what are third party libs like?", "how active are people in this community?", "how active is the SO and other forums?". These are actionable questions.
"People who use Elm are smarter" is simply not actionable information. What does it even mean? It's meaningless.
> It's presumptuous to assume "community A is smarter than community B"
You can e.g. Rocket scientists VS. Janitors.
Note what I am not saying (which people might take offense to) is "you use Vue not Elm, so you are dumb". Nope of course that isn't true!. I am saying in aggregate, due to selection biases etc, Elm attracts smart people. A lot of really smart people will stick to popular frameworks and not use Elm. Some bad programmers might use Elm for some reason. Aggregates are what I am talking about.
> "what are third party libs like?", "how active are people in this community?", "how active is the SO and other forums?". These are actionable questions.
These are good questions.
Another question though is: how much do I trust another library to not have bugs? Given Elm's design vs. vanilla JS, I'd go for Elm. That confidence extends to "How much do I trust a Library I knocked up over the weekend to solve an previously unsolved problem in Elm".
> It's presumptuous to assume "community A is smarter than community B".
To assume with no basis, sure; to conclude it may or may not be.
> It's just not even the right question to ask.
It's not inherently operationalized, sure, though (as seen elsewhere in this thread), “advertising jobs for platform A results in us needing to filter out fewer candidates with no real programming ability than when we advertise jobs for platform B” I'd pretty much an operationalization of it, and quite (to address your later point) actionable.
> The community tends to have smarter on average people in it. Simply because all the less smart people are put off. Sounds elitist? Maybe.
Have you thought about the possibility that it's actually a community of people who first and foremost _consider_ themselves as being smarter than the rest? Maybe this is what's really off-putting to equally smart, but more humble developers?
Not with elm. Its a nice intersection of smart and down-to-earth / helpful types. And "smart" here means "conscientious, passionate developer smart" not "born with 200 IQ". I don't get any impression that elmers "_consider_ themselves as being smarter than the rest" from talking on the reddit etc. Haskellers though, I think it's a different story, but YMMV.
I tried and failed to pick up Elm. I've tried and failed to pick up a few FP programming languages. They tend to become more and more cryptic as I progress with them. I feel dumb, but I'm happy to stick with C like languages, like PHP for web development.
You can adopt some practices from fp when writing PHP. For example often instead of iterating over an array and doing some imperative sequence of modifications you can do a sequence of maps, filters and reduce... you replace error prone mutable code with more a declarative approach that says what you are doing clearly.
I've played with Elm a while ago and it's been a very nice experience.
I wouldn't use it for any frontend work just because I believe that it's possible to avoid single page apps if possible, but if I had to write a SPA I'd reach for Elm.
The only thing that makes me worried is that Elm 0.18 has been out for quite a while now and most of the development has been carried on behind the scenes by Evan.
This is good because it gives the time to the language to evolve in a coherent way instead of being a collection of bolted on features, but it also means that for outsiders is a bit hard to track progress in the language
After reading this article and reflecting on my own experiences and habits, my rule of thumb would be "choose the best of those with second-tier popularity." It's common that the most popular isn't the best, has advantages in numbers, but also disadvantages like low S/N ratio.
Many of the pitfalls of a scarcity are avoided, and can sometimes find a well-organized, curated, cultured pocket of enlightenment.
I see that the current version is 0.18. Curious if the language has become more stable with fewer breaking changes than when I looked at it two years ago.
I feel unpopular languages are best relegated for niche use cases the language is well suited for. If you are going to build something mundane like a CRUD App or a game, or an ERP, why wouldn't you just use some mundane blue-collar language like Javascript or its equivalent?
What you are doing isn't new or groundbreaking, so why bother bringing in extra drama and ceremony by using a language very few people use to achieve the same result? Just seems like added complexity for no reason, even if the code looks simple with the first pass.
> why wouldn't you just use some mundane blue-collar language
Because for something so mundane then it really doesn't matter what you use and you know the domain so well that you can recover from any hangups.
I certainly don't consider it self-evident that you always just use mundane solutions to mundane problems. Depends on your appetite for risk and how biz-critical the problem is.
I find this line of reasoning a bit strange. I've only dabbled a bit with some unpopular languages, but I don't think their limited popularity implies they're only suitable for niche use cases. In fact after using some languages (most recently Clojure) I find programming in other mundane languages like JS a huge step backwards.
People are probably the biggest determining factor in achieving a quality result. But I don't think that means we should forget about trying to improve our tools.
You might think Japanese is an awesome language and decide to learn it, but then what use is speaking Japanese outside of Japan? Around the world people still just use boring ol’ English, even though English is actually a pretty shitty language and full of hacks to make up for weird edge cases (read and read, goose and geese, mice and meese?, Buffalo buffalo Buffalo buffalo buffalo buffalo Buffalo buffalo)
Likewise, if you write something in Clojure, you have far less people and libraries and platforms that can help you accomplish whatever you are doing. If what you are doing is not something new or groundbreaking that could only be done well with Clojure, then the extra effort you have to spend to get up and running is not worth it.
I have thought about this problem some and currently see it as an issue of scaling thresholds.
If it's literally just you, there's a lot of benefit to leveraging existing work so that you can focus on the differentiating part of the project(which is probably not a language innovation). It's not just the libraries but the whole ecosystem - example code, troubleshooting help, IDE support.
If you have an engineering team of even modest size, the picture can change very swiftly towards ensuring your result is built on a solid foundation, even if it means a lot of pioneering infrastructure has to be built and a lot of late nights spent debugging core toolchain issues. Team efforts have the necessary momentum to break free and do that ground work as the overhead gets swiftly absorbed in "person-year" budgetary terms. Individuals can only really justify the same as their core direction of research, sole hobby, or speculative investment - e.g. being the "first to implement" some hot new standard could be a well incentivized career move.
That argument may be true for other languages, but Clojure is a hosted language which means you have access to all the libraries created for that mundane Java language (should you need to use them).
>I made it a hard and fast rule: if I found two technologies that could solve a problem, I would choose the one more people were using. I didn’t want to include an obscure graphics API and then discover that no one had ever called set_color() followed by resize_window() (resizing is hard) and somehow those two functions in sequence cause a segfault. I don’t want to be the guy that finds a bug in the compiler. I just need to ship the product.
Then he links to issues for Qt and Go. They're certainly not obscure. What kind of weird argument is this? If anything, the argument there is that it doesn't matter how popular something is, as even extremely popular things like Qt and Go will still have bugs.
Nowadays it's really easy enough to use real Haskell in the browser with GHCJS, and FRP libraries like Reflex provide a much more complete experience than Elm's restricted form.
This is why I'll never be a web developer. "Let's drop names! I made bridges out of Wood.io, then I used Steel (TM), now I use Stone.js. What's fun, now there's data in the kiddie pool!"
It feels like watching a cult of optimization function in a culture where measurement is taboo. It's absurd beyond belief.
That's fine. I don't think the ecosystem needs any more people that get their blood pumping over some inconsequential name reuse.
To get Elm confused with the email client with this title, you'd have to not know the Elm language existed. So it seems like it's a moment to go "oh I see, that exists" than cursing the world because you thought "Elm changed my mind about unpopular languages" somehow referred to an email client.
But I get it. It's cathartic to be angry on the internet. The angry shadow boxing just gets pretty old for the rest of us no matter what kind of developer you are.
I'm talking about the uselessness of saying "I used X, now I use Y." WHY does one change what one uses? For reasons supposedly. But of course nobody ever provides reasons. It is enough to simply drop names. Print out your resume. Because heaven forbid anybody provide some real data that shows why one platform is better than another.
I'm certainly not confused about what Elm is. That'd be easy enough to find out. But "why you stopped using it" sure as shit isn't something I can google.
The problem with unpopular languages is twofold:
* lack of talent that can step right in and be effective
* lack of resources to push the language forward
The first can be remediated by planning to bring new hires up to speed, and just making that investment in them. (It can also be a useful filter, making sure you are hiring someone who is really interested to the company and is willing to make the time investment to learn what is probaly not a very portable skill.)
The second is a bigger problem, if the main sponsor of the language moves on. If the main sponsor is committed, then you're probably fine. (I have no idea who pushes Elm forward, looks like there's a foundation from the wikipedia page.)