Hacker News new | past | comments | ask | show | jobs | submit login

Just curious, what companies (or kinds of companies) are betting on Scala nowadays? It seems that the people wanting "better Java" all decided they like Kotlin, and the functional programming people now gravitate more towards either F# (for .NET ecosystem) or OCaml/Reason (for the more unixy world). And the academic/research/learning crowd seems to like Haskell more.

Who's still in Scala boat? Are Google or Fb or other big cos known to give back to open-source growing any Scala codebases now?




I'm working since 3 years at a growing startup that has been hiring Scala Engineers at least every half year since I started.

In the last ~18 months, the amount of CVs we're getting has been constantly increasing. Part of that is probably that our startups hiring matures, but it's also the kind of CV that is changing.

I'd say that 3 years ago, there was an 80% chance that the applicant was highly self-motivated to learn Scala in their freetime, and tried/did introduce it at his/her current workplace. Today, there is an 80% chance that the applicant either "had to" learn it in their current workplace, or learned Scala when switching jobs. (Don't get me wrong, they're still motivated, and they took the chance when it was there!)

So there is a switch from the Early Adopters to the Early Majority (where the Early Majority now has worked 1 or 2 years with Scala at their current job, and is confident enough to look for a new one).

One driving force was definitely Spark, but there are a lot of Enterprise apps, unrelated to ML (usually with higher traffic requirements). The sort that would have most likely been done with Java or C# 5 years ago. It seems a lot of Enterprises introduce Scala when they try to break up their (Java) monolith into microservices.

So it seems that Scala has been carving out it's place in backend/microservices with scalability requirements, and is eating part of Javas cake there.


> It seems that the people wanting "better Java" all decided they like Kotlin

I've been using Java and Scala exclusively for the past 10 years at two large NY banks and two fintech start-ups, and I literally do not know anyone who has ever compiled a Kotlin program.

Simply extrapolating from one's own experience is fraught with peril.


F# only has access to a subset of .NET deployment scenarios, where Scala can be used pretty much everywhere there is a JVM.

So most companies actually are more keen to bet on Scala than F#.

OCaml/Reason world lacks the wealth of Java libraries, which are an import away on Scala.

Regarding Kotlin, so far its killer use case is targeting Android, where one is stuck with an ageing Java subset. Outside of Android, it remains to be seen how it will keep up with the Java improvements coming up every 6 months now.


I don't think many companies are betting on Scala anymore. F# is just as irrelevant in the grand scheme of things.

Let's face it, Kotlin hurts Scala adoption.


Kotlin and Java 8. I started using Scala in Java 7 era and loved it. When Java was stagnating it seemed like an alternate language that targeted the JVM was the best option. Progress with the language has really picked up a lot. There is a lot that Scala has that Java doesn't, but now its not enough to make it worth it. Java is so much easier to work with on a large team.


The two languages though also have preferred libraries / frameworks. I code in Java again and the main thing that bugs me is being always in the Spring ecosystem. There is a mindset in javaland that every technology needs to be wrapped by Spring. Scala gave me the freedom to choose alternative libraries.


Oft. I agree with you there. I never enjoyed spring. I don't get the appeal. With scala I loved play framework. I know it has java support but I never tried it. It used so many scala idioms and features I'm not sure how well it would translate.


Depends where you work.

I know Java since the early days and only used Spring during one month as validation of an architecture proposal.


It seems like it. I recently took the liberty to update the language rankings on https://en.wikipedia.org/wiki/Scala_(programming_language)#L... and compared to 2016, Scala took a hit on pretty much every ranking.

Outlook largely negative, falling in rankings across the board.


Kotlin has many benefits that Scala has in terms of “being a better Java” without sacrificing the readability and usability benefits of Java itself.

Scala is a fantastic language, but it’s not one your average Java developer can pick up in a day or two.


What readability and usability benefits are you claiming Java to have?


Java is a simple language, the syntax is easy to parse and logic is generally straightforward to reason about - as a programmer you have very little ability break expectations of how the language itself behaves (no operator overloading, etc).

Scala, on the other hand, gives you a huge toolbox down to some really complicated to reason about features like implicit parameters, creation of completely arbitrary operators, etc.

[Insert some funny pun here about Java giving you a simple tool while Scala gives you an incredibly complicated one]. They're both great languages, but they serve very different purposes and audiences - Kotlin happens to fit Java's demographic better than Scala as a result, it doesn't have the magic and complexity to the same degree Scala does (the most confusing new constructs probably revolve around builders/lambda's with receiver types which aren't needed by most developers not writing DSL's).


i am almost never surprised when I am reading java code. Its almost never about the code.

With scala I seem to have wtf moments about the language every once in a while. (eg: magnet pattern)


That assumes that Java is the only funnel for Scala. Sure, Kotlin likely now captures more of the people trying to get out of Java, but how many people does it capture with backgrounds in Haskell, OCaml, Lisp, R, Python/Pandas, or any other language for that matter?


Aren't there so few people with backgrounds like that (Python aside) that it doesn't really matter?


No? I came to Scala from R and Clojure. The point is that Kotlin has pigeonholed itself into being a language for people who can't stand Java. Scala is partially that, but it is also vying for the attention of anybody who does object oriented programming, anybody who does functional programming, anybody that cares about type safety, etc.

The idea that Kotlin is hurting Scala adoption rests on the assumption that the only people who would ever adopt Scala are the people who are trying to get rid of Java.


Not only on JVM, scala can be used in the browser with scala.js or without a JVM with scala native.


Yep - you're right... F# definitely can't be used in the browser: http://fable.io/


F# is now supported within the .NET core world so it opens up the deployment scenarios beyond anywhere traditional .NET would live.


Not everyone is getting into .NET Core.

In fact, Microsoft has to advocate library writers to actually care about .NET Core and .NET Standard.

https://channel9.msdn.com/Shows/Visual-Studio-Toolbox/NET-St...


How is this relevant to the discussion though? F# works on either .NET Core or the regular .NET Framework.


Eh, it mostly works on .Net Core. There's still some bugs with the compiler around Generative Type Providers when running the compiler on the .Net Core runtime, forcing you to compile with a build of fsc running on mono or the full .Net Framework.

Still, for all intents and purposes you are correct - but there's still some growing pains in the tooling that are hard to ignore.


Meh, the reason many people weren't targeting .Net Core until recently was because of the huge pain in the ass it involved. .Net Standard 1.0/1.1 were missing a LOT of API surface area that many libraries needed. .Net Standard 2.0 fixes this, and makes life easier for library authors who already support multiple runtimes in the process by providing a sane upgrade path from the hell that is PCL's. It's all about tooling, there's no reason for library authors to not target .Net Standard in their projects now unless they really need some Windows/Full-specific assembly that isn't included.


> Meh, the reason many people weren't targeting .Net Core until recently was because of the huge pain in the ass it involved

+1000. It wasn't even library support, it was the tooling that was a huge pain in the ass. Half the time it didn't work right, builds broke unexpectedly, build file formats kept changing, cripes what a pain. I swore off it until the early .NET 2.0 standard prereleases when things seemed more stable, and it's been much easier to port my libraries.


I've been using .NET Core since it was released. Sure in the beginning library support was spotty at best, but that was quite a while ago. I have yet to find a popular library that doesn't support .NET Core/Standard.


And Clojure? I find it to be the best experience on JVM.


My favorite Lisp variant.

I have not listed it, because in spite of spec, it still is a dynamic language.


> F# only has access to a subset of .NET deployment scenarios

And what subset would that be? The only place you can't use F# IIRC is UWP application which is likely not the deciding factor in choosing F# or Scala.


Anything GUI related, and their respective tooling support on Visual Studio, which it never got.

It is still playing catch up with VS 2017 tooling for VB.NET and C#.


About GUI on .NET Framework:

- Windows form works but not perfect. The editor works if you install the template for it, but not the auto generation of code, like double click on button -> handler generated. You write code programmatically. but is used a lot. I use it in the repl (fsi), to generate chart, custom data visualization.

- WPF the same, works, no editor (but codegen is less needed)

- Xamarin support F# ( https://docs.microsoft.com/en-us/xamarin/cross-platform/plat... ) on Forms, etc

Some good from community+MS, because community tried to adapt techonologies and make it more friendly to use in F#

- Xamarin XAML in Elmish style, really more idiomatic ( https://github.com/fsprojects/Elmish.XamarinForms ) from F# creator itself (Don Syme, who now work on xamarin division too)

- WPF and Xamarin xaml can be used with a type provider too for statically type view at compile type ( http://fsprojects.github.io/FsXaml/ )

The only one not yet supported is UWP, because of of .NET Native.

All that without speaking of the gui stack outside .net framework, like electron+fable or just fable+elmish/react/react.native ( https://github.com/SAFE-Stack/SAFE-Nightwatch )


GUI development without designer support is just like time traveling to implementing Turbo Vision and Clipper applications on MS-DOS.

Never understood the mentality for designing UIs by coding instead of visually.

I care for what comes in the box, and is directly supported by Visual Studio and Blend.

If someone needs to lose their .NET GUI tooling productivity to embrace F#, then better wait while C# keeps getting F# most relevant features.

Even C++ has better UI tooling support on Visual Studio than F#.


Depends for what you need GUI. You need it for complex LOB app? So no, editor will be good.

Personally, i write c# and xaml, and i dont use the editor (vs or blend), but i edit directly the xaml.

About f# and gui, depends on use case. For example https://fslab.org/XPlot/ to show graphs.

  let series = [ "bars"; "bars"; "bars"; "lines" ]
  let inputs = [ Bolivia; Ecuador; Madagascar; Average ]
  
  inputs
  |> Chart.Combo
  |> Chart.WithOptions 
       (Options(title = "Coffee Production", series = 
          [| for typ in series -> Series(typ) |]))
  |> Chart.WithLabels 
       ["Bolivia"; "Ecuador"; "Madagascar"; "Average"]
  |> Chart.WithLegend true
  |> Chart.WithSize (600, 250)

this is used to generate programmatically i chart. inside a window with some layout, usually in enough. And i can do testing in the repl.

But yes, if you use editor like c# version, F# is less nice to use. But again, programmatically allow other things, like https://github.com/fsprojects/Elmish.XamarinForms

So depends how much time you edit the view (and why), vs gains in the logic behind the view. For me the global tradeoff, but depends, so you are right.


The only time I have ever felt compelled to design a UI visually is when working with iOS or macOS because the framework is so centered around Interface Builder. When I write a WPF or JavaFX view I am not using Blend/Scene Builder to drag and drop controls, but to have a mostly-accurate preview of what crap looks like without having to build and run.


Ever heard of the web?


Yes, it keeps trying to reach parity with native UI design tooling both on desktop and mobile OS.


If you're waiting for feature parity with C# in VS, you'll never get it. There are millions of C# developers and F# devs are counted in the tens of thousands (sadly) so there's a reason for that.

I don't believe most organizations considering adopting F# or Scala are considering them for GUI development so I'm not sure why you'd rule out F# because of that


This is kinda true, but not in the sense that you mean it.

And maybe not across the board, but my clients are very comfortable with the (very few) F# solutions we've done. Ocaml clients generally use it for stability, or correctness, they certainly aren't competing in the same sphere that kotlin, scala, Java really operate in. Prominent in the financial spade especially, but the trend is the same everywhere.

Reason may eventually help ocaml move closer into those spaces, as people are coming to ML with more openness and are more eager to learn and get involved, so that'll be cool to see in the future.

Anyways, don't count F# out either. I'm no expert, only ever used it with other project leads and with smaller clients, but MS is doing impressive work lately, like NET Core and stuff like that. They at least can show that making net more visible and flexible is part of what they're after.

Java won't be killed. Ever. But people may stop writing it eventually.


> F# only has access to a subset of .NET deployment scenarios, where Scala can be used pretty much everywhere there is a JVM.

This statement is wrong. F# can use .NET Core, and it works same as JVM.


Scala is used pretty heavily in the big data world, particularly if you are working with Spark.


Not to mention, Flink, Akka, Apache Beam (via the Scio library or the Java api).


Kafka too


Kafka switched to java 100%, afaik.


Python is still king there, even DataBricks market it that way. Scala is for advanced stuff that matters.


Python is king in analysis, but for big data engineering, most of the building blocks (as mentioned elsewhere in this thread, Akka, Kafka, Flink, Spark, etc) are written in Scala


It doesn't matter. Because pyspark is still the go-to language.

In fact with spark 2.3 python UDF, the performance gap has also reduced. https://mindfulmachines.io/blog/2018/4/3/spark-rdds-and-data...


pyspark might be the go-to language for data scientists playing with the spark repl, or MLLib, but for production data engineering, scala is still king.

Besides performance and the obvious fact that not knowing scala makes it difficult to understand the underlying Spark code, there are multiple ways in which scala is more natural to develop in (many libraries are for scala only, for example).


I don't think so. Python and data frames is arguably more natural to think about and reason than scala.

I have no doubt that scala is more performant and the "fat" jar mechanism makes dependency management and codeshipping very easy (it's still tricky to install python dependencies on your spark nodes), but the pandas ecosystem is definitely more intuitive to understand.


I have the impression you are leaning towards thinking of data analytics (pandas, data frames, etc) whereas I and some other commenters may be thinking more of more data pipelining kind of architectures, where you can't afford wrong typing, scale is quite large and you are not even doing the kind of operations pandas dataframes are useful for


It depends on the use case. Our work primarily revolves around extending Spark with custom pipelines, models, ensembles, etc. to be deployed into our production systems (petabyte scale). Scala was really the only way to go for us.


I can understand performance difference, but I have not generally seen a difference in building custom pipelines and ensembles .. although I grant I'm not at your scale yet.

What kind of specific pipelines did you have trouble in pyspark ?


Although we decided to start using Scala specifically because PySpark was not as performant (2.0 is not so far ago), a reasonable use case I keep always in mind is aggregation (and in general any API which is still not solid/experimental/under work). Python bindings are always the last to be available (because all groundwork is being done in Scala). We have a relatively large scale process that takes advantage of custom-built aggregation methods on top of groupedDatasets, where we can pack a good deal of logic in the merge and reduce steps of aggregation. We could replicate this in Python using reducers, but aggregating makes more sense semantically, which makes the code is easier to understand. Also, the testing facilities for Spark code under Scala are a bit more advanced than under Python (they are not super-great, but are better), even without considering that being strongly typed makes a whole kind of errors impossible, right out of the compiler.

I very, very rarely think of using PySpark (and I have way more experience with Python than with Scala) when working with Spark. In a kitchen setting, it would be like having to prepare a cake and having to choose between a fork and a whisker. I can get it done with the fork, but I'll do a better and faster job with the whisker.


I will stay away from veering into a statically typed vs dynamically typed conversation here ;)

But I'm very excited about pyspark 2.3 UDF bringing grouped map . It will be interesting to hear your views on that https://databricks.com/blog/2017/10/30/introducing-vectorize...


Only checked the implementation of the "Arrow UDFs" recently, because I'm interested in the Arrow interaction (for curiosity), so still don't have a strong opinion. My main concern is that a lot of the PySpark systems are playing around how to interact and speed up the systems while still staying on top of the Scala base.

I'd recommend Dask (haven't tried it much but from all I've seen is top-notch) to anyone who wants Python all the way down (at least until you hit the C at the bottom) ;)


well we run a hundred machine cluster on Dataproc for doing our stuff. Dask is still not battle-tested, cloud ready (or available) and is generally harder to work with than pyspark.

In general, I will stay happily in the spark world using pyspark rather than go to Dask right now.


Being able to pass data through Arrow is a big improvement, but there's also a lot of serialisation going on you pay in Python. Also, if you want to do anything in the fancy areas (like, write your own optimisation rule for the SparkSQL optimiser) it's Scala. Even something simple as writing a custom aggregator is impossible in Python (at least it was in 2.2, haven't checked in 2.3 or "current" 2.4)


Scala is still primarily used for data engineering workloads due to the fact it is a JVM language. (There's Java too, but no one wants to write Java code)

PySpark is often used for data science experimentation, but is not as frequently found in production pipelines due to the serialization/deserialization overhead between Python and the JVM. In recent years this problem is less pronounced due to the introduction of Spark dataframes which obviates the performance differences between PySpark and Scala Spark, but for UDFs, Scala Spark is still faster.

A newer development that may change all this is the introduction (in Spark 2.3) of Apache Arrow, a in-memory column store engine which lets Python UDFs work with the in-memory object without serializing/deserializing. This is very exciting as this lets Python get closer to the performance of JVM languages.

I've played around with it on Spark 2.3 -- the Arrow interface works but still not quite production-ready but I expect it will only get better.

Many folks are making strategic bets on Arrow technology due to the AI/GPU craze (and an in-memory standard enables multiple parties to build GPU-based analytics [1]), so there is tremendous momentum there.

At some point I expect the relative importance of Scala on Spark will decrease with respect to Python. (even though Spark APIs are Scala native)

[1] https://www.nextplatform.com/2017/05/09/goai-keeping-databas...


Scala has an insanely productive and efficient streaming ecosystem with the likes of Kafka and Akka streams. You can use those with other languages but it's not nearly as nice.


Why isn't it nearly as nice? Is there something particular about Scala such that Kafka or Akka are best implemented in it? Could you give some concrete examples (e.g. compare it with Java, C, C++, Rust, Haskell)?


(Compared to Java, C or C++ there's all the usual ML-family goodness - first-class functions, algebraic data types with pattern matching, type inference).

Scala has a for/yield construct similar to Haskell's "do notation", which is the perfect way to work with async code - it strikes the right balance of avoiding an unreadable callback pyramid of doom, but still your async yield points visible (the difference between <- and = is about as concise as it gets, but still very visible). And having HKT and allowing you to express concepts like monads means that a whole library ecosystem can build up of standard operatons operations (e.g. traverse, cataM) that work on async futures and also on other contextual/effecty types (e.g. audit logging, database transactions). That's the big advantage it has over Rust and most other competitors, and it particularly shows in things like Kafka/Akka that need to work with async a lot, but also in large programming projects generally. (What it shows up as in practice is that you can do everything in "plain old code" - all the things that need reflection/interceptors/macros/metaclasses/agents/... in other languages are just libraries in Scala - and that makes development so much easier and bugs so much rarer)

Haskell has all that - the trouble with Haskell is that "there's no way to get to there from here". I was able to go from writing Java on Friday to writing Scala on Monday - not great Scala, but working Scala, and I was just as productive as I was in Java. I couldn't've done that with Haskell.


I think the parent comment was talking more about using those libraries with Scala, not necessarily implementing them.

Can't speak too much about the others, but I recently looked into using Akka with Java. Akka itself had a good deal of documentation for Java users. However, other related products (look tools for monitoring Akka, etc) were clearly treating Java as a second class citizen. When there was documentation, it would be outdated, and unmaintained. I ran into this over and over again, which weighted heavily on my decision to not use Akka. I'm not quite sure if there is anything about the language that would have made it particularly challenging (doubt it), but it definitely seems like the community built around it is more Scala centric. And that in itself makes it challenging to use with anything else.


I took the comment to mean that Scala is the best choice for having implemented, e.g. Kafka, and that other languages somehow aren't suited for it, and asked the question with that in mind.


I still think Scala is a better better-jav than Kotlin, but Kotlin captures that market better because of the much lower learning overhead.

I'm much more of an ML-style programmer, but the thing keeping me away from OCaml is the lack of a multicore runtime. Not only is Scala's concurrency mature and performant, implicits (like having an ExecutionContext) make it extremely easy to use. Even if OCaml released it's multicore runtime today, it would take years to get to the maturity of the Scala ecosystem.

Plus, occasionally I run into areas where OOP really blows away functional programming. It's nice to not have to learn a new language for those occasional times.


OCaml fully supports OOP (the 'O' in OCaml).


> Just curious, what companies (or kinds of companies) are betting on Scala nowadays?

LinkedIn, Twitter, The Guardian, Morgan Stanley, Barclays, Zalando. Generally speaking, Scala is used a lot by companies involved with Big Data (because they use Spark).


I’m not sure if your info is dated or mine is, but all the systems development at LNKD that was being done in Scala is now done in Java. There is some legacy Scala still but as of a couple years ago they decided no new development. Perhaps they are still using Scala in a data science context, since it is hard to avoid there, these days.


It's trivial to avoid it in data science, with even modest to large (although not "Big") data in the few TB range. My data science team's entire stack is basically Python, with a smidge of Java and Rust for infrastructure (soa, pipelines) development.


I almost added the clarifying statement "on the JVM" but got lazy. Yes, Python is the obvious choice for data science until you hit a certain scale.

It's not even clear to me that most companies doing data science in Scala have that scale -- they're just using tools and libraries companies at that scale have open sourced. You could call it cargo culting, but I think it's more nuanced than that. I think engineers can be separated into two camps roughly: those who are passionate about the language(s) they use, and those who are simply trying to get the result they need and don't care what language they use to get it. A lot of data science engineers naturally fall into the second bucket, so using Scala because a library they want to use is written in it comes naturally, even if they could get the job done in Python (possibly with a bit more wheel reinvention).


I'm in the second camp, actually.

Python is a good choice for data science even at relatively large scale. I'd question it's suitability for stable, scalable deployment in production (not to say it can't be used then, just that I wouldn't necessarily reach for it first, preferring either C++ or Rust for that).

Scala just doesn't figure into the picture at all. I consider that some "Big" data tools were written in it to be a matter of trivia and not essential to the work of data science.


I think your productivity in Scala would be quite a bit higher than in Rust. I've done reasonable amounts of Rust and quite a lot of Scala, and _given the current state_ Rust is simply slower to develop in. C++, well, you know what the downsides there are if you prefer it over Python.

I think for a particular data science mindset (the category theory toting, bijection loving person) Scala actually _is_ essential to the work of data science. But these people are in a minority.

Anyway, if you're truly in the second category, then the fact that the best library for doing X is in Scala would mean you're going to write some Scala, despite the fact that it's ~accidental that it was written in that vs Python.


The control plane for Cisco Hyperflex[1]'s distributed storage uses Scala. We're hiring[2].

1- https://www.cisco.com/c/en/us/products/hyperconverged-infras...

2- https://jobs.cisco.com/jobs/SearchJobs/hyperflex?3_12_3=187


This doesn't mean they're using Scala to develop, necessarily. Some architects at the company I work for decided we needed Kafka, but there isn't a single line of Scala being written by anyone here.



Twitter created a website "Scala School" awhile ago, so I'm guessing they didn't jump ship yet (but could be wrong). https://twitter.github.io/scala_school/


Twitter is still very much heavily committed to Scala.


Interesting that you see it that way. I see Haskell and Purescript usage growing, and I don't know many people interested in O'Caml / Reason. Not trying to say you're wrong---just commenting on the effect of our respective filter bubbles.

As for Scala usage. I'm a Scala consultant, so I'm biased, but I'm seeing a lot of adoption. Most finance companies and most media companies are using Scala.


We use Scala as the primary backend language at Elastic for the Elastic Cloud SAAS & enterprise products (along with Java, Python and Go).


Yeah, I don't think Scala 3 changes anything with regards to perception and/or adoption of Scala. That ship has pretty much sailed.

Martin Odersky seems like a really good language designer. I took a look at the Scala 3 languages features, and a more radical departure from historic Scala probably would have helped Scala.


Three or four years ago the biggest criticism of Scala was that releases were breaking backwards compatibility too much. On this very thread you'll find people worrying that Scala 3 will create a Python 2/3-style endless awkward transition. At this stage it's a mature language with a big established ecosystem, it can't afford to break too radically.

(Did you have specific breaking changes in mind? The most important changes I'd want to make to the language - having a syntax for guaranteed-safe equality comparison and guaranteed-safe pattern matching - could be done in a backwards compatible way, and the only other thing I can think that I'd like would be Idris-style totality checking which could also be backwards compatible.)


I wouldn't have called it Scala 3. And if I had the big brain of Martin Odersky, 5 years ago I would've had started thinking about an intermediate step in programming productivity, correctness - instead of doing Doty. I don't follow Scala that much anymore, but it seems that it's mostly a reworking of the internal consistency of Scala from an implementation perspective. At least that was the gist of it I got from watching Odersky give a talk about it a couple years ago.


> 5 years ago I would've had started thinking about an intermediate step in programming productivity, correctness - instead of doing Doty.

Wait what? In the last post you were complaining it was too close to existing scala, now you're saying it's too big a step? Again, what is the change you're actually advocating?


You misunderstand. The intermediate step in programming productivity has nothing to do with any particular language. I would have not done "Scala 3", but something else that implements "intermediate step up in programming productivity and correctness". Probably something a bit radical, but hey development is so stagnant when it comes to way we program, it would've been worth the risk.

Hint, the stuff that Chris Granger was trying to accomplish.


I find Scala is the thing that's actually advancing programming productivity - even when it comes specifically to IDEs/HCI/visual programming - whereas I expected Granger's effort to fail as it did. So I'm very glad Odersky's continuing to focus on Scala; if anything I was worried that his efforts on Dotty were detracting from maintenance and improvement of Scala proper.


Could you (and perhaps others) say more about this? I was excited to read that key goals were simplification, clarification, and consolidation. But my exact worry is that they didn't go far enough.

I liked the idea of Scala and took Odersky's course. But I built a few things in it and it was never not frustrating. What Bruce Eckels said resonates with me: "I’ve come to view Scala as a landscape of cliffs – you can start feeling pretty comfortable with the language and think that you have a reasonable grasp of it, then suddenly fall off a cliff that makes you realize that no, you still don’t get it."

Then I watched this Paul Phillips talk, which convinced me that the problems were deep, not superficial: https://www.youtube.com/watch?v=4jh94gowim0

My hope with Dotty, etc, was that they had learned a little humility and were pruning back enough to make it a decent developer experience. But I could well believe that it was impossible to do enough and still end up with something that could be fairly called Scala.


> a more radical departure from historic Scala probably would have helped Scala.

Maybe. But high-risk. See Perl 6.


I know quite a few finance companies use it - Morgan Stanley and Goldman Sachs come to mind.



N.B. Scala and its ecosystem is actually a lot more oriented towards FP than F# or OCaml for that matter.

Its popularity also far surpasses F# and OCaml in jobs available, books, libraries or other metrics that count.


We use Scala heavily. If kafka/Spark/Monoids/Semigroups/Cats/Algebird makes any sense to you and you are looking for a job, send your resume to mansur.ashraf@apple.com


Here's one datapoint: AutoScout24 (the vehicle vertical of Scout24, a Europe-based digital marketplace) has fully committed to Scala a couple of years ago as part of a re-platforming (microservices on AWS). It's still the recommended default language.

Not a "big" company by global standards, but not small either, with around 120 engineers.


Twitter now is running its Scala code on GraalVM.


Woah, can you provide a reference? I'm curious about the rationale and performance implications.



Did you find any benchmarks with hard numbers regarding performance improvement?


> functional programming people now gravitate more towards either F# (for .NET ecosystem) or OCaml/Reason (for the more unixy world). And the academic/research/learning crowd seems to like Haskell more

What if you want functional programming without having to go to MS stack and stay on JVM. You only have have two real options clojure and scala.


If you make heavy use of Spark there is still no better language/ecosystem than Scala.


F# and OCaml? What source do you use to say that? Scala is way more popular in the industry that these 2 combined.


Expedia is a big time scala shop, but new stuff is trending towards Kotlin.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: