Hacker News new | past | comments | ask | show | jobs | submit login
Scala founder: Language due for 'fundamental rethink' (infoworld.com)
174 points by snydeq on Sept 6, 2014 | hide | past | favorite | 116 comments



> Essentially, the fusion of [functional and object-oriented programming] can have a power that neither of the two paradigms individually can have.

I tend to disagree. OOP can borrow some FP stuff (as we see with map/each/select/reject in Ruby for instance, or lambda's in Java); or to take it broader: imperative can borrow some from declarative.

But "fusing" the two will cause FP's ease-of-reasoning ("just functions and data structures") to be lost. On top of that, when using a strictly typed discipline (like Scala does), the types will become really messy.

So when it comes to FP, I like it pure-and-typed (a la Haskell), or untyped (like the Lisps), but not mixed-paradigm.

When it comes to OOP/imperative I like to mix in as much of FP/declarative as makes sense (like Ruby does for instance).


I agree with this, but I always used scala as a OO language mixed with FP features. I prefer OO, but many in the scala community were adamantly on the FP side (7 years ago, not sure today).

About OO: subtyping leads to semi unification which really complicated type inference. I'm beginning to think it might not be a good idea to combine FP unified features with OO semi unified features.


> I always used scala as a OO language mixed with FP features

Good point! Must admit that I did not write a lot of Scala, but I always approached it from the FP side :)


I think there's a huge opportunity for FP+OO... but that OO will have to be rethought from first principles (which are lacking in OO).

More specifically, I think OO-style code interactions are great but that they need to be separated from ambient mutability, subtyping, subclassing, inheritance, etc. Late binding/open corecursion is really cool however.

I would like to see a language which starts with ML modules as a base and build a totally novel semi-OO system into FP. I don't think it would, at the end of the day, look much at all like OO the way that OO appears today, but I think it could still gather many of the advantages while maintaining ease of reasoning, good type inference, etc.


I think there's a huge opportunity for FP+OO... but that OO will have to be rethought from first principles (which are lacking in OO). More specifically, I think OO-style code interactions are great but that they need to be separated from ambient mutability, subtyping, subclassing, inheritance, etc.

This has already happened. Ideally, good object-oriented code should always prefer immutability over mutability, should prefer interface polymorphism over inheritance polymorphism, should prefer has-a relationships (the OO version of composition) over is-a relationships (subclassing), and so on. Much of this was originally figured out a very long time ago.

So the issue isn't that these standards don't exist; it's that they aren't widely understood among users of object-oriented languages. That, in turn, is because OO is so widely used by so many programmers, including a lot of people with very limited formal training in computer science, that most users of OO languages have very little exposure to any of the literature about what constitutes well-crafted object-oriented code. Certainly less so than in FP, which benefits (in this department) from the fact that it hadn't even begun to escape the ivory tower in any sort of serious way until just a few short years ago. In short, object-oriented programming is a victim of its own success.

It doesn't help that the first really widespread object-oriented language wasn't particularly object-oriented, either.


> Ideally, good object-oriented code should always prefer

Really what I'm referring to is a language which outlaws instead of one that leaves it up to preference. And by outlaw I don't mean entirely, but instead a language which requires the programmer explicitly opt into these richer domains as they are needed.

My understanding is that while these principles are understood as valuable, OO is generally constructed as too permissive to enforce them.

In a similar story, unbounded recursion is generally "bad" in almost every part of a program---nobody likes mysteriously hanging programs---but only total languages have a formalism which allows you to profitably outlaw it.

So I'm convinced there are the beginnings of the theory of objects lurking around out there even if they rarely see the light of day (really) in practice. But I also recognize that modern functional semantics grew over the last 80 years. I don't think (maybe I'm wrong) that the whole package of OO has had so long to mature.


> So the issue isn't that these standards don't exist; it's that they aren't widely understood among users of object-oriented languages.

But we're talking about language design. I think enforcing constraints is much (maybe all, from a certain point of view and definition of "constraint") of what a programming language ought to do. Haskell code is principled because the language is principled, not because of conventions passed from an ivory tower.

Also do you know of any languages built around those principles you mention? It sounds like studying such a language might make OO click for me a little more.


I feel that Golang in many ways is on that path, the way it does not allow no-subclassing really got me thinking.

> I don't think it would, at the end of the day, look much at all like OO the way that OO appears today

Won't it be more like a Actor-oriented thing then?


"I feel that Golang in many ways is on that path, the way it does not allow no-subclassing really got me thinking."

I'm pretty decent in Haskell and fluent in Go at this point, and it really depends on what you mean by that. Is Go a certain brand of very refined OO? Yes. I've even found certain ideas from PF are surprisingly adaptable [1], the "surprise" being that it's not the usual "Look, Ma, I can map/filter/reduce!" that an OO language borrows from the FP world, which Go is actually really hostile to. (Go does have closures and they are important, but the type system precludes a lot of the "good stuff".)

However, is Go headed in the direction of Haskell? No. It's definitely not. Trying to turn it into Haskell would be nothing but pain. Mutability is pretty fundamental in Go, and while there's a culture of isolating it to goroutines, the language itself does little to help with that.

[1]: http://www.jerf.org/iri/post/2923


> it really depends on what you mean by that. Is Go a certain brand of very refined OO? Yes.

Indeed that's what I meant.


I'm not certain. I think the interesting aspects would be in considering what's missing from open corecursion. I don't think the actor model is much better foundationally---it's a destination, not a beginning or a path.


Uhm...you mean like Moby?

http://moby.cs.uchicago.edu/


Perhaps, I'll take a look at it.


What is "open corecursion"?


Corecursion is when you define the meaning of a type by a coalgebra like

    x -> F x
for some pattern functor F. So, for instance, defining an infinite stream like

    newtype Stream = Stream { runStream :: (X, Stream) }
is corecursion. The interesting part is that whatever defines the stream is forced to fix the Stream type immediately. Open (co)recursion in OO langauges occurs when the recursive type is left undefined (i.e., self is passed in). This allows you to change the type of the object at every step if desired.

The similar game in a recursive function might be defining face like so

    fact :: (Int -> Int) -> (Int -> Int)
    fact recur n = n * recur (n-1)
We "tie the knot" by passing fact to itself

    fix f = f (fix f)

    fix fact :: Int -> Int
which would allow us to add extra cases as needed

    let term recur n = if n == 0 then 1 else recur n

    > fix fact 3
    -- infinite loop
    > fix (term . fact) 3
    6
So this is a kind of "mixing" of recursive functions. Open corecursion lets you do OO-like "mixing" of corecursive functions.


But look at Clojure. It's imperative yet quite FP, but much of the language rests on OO polymorphism. Clojure certainly isn't an OO language (in fact, it is decisively "anti-OOP"), but it does mix an essential OO idea, and that actually helps keep the language simple.


I would not call Clojure imperative myself. To my best knowledge it allows imperative style just as much as declarative. And having read some Clojure code before I thought that it's community kind of prefers declarative for most things.


> Clojure certainly isn't an OO language

Multi-methods and protocols


That was my point: even though Rich Hickey pretty much touts Clojure as an anti-OOP language, Clojure still relies heavily on OO polymorphism, yet it doesn't make the language any more complicated -- in fact, it helps keep it simple.


Sorry I misinterpreted it.

As someone that likes multi-paradigm languages, I tend to dislike that many FP guys "forget" their languages also do have first class support for OO when needed.

Somehow, I did not read you comment properly and spitted out a bad comment.


In Java 7 it was pretty trivial to create a closure with an anonymous class and if that anonymous class implemented Callable or Runnable it was effectively a lambda with strong types.


So you don't use type classes in Haskell?

Neither touch F# or OCaml.

Nor CLOS in Lisps.


A major advantage of typeclass polymorphism vs dynamic polymorphism is that typeclasses enable parameteric polymorphism, and parameteric polymorphism enables reasoning that is lost if types can "pretend" to be a supertype.

For instance, in Haskell you can prove a functor instance has at most one implementation (if we disallow diverging values), uniquely given by fmap id === id. GHC can automatically derive it for you. If you have dynamic polymorphism this is lost because now you can have "nonuniform" functor instances that behave differently for different types while still satisfying the functor laws.

More generally, parameteric polymorphism eliminates a large number of "behavior surprises" that can lurk in OO code. And with existential types I never find myself missing dynamic polymorphism. Similarly, in Clojure, if I find myself using dynamic behavior, it's either to interact with Java or as a hacky version of parameteric behavior.


I don't understand your example. Is there an explanation for people who aren't Haskell experts somewhere?


> For instance, in Haskell you can prove a functor instance has at most one implementation

How would one do that?


I believe the technique is relational parametricity, but I don't actually know how to apply it in this case.


Correct me if I'm wrong, but isn't OO in OCaml frowned-upon, or at least not used frequently by seasoned programmers?

Not an OCaml programmer myself, but I heard this multiple times. For example, in Real World OCaml [ https://realworldocaml.org/v1/en/html/objects.html ]), and I quote from Chapter 11:

> You might wonder when to use objects in OCaml, which has a multitude of alternative mechanisms to express the similar concepts. First-class modules are more expressive (a module can include types, while classes and objects cannot). Modules, functors, and data types also offer a wide range of ways to express program structure. In fact, many seasoned OCaml programmers rarely use classes and objects, if at all.

To be fair, the chapter goes on to explain why you'd want objects. It also states they are very different from objects as understood by other OO languages. I'd say OCaml is therefore a bad example for OO.

--

How are type classes in Haskell an OO feature?


The fact that OO is not used frequently in OCaml does not mean that it is useless. Objects in OCaml are typed structurally, not nominally, i.e. object's type is the set of the methods it provides.

Consider an example:

  let a = object method f x = 1 end;;
  val a : < f : 'a -> int > = <obj> 
We define an object a, and the interpreter tells us that the type of the object is "< f : 'a -> int >", i.e. it provides method f.

Define another object

  let b = object method f x = 2; method g x = x end;;
  val b : < f : 'a -> int; g : 'b -> 'b > = <obj>  
Define a function that takes calls method f of its argument:

  let f obj = obj#f ();;
  val f : < f : unit -> 'a; .. > -> 'a = <fun>
The type of the function tells us that its argument must be an object that provides method f, and maybe some other methods. Now, we can call this function with object a, and object b:

  f a;;
  - : int = 1  

  f b;;
  - : int = 2 
This structural typing, (or if I'm not mistaken, it's called row polymorphism) is an interesiting feature of the language, and it may be useful in some situations. It does not harm the rest of the language, and I see it as a good feature that probably does not get a lot of attention, but it's theoretically solid and works well. Compared to more usual OO systems, it is cleaner and, imho, easier to reason about.

That said, I think, the authors of "Real World OCaml" are right, encouraging the use of Modules rather than Objects. But they do so primarily because in many mainstream languages the main means of abstractions are objects, and they are used for everything. OCaml has functors, modules, it has records, it has ADTs, it has functions, so the need for objects is very limited, and so the new users must get this message very clearly.

An article on objects in OCaml: http://roscidus.com/blog/blog/2013/09/28/ocaml-objects/


Awesome, thanks for the explanation!

I think I understand the appeal of structural typing: you are defining implicit anonymous interfaces as you write objects, and automatically grouping objects that share structure. But in which case is this more useful than, say, defining type classes or using explicit interfaces in mainstream OO languages? Or is it a case of preferences?

I can see a difference with objects: in most mainstream OO languages you cannot retrofit an object to comply with the interface you want if you don't control the source code and the object only differs from what you need on a triviality such as naming. But what about type classes? Isn't this what they are there for?


> How are type classes in Haskell an OO feature?

See https://news.ycombinator.com/item?id=8280613

There are so many ways of doing OO.


Depends on how you define OO. My own definition of object involves identity and corresponding encapsulated state, which is something that Haskell doesn't support directly (since identity isn't pure).


Sure, but given your backgound, I imagine you are fully aware that back in the early 90's the OO community was quite divided about what are actually the core concepts and their implementation.

So, what OO is all about will depend which researcher gets asked.


Oh, fully agreed. I think my main point was that OO in a language such as OCaml has very little to do with OO in a language such as Scala (or Java). I'm agreeing with the OP that the marriage of OO and FP in Scala is problematic.

Yes, we can use less currently mainstream (note: not less correct) definitions of OO, but that doesn't help the Scala=OO+FP position.


Just like languages have versions/revisions, e.g: C++11,C++14, Java8 ...etc, having versions at major changes during evolution of OO might have helped in creating reference points which can facillitate discussions and enhance understanding i.e. like OO98, 0099, OO2005 ...etc.


That's impossible, because what would you write in OOxx, given the different CS views and programming models on the subject?

Jump dump what is being worked on at any given point in time?

For me it is hard to grasp how youngsters see OO or FP, because I was part of the initial mainstream waves. So I got to see the things in a different perspective.

The world at companies was procedural, and lots of experimentation was going on.


True. And in the end, this even applies to functional or declarative programming


> So you don't use type classes in Haskell?

Sorry, how does this remark relates to my comment?


I assume it was intended to be 1) an assertion that type classes are OO, and 2) an implication that using them is mixing OO in your FP (which you said you don't like to do), and so 3) an allegation of inconsistency which should be clarified or examined.

There are a few places where this chain is weak.


> an assertion that type classes are OO

They never occurred to me as OO. Could someone explain this?


Sure, since I mentioned it.

Sorry if I misassume your OOP experience.

Many think about OOP as introduced by Simula/Smalltalk, or how it is served by Java/C#... ones.

However there are many other ways of doing OOP, which many of us in the old (early 90's) experimented with, while OOP was striving to become mainstream.

Type classes in Haskell and their use, can be easilly mapped to interfaces/protocols/traits/... in the OO world.

From the OOP point of view, type classes are a group of operations that can be applied to a given type and refereed to by a specific name.

Which allows writing polymorphic operations over types that support a specific a specific group of functionality.

Polymorphism and data hiding are the common concepts across all OOP proposals.

As for the remaining concepts, there are lots of languages that agree to disagree, between mainstream, academia and lost OOP battles.

And I can call one of the many Simon Peyton Jones talks that describe this, http://yow.eventer.com/events/1004/talks/1054, starting at 00:42.


To my mind, the biggest thing missing from typeclasses that you get automatically with many OO systems is a sense of persistent identity of objects. Of course Haskell has ways to roll your own, and I'm not sure the decoupling is a bad thing by any means, but it's a difference.


tnx. very insightful. I'm not fully convinced that type classes are OOP, but I certainly understand why --for a very broad definition of OOP-- one might think they are.


Yes.


So what are you proposing to prevent functional languages form being a complete nightmare from a modularity point of view? (See Haskell.)

Module systems aren't much more than castrated object systems, so I don't see a real benefit of picking some half-assed module system over a first-class object system.


Haskell modules may be lacking in some aspect, but I certainly dont think they are half-assed. They are conceptually very simple and the community got quite far with them.

I also dont think that Haskell-modules and objects are on the same level, for instance: Haskell's modules do not encapsulate state.


> So when it comes to FP, I like it pure-and-typed (a la Haskell), or untyped (like the Lisps), but not mixed-paradigm. You overrate pure programming.


How did Odersky miss Yammer's citing Scala's complexity as a reason for moving away from it? Odersky said:

> In terms of complexity, I don't know whether Yammer featured that, but you definitely do hear that a lot [from] many people.

The first paragraph (after introductory greetings) of Yammer's letter which explains its action [1]:

> Scala, as a language, has some profoundly interesting ideas in it. That's one of the things which attracted me to it in the first place. But it's also a very complex language.

[1] https://gist.github.com/anonymous/1406238


Indeed. Another example from the fourth paragraph of the letter:

"Scala, as a language, has some profoundly interesting ideas in it. That's one of the things which attracted me to it in the first place. But it's also a very complex language. The number of concepts I had to explain to new members of our team for even the simplest usage of a collection was surprising: implicit parameters, builder typeclasses, "operator overloading", return type inference, etc. etc."

It seems very clear the language's complexity was a major problem for Yammer.


I wonder if that means Odersky's not really listening to criticism, which would be really bad.

EDIT: He does, he just doesn't remember where it's coming from.


I find Odersky's comments on Swift very apt. I think both Scala and Swift are aimed at being mergers between FP and existing OO systems and both of them feel a little warty for the effort.

I really applaud trying to merge the benefits of FP and OO. I think it's going to be a much larger and harder project than either Scala or Swift will benefit from, but I'm really glad that they're exploring the design space.


It's reassuring to see that the various groups working on Scala seem to have the same broad goal in mind. I also think it is important to not let the language ossify due to legacy -- which is the problem that kills many languages.


Outside of maybe Fortran and Cobol I can't think of anything that was A)alive enough to be killed B)killed by ossification.


> Each new release of Delphi attempts to keep as much backwards compatibility as possible to allow developers to continue to use existing code without incompatibility of interfaces or functionality.

https://en.wikipedia.org/wiki/Delphi_(programming_language)


I'm pretty sure Delphi died from having to compete with essentially free C# and its better integration with Microsoft's ecosystem.

And a little bit from the shifting of the overton window away from paying for languages.

And mismanagement of Borland/Inprise.

And about a dozen other factors before "lack of (breaking) change in the language".


It's 1996. Java runs on Linux, Delphi doesn't. Java also runs where Delphi runs. Java is free. Java 1.1 with inner classes looks much like recursive blocks in Pascal (albeit without procedural types -- function pointers to C folks).

THE END.

(sadly, as I really liked Borland Pascal)


"killed" might be a bit strong, but the overall story of "stop working on perl 5 in favor of working on perl 6" contains some aspects of the concept.


And even those have modern standards.

Object Cobol anyone?


Honestly i 'm just guessing on Fortran and COBOL. I wasn't there when they lost prominence. I just know that no other languages actually lost significant amounts of popularity for not changing (java? Its maybe less popular but calling it dead is not even wrong).


Java isn't dead, but it suffered greatly during the hibernation years starting with the Oracle takeover.


Funny, when I read the headline, I thought this was referring to Paul Phillips saying that "Scala is unfixable" [1] in the other thread. He's also a Scala founder and he's also saying the language needs a fundamental rethink.

Interesting how the interpretation can vary.

[1] https://news.ycombinator.com/item?id=8277626


Paul is not a "Scala founder". Martin Odersky started to create Scala, the language, by himself 10 years ago. Typesafe, the company around it, was founded 5 years ago and Paul was one of the co-founders.


With the release of Java 8 I wonder if Scala is getting squeezed from both sides. Java 8 on one and Clojure on the other.


Unlikely, Clojure is very much its own thing, won't draw those that are drawn to Scala (rich type system, (G)ADTs, performance, etc.). That branch of potential adoption is minisicule anyway (even compared to Scala which relative to Java is itself tiny).

So, the elephant in the room is indeed Java 8. Will Scala at long last be killed? Perhaps, but highly unlikely, Scala lives and dies on its ecosystem (Akka, Play, Spark, Spire, Spray, etc.) which are seeing ever increasing enterprise adoption.

If anything Java 8 and the JVM improvements made and coming will aid all languages on the JVM that are different enough from Java 8+, which Scala, Ceylon, and of course, Clojure, are. The only JVM language made vulnerable by Java 8 is, IMO, Kotlin.


Clojure is becoming increasingly popular in banks in the UK. (as I heard from trusted person)


What's up with the UK? Scala is also becoming increasingly popular in banks in the UK (as I heard from trusted persons).

CLOS, Smalltalk, and Objective C (before it was cool) were for a long time very popular on Wallstreet.


Yep… I can say from personal experience that Scala and Clojure are becoming popular in the financial world.

There seem to be several drivers: the FP part appeals to the algo guys, immutability is attractive when looking at concurrency, and they can interop with the extensive existing Java codebase easier than some of the alternatives.


Supposedly, Haskell and F# are also big in UK banks. Wow.


For what it worth, this website ranks the popularity of some keywords in UK's IT job ads: http://www.itjobswatch.co.uk/ Bank jobs are a big part of UK's IT jobs.

There is a category "Programming Language", in which you can see Scala and Clojure has taken steam since 2010, even if interest in Clojure seems to have waned a bit this year (and now it trails COBOL or Delphi). Interestingly, the salary advertised for Scala and Clojure jobs are also larger than for the more popular programming languages.


Relevant: Will Java 8 Kill Scala?[1] (spoiler: Betteridge's law applies)

[1]http://ahmedsoliman.com/2014/03/26/will-java-8-kill-scala/


Depends if the question is targeted to startups using the JVM or the classical bureaucratic enterprise.


"Essentially, the fusion of [functional and object-oriented programming] can have a power that neither of the two paradigms individually can have."

Very aligned with my opinion. I would add that having strong static type system with opt-in dynamic types system (syntax integrated) adds even more power (e.g. C#).


What a confused interview. If the audience is for people who are interested in knowing scala's timeline then saying "isnt Java 8 functional now because it has lambdas" is either dumb or trying to be dumb on behalf of a java audience who doesnt care about the scala timeline.


Or "another language that's becoming popular, Node.js"


Lol - "language", it's not even a platform.


Libraries, frameworks, and languages are actually quite similar; it was long ago that when people said "Ruby" they actually meant "RoR".


I agree. It hardly sheds any light on what's really going on in the Scala community or why.


I'm really hoping that functional languages will evolve in a way that they become more readable at a glance.


Haskell is very clean and readable. The problem, however, is that its such a departure from imperative languages that you can't bring much of anything you already know over to it. You really do have to forget everything you're learned about programming and start from scratch. Until then, its all alien. That being said, I think that Haskell is the most clean looking of all the FP languages, but that's purely my opinion.


> The problem, however, is that its such a departure from imperative languages that you can't bring much of anything you already know over to it.

I don't find that really true; if you've used any structured language with static typing, there's a lot of carryover, especially if you've ever done any formal study of CS, since even when imperative languages are used functional patterns are pretty common.

The biggest problem I see is that there isn't a lot of pragmatic guidance on the bits that are unique to Haskell -- a lot of the documentation is very abstract, without a lot of guidance on solving specific real world problems. The power of Haskell is that you can make very general solutions in it quite easily; the weakness of the documentation of Haskell is that virtually all of it focusses on that highly general facilities and not enough on the kind of specifics that would help people get familiar with it while solving real problems.


Haskell suffers from people who want to make things look clean and achieve it by using symbols and very short names. These things have to be memorized before you can understand anything.

Basically, people are trying to make Haskell code look like math and the truth is that math's symbolic notation is optimized for hand writing speed, /not/ reading comprehension.

Take a look at Ada code. Maybe they're taking it a bit far, but it's very readable. Spelling words out and being very very sparing with symbols is a very good thing, I think.


I don't think this is true. I think Haskell's notation is the way it is because it emphasizes the "look" of certain abstract patterns that would be obscured by the notation used in other languages (I believe this is the case for math as well... I doubt it's that way for hand writing speed, since I doubt that's the bottleneck in maths). This is not what many of us who come from other programming languages are used to, and therefore we identify it as "more difficult", but it actually is lack of familiarity. Not only with Haskell itself, but with the abstractions and the way of looking at programs.

I doubt the majority of Haskell programmers would prefer a more verbose notation. I think that, like the parent post, they consider Haskell very clean and readable.

A friend of mine who is a proficient Haskeller told me he often struggles to understand new Haskell libraries. Then the library "clicks" for him and he can begin to use it. I -- who program in Java for my day job -- have trouble understanding this, because this is seldom the case for Java: you may not understand what problem a Java library is intended to solve, but using it is usually trivial (and mis-using it is trivial as well, of course!). I think it's unfair to compare both languages this way, because in Java there is a fairly low use of general abstractions, and Java's syntax (and coding practices) have evolved to reflect this. The verbose syntax that works (1) for Java is not suitable for other kinds of programming.

(1) for some values of "works".


> since I doubt that's the bottleneck in maths

I'm not a mathematician, but I heard that math is most frequently written on a whiteboard. If that's true, then optimizing for speed and space efficiency of hand-writing suddenly makes sense.

> I doubt the majority of Haskell programmers would prefer a more verbose notation

The exact same thing can be said about PERL programmers. But why stop there, take a look at APL, J, K languages. J, for example, ships with all standard verbs aliased to words, like "under", "behead", "curtail" etc. Of course, nobody uses those, they write "&.", "}.", "{." instead.

The difference is that in J you have small, consistent core vocabulary (http://www.jsoftware.com/help/dictionary/vocabul.htm), and once you learn it and it "clicks", you very much are ready to do anything with the language. Haskell way makes you go through the same process again and again with almost every library. The sad thing is that you could write Haskell in a way which wouldn't require you to internalize large numbers of new symbols, but you generally don't. Is it really because of how much better the abbreviated, symbol-based notation is so much better, I have to wonder.


> I'm not a mathematician, but I heard that math is most frequently written on a whiteboard. If that's true, then optimizing for speed and space efficiency of hand-writing suddenly makes sense.

But a lot of math is read from papers as opposed to written. Math and logic papers employ the concise notation we're talking about. You'd think mathematicians would have evolved a better notation for paper-writing if one was lacking (there being already many conventions and style guides that apply exclusively to paper writing and that are odd for everything else).

> The exact same thing can be said about PERL programmers. But why stop there, take a look at APL, J, K languages. J, for example, ships with all standard verbs aliased to words, like "under", "behead", "curtail" etc. Of course, nobody uses those, they write "&.", "}.", "{." instead.

Indeed, the same could be said about other languages. I was even going to write in my original post "of course, Haskell programmers are self-selecting". There are problems with the syntax of some of the languages you mentioned, probably because they require a special keyboard, which is an obvious hurdle. Others may not have prevailed because they were jarring to programmers that also had to program in mainstream languages (therefore, the familiarity argument all over again). Yet others are arguable -- PERL for example is joked about as a "write only" language, but how much of that is truly because of its syntax? Maybe idiomatic PERL merely leads to barely understandable programs. Maybe PERL is simply not a good language, no matter what (note: I'm not arguing this is the case, not being a PERL programmer myself). And maybe it's not true that PERL programmers find no fault in its syntax; maybe they have trouble reading their own code, which would put them in a different category than Haskell programmers!

In the end, all of this amounts to: a language syntax is aimed at its intended audience, programmers using that language and familiar with its intended degree of abstraction and with the problems it aims to solve. Programmers from other languages will often find it weird (and in Haskell this problem is increased by the fact it seems to belong in a different category than languages such as Java/C++/C#/etc.). It is a mistake to say that, because of this, language X has a "harder syntax"; the more correct claim would be "language X has a syntax that will likely be harder to learn for programmers used to F-inspired languages". This is probably related to the Blub paradox :)


> It is a mistake to say that, because of this, language X has a "harder syntax"

Yeah, I mostly agree. But I believe there is some objective notion of readability and that syntax can be objectively more or less readable. However, that's only my belief, as we have no objective way to measure readability yet. (There's this wiki article: http://en.wikipedia.org/wiki/Readability but as far as I can tell nobody bothered to apply methods mentioned there to code)

So, as it's almost impossible to have a meaningful discussion on readability, I don't try to argue that Haskell is or isn't readable at all. However, the fact(? seems intuitive enough, but maybe it's wrong?) that it's easier to associate meaning to words than to abstract symbols and the frequent use of abstract symbols to name things in Haskell makes me argue that Haskell is harder to learn than for example J (btw, J and K use only ASCII characters). But that, too, is open to discussion - one can claim that names like <*> are easier to learn and more precise, because they come without any prior associations in the reader's mind.

In the end, the only thing I can say is that for me - probably because of a whole lot of factors and influences - word-based identifiers are easier to learn and use than symbol-based ones. It's entirely possible that a majority of Haskell programmers are different in this regard (and yes, they - as for all languages - are most certainly self-selecting).


Well written - for some particular definition of "good" - Haskell is quite pleasant to read and work with. But, frequently, the code I see looks like mix of bash and PERL, only with a bit more consistent indentation.

OCaml code, on average, has less symbol-based infix functions and tends to be more readable (for me). But I'm not a Haskell programmer, so maybe that's a matter of getting used to, like parens in Lisps...


Readability is in the eye of the beholder, which is to say, familiarity breeds content.


Scala can be as readable or as unreadable as you like. Which is the problem with using Scala in a corporate environment, it can become a terrible weapon in the hands of a bad developer.


You also have to decide whether to use scala as a functional language or as a better java. Each approach has its merits and drawbacks.


Do you think Scala could succeed with a "Scala--" fork: "Java, the good parts".

Just a few things like type inference and implicits and case classes would be enough to make Java devs drool without getting scared by FP or performance scares.


You could totally use Scala as a language without the laziness, which can be quite a pain when doing perf-sensitive work. Traits are also a huge deal.

I don't think the community wants to go in that direction though. Who Scala attracted was a lot of FP programmers (Haskell/ML) who wanted to do work on the JVM; not Java programmers looking for a better alternative to Java on the JVM.

So is life.


This probably depends on where you work, though. I work in the industry, and in my experience people using Scala (1) are simply looking for a better Java, and have no prior knowledge of FP or other languages which are not C-like OO. They would probably freak out if they saw a language similar to ML. On the other hand, the few people I know working in academia have very little interest in Scala.

(1) i.e. my coworkers, and people I know in other companies.



I think that has a lot to do with personal familiarity over language design. FP designers are rarely trying to make their languages feel familiar to C/Awk/Python/Lua/Ruby/Java programmers---though it certainly is an influence from time to time.


I find Clojure to be one of the most scannable languages. The s-expression syntax gives you a visual representation of the relations between different pieces of code. When you see one expression nested inside another, you know they're related. You can tell what branching and relations are there by a quickly scanning the code and looking at the indentation. Having a bit more syntax than traditional Lisps helps break the code up visually as well. For example, things like bindings jump out immediately.


"The language is the code's biggest API" -- my own words.

Breaking that API will result in a new language.

Scala, like several other languages (JS, PHP, Perl), has had very little time to iterate on it's syntax before it got widely used. All these languages want/need to make breaking changes to their syntax, at which point they are at risk of becoming fragmented.

Python3 was not even so much different, still it was (and still is) a hard upgrade for the community.

I really hope Scala manages though, it's the largest FP-on-JVM community I guess, so good luck!


Well -- they have the Python 3 disaster to learn from... you can't break backwards compatibility for minor gains -- if you are going to break it -- break everything you must and make the new thing much better -- half measures in this regard suck.


It's really absurd when people call Python 3 a "disaster". There was nothing disastrous about it. In fact, hindsight shows us that it was actually a very good path to take.

Python 3 didn't negatively affect Python 2 or earlier users. Their code still runs fine, and is well supported by a huge number of libraries. They weren't forced into upgrading against their will at any point.

Python 3 allowed the Python developers to make some breaking changes to the language and libraries. These have, without a doubt, improved the language.

Much Python 2 code, especially well-written code, can be automatically converted to Python 3 code with little to no effort. Anyone with any sensibility who has been writing new Python 2 code within the past few years has been keeping an upgrade to Python 3 in mind. Their transition should be quite painless.

Over time, more and more existing Python libraries have supported Python 3, or been replaced with significantly better replacements, as the need arose.

The fact that we see so many libraries simultaneously supporting both Python 2 and Python 3 goes to show that the community is not "divided" or anything like that.

The only downside is that it took a few years longer than people may have initially been expecting for certain libraries or frameworks to support Python 3. But at this point in time, Python 3 is a clean, usable language with very good third-party library support. Existing users weren't forced into using the new version, yet those developing the new version weren't constrained by compatibility concerns. The end result is an improved and usable language, achieved with minimal disruption.

Perl 6 is an example of a real disaster, on the other hand. It still doesn't have a truly good implementation, even after 10+ years. Not only is Perl 6 pretty much unusable in practice today, but the uncertainty it caused stunted the growth and development of Perl 5 for quite a while. It is only recently that we've seen people finally realize that Perl 6 is a lost cause, and get back to using and evolving Perl 5. Compared to Perl 6, the Python 3 development process was perfection.


Python 3 allowed the Python developers to make some breaking changes to the language and libraries. These have, without a doubt, improved the language.

Sure. But the improvements aren't really that great, and IMO they weren't enough to justify breaking everything. Some combination of a JIT compiler, GIL removal, and optional typing might have been.

The end result is an improved and usable language

And this is why I view Python 3 as instance of choosing purity over practicality. Python 2 was and is a very usable language. It's vastly better than JavaScript, which took over the world by virtue of being available everywhere and having halfway decent performance, which ended up outweighing its huge flaws as a language. I can't help but think that we'd be better off if the effort spent on the Python 3 migration had instead been directed toward speed and browser support.

Perl 6 is an example of a real disaster, on the other hand.

Certainly can't argue with that.


It's a major release and the language does need to evolve at some point. I'd agree that Python maybe got a little too ambitious with the upgrade but it was definitely a good thing.


> Not only is Perl 6 pretty much unusable in practice today

I don't think that's true anymore, and for some subset of people, hasn't been true for a while now. If you consider performance the big blocker, that's getting better all the time[1],and is close to being competitive with Perl 5 ins some areas.

[1]: http://jnthn.net/papers/2014-yapceu-performance.pdf -- Start at slide 76 for performance graphs


"Compared to Perl 6, the Python 3 development process was perfection."

Comparing to a series of "Apocalypses" (the term used early in perl6 development to refer to radical breaking changes) is not great evidence that python3 is non-disasterous.

Do you really think python3 being 6 years in limbo (and counting) is a good thing? Probably not fatal, but I don't see how it's good.


Python 3 was never "in limbo".

From the very start its goals were clear. Yes, it took some time to implement them, but this was done rather efficiently and quickly.

Unless you were using one of a handful of libraries that didn't put forth the effort to be compatible with Python 3, it was very easily to adopt Python 3 early on, and to use it effectively.

I worked with a group that adopted Python 3 relatively soon after its official release. This would've been around early 2009. We developed a number of large systems using Python 3, without any major problems. Sure, we ran into bugs now and then, but we reported them and they were fixed soon enough. We helped port some libraries to Python 3.

We didn't regret the decision to go with Python 3 then, and the last I talked to people still involved with those projects, they don't regret the choice now. They're glad that their millions of lines of code are targeting Python 3, which is without doubt the future of the Python language at this point.

I don't know why people such as yourself continue to portray Python 3 as a "disaster", when all of the evidence and much of the experience with it shows the complete opposite to be true.

It was a smooth transition for Python 2 users who didn't want to or need to upgrade. It was a smooth transition for Python 3 early adopters. It's a rather smooth transition now for Python 2 users who want to use Python 3. "Disaster" just isn't the sort of term to describe a transition that goes well for all involved.


> Unless you were using one of a handful of libraries that didn't put forth the effort to be compatible with Python 3, it was very easily to adopt Python 3 early on, and to use it effectively.

You're speaking in the past tense about a present problem. Many large projects that would like to migrate to Python 3 cannot because the library support isn't there.

This is in no way meant to disparage the motivations that led to Python 3, because it's better in every way than its predecessor. Only that the transition problems are still present, and the larger the project, the more difficult the transition.

Sage (http://www.sagemath.org/), a project I'm involved in, is just one example -- a large, complex project, it relies on dozens of mathematical and other libraries, and to transition to Python 3, all the libraries would have to be available in Python 3 versions -- even one exception would prevent the transition.

It's safe to say that everyone involved would like to see a transition to Python 3. But it's not possible, and for the foreseeable future it's not even likely.


Python is a dynamic language and a compiler can't do much in helping you move from 2 to 3. You really need good test coverage.


Dynamic language notwithstanding, the transition was badly designed, in terms of being supportable by automation and providing a cost/benefit motivation for upgrade. With something like that, pretty good = disastrously inadequate.


Syntax is absolutely not the issue.


Scala's syntax is a glorious fuckup! YMMV


Scala's syntax is actually quite nice. But I guess it depends on what you are comparing it to.


You didn't read GP's post?


"In terms of complexity, I don't know whether Yammer featured that, but you definitely do hear that a lot [from] many people. Yes, absolutely -- that's precisely what we want to address in the future versions." --Odersky

I read it, I'm just asserting it! Any issue with that?


I don't think that quote is about syntax. In programming languages there are other, way more important sources of complexity besides syntax.


Are there any risks to adopting Scala as a language for a project knowing that it will change over the next few years? It sounds like as opposed to an evolution, they are going to change the language itself, which sounds like backwards incompatibility and refactors within a few short years if you don't want to be considered legacy.


If stable longevity for your project's implementation is important (as your question implies), then I think Scala does present additional risk (relative to stable/mainstream languages). It sounds to me, too, that they are thinking of changing fundamentals in the language/system. This could lead to something like the Python2vs3 scenario for a while. If you base on current Scala, you might (likely) later find yourself doing either a re-write, or being relegated to a "compatibility" library. You're wise to ask.


It depends on what you're doing. If you're really pushing forward with experimental features, sure, there's some risk. But if you're using common libraries, you probably have more to worry about staying up to date with the library interfaces than the language changes.


No, not really.

There might be a few changes, but probably nothing which can't be fixed automatically by a tool/IDE. (They are investing in that currently.)

Additionally, if your code compiles with version X without warnings, it should compile with version X+1, too. (Otherwise that would be a bug.)

Apart from that, stable versions are supported for a long time, so just like in Java you can migrate when you feel comfortable with it.


Comments on compiler speed as a problem? I find it sometimes to be a bit slow.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: