Hacker News new | past | comments | ask | show | jobs | submit login
Impressions of Go (bensigelman.org)
187 points by icey on July 23, 2013 | hide | past | favorite | 268 comments



Articles on Go are getting a little boring. The majority are written by people who've spent an afternoon looking at Go and decide to rehash the same 5 bullet points. In the time it takes to read their "article" I could learn the same stuff, better, by reading the official Go tutorial.

Some in depth reviews based on actually using it for an extended period of time on a large project would be nice...


I agree about the articles themselves but vote them up because I tend to learn things on the resulting threads.


Go threads aren't as fun to read without uriel scolding non-believers for their bad taste.


As non-believer I miss my discussions with him.


I still wonder why Go is so prone to religious fanboys, though.


The problem is that even those are boring. I've used Go for an extended period of time on a large project and there's nothing to report. It works just like what everyone else already says.


I'm a Python guy, and there's something in this, I think. One of Python's underrated virtues is how uncontroversial a lot of Python idioms are. The "only one way to do it" attitude helps there.

In my (limited, biased) experience, there's three camps among the Python community:

1. people who want principle-of-least-surprise: there's a lot to be said for boring languages (if the language is the most interesting thing about your product...). Often the systems engineering/service engineering crowd.

2. people who want it to be more functional – the Haskell/Scala/ML crowd. (I guess the overlap between camp 1 and camp 2 is Erlang...)

3. people who want access to the library ecosystem, particularly the Numpy/Scipy/Pandas/Theano stack (the scientific community, latterly data scientists). This is my tribe.

So you've got three crowds here, at least, unified by a language. Where are they going to head?

Go has nothing for the second crowd; they're all going to go to a purer FP language sooner or later. It doesn't have much for the third (no IPython, LAPACK bindings, Fortran bindings, stats libraries, machine learning, etc...); we all have an eye on Julia. But for the first: great deployment story, small language, decent standard libraries, markedly better performance than Python, much better parallelism story...


As someone else noted below, it's also like "static typing lite." It has most of the advantages of static typing with much less of the bookkeeping involved.


ML-style languages go much further with this, while also having richer type systems that both let you write code more easily and do more compile-time safety verifications.


may i know what other languages interests you?


That's what's awesome about it! Everybody who uses it gets the same impressions of it (EDIT: and those impressions are 95% good). What other language has that going for it?


I think Peaker is right, I think a lot of us who are coming from Clojure/Scala/Haskell/Ocaml/etc don't bother writing blog posts that we're unimpressed.


I think there's a survival-bias/selection-bias here.

Those who tried Go and came from a background of more advanced languages just threw it away quickly and wrote nothing about Go.


I've seen a few Haskellers finding similarities and praising Go[1]. Go is just meant to be practical and familiar to most working engineers and all the articles are going to reflect that. Most of the people coming from "advanced languages" are looking for new paradigms to move forward. When it comes to Go, there really isn't much to talk about besides the maturation of the toolchain.

[1] http://www.starling-software.com/en/blog/my-beautiful-code/2...


The plenty of Haskellers I know have all expressed disinterest in Go as a poorly designed language.

The linked post seems to be a very superficial take of Go. I wonder if he'd keep his opinion of Go after learning about lack of generics, error product types, nulls, mutability of concurrent messages, and all the other show stopper design mistakes...


As an aside, that article says that Haskell can't guarantee Monads to also be Functors. Is that a language problem or library problem?


That's purely a library oversight which has become mired in backwards compatibility (don't track any of that onto my carpet :P).

It would be trivial to specify that all Monads are also Functors by changing the definition of the Monad class a little bit:

    class Functor m => Monad m where


It's a library/compatibility problem, and it's (finally) being addressed now (will take years until the change takes place, need deprecation warnings for a loong time).


What? Everyone coming from python/ruby/javascript maybe, but there's also a heck of a lot of us who wanted to like go but find it too primitive and inexpressive. I tried go while I was still a confused haskell noob, and I still had to go back to haskell. I can't imagine how bad go must feel to an experienced haskell dev.


In my case at least, this article spawned a little personal research that has me now giving Go a serious try. With all of the new-ish languages and databases popping up, it can be difficult to choose which ones to try for a new project, or seek solutions for a new problem.

There is something in the initial life of a new technology that determines if the adoption rate will result in success or failure, where failure is abandonment and an end to development. If this sort of interest keeps Go going, then maybe some repetition is a good thing.


You should look on the bright side. Every time one of these "Impressions of Golang" articles hits the front page, we get pages of bike-shedding in the comments.


I sorta like them for two reasons:

1. It's interesting to see what people struggle with or like about it, having written a lot of Go. That gives me some perspective, I guess.

2. The comments usually have these little gems of tips and tricks, like one I found here.

That said, #1 is getting a bit old.


I like these articles, because I like learning from other people's real-world experiences with various tools. He spent years using C++ at Google, yet decided not to use it for his own business even though he, presumably, wasn't one of the "junior developers" who couldn't handle memory management. He used mainly Ruby on Rails at his post-Google company and had a very negative experience. He says he'll write more about the naked emperor Rails. I'd love to read that article.

And if Haskell is so much better for real business infrastructure in practice, and not just a "more interesting language" in its own theory, I'd like to read more articles about companies that use Haskell, not fewer articles about companies using Go.


I've done lots of C++, and memory management has become much easier since shared and scoped ptrs have entered the std lib.

But still, going back to it now for a new project seems highly unlikely to me, it's like moving from a feature phone to a smartphone and then back. I just can't go back now. I'm a server guy so Go really hit the spot for me, despite having its shortcomings.


This is better than the average such article.

I get the sense that many of us are just waiting for a few improvements (like generics and better GC/performance) before using Go for real work, but it seems mature enough for many uses already.


This claim: "Python/Ruby/Javascript: my experience is that large systems are difficult to maintain in these languages, as the dynamic typing makes it difficult to refactor large codebases without introducing errors in poorly tested areas of a repository"

-- is unfounded. Even though there may be some good reasons why one might use Go instead of C/C++, I find it hard to justify using Go instead of Python/Ruby/JS/Java (the only criticism of Java - that it's verbose and hard to tune - is questionable as well). I've said this before. I like Go. I contributed to Go. I've used Go and I still use it from time to time. But much like D, I don't think it has a niche.

There are magnitudes more libraries and resources available for Python/Ruby/JS/Java -- thus far, it's been more than enough to sway me into using those languages (mostly JS/Java) instead of Go.


I share the sentiment. People often mention that Go is less verbose than Java like they were talking about Clojure or something. Go is just a little less verbose than Java (other than in a hello world example), and targets the same conceptual "level" (same order of abstractions, same "distance" from the metal, although Java can get closer to metal than Go). So, sure, it feels a little more modern in some respects (and less modern in others), but when considering both languages carefully, I find I really need a magnifying to tell the two apart. If Go had offered everything that Java does, I still wouldn't have had a compelling reason to switch because the differences are just too small.

But Go doesn't offer everything Java does. Like you said, Java's ecosystem dwarfs Go's. Java has dynamic linking, runtime code instrumentation, unparalleled tooling, and better performance than Go. The only advantage I see Go has over Java is a shorter startup time, which makes it a reasonable choice for writing command-line programs. As for concurrency constructs, Java is far more flexible than Go, and because goroutines and channels are easy, I've ported them to Java[1] (and Clojure).

Go sure is easy to get started with, but it would have to be 20 times better than it is to make me give up the JVM. In reality, it's just a recent, beginner-friendly Java without the awesomeness of the JVM.

(P.S. I'm not sure Java's often mocked factory-factories aren't simply a result of the huge number of multi-million LOC programs that have been written in Java. It's just experience, and Go sure doesn't have the necessary abstractions to make engineering large systems any easier. Other recent languages -- sure -- but not Go)

[1] https://github.com/puniverse/quasar


My too, that is what made me stop playing with Go and look forward to D and Rust.

Initially Go attracted me, mainly because of the Oberon-2 influences (method declarations) and being compiled by default to native code. Java and .NET have AOT compilers, but don't tend to be used that much.

I even tried to do some initial contributions before the 1.0 release, but with time I got a Java 1.0 feeling. The language just throws too much away in the days that the enterprise is adopting Scala, Clojure, F# into their ecosystems.

I wish all the best to Go developers, but personally I don't think the language would be all the time here, if it wasn't being done at Google.

I mean, how often was Limbo discussed here, if ever?


Yes. Rust is an entirely different beast. A truly modern C-level language is something to look for.

There's no doubt Go wouldn't have been discussed here if it hadn't been done at Google. It's a really nice language, and the tooling is nice as well (far better than other new non-corporate-sponsored languages), but it doesn't address a need like Rust, or offer a way to tackle the hardest modern software development problems like Erlang or Clojure. As a language, it's not interesting either (say, like Haskell). It's not even a modern Java (like Kotlin). It's just Java. (only severely handicapped but made a little friendlier)

Then again, all this might not matter. Go is well executed, it's easier for Python devs to adopt than Java, and it's made at Google. And Google is known for making popular Java flavors, so, if Go's particular (few) strengths appeal to some -- why the hell not? There are smart people behind it, and I'm sure we can learn from Go, too.


I don't think it's fair to say that Go doesn't address a need. I volunteer on the Rust project, and compilation time of the compiler itself is one of the biggest problems at the moment. Take a look at at the turnaround times on our automated testing bots:

http://huonw.github.io/isrustfastyet/buildbot/

That's 24 minutes spent compiling each and every pull request! Granted, this is triply exacerbated by the fact that as a self-hosted compiler we have to compile three times, but I'd kill for sub-minute turnaround times for even a single stage (on a beefy dev machine we're down to maybe four minutes per stage, so 12 minutes total). We're putting a lot of focus on reducing this burden for our next release, but that's still time that could have been spent on features.

As I understand it, Go is intended to address this need for systems at Google's scale, where compilation presents enormous time overhead. I'm not a Go user so I can't comment on how well it achieves this, but one way or the other I think it's a really fascinating thing to optimize a language for.


Rust suffers from having to compile LLVM as build dependency.

This will surely improve when Rust no longer requires it.

Go compile times are sweet, true. But Modula-2 and Pascal dialect compilers were already achieving similar compile speeds in 16 bit compilers a few decades ago.

Young developers get impressed by Go compilation times because they never experienced those systems.


Those compile times don't include building LLVM, which, on the rare occasion that a recompile is necessary, will add about five minutes on a beefy desktop and about an hour on our buildbots.


Sure, Go compiles a lot faster than C++, but unlike Rust, Go doesn't operate at the same level as C++, but at the same level as Java. How much faster does Go compile than Java?


I'm pretty sure the compilation time is less to do with the "level" that it operates. From presentations on the topic, most of the time savings were by making the syntax completely unambiguous. Apparently, in C/C++ and similar, some of the longest parts of the compilation process are lexing/parsing/AST-building, etc

(citations needed)


Go pretty much compiles and runs within the time the JVM needs to even start.


Which JVM?


Go sure is easy to get started with, but it would have to be 20 times better than it is to make me give up the JVM.

I like Rust because it gives you control (like C++) but does so carefully (unlike C++). In ten years, Go should be a great alternative to Java and Python. In ten years, Rust should be a solid alternative to C and C++. (That is, assuming they both succeed.)

A language like Rust, which is reasonably expressive and safe but fast and with little overhead, should become a very desirable language as hardware improvements become more marginal (adding more cores eventually brings diminishing returns), and as battery life becomes more important with mobile devices.

But the portability problem is solved really well by the JVM. I'm hoping Rust will make writing cross-platform, native code easier.


You have "ported" a green threads implementation for Java? I'd really like to know more about that.


I call them fibers. Or lightweight threads. They're like Erlang processes or goroutines. You can read more about it here: http://blog.paralleluniverse.co/post/49445260575/quasar-puls...

The library is very much in active development.


You mention Kilim (but perhaps not in a very good light)! Very cool.

To be honest, I was surprised what you could do with Kilim (and the awesome robustness of the system). Unfortunately, I don't think Kilim has been updated for ASM 4.0 -- your library looks interesting though, I will certainly take a look at it.


I have nothing but admiration for Kilim, and I considered using it, but I needed something more modular.


It's not easy, but certainly possible. I contributed to Kilim[1], a microthreading Java library built by the brilliant Sriram Srinivasan.

[1]http://www.malhar.net/sriram/kilim/


With cgo, Go can get "just as close" to the metal too. But I don't even think a cgo vs. JNI benchmark would be substantive. Most people don't use a "medium-"level programming language to write low-level code, anyway.

Like you say, Java is barely less verbose and arguably just as powerful as Go with many more times over the documentation and resources.


Oh, I would say Java is much, much more powerful than Go. Other than dynamic code loading and runtime instrumentation, even when it comes to concurrency you have your choice of schedulers and control over OS threads.


I agree, but I think fans of Go would not :P Hence my tentative "arguably."


This claim: [...] -- is unfounded.

Actually, most of that enumeration consists of shallow clichés. For instance, take the description of Java:

Java: too verbose,

Having written a fair share of Go and Java code, I have to say that difference is not all that profound. The usual boilerplate that people come up with is the construction of a BufferedReader/Writer. But Go has its share of boilerplate as well (e.g. error handling). Java currently has the advantage that IDEs can quickly generate whatever boilerplate is necessary. Given that Go is easy to parse and has a simple module system, there'll probably fairly complete IDEs for Go as well.

too many FactoryFactories

That very much depends on what libraries or frameworks you use. I have written lots of ML and NLP code in Java the last half year or so. I can count the number of insane 'FactoryBuilderProxy'-like classes that I encountered on one hand. Wait a bit and the architecture astronauts will also be writing Go packages ;).


> Actually, most of that enumeration consists of shallow clichés.

You can apply this to most of the writing about programming languages out there. My view might be skewed on HN, but it seems like the majority of this stuff is put out by programmers trying to promote themselves.


I was about to post the exact same comment, with the above quote from OP in my clipboard.

In my experience, this claim is made by (many times really good, experienced, intelligent) coders who have a lot of previous experience in static typing and aren't comfortable -- either because of a lack of time and sheer LOC, or because of some mental block they've doubled-down on ("where's my 'extract method' menu option???") with the techniques and peculiar challenges of dynamic type refactorings.

I work on a very large application/system of applications in a dynamic environment, and I've never encountered refactoring or maintenance issues that I thought would have been any easier in my previous life, 10 yrs in static typing.

The other thing to keep in mind is that -- at least when comparing something like, for instance Rails to Java, there will be a lot less code, and this is a significant contribution to code management. That factor, plus just experience and acclimation to the techniques for dynamic-type editing/replacing/finding/greping and so on, plus solid TDD practices -- maintenance has never been an issue for me, and I've had a decent amount of time in large codebases in both dynamic and static environments.


With a static language, you get more with less. If you didn't have any testing framework, the compiler would tell you where you messed up. So, there's that, barring "well you should have had testing code to begin with".

Grepping is ghetto, second-class. Whereas an IDE with built-in refactorings has a much higher guarantee of hitting the right artifacts, especially with shared substring collision. Again, barring "well you should have named things better".

Having spent the majority of my career as dynamic and now a recent convert to static, I fail to see the allure of dynamic languages at a certain scale. You can shoot yourself in any language, but I think it is easier and safer to crawl out of a static mess than it is a dynamic one.

And, I would argue the majority of applications do not demand the level of dynamism that dynamic languages are capable of, making it a waste.


> Grepping is ghetto, second-class. Whereas an IDE with built-in refactorings has a much higher guarantee of hitting the right artifacts, especially with shared substring collision.

Of course that only works when the tooling exists, which for instance it does not for Go but does for Python. Doesn't for Haskell but does for Ruby (Jetbrains has done a pretty good job there).


Well, you also don't have to rely on the IDE or grep since your incorrect code just won't compile.


This claim: "Python/Ruby/Javascript: my experience is that large systems are difficult to maintain in these languages, as the dynamic typing makes it difficult to refactor large codebases without introducing errors in poorly tested areas of a repository"

-- is unfounded.

Not necessarily unfounded, you simply don't have access to data on it.

I am not at liberty to share details, but I have in fact seen data from a large company based on many internal projects that found that initial development was faster but long-term maintenance costs were much higher for stuff written in dynamic languages like Python and Ruby than in static languages like Java and C++.

The cost difference was both large and real. As someone who mostly does dynamic languages I didn't like the conclusion, but I couldn't argue with the numbers.


> initial development was faster but long-term maintenance costs were much higher for stuff written in dynamic languages like Python and Ruby than in static languages like Java and C++.

I don't have any hard data, but this feels right. I wonder what would be the results with Clojure, which certainly isn't statically typed, but doesn't do duck-typed function dispatch (like Ruby and Python, and even Go) either.


It's a misnomer to say that Go uses duck-typing because an object must statically implement a complete interface, not just a subset that's hopefully sufficient for what happens at runtime.


Sorry, you're right.


> I have in fact seen data from a large company based on many internal projects that found that initial development was faster but long-term maintenance costs were much higher for stuff written in dynamic languages like Python and Ruby than in static languages like Java and C++.

A big part of this is the shortsightedness of the companies themselves. There are steps you can take to ensure that you can manage types in dynamic languages. Very often, companies simply don't take these steps. It's not programmers that need babysitting by tools. It's big enterprise.


> It's not programmers that need babysitting by tools. It's big enterprise.

If one language is easier for "big enterprise" to maintain than another, then what's the problem? Obviously different languages have different strengths.


The claim 'big enterprise is better served by a statically typed language' is a different claim than 'dynamically typed languages cannot be used for large projects.'


I only make this claim for the "typical" big enterprise.


> If one language is easier for "big enterprise" to maintain than another, then what's the problem?

I never said there is a problem. There is a lot of inefficiency in big enterprise, but since they're basically sitting on a formula to print money, it's often not a problem. (Until they start to get out competed.)


A big part of this is the shortsightedness of the companies themselves. There are steps you can take to ensure that you can manage types in dynamic languages. Very often, companies simply don't take these steps. It's not programmers that need babysitting by tools. It's big enterprise.

I am not at liberty to discuss details, but I would be shocked if your theory was an accurate explanation of the data that I saw.


> This claim: "Python/Ruby/Javascript: my experience is that large systems are difficult to maintain in these languages, as the dynamic typing makes it difficult to refactor large codebases without introducing errors in poorly tested areas of a repository" -- is unfounded.

Why is it unfounded? The author clearly qualifies his claim with "my experience". That's my experience too. I wouldn't extrapolate it to all programmers and in all scenarios, but dynamic typing just has never worked that well for me in larger projects.

It could be because I'm not as diligent about writing unit tests or it could be because I'm not intelligent enough to reason about large amounts of code without static types. Regardless of whether it is, it's my experience and it certainly isn't "unfounded."


I don't think reasoning about types is a matter of intelligence. I happen to share your experience. It is more of a cognitive burden, exacerbated by less diligent coworkers. No amount of "convention within dynamism" can compete with first class compile time type safety.

If I were to babysit a codebase by myself, maybe I would not feel this way. But with regards to professional team development on a large code base, the ship has sailed on dynamic languages.


I fully agree.

Dynamic languages don't scale on the typical enterprise Fortune 500 with three development sites and 50+ developers, as an example of the typical project sizes I work on.

The main reasons tend to be:

- Lack of unit testing, yes even in 2013 most enterprise managers would rather that time is spent on "real" coding

- Massive code size, hard to navigate with just grep/find

- Skill set varies too much across teams, specially if seen as cogs


It's weird. You mention that the claim is unfounded, and then you have all this text that follows, but none of it seems related to supporting your point.

How is the claim that [X at large is difficult to maintain] unfounded?

I happen to agree. I am (was?) a Perl/JavaScript developer. I find both to be sorely lacking at a certain size. And by extension I assume Python and Ruby to be the exact same (under the shared umbrella of lacking static type safety).


@dvt, this is the OP here.

I was hoping that those who already agree with me about dynamic languages would come to understand that Go is different in this respect. I did a lousy (i.e., nonexistent) job making a case to those who don't agree with me [yet! :)] about dynamic languages, though.

I will write a followup post later this week about the long-term maintenance problems associated with languages in the python/ruby/javascript family. I don't think they're "bad" (I was known to advocate for python in certain situations when I was at Google), but they're often inappropriate, and it is my sense that many developers haven't had the requisite large-dynamic-language-project trauma yet to understand that from firsthand experience. (The toughest part about those traumas is that they happen so late in a project's lifecycle that there's no quick way back to safety...)

So I will try to make that case in a future post. Thanks for your thoughts.


Dynamic languages work out great when code coverage is ~100% at every run, and runs are short.

Virtually all programs start out that way, so dynamic languages feel great.

As they grow, the pain creeps in very slowly. As you said, by the time the programmer realizes he's in hell, it's too late to fix it.


> by the time the programmer realizes he's in hell, it's too late to fix it.

From what I've seen, it's more that management doesn't want to take resources from fighting fires to move some gasoline. A disciplined group can even take rat's nest code and whip it into shape: but only if management is clueful enough to make that a priority. Usually, they're making decisions on a short-term basis.


Exactly. With proper discipline and true 100% test coverage, dynamic languages work well. But over time, that ideal is a challenge for most software organizations to actually live by. Or that's been my experience... it's one of those theory vs. practice situations.


Maybe you'll convince the rest of us! If you do write another blog post, try to include the recent indie nightmare that was purportedly caused by Go[1]. I think it's relevant here (since we're discussing large projects, after all).

[1]http://forums.thedailywtf.com/forums/t/27755.aspx -- it was also on HN but I'm too lazy to dig out the thread.


Ah, I hadn't seen that, thanks!

For what it's worth, "Go" as a language is not really implicated in that, it's more like the `go` cmdline suite that was causing trouble. I would also contend that the devs were being foolish to do what they did... Assuming everything was in a git repo, the toolchain makes proper use of submodules, and to my mind this sounds like a case of developers fundamentally misunderstanding git, not Go per se.

But my "railing on Rails" (and, to a lesser extend, Node, Django, etc) will not focus on Go... it's more of a general critique about the lifecycle of large software projects written in dynamic languages.


> There are magnitudes more libraries and resources available for Python/Ruby/JS/Java

Those languages have been out longer. I think Go will get there.

EDIT: cgo dramatically expands the libraries. I hear a lot of people say "Go doesn't have X", but if X is already implemented in C (assuming some conditions are met), then you need not reinvent the wheel. I've been using Go as a concurrent parent over C functions for my geospatial research... really nice fit when you need to do 100 gigs worth of XYZ coordinate transforms.


> I think Go will get there.

Possibly, but the language has very little to add over the incumbents. I think many Python programmers are attracted to Go because it's easy and so much faster than Python. But for veteran Java programmers, Go feels like a handicapped Java.

I will admit that it's probably faster to write a short Go program than a Java one. But on the JVM I'd use Clojure for short-and-sweet stuff anyway, and use Java when I need the big guns.


As a veteran Java programmer, I like that Go is a handicapped Java. Please force all of my coworkers to use delegation rather than implementation inheritance and make CSP-style concurrency easier to code than synchronized-style.


In that case, wouldn't you also want a more thorough and thought-out solution to concurrency like Erlang and Clojure offer? And if you prefer the familiarity of a C-like syntax, Kotlin is just what you're looking for. It won't force you to use delegation, but Go also leaves the door open for a a whole class of problems (especially concerning concurrency).


Communicating sequential processes is pretty thought out, I haven't used much Erlang but I was under the impression it uses a similar model to go (green threads, message passing between them).

Can you explain what you mean by 'more thought out'? If you're talking about STM or pmap and friends, then I'm very, very unimpressed by them (see Amdahl's law for why). I can do shit really slowly in one thread just fine.


Then why not both CSP and immutability? You can't really have serious concurrency without giving serious thought to managing state.

Erlang is really all about managing, and isolating, state. Clojure, too, has great support for CSP (with care for state) in core.async[1] and Pulsar[2].

[1] http://clojure.com/blog/2013/06/28/clojure-core-async-channe...

[2] http://puniverse.github.io/pulsar/


Immutability's great, I pass around Callables of final variables in Java all the time to accomplish CSP and immutability.

That's not to say you can only have immutability, though, we're humans here and capable of reasoning about happens-before relationships. I wouldn't implement a parallel linear algebra library, for example, that didn't use mutable just-plain-arrays of floating point values..


Clojure doesn't 'just' (I don't know how to do italics) have immutability. It has really good reference types for managing mutable state when you need it, as well as easy interop to just use plain arrays if you need them. Immutability is strongly encouraged, but mostly at the interface level. If a function is referentially transparent, most people don't mind if it's internally bashing on a mutable array.

It's great to have the vast majority of your stuff be immutable and have it clear when you're using mutation. I think avoiding mutable state is more important for maitenance than dynamic vs static typing personally (though I'm recently becoming more interested in static typing).


Yeah, I read the clojure book back in like 2009 and used it a little. I thought it had a lot of neat ideas but when it comes to just getting a job done, if it's not a problem where pure functional programming brings a lot of value (things that really benefit from code-as-data), I'd rather just write boring old procedural code. Easier to read, easier to reason about.


I strongly disagree with procedural code being easier to read and particularly easier to reason about. I haven't run into many (any) problems in the last few years that I thought functional programming didn't bring a lot of value. The code-as-data is kind of an orthogonal issue, most functional languages aren't homoiconic.

Edit: also Clojure isn't pure.


That's fine, you're just in a tiny minority.

Code-as-data is actually a win that you can't get from procedural languages, which is why I brought it up. It makes solutions to complex configuration spaces possible that aren't even conceivable in procedural languages. Everything else is just syntax.


But for veteran Java programmers, Go seems like a handicapped Java.

Assuming that the Go toolset evolves to a point where it matches Java performance, does Go's license (and non-association with Oracle) offer any advantage?


For some, perhaps.


Would you consider Scala for any use case?


Not personally, but only because I really don't like Scala. Some would say it has many use-cases. Scala is a chameleon of a language, which some may consider a strength. I think it makes Scala a non-language: a wonderful compiler, but zero coherence and a level of complexity which few organizations will tolerate in a language.

My guess is that most Scala users simply like it as a better Java, and Kotlin does a better job at that. Kotlin is what Scala should have been (and wanted to be) before it was overcome with a desire to prove that a compiler can do all sorts of crazy stuff. It's hard for me to understand what problem Scala is trying to solve (other than the challenge of writing a compiler with lots and lots of features), but whatever it is, the language is constantly becoming harder to understand, so its mysterious goal should better be really good.

So, for me, Kotlin is the true Scala. Problem is, Kotlin is very, very young, and it's way too early to tell if it will ever take off.


  > what Scala should have been (and wanted to be) before it
  > was overcome with a desire to prove that a compiler can 
  > do all sorts of crazy stuff. It's hard for me to 
  > understand what problem Scala is trying to solve (other 
  > than the challenge of writing a compiler with lots and 
  > lots of features), but whatever it is, the language is 
  > constantly becoming harder to understand, so its 
  > mysterious goal should better be really good.
This is a bit hard to follow ... I'd love to see some examples.


I can give plenty, but I'll try to keep it short. First, what is the problem Scala is trying to solve? I know that Erlang and Clojure try to solve the problem of writing concurrent code (and fault-tolerant code in Erlang's case). Haskell tries to make writing correct code easier. Ruby and Python were made for ease and productivity, and both Ruby and Clojure are great for DSLs. Java and C are used nowadays for performance, and Java is relatively good for architecting huge software systems.

What is Scala for? If I'm hard-pressed to give an answer, I'd say, "better productivity than Java, in a statically typed language, with good performance". Now that's great, and Kotlin is all that, too.

Why the immutable data-structures, then? To make concurrency better? In that case, why is mutability just as easy? And what are implicits and these new cringe-inducing macros for? DSLs? Why would a high-performance, statically typed language make it easy to write DSLs? Is it to introduce developers to the wonderful high-level abstractions of FP? Why all the OOP, then? Oh, it's to combine the too; in that case why do they feel so strenuously glued together (classes vs. case classed, an entire collection package replicated twice, once for the mutable case and once for the immutable).

So the language offers a powerful compiler but absolutely no guidance on how a program should be written. If at least Scala had somehow provided all of these features and stayed elegant, but man, it would take you weeks just to understand how a freaking Scala collection works, just because the language designers wanted to be so clever and prove that you could transform collections and still never require casts. It seems that at every turn Scala favors cleverness over clarity, and features over a cohesive philosophy. Scala chooses, over and over, to try and address more goals (most are really unimportant), and in the process has lost the most important thing in a language: coherence.

Scala sees something nice in another language and immediately adopts it. And I gotta say, writing a compiler that compiles code that's both javascript and Haskell is an impressive feat of engineering. But it comes at such high a price...


This is quite some incredibly confused rant. :-)

I think it is totally OK to dislike a language for pretty much any reason, but that wall of text reads a lot like “I never actually used the language, but here are some things I read on the internet which sounded plausible to hate”, which is quite disappointing, imho.


I've had a long history with Scala, but have only written about 500 lines of code using it. My feelings towards it had this trajectory: great hope, realistic hope, caution, suspicion, confusion, disappointment, pity.

Around 2006 I was working at a pretty large Java shop, and had hoped to convince the whole organization to gradually switch to Scala. There was one thing that really bothered me at the time, which was the inclusion of XML in the language. I wasn't too fond of XML, didn't think it would last, and thought it a sign of thoughtless trend-following on the part of the language designers, but I liked pretty much everything else. I really liked traits, I liked pattern matching; I really liked lambdas. I thought the language would never win any points for elegance and grace, but at least it was powerful. In any case, the language was young, and I knew I would have to let it mature before there was a real chance of it being adopted in such a large organization, so I kept close tabs.

Shortly after, implicits were introduced (if I have my chronology straight), and I then noticed that scaladoc only made an API even harder to understand, but I thought this could be resolved. Then structural types were introduced, and a big red warning light went off in my head. By 2009-2010 it was clear that a large organization like the one I was working at, would never adopt Scala; it was too unwieldy. Then collections were revamped, and Scala became the only language in existence whose automatically-generated documentation ensured that an API could never be understood. The designers' taste, or lack thereof (taste means choice; preference) was clear. I was then introduced to Clojure and learned that an extremely powerful language can be extremely elegant at the same time, and that a language can really help you program (rather than confuse you with "constructs") by adopting a coherent philosophy. I pretty much abandoned any hope for ever liking Scala again (or recommending it for a large organization), but I swear to you that I still thought, "the Scala guys haven't adopted macros yet in spite of their lispy awesomeness; perhaps there's some hope to them yet; maybe they finally realized that mixing ice-cream, steak, and pizza in a bowl does not make a good salad". We all know how that turned out.

I think I'm a pragmatist, but leaving aside the total incoherence of Scala, it has become so inelegant, so ungraceful, that I wouldn't use it for that reason alone, especially considering that most modern (and non-modern) languages value elegance. It's as if C++ hasn't taught us anything; as if programmers need to make a binary choice between power and beauty.

In the meantime, Scaladoc has actually improved, but that's just too little too late.


PART TWO

  There was one thing that really bothered me at the time, 
  which was the inclusion of XML in the language. I wasn't 
  too fond of XML, didn't think it would last, and thought 
  it a sign of thoughtless trend-following on the part of 
  the language designers, but I liked pretty much 
  everything else.
Good news: You'll be able to delete the scala-xml.jar file. Done. No XML support in the language.

  Then structural types were introduced, and a big red 
  warning light went off in my head.
They are a simple generalization and remove arbitrary restrictions on what can be a type and what can't be a type. A win for consistency. They will become crucial if you want to interoperate with prototype-based languages (JavaScript for instance), so I think the language designers made all the right bets back then when we see the hype around JavaScript today. I don't use structural types much, but a lot of people seem to so excited about them that they designed the whole language around that concept (Golang).

  I then noticed that scaladoc only made an API even harder 
  to understand, but I thought this could be resolved. 
  Scala became the only language in existence whose 
  automatically-generated documentation ensured that an API 
  could never be understood.
I don't understand what you mean. Could you explain?

  I think I'm a pragmatist, but leaving aside the total 
  incoherence of Scala, it has become so inelegant, so 
  ungraceful, that I wouldn't use it for that reason alone, 
  especially considering that most modern (and non-modern) 
  languages value elegance.
As someone who actually uses the languages and undertakes a lot of comparisons with other languages to better understand the state of the art and existing solutions before designing APIs, I totally disagree with that. There are not many languages out there which consider consistency and elegance to be as important as in Scala. In Scala things can and will be rejected or removed for failing to live up to these standards alone.


Your effort is very much appreciated. There, I upvoted both well thought-out answers.

I would never say that Scala's designers are stupid. Far from it. The Scala compiler is a work of brilliance. And, obviously, every feature, as you so meticulously tried to present, has a purpose; tries to solve a problem. But your explanations, I feel, only prove my point. Many of your explanations are along the lines of because sometimes you need that ("Some algorithms work best", "a good tool to solve some problems", "for some things it make sense"...). While absolutely true, and every practical language, be it a programming language or a spoken language, needs versatility and irregular forms, it seems like Scala tries to take on each one at a time rather than spending most of the intellectual effort on defining the common case.

For example -- and this is an important philosophical point of disagreement -- you say of macros: "... instead of having to resort to such terrible things as annotation processors, bytecode rewriting and Java agents." This, imho, the WRONG answer. Those problem areas that in Java are addressed by what you call "terrible" means, are highly irregular; highly uncommon. They should be addressed by "terrible" means if your goal is a simple language. Yet Scala seems to want to address every problem with a language feature, and in this case it's a huge one.

On the other hand, when I look at Erlang and Clojure (or Ruby, though I'm less familiar with it), I see languages that were designed by people who sat down and thought long and hard about which are the top one, two or three most burning problems of software development, and then tackled those and only those. Everything else would be solved possibly "terribly" (though it would be solved). Rich Hickey thought long and hard and came to the conclusion that while OOP might be the right solution sometimes, in the end it's more trouble than it's worth, and people should not generally use it to write most programs. He may be wrong, and he may be right, but he made a decision. He thinks (I guess) that if your particular problem absolutely requires OOP, then you're better off using a different language, one that's been lovingly crafted for that purpose.

This is extremely important. A coherent language says, "for my problem domain these are the tools you should use". A general-purpose coherent language adds "... and most problems fall in this problem domain". For whatever is left, use other, better suited languages. Scala never says this. For every problem, big and small, it tries to find a solution in Scala. I mean, it's running on the JVM for god's sake, and interoperability on the JVM is particularly easy. Why not come out and say, "DSLs are great; we absolutely love them; if you want to use them, write them in, say, Groovy"?

I did not intend to ask why would you ever need this feature or that? What I asked was, why must they all be in the same language? If you had said, "look, Scala just tried to do this, but because of sheer genius it just happens to do that, too", that would have been a good answer. But you didn't. Each feature is there to solve a different problem. That's why Scala lacks coherence.

A non-coherent language says, "here are the tools you can use". It says, "in your programming endeavors you might some day encounter this byzantine problem, and guess what? We got a tool just for that!" It lays out a huge set of tools, all usable at some point or another, all serving a purpose, but doesn't say "I think you should rarely use this tool or that, so I'm leaving them out of the toolbox. When you need them, buy them separately" (Worst of all, it gives you a bulldozer when all you need is a hammer. That's why it's unwieldy)

These are two competing philosophies, but for modern software development, the latter is the wrong choice and the former is the right one. Software systems are getting bigger and more complicated as it is, while programmers aren't getting smarter. Some challenges are much more important than others. Scala chooses to be Jack of all trades and master of none[1] in the very discipline that needs the opposite approach.

[1] What is the one or two things Scala is better at than any other language? For Clojure it's managing state; for Erlang it's fault tolerance. Both are at the very top in some other aspects as well. But what does Scala do better than anyone else? (and is that thing important? You might say it's best at marrying OOP and FP -- though even if that's the case I'd say being best doesn't mean you're good enough -- but I don't think that anyone would say that combining these two paradigms is what the software industry needs most. Or, you might say, typed OOP. But typed OOP is, again, a compiler feature, not a solution to a burning problem)


I think we have a major philosophical difference and talking a bit past each other.

Here are the two things why I am using Scala:

1. Confidence

Scala gives me the confidence that I can build software the way I imagine, I can focus on the user of my code, not on making the compiler happy.

While there are plenty of languages which make easy and medium problems nice to solve (Clojure and Erlang certainly belong to this group), Scala is one of the few languages which keeps supporting me regardless of whether my problem is easy, medium or incredibly hard.

In my experience, the work on making hard problems easier had a huge tricke-down effect which in turn improved Scala's issue solving capabilities for simple and medium issues.

I think the language is better for that and certainly ahead of Erlang and Clojure here.

While most hard problems are not common, they are often fundamental. Not being able to solve some issue in the best possible way can have huge negative impact on the whole application and library design. That's why for instance making it easier to create macros wasn't the first problem to concentrate on. Instead, developers made sure that users of macros had the best possible experience and focused on having one, unified API for reflection, the macros and the compiler, hugely simplifying semantics while re-using battle-proven code. Macros makes it possible to pull more functionality out of the language and out of the compiler; into libraries. For instance C# 5 introduced huge language changes with adding async/await to the language. In Scala no language change is necessary: Adding support for async/await would be just a library.

Unlike Macros in most other languages and inferior approaches like in Java, Scala macros are type-checked as regular Scala code at the definition site as well as the macro expansion at the call site, removing huge amounts of tricky issues all at once.

Great care is taken to make and keep Scala a highly regular language with only the minimal amount of hardcoding necessary to make things work. Unlike Clojure, it doesn't have special-cased syntax for collections and a few “blessed collection types” shipping with the language. Unlike Erlang, Actors are not built into the language.

In both cases, Scala avoids irregularity by enabling users to build libraries which can be improved and be replaced without much trouble.

  Yet Scala seems to want to address every problem with
  a language feature, and in this case it's a huge one.
This for instance is something I would call blatantly wrong. You confuse the distinction between a language like C# or C++ which adds tons of features to address every fashionable problem and Scala, which keeps its feature count low and orthogonal but manages to solve a lot of those problems by just being a better designed, more expressive language.

  On the other hand, when I look at Erlang and Clojure 
  (or Ruby, though I'm less familiar with it), I see 
  languages that were designed by people who sat down 
  and thought long and hard about which are the top one, 
  two or three most burning problems of software 
  development, and then tackled those and only those.
Well, that's nice, but I think it is even better that some people decided to bite the bullet and improve a lot of things instead of just building yet-another-language which improves on parts which the creator found subjectively important and regresses on dozens of others.

Is it hard to build a language with these intentions? Sure! Is that a reason not to do it? Absolutely not. I think one part where Scala has basically proven tons of people wrong is OO/FP. People have been saying for decades that OO and FP are fundamentally opposed to each other. Scala just went ahead and proved them wrong, showing that just because some earlier approaches like OCaml or F# are not that good doesn't mean it is impossible. Also, people have been claiming that there will always be a impedance mismatch between languages and databases. Scala went ahead and showed that it has not to be that case.

I want the best OO functionality combined with the best FP functionality. I want to be able to use higher-order abstractions combined with the best performance and efficiency. I want libraries written in the best possible way, not in the way the language decided it was convenient. I want to use the right tools for the right job without having to migrate from one language to another.

  He may be wrong, and he may be right, but he made a decision.
It's 2013. Let's stop forcing people to make pointless decisions. I just won't choose between things if I can have both, combined into a consistent library.

Clojure or Erlang just don't deliver here and that Clojure is the best language to manage state is highly debatable, too.

  I think you should rarely use this tool or that, so I'm 
  leaving them out of the toolbox. When you need them, buy 
  them separately
This is by the way exactly what Scala says. The language ships with the tools to enable people to build libraries. By default, everything is left out.

Don't take me wrong, a language should be as easy as possible — but not easier.

2. Community

It is pretty non-sensical to ask “What is the one or two things Scala is better at than any other language?”. There plenty of things it does better, because “good enough” is just not good enough for people in the Scala community.

In general the Scala community is highly critical of every aspect and tend to push things to the current state of the art or beyond it if they feel something can be solved in a better way. This has led to a huge increase of consistency and quality throughout the ecosystem, so that having a few good parts and a lot of mediocre parts is just not acceptable to most Scala developers anymore. They demand the best tools one can possibly build.

Anyway, I think your use of “coherent” is getting more clear, but imho makes less and less sense. You are basically asking for a silver bullet and are unhappy that Scala tells you that for many problems, there isn't one. I think this is one of the core advantages of the community: It doesn't try to sell you some “ultimate solution”, avoids ideological bullshit and treats people as grown-ups.

For instance, Scala's Akka team (those who work on concurrency-related libraries) had an interesting talk recently where the demonstrated something like 9 different approaches/techniques to tackling concurrency, all of them with different benefits and drawbacks, with the main conclusion of “pick your poison”.

I think this is one of the core distinctions between Scala's diverse community and other, more anglo-saxon-centric communities: People who have grown up in the US just love to swallow shallow marketing non-sense and respond extremely well to claims about “one true way” or “silver bullets”.

If somebody came with that approach to the Scala world, people would tell him/her that he/she is either lacking experience, has poor judgement, or probably both and show him/her why he/she is wrong.

The way people carefully evaluate different approaches and document its pros and cons instead of following the next hype is exactly why I'm using Scala.

Scala's strength is shipping efficient, reliable and fault-tolerant software at a rapid pace.


Well, best of luck with Scala, then. I am aware that there are people out there who like Scala, some of them even like it for the reasons you mention, and some of those even seem to find it elegant (BTW, I watched a talk[1] by Marting Odersky in which he tries to explain why he thinks all those Scala features should be crammed into a single language; even he didn't seem half as convinced as you are :)). It's good to have choices in the JVM ecosystem.

[1] https://www.youtube.com/watch?v=iPitDNUNyR0


I still can't see what you mean with "all those Scala features [...] crammed into a single language", is there anything more specific?


I'm curious what you find it is about Scala that lets you solve hard problems that you don't find in Clojure. Is it static typing and/or OO support? I'm also curious what you don't like about having syntax literals for vectors/maps/sets?


For instance Scala's support for composition and modularization of library fragments which allows you to separate even heavily interwoven concerns into tidy parts and put them together whenever and wherever you like it (or exchange some parts of it completely)

Static typing is certainly a factor, too. Scala allows me to not only design APIs which make it hard to be abused or misused, it makes it possible to encode many things I care about into types so that “wrong” code won't even compile.

With macros, there is now a whole new breed of libraries which add support for types to tasks which were to get wrong before, for instance

- the whole type provider business where one points to soma data source (like a database) and tells the compiler to figure out the right types on it own

- compile-time checked and SQL-injection-safe string interpolation like sql"""SELECT * FROM PERSONS WHERE $predicate"""

- sqlτyped (github.com/jonifreeman/sqltyped) which can compute types from SQL strings

- macros which transpile Scala code to JavaScript, inline (jscala.org)

- macros which can synthesize typeclass instances on-the-fly, like used in Scala's new serialization library (speakerdeck.com/heathermiller/on-pickles-and-spores-improving-support-for-distributed-programming-in-scala)

- Scala's async library (github.com/scala/async)

Regarding collection literals ... it certainly isn't that important in languages like Clojure where performance is not of great importance, because everyone just picks the one which come with the language and hopes it won't be too bad. Implementing new collection types is just not common here (like in many other untyped languages like PHP, Ruby, Python, JavaScript, etc.).

In Scala, blessing a few chosen collection types with special rules and syntax just won't fly. Developers demand first-class support for all collection types including the ones they define themselves.

Reserving some special rights which no one except the language creators are able to use just gives them an unfair advantage. All implementations should compete on the same ground so that the best one can win, and not the one which benefits from special-cased, hard-coded syntax rules.


Let's just say that I have a bit more, up-to-date experience with the language and I value its consistency, coherence and elegance.

One thing I really like is that Scala pushes for more general, generic solutions, instead of ad-hoc additions and hacks: implicits instead of extension methods, traits (instead of abstract classes + defender methods), objects (instead of “static” members), types (without arbitrary rules about what is allowed as a type and what not), pattern matching via apply/unapply, for comprehensions via map/flatMap/withFilter, methods instead of methods and properties.

That tons of languages (Java, Kotlin, Ceylon, ...) are copying Scala's design decisions (often badly, but nevertheless) is another sign that Scala got a lot right.

Ok ... whatever, if I have already written so much, I can just answer to your claims one by one (I hope that the time I spend on this will at least be slightly appreciated):

PART ONE (Hackernews complains that it is too long)

  what is the problem Scala is trying to solve
Being a modern, typed language which gives people the right tools to solve today's and tomorrow's engineering requirements.

  I know that Erlang and Clojure try to solve the problem 
  of writing concurrent code (and fault-tolerant code in 
  Erlang's case).
Scala fixes a some issues of Erlang's design and improves on it in a few substantial areas (which can't be fixed in Erlang itself anymore due to backward compatibility). It has better performance and better monitoring support.

Additionally, it offers better and more diverse tools to tackle concurrency than Clojure.

  Haskell tries to make writing correct code easier.
While Scala does not enforce purity by default (there is an effect system plugin for that) it gets you a long way towards Haskell's “if it compiles it is most likely correct” guarantees.

  Ruby and Python were made for ease and productivity
Apart from the “batteries included” approach (Scala prefers a minimal standard library instead¹) my experience is that it can easily match or beat Rupy's or Python's productivity. ¹ It also provides better tools to fetch additional dependencies than the languages mentioned above.

  both Ruby and Clojure are great for DSLs
Well, people say that about Scala, too. I don't see the big deal about DSLs, I just try to design and implement the best API a library can possibly have and Scala gives me the right tools to make that happen.

  Java and C are used nowadays for performance
Scala can match and beat Java's performance (looping seems to be faster than in Java, but I never understood why, optimization, specialization, macros, ...).

  Java is relatively good for architecting huge software 
  systems
Scala's better OO and module support improves on that.

  Now that's great, and Kotlin is all that, too.
Kotlin is a train-wreck. They promised a lot of things, but failed to deliver on pretty much everything. Sadly, those parts which were not just direct copies of Scala's design show the lack of experience in language design.

I think it is pretty ironic how their beloved talking points about “why not Scala?” has been reduced to almost nothing as they have continued to learn why Scala did things in a certain way. Just compare one of their first presentations with one of their latest ones.

They should really stop talking and start shipping if they want to be taken serious, because as a paying JetBrains customer I'm getting really tired of their vaporware and FUD.

  Why the immutable data-structures, then? To make 
  concurrency better?
Partially. It makes reasoning about the program much easier in general and allows safe, structural sharing of data.

  In that case, why is mutability just as easy?
Because Scala is not Haskell. Scala gives you tools to get your job done, it doesn't require you to adopt some ideology or religion. Sometimes, a mutable algorithm/data structure fits a requirement exactly and Scala won't annoy you for picking it.

  And what are implicits
Generally speaking, implicits are a generic way to guide the compiler towards closing a gap. What's such a gap? It can make existing types implement new interfaces (think arrays, java.lang.String, ...), it wires up type class instances with methods which require them, it can make incompatible things compatible (e. g. types which come from different third-party Java libraries).

Have a look at how String is made to support Scala's collection API. Have a look how the `to` method in Scala's collection library can work with arbitrary collection types (which don't even need to be known to the standard library).

They wouldn't be necessary in a perfect world, but Scala is pragmatic language and its designers acknowledge that we are not living in a perfect world. The cost/benefit ratio of implicits compared to things like extension methods is magnitudes better.

  and these new cringe-inducing macros for?
They provide a general way to make APIs more safe and implementations more efficient. They can be used to report more specialized errors right at compile times, they can be used to make sure that your closures don't close over things you don't want, they can be used to implement LINQ to query databases while using the bog-standard collection API, they can be used to implement F#'s type providers. This can all be done with full type-checking and refactoring support from the IDE/compiler instead of having to resort to such terrible things as annotation processors, bytecode rewriting and Java agents.

They are a huge improvement over Java's approach and Oracle is now copying parts of it.

  DSLs? Why would a high-performance, statically typed 
  language make it easy to write DSLs?
Why not? Just because it is a DSL, it doesn’t mean it has to such on the performance/safety front.

  Why all the OOP, then?
Because OO is a good tool to solve some problems, just like FP is a tool to solve some other requirements.

  Oh, it's to combine the too; in that case why do they 
  feel so strenuously glued together
I think you have to be more precise here. Even people coming from OCaml or F# concede that Scala has done an incredibly good job at combining OO and FP, so I'm happy to hear what issues you have found.

  (classes vs. case classed,
Well, for some things it make sense to have the additional methods of a case class, for some use-cases it doesn't.

  an entire collection package replicated twice, once for 
  the mutable case and once for the immutable).
Pick the best tool for your job. Some algorithms work best with immutable data-structures, some with mutable. Scala spells out explicitly which guarantees are made and people can safely rely on it. Experience has shown that Java's approach had good intentions but just didn't work. Even the designers of Java agree with that these days. Scala has learned from those mistakes and doesn't repeat them (unlike Kotlin).

  So the language offers a powerful compiler but absolutely 
  no guidance on how a program should be written.
That has not been my experience. There is some local immutable-OO-with-FP-with-typeclasses optimum and people regardless of where they come from are almost magically converging towards it.

  If at least Scala had somehow provided all of these 
  features and stayed elegant,
I think it does.

  but man, it would take you weeks just to understand how a 
  freaking Scala collection works, just because the 
  language designers wanted to be so clever and prove that 
  you could transform collections and still never require 
  casts.
Well, it's a bit more than that, right? Anyway, I think everyone in the Scala space is open towards a better solution, but frankly even after years no other language has come up with an approach which comes even close to collection API's ease of use.

  It seems that at every turn Scala favors cleverness over 
  clarity
In my experience, readability and clarity are considered more important these days. Cleverness is deemed to be OK if it is used to improve the lives of people using that piece of API. It's just like mutability: It's ok as long as you keep it localized, confined and don't unnecessarily expose your users to it.

  features over a cohesive philosophy
I think I disagree with that. Consistency is still one of the most important requirement and I don't have seen much features to make it in the last versions.

Anyway, Scala has much less features than Java 8, C#, F# and many other “competitors” in that space, so I think we are fine here.

  Scala chooses, over and over, to try and address more 
  goals (most are really unimportant), and in the process 
  has lost the most important thing in a language: 
  coherence.
As mentioned, this has not been my experience, but I'd love to see an example.

  Scala sees something nice in another language and 
  immediately adopts it.
No, absolutely not.

  And I gotta say, writing a compiler that compiles code
  that's both javascript and Haskell is an impressive feat 
  of engineering. But it comes at such high a price...
Huh? That doesn't make sense.


We might as well debate religion.

JVM... you can keep it.


I promise not to argue, then, but I'm seriously interested to know why would anyone not like the JVM? (this isn't the first time I've heard this view, but I never asked before) I've been writing software for some 20 years now, and have never seen a more impressive environment (in terms of performance, pluggability, monitoring). Is it the startup time? The classpath? The monolithic JRE? (the last two annoy me, too, but they should be resolved in Java 9. I hope)


I guess the JVM is just too sophisticated for my taste.

I think Mies van der Rohe, if he were a coder, would like Go for all the same reasons that Java programmers don't like it.

Like I said, this is religion.

If Java works for you, phenomenal.

I'm not a big fan of languages that become corporate standards, or languages that are now controlled by companies like Oracle.

Perhaps this is just rebellion without cause... but I'm pretty happy worshiping my gods.


Fair enough.

I think Mies van der Rohe, if he were a coder, would like Go

Really? Not Clojure? :)


He would rather design chairs than use Clojure.


I guess my extended argument is -- okay, so Go will get there eventually (I'll concede that) -- but then what? IMO, Go will still not be attractive enough to warrant a major migration. cgo is interesting, but we already have a Java analog - the JNI.


> Go will still not be attractive enough to warrant a major migration

One of pron's concessions above is that Go is much easier to learn. That could be its killer feature: a new generation of developers and founders may choose to use Go for their projects because of the lower barrier to entry.

A common theme with these write-ups is that the developer felt comfortable with Go after about a week. That's pretty incredible.

I had the same experience. Coming from a scientific computing background, I wanted to get my feet wet with a an application level or system level language. I chose Go, and I felt like I was off and running in about a week. I've since looked at Java, and I'd rather not get involved with it unless absolutely necessary.

If that turns out to be the experience of many others, it won't be a matter of migration. It'll be a matter of growth from the ground level.


That is a good point, and it just might be that way, but I'm not so sure for several reasons.

First, languages like Python and Ruby (and I think Clojure, too) are as easy to learn and Go, and most would say are more productive.

And if you want a system level language, then you're probably not a beginner. Yes, Go is easier to learn than Java, but certainly not by much (it is, granted, much easier to get a small program running in Go). On the other hand, Java (or Kotlin) are more versatile.

Someone once said that easy always wins, and this might be the case with a system level language, too, but maybe when people specifically look for power then easy is not their top priority? I don't know.


> Those languages have been out longer. I think Go will get there.

Maybe so, but that doesn't help today.


What exactly is Go missing for you?


I think go does have a niche, and it is "python/ruby devs". Yes, its niche is a set of people, not a particular role. Because go is so simple for a typeless developer to learn, and has essentially no downsides and a massive upside (performance) compared to python and ruby, I see a lot of ruby and python devs upgrading to go.

I also think you are being unfair to D. D is basically just what go initially claimed to be: a new systems language. D is a fantastic replacement for C++, go doesn't really enter the world C++ lives in.


> Because go is so simple for a typeless developer to learn, and has essentially no downsides and a massive upside (performance) compared to python and ruby

I think that's a brilliant observation! It's all of the duck typing goodness with far less of the static typing overhead. It's static types "lite" for duck typing lovers.


Dynamic typing has advantages and disadvantages. E.g. Static typing violates the DRY ("Don't repeat yourself") principle concerning code reuse whereas compile time type checking reduces the effort for testing (but for very simple cases only).


>Static typing violates the DRY ("Don't repeat yourself") principle concerning code reuse

How did you come to that conclusion? I can't see any way that static typing "violates" DRY.


I think it's because you're declaring the type, which is obvious from context, when you could save characters and have the runtime system figure it out for you.


You are conflating explicit type declaration with static typing. While often found together, they are different things.


You're right. Further discussion in sibling thread.


That isn't an issue of static typing, it is an issue of explicit typing. I program in a statically typed language. When I don't feel something warrants a type signature, I simply don't write one. Type signatures are compiler enforced documentation.


Sometimes type declarations are required for disambiguation in statically typed languages, no? The 'auto' keyword in C++ doesn't always work.

Anyway, I believe this is the source of the DRY claim.


In Haskell, for example, you only need type declarations for disambiguation when the information is actually missing from the program. That is, if the program were dynamic, you'd also need to write that down.

For example, the type of:

  show . read
is ambiguous, because "read" parses a String into some "Readable" type, and "show" converts a "Showable" type into a String. Which type? It could be anything, so the compile complains it is an ambiguous type.

You could say something like:

  (show :: MyType -> String) . read
To resolve the ambiguity.

In a dynamic language, a function like "read" is not possible at all, since it uses static types to determine which type to parse into. So in Python, this would look like:

  lambda x: MyType.parse(x).show()
Same information, same DRY violation/non-violation.

Note that in some other cases, where the type can be determined, I can write:

  [1, 2, read "3", 4]
Whereas in a dynamic language I'd need:

  [1, 2, Int.parse("3"), 4]
So it is actually dynamic typing that violates DRY here.


In your example with the arrays, what were you trying to illustrate? I don't understand.


The notation [x, y, z] denotes a homogeneously-typed list.

So I used it as an example where type inference can figure out the type of an expression from its use, rather than from its intrinsic value.

Dynamic types only work based on the intrinsic value, so whenever a statically-typed language can figure out the correct types from context, a dynamically-typed language is going to have to require redundant type hints.

So [1, read "2", 3] is a list of ints, so the read call there is known by type inference to return an int, so the read parser chosen is the parser for ints. In Python, even if you had some value that is required to be an integer, and you wanted to parse a string to that value, you'd need to say: Int.read("1"), which is redundant.


That you don't need a type signature, or any explicit type information at all. Because the compiler already knows it has to be a list of Ints, you can just call read and it knows it has to be converting that string to an int. In a unityped language you still need to supply the information about what type to convert to.


There are different degrees of type inference. Go's `:=` operator and C++'s `auto` keyword use the simplest kind, where the right-hand side must evaluate to a concrete type. More powerful than that are systems that can infer the type of all variables in a function scope depending on how they are used throughout the function (such as Scala and Rust, though Scala's is waaaay more advanced than Rust's afaict), but require functions themselves to always be explictly typed. More powerful still are languages with whole-program type inference such as ML, which can infer even function signatures.

(Please note that I'm very rough on approaches to type inference, corrections to the above are welcomed!)


"Java: too verbose, too many FactoryFactories, painful to tune"

Paintful to tune... Maybe. But I don't understand why so many people say patterns like FactoryFactoryFactories are inherent to Java!

I can write Java code with NO factories at all. Writing shitty and unflexible code is really easy. When you understand why a factory is used in a particular piece of code, you start to appreciate it!

It's also very possible to write C++ or Python code with too many overengineered components.

Also, I like a language to be verbose, I don't think it is an issue. As long as the code is clear and easy to read... And it's often the case with Java programs. For my eyes, Java is way more readable than, let's say, Perl, Scala or Clojure. I couldn't care less about the number of lines required to achieve the same result!

I'm getting tired of this Java bashing... The only thing that really sucks about Java is Oracle!


In fact, since go does not have explicit constructors, it encourages the factory pattern all over the place. You'll find static factories everywhere in the go library.

http://golang.org/doc/effective_go.html#composite_literals


In modern languages, you don't need to write a class when all you need is a first-class function.


Modern languages?! This has been possible since FORTRAN.


And LISP, but not in Java... :(


Even Java bashers would concede that Java is a brilliant language, certainly for its time. And the JVM is the most impressive programming environment to this day.

But the language has become a little dated. Oh, you don't need to go as far as Scala to fix it (to me Scala feels like a deranged, haphazard combination of javascript and Haskell), but something like Kotlin could really be a "better Java" (it also happens to be a better Scala, but that's a different point).


I guess when you talk about a language, you usually also end up talking about the platform, libraries, community and the whole environment. It is hard to escape from this when talking about Java.

For those that are not in the Java land - the whole Java ecosystem is heading in an interesting direction.

Not just the platform (with all the other jvm languages), but the java language itself. Using Java 8 and some of the new features, suddenly standard Java code starts looking more and more modern. Add to this some interesting developments in the framework land, like NIO with Netty, Gradle combined with maven repositories for dependancy management, even new developments in entrenched stuff like Spring or JavaEE...

The community and the ecosystem is slowly evolving outside the whole enterprise monstrosities. It will be interesting to observe the platform in the next couple of years.


I thought it was a pretty poor language, obviously written by people who'd either never heard of functional languages or never "got" the point of them. More here:

https://rwmj.wordpress.com/2013/07/03/golang-bindings-for-li...

There's still room out there for the C replacement language. Something with ML/OCaml-level of expressiveness but with a replaceable garbage collector might be the sweet spot.


Go was neither designed to replace nor compete with existing functional languages.

Even if it were demonstrably true that functional programming languages are a "better" (leaving this loosely-defined for now) choice than Go in 100% of the circumstances in which one would choose to use Go, your argument still wouldn't hold water, as it neglects other important characteristics of a programming language that are neither quantifiable nor useful in the context of comparing programming languages, such as perceived simplicity and the ability to hold the entire programming language in one's head.

What's the value in denouncing a programming language because it isn't designed with your favorite programming paradigm in mind? That's just dick.

Also, are you really going to call-out Rob Pike and Russ Cox on whether or not they truly understand Functional Programming like you do?

Really? Really?


I didn't say Go was designed to replace or complete with functional languages. I said it would be good if Go's designers understood functional languages, but whatever their history or how internet-famous they are, they don't.


Go was designed to write systems-y back-end services. Functional languages are a terrible mismatch for this. 'No side effects' is great for certain types of coding, and terrible for systems integration where side effects are the whole point of what you're doing.

Could you please explain what you mean when you say they don't understand functional languages as well as you? The lack of eval() and code-as-data in Go? Something else?


I'd like to use map/fold/filter idioms instead of for loops, because this encourages me to write reusable functions instead of embedded blocks of code. To do this in Go, I need to write type-specific functions for every combination of passed function, source collection, and return-collection type. The alternatives I see in Go are to lose type checking or to fumble around with introspection and casts.


That sounds like it's actually a complaint about strong typing -- if everything's the same type, you can do that now by passing around first class functions.


There are plenty of strongly, statically typed languages which do not have this problem because they have parametric polymorphism, i.e. generics.


Reread the comment. It's a complaint about the lack of generics.


Aka polymorphic function parameters. If I manage to abstract a bit of functionality into a function, I don't care what the type is as long as the operations within the function work on it. Haskell uses typeclasses to provide this guarantee for specific types. Go seems to try to do something like this with interfaces, but it doesn't seem to apply to the map/fold/filter style of programming


Agreed, I think navel gazing about functional programming for a systems language is ill guided at best.

Go seems to be pegged as a complement/replacement for C/C++/D(maybe no experience)/misc assembly. I don't see how functional languages will help at all for these tasks. You could argue maybe that security would be improved to a degree, but I really am having a hard time at imagining a low level driver being written in purely functional code. I know of only forth being used for some of those tasks in things like boot proms but that is a bit different in that it wasn't really an entire os built on top of things.

I find this somewhat ironic as I'm currently looking to replace a number of ruby/perl/shell scripts at work with go this winter. Not for any reason of maintainability, but the horrible task it is to deal with gems/cpan/"extras". And rather than program in pain with C I was thinking go would be a decent compromise as I can make rpm's rather easily for deployment and a makefile. Go seems to be perfect for these glue level things that escape the basic confines of most scripting.

But I don't know yet, this weekend is my "learn go and find a test framework like rspec in it" task. It looks similar enough to C that I could get up to speed ish in a day.


'go test' is pretty straightforward. Not so much a framework as a tool -- I think you'll find that most of the language carries the opinion 'Frameworks are a straightjacket, libraries are a tool'.


Gotcha, i've only bought the pragmatic programmers ebook so far. Past that i've really not done much.

I take it the go vein is more akin to: everything a framework(insert noun here) does for you it does to you?

I wholeheartedly agree, note i've used ruby for about 10 years aka: before rails, and there are times I want to strangle rails with a garrote for some of its choices in how it does things.

Ranting done, do you/anyone have any specific codebases I should look to for a good example of go programming methodology?

The scripts that will be replaced tend to be fairly heave with i/o and other stupid process manipulation, i.e. detect that some subprocess sigtermed etc... pretty boring stuff normally but I use threading rather heavily in ruby at the moment. So doing concurrency a bit more easily would be nice, especially since most of the things are just i/o derivatives of pack/unpack calls on things across many many devices. I've tested out celluloid with ruby and its great, but again that gets to gem/dependency management horror.

Guessing I just need to work with it and I'll find things out. Should hop into a go irc channel if there is one on freenode.


#golang-nuts on freenode. The effective go document is pretty good for basics, especially concurrency.


Functional languages are not about having no side effects.


It's a major characteristic, often cited as a positive. I included a reference to eval(), as well. Please enlighten me as to what they're really about, then?


It is a major characteristic and it is positive. I work with backend services in clojure, we have a huge codebase and many programmers working in parallel in the project (6 right now). The code is straight forward, short and elegant with immutable functions and some few macros. It's the first time in my 10 years programming career that I can say that about the codebase I'm working in. I agree that it's hard to understand the benefit if don't know FP to some degree and don't have much experience working with it (Paul Graham blub paradox).


FP has many characteristics—at its core it's about programming with functions as first class values—but it was never about having no side effects, just about controlling side effects.


Go has first-class functions and closures.

Every pure FP approach to side-effects I've seen has been a mess, stuff like with-open-file. It's fine if that's the minority of your code and you're getting enough out of being able to pass code around as lists of nested statements to make that worth it. For go's stated use case, it's not worth it.


WITH-OPEN-FILE is not about controlling side effects at all; it's about making it hard to forget to close the file. Go's analogue is "defer".

In any case, I don't even know what we're arguing about anymore. My point is simply that FP is not about having no side effects, and saying "side effects are important" does not argue against the value of other FP innovations, such as HM-style type inference, option types, pattern matching, first-class functions, algebraic data types, etc.


When you say having no side effects is not a FP characteristic, then I say Hindley-Milner type inference, option types, pattern matching and algebraic data types aren't either. Many functional programming languages aren't even statically typed.


I'd go a step further and say that controlling side effects, as a consequence of pure-functional design, is a bigger part of FP than any given type system. But hey, what do I know, I just did the SICP exercises a decade ago and forgot about it afterwards.


We're in agreement that functional programming encompasses a range of characteristics and is hard to define.


You're saying that the type system has something to do with functional programming.

I'm saying that if Scheme, LISP and Haskell are all functional languages, obviously the type system is besides the point -- they're radically different in that aspect. Functional programming is about functions.


I'd say Scheme and Haskell are functional languages, as they emphasize programming in a functional style: Composing functions to process values. "Value-oriented programming".

I never understood why anyone classified Lisp (typically CL) as a functional language, as it isn't any more functional than Python, Ruby or Perl.


>Functional languages are a terrible mismatch for this

How? That is precisely what I use haskell for, and it has been by far the least terrible programming experience of my life.

>'No side effects' is great for certain types of coding

That comment shows a very superficial understanding of functional programming. Side effects are never the whole point of what you are doing. You are assuming "no side effects" means you are limited in some way. That is incorrect.


You seem to be under the impression that the only reason why someone would design or prefer an imperative language is that he doesn't understand functional languages.


Do Rob Pike or Russ Cox know any functional languages?

I am not in either's fan club, and I find it easy to believe that they indeed do not understand functional languages.


>Also, are you really going to call-out Rob Pike and Russ Cox on whether or not they truly understand Functional Programming like you do?

Appeal to authority really only works if the people you mention are authorities. On what basis would any reasonable person conclude that either of the people you mentioned are authorities on functional programming? Have either of them even been willing to respond to the constantly referenced parametric polymorphism paper from the 1970s yet?


> Appeal to authority really only works if the people you mention are authorities.

Excellent point. I agree.

However, my entire argument wasn't based on an appeal to authority. I included it as a snarky invitation to "lay it on the table".


I kind of think Rob Pike knows about functional programming.

Go was written at Google to solve very specific problems Google had. Looming large was the need for a C family language that implemented concurrency, fast compiles, and so on. The C family requirement is quite simple: most google engineers are young and don't know functional programming, have experience with Google's existing C/C++ code base, and they will be using Go to replace and extend existing systems already in C/C++. A new functional paradigm, nice as it is, would just get in their way. I'm not enamored by many features of the language, but I'm really hard pressed to say that those features do not solve Google's problems.


The first Go program was a LISP S-expression parser: http://blog.golang.org/first-go-program

Stephen Wolfram writes about the birth of Mathematica: "Macsyma was written in LISP, and lots of people said LISP was the only possibility. But a young physics graduate student named Rob Pike convinced me that C was the “language of the future”, and the right choice." http://blog.wolfram.com/2013/06/06/there-was-a-time-before-m...


Yeah, now we have to live the "present of buffer overflows".


> Something with ML/OCaml-level of expressiveness but with a replaceable garbage collector might be the sweet spot.

That's Rust: http://pcwalton.github.io/blog/2013/06/02/removing-garbage-c...


Or D if you want C++level of expressiveness. The garbage collector is replaceable.

http://dlang.org/

The difference between ML and C++ is a tradeoff between efficiency and safety in my eyes. D comes from the C/C++ level of efficiency and improves safety. ML comes from a safe point and tries to be efficient.


Kind of a shame that the Rust crew doesn't seem to "get it" the way the Go team does. The Rust stdlib is TERRIBLE. Even basic stuff like file IO isn't implemented sanely. A shame as the language itself is pretty good, other than the idiotic semi-colon decision.


It's a work in progress. We have not hit 1.0 yet.

Regarding the semicolon rules, you won't like this observation, but in my 3 years of writing Rust the semicolon rule has literally has never led to a single bug.


I get the feeling that your knowledge of the rules is not necessarily representative of typical post-1.0 developers. :-)

However, I don't recall it being a problem in my experimentation with Rust either.


In my case, it's not a problem with understanding them, it's a burning desire that they not exist.


You realize that it's a matter of aesthetics at best? You could set your editor of choice to substitute character ; with <pink elephant> when displaying code and back when saving it to disk. Have you tried this approach? (It's not an original idea - there is a plugin for Emacs which does exactly this for CamelCase - it displays parts of camel cased words as separate words. Helpful if you have looong camel cased name and are irritated because <back by word> and <forward by word> doesn't take you where you should be)


Not just aesthetics, it's also a matter of ergonomics. ; is not the most comfortable thing to type, requiring either wrist rotation or an awkwardly bent pinky.


That's odd, I consider ';' as one of the easiest things to type. What kind of keyboard are you on?

I'm a C++ developer and go so far as to make it harder. I map ; to <esc> in vim, and enter the actual char with a second ; (from from normal mode).


On the US keyboard it probably is. On Finnish keyboard it's where < is on the US one, i.e. Shift-comma.


Yeah, I always felt a tiny bit sorry for non-US developers, especially using languages like Perl that use $ all over the place.


Dvorak does it better -- underscore is on the home row.


> idiotic semi-colon decision

It's surprisingly logical and convenient. Everytime I `return null` (or just `null`) in CoffeeScript I wonder if Rust's way isn't beter.


The language itself is still evolving, I'm sure the stdlib is currently not a priority for them and that it will improve.

However, I hate semi colons so I agree with you there. You should check out the nimrod programming language.


The problem is that historically changing the stdlib in a significant way rarely happens once a language has moved out of the lab, do to the sheer amount of breakage that would ensue.


That is not a worry for Rust. At this stage we make breaking changes to the standard library nearly every day. Sometimes on an hourly basis.

Revamping I/O is very high on the list. Brian Anderson et al are working on it as we speak.


Nimrod is also a good option: http://nimrod-code.org


Yes, what a dummy Rob Pike is, to have jettisoned all the functional goodness of C.


Your blog post has some assertions that are either overbroad ("No type inference.") or just unsupported ("The whole design of GOROOT/GOPATH is completely broken".)

Everyone is welcome to their opinion, but it'd be nice if they actually supported it with some explanation if they want to bandy it about in public.


Well you can't call 'var' type inference (Edit ':=' .. it's still laughable compared to real type inference).

But I'll explain the GOROOT stuff in a bit more detail ...

Basically the way it's set up prevents golang modules from being packaged up for Linux distros. There are two main problems. Firstly the 'go' utility tries to recompile installed modules, which isn't going to work as those modules are in files and directories owned by root and 'go' is not running as root. By very carefully setting up timestamps it's usually possible to avoid this, but it's fragile.

Secondly if you do install a golang package, you can't compile an alternative copy in GOPATH, because the one in the (root-owned) GOROOT overrides it. Which is dumb and backwards.


There are a myriad of possibilities that await you on the Archwiki. [1]

Also, you should not be installing third party packages in `GOROOT`. If you do, then yes, you're going to be in some trouble.

[1] - https://wiki.archlinux.org/index.php/Go_Package_Guidelines


That web page summarises how we package golang libraries so far, and it's (a) fragile and (b) doesn't let people install their own development copies of libraries, as I said in the gp post.


> doesn't let people install their own development copies of libraries

Then maintain two different and distinct `GOPATH` directories. One for development libraries and one for other stuff.


> Well you can't call 'var' type inference

'var' is not the type inference, ':=' is the type inference.


It's not HM-style type inference, which is presumably what the parent comment was referring to.


>I thought it was a pretty poor language, obviously written by people who'd either never heard of functional languages or never "got" the point of them.

Or "got it" and didn't find it any that good, for the types of problems they work on.


> No type inference. Because obviously it’s 2013 and I want to write types everywhere.

The `:=` does a fair bit for the programmer in cutting back on writing types. For me, the most needed place for type inference would be for writing anonymous closures (like what Rust has). In most other places, I am quite content with writing the types, particularly at the top level.

> Like what’s the point of all the odd rules around := vs =

The former is short-hand for variable declaration with assignment and type deduction, while the latter is just regular assignment. The primary benefit here is type deduction, which partially relieves the lack of type inference.

> and what types you can and can’t assign and pass to functions

Huh?

> And why do you have to declare imports, when the compiler could work them out for you (it’ll even moan if you import something which is not used!)

The compiler absolutely cannot work them out for you. If you have a package `github.com/PeepA/wat` and a package `github.com/PeepB/wat`, how will the Go compiler know which `wat` package to import?

The Go compiler merely moans if an import (or its alias) hasn't been used in the source file that it was imported in. I like this feature, even if it is a mild bother while debugging.

> Hello, world is about 8 lines of code. Camel case! Java rang, wants its boilerplate back.

It's two lines. [1]

> No breakthrough on error handling.

Breakthrough? I'm not sure what you were expecting, but I think the error handling is pretty sane and at least far better than error handling conventions established in C. It could be adjusted if sum types were added to the language, but that has its own trade offs.

If you desperately want exception-style error handling, then you can use panic/defer/recover, but it's frowned upon to overuse it.

> The whole design of GOROOT/GOPATH is completely broken. It’s actually worse than Java’s broken CLASSPATH crap which is some kind of achievement, I guess.

I have the exact opposite opinion. Did you know that you probably shouldn't be setting `GOROOT`? [2] After Go is installed, you just need to set `GOPATH` and add `$GOPATH/bin` to your `PATH`. That's it.

> It’s not even enforced error checking, so bad programmers will still be able to write bad code.

That is true, but Go's compiler forces you to address errors returned by functions that also return another value. You either need to explicitly ignore it or use the variable the error is stored in. (Lest you get an "unused variable" compiler error.) This doesn't cover all cases---like completely ignoring the return value(s) of a function---but don't throw the baby out with the bath water!

[1] - http://play.golang.org/p/ihEoJ0yL9I

[2] - http://dave.cheney.net/2013/06/14/you-dont-need-to-set-goroo...


> That is true, but Go's compiler forces you to address errors returned by functions that also return another value.

Barely and badly: you can ignore the error and it's by far the simplest course of action. Contrary to Haskell where it is easier to propagate it, or Erlang where it is just as easy to turn the error into a fault if you don't want to handle it.

> don't throw the baby out with the bath water!

There is no baby in that bath water.


My post was tempered, even and acknowledged the shortcomings fairly. Your response indicates that you missed that.


No, my response indicates that I do not think your post was "tempered, even and acknowledged the shortcomings fairly" (and the one I picked is merely the worst, your hello world is as disingenuous as Java's with all newlines removed — a single line)

But hey, I guess I'll give you credit where credit is due: you do know Go's error handling is only an improvement compared to C's, although you still err in declaring it "far better".


Because I did not expound upon the intricacies of error checking with sum types? Responding to inaccuracies about Go does not imply I need to exhaustively describe all alternatives in order to be even-handed. Namely, I was responding to the OP's claim that the Go compiler did not "enforce" error checking. But it certainly does to some extent. I acknowledged that such enforcement was not exhaustive and that errors could be ignored by the user. I even acknowledged insidious cases where the return values are never captured at all.

> (and the one I picked is merely the worst, your hello world is as disingenuous as Java's with all newlines removed — a single line)

I ran `gofmt` on that code before sharing it, so it conforms to Go's "One True Style". More importantly, my "Hello World" is quite readable, unlike a single line Java "Hello World". This last point in particular addresses the OP's central point: that Go is just as verbose as Java because of the length of "Hello World".


> Because I did not expound upon the intricacies of error checking with sum types?

No, it's got nothing to do with sum types, Erlang does not use sum types yet for all the similarity of Go's error handling to Erlang's, Erlang is vastly superior.

> I acknowledged that such enforcement was not exhaustive and that errors could be ignored by the user.

And that's insufficient, errors can almost always be silently ignored by the user (even in languages with sum types which turn errors into faults, you should be able to silence the fault). The issue in Go is that ignoring errors is the simplest thing you can do. And not only that, not ignoring them is a significant step up in complexity and amount of code.


I'd really appreciate if you stopped quoting me out of context. I never said sum types were the only alternative. The second sentence of my GP was "Responding to inaccuracies about Go does not imply I need to exhaustively describe all alternatives in order to be even-handed." I referenced sum types specifically because it is often what people suggest should have been included in the language.

> The issue in Go is that ignoring errors is the simplest thing you can do. And not only that, not ignoring them is a significant step up in complexity and amount of code.

I agree that error handling requires more code, but I disagree that it increases complexity. Most error handling cases I've ever written are just passing them up to the caller:

    if err != nil {
        return nil, err
    }
That is not added complexity IMO---particularly since it is an extraordinarily strong idiom---so I suppose we are at an impasse.


> That is not added complexity IMO---particularly since it is an extraordinarily strong idiom---so I suppose we are at an impasse.

Yes, if 10 tokens, 3 lines and a conditional are not added complexity to you, I suppose we are.


Not in this case, no.


Nice response. I would add

> It’s not even enforced error checking, so bad programmers will still be able to write bad code.

Bad programmers will always be able to write bad code. You can't structure a programming language around preventing bad code without completely hamstringing the language.


A false dichotomy. You are correct that you can't outright prevent the writing of bad code in any language without crippling it; nonetheless, the design of a language can be contrived such that writing good code is the path of least resistance. It is not a binary choice between "safe but crippled" and "useful/flexible but unsafe". There is an entire continuum of choices to be made between the two extremes. The design of Go's error handling mechanism, while not the worst approach imaginable, unfortunately makes the programmer exert extra effort in order to do the "right thing".


> the design of a language can be contrived such that writing good code is the path of least resistance.

You can only do this for sufficiently motivated and well-informed programmers.

> The design of Go's error handling mechanism, while not the worst approach imaginable, unfortunately makes the programmer exert extra effort in order to do the "right thing".

Most error handling mechanisms I've seen make doing of of the worst things imaginable (silently throwing away the error) the easiest thing to do. I'm not so sure it's any easier or harder to do the right thing in Go, in the particular context of a highly concurrent system.

EDIT: But I haven't investigated Erlang's mechanism yet. I've heard it's quite good.


> The whole design of GOROOT/GOPATH is completely broken.

I too thought this, until I took a couple of minutes to actually understand how it's supposed to be used. It's a very opinionated setup, but it seems well thought out.


Perhaps you don't "get" Go?


I know it's not a popular view, but being pretty much in love with Go, what I find most difficult about it is the lack of any real IDE. And by real IDE I basically mean: a rich editor, stable code completion, jump-to-declaration etc; and tight debugger integration.

Unless I'm really unaware of an IDE that has all of them, all the IDEs that support Go fall short:

1. IntelliJ plugin - excellent editor, a lot of the intellij goodies; but I've managed to break the code completion, and it's tricky to configure with respect to GOROOT and GOPATH. and most importantly, no GDB integration whatsoever.

2. GoClipse - reasonable (although buggy) debugger integration, very good code completion, no jump to declaration which is a pain, plus a wonky build system - the build can fail and you see no message, and the program just runs from the previous build.

3. Sublime with GoSublime and SublimeGDB: the build system is buggy, code completion is great (managed to break that too on complex projects), no jump to errors, don't remember re jump to source. I had to do a lot of voodoo to get SublimeGDB to work half decently, and even that was not very robust because SublimeGDB is buggy on its own.

4. LiteIDE - has everything in theory, but just feels a bit clunky and hard to configure. I didn't like it as an editor and haven't spent much time with it. But it's constantly improving so I hope it will get there some day.

The rest are Windows only so out of the question, less complete/mature than those 4, or vim with plugins. I don't like vim, I like graphical IDEs, shoot me. :)

Again, I really really love Go, I hope it will allow me never to write C++ again in my life. This is not a "Go is not ready yet" gripe. It's just what I find most difficult about it.


Although it is not as shiny as modern IDEs it is worth making the effort to use emacs - and I say this as a vi user :-)

Whenever a new language comes out it is usually supported by emacs and vim (go has the support files in the source tree). Learning emacs is a small but frustrating investment in time that pays off over the years. Of your points, it has a rich editor and tight debugger integration.

Jump to declaration is a pain but possible using exhuberant etags [1]. You may even prefer using ack [2]. Emacs does code completion but I've never been bothered to get it working.

[1] https://groups.google.com/forum/#!searchin/golang-nuts/emacs

[2] http://www.emacswiki.org/emacs/Ack


vim + a bit of tweaking , ie. completion plugin [1] (probably there is an emacs mode too - which has all of the above)

[1]: https://github.com/nsf/gocode


I tried the recommended vim plugin. it was okay, nowhere near as powerful as intellij's plugin - and still didn't solve my gdb issue.


Just curious, but what issues did you have with GoSublime builds? It's using the `go` tool under the covers.


A few are just UX quirks - ctrl+b tries to build from the current package, not the project root, leaving you tabbing to main.go for each recompile, which is annoying; cursor "runs away" from the prompt not affecting anything. but mostly, the output is buffered.

I wrote a toy MapReduce framework that yielded tons of output. It crashed Sublime because it buffered the entire output into the console instead of printing it into some sort of FIFO buffer like IntelliJ or Eclipse do. Again, running from a terminal is ok, but the whole point of an IDE is to save you from fiddling with lots of windows and terminals.


Imagine if the Go team listened all those that want Generics in Go. It would be more consistent and closer to perfect language.

As it is, it's a PITA 20% of the time.

(Sure, some say they use it and don't feel a need of generics. Most do however, and it's a constant in every review, even by people using it for core infrastructure).


What kinds of things have you written in Go, where the language has been a PITA to you?


Data structures usually, If you want to write a linkedhashmap or a dictionary where the keys are autogenerated from the values or any kind of structure not built into the language you are forced to either roll a new structure for each set of types or cast everything to and from (empty)interfaces.

While it's still possible with the casting, it doesn't feel as nice as the rest of the language does.

The other case seems to be any kind of generic pattern with channels, it's possible to build a channel-pair which acts as a single channel with a buffer only bounded by memory, but you can only do it for a single type.

All of these are avoidable or maybe even inadvisable, but it still feels like a hole in the language when you go to do something which is common, useful and trivial in C++/java/C# and you have to use ugly hacks or deal with heavily reduced functionality, especially when the rest of the language is so nice.


A log scanner that does some analytics, a intra-company web service, ported a few Python programs to evaluate the performance, etc.

Copy pasting data structure code to fix for different types feels (and is) bad. Anything involving non DRY code I consider a failure. (Oh, and working with numeric types).


"Well I would write things in Go, but I keep hearing about this generics thing so I'm just going to complain about it instead"


Yes, because nobody both uses a language AND has a legitimate complain against it.

Well, I for one have both. As lots of people. And in fact, it's quite common. Anyone that used a language should be able to tell you several pain points about it. Except if he is in his idiotic "oh, this is so much better than the blub that I used to use, it's perfect" stage.

Have you even read TFA? The guy evaluated the language, done a project with it, and still wants generics.


Read the FAQ: http://golang.org/doc/faq#generics

Generics are convenient but they come at a cost in complexity in the type system and run-time. We haven't yet found a design that gives value proportionate to the complexity, although we continue to think about it. Meanwhile, Go's built-in maps and slices, plus the ability to use the empty interface to construct containers (with explicit unboxing) mean in many cases it is possible to write code that does what generics would enable, if less smoothly.


So instead of some--very limited, as languages like ML show--complexity in the type system, go punts these difficulties to the poor programmer who is then forced to constantly dynamically cast values, simultaneously adding clearly superfluous noise to the code and subverting any compile-time safety the type system offers.

Wonderful state of affairs.


This would make sense if it was written in the seventies.


Wow, they still mention that non-sense in the FAQ?

Let me guess, they also still link to that “OMG Generics are so hard” article where they conveniently manage to ignore the comments which point out the elephant in the room?


> C++: too much rope, hard to maintain, painful to introduce at a company with no prior C++ footprint, frightens junior devs who no longer absorb relevant memory management idioms in school

So, are we now in an era when there are lots of people in the job market who couldn't write a doubly-linked list to save their life?

When I was in my 20's, I noted that there were lots of programmers around who basically treated compilers as "magic" and hadn't the tiniest inkling about how they worked. Now that I'm in my 40's, I've noted "so what" attitudes in languages that require manual memory management. (iOS and Objective-C. ARC goes a long way, but it still doesn't take care of everything for you.)


I am one of those programmers: I have no real idea how a compiler goes about its business, my university does not offer an accessible compiler class (only for grad students and very irregularly), and I have no idea where to start besides trying to force my way through the purple dragon book.

What are things I should know about compilers that I probably don't? What is suggested reading on compilers?


I am not the person you are responding to, but I don't think the point was to understand how compilers parse code, but to know how the compiler implements your code in assembly. I.e. knowing things like the difference between the stack and the heap, how to do a function call in assembly while passing parameters, how memory actually gets allocated to you from the operating system, what happens when a page fault occurs, how the compiler lays out data in memory (this has huge implications in your code as to whether you generate a ton of cache misses or not). None of that really has much to do with understanding compilers, but understanding modern microprocessors and assembly code.

But, of course, you need to pick what makes sense to you. If all you ever do is write MySQL queries for low performance requirements apps, probably all of that knowledge will prove of little use to you. But to actually debug something when it goes all the way to the OS, to deal with hardware, to write a device driver, you really can't get far without knowing it. If you are programming in C/C++ or otherwise doing anything performant (3D graphics, CUDA GPU programming, and the like) you'd better know all that stuff intimately if you strive to do more than "program by magic" (when something breaks, randomly change code until it seems to work without you understanding why).

When you know this, you can basically go into any programming assignment and get it done. Interface this phone to this hardware? Done. Fix this nasty blue screen crash? Okay, no problem. Modify the linux kernel for some local need? I'll get right to it! Without it you are kind of restrained to working on top of the infrastructure that others have built. Not that there is anything wrong with that - it's your career, and your life, you might as well try to have fun while doing it.


> I am not the person you are responding to, but I don't think the point was to understand how compilers parse code, but to know how the compiler implements your code in assembly.

You said it a lot more concisely than I did!


I am not an expert in compilers, but in my undergrad compiler's class one of the things I remember the most are parsers (such as LALR). I remember having a great time implementing different types of parsers:

http://en.wikipedia.org/wiki/Category:Parsing_algorithms

I think the dragon book is the absolute reference for learning compilers:

http://www.amazon.com/Compilers-Principles-Techniques-Alfred...


As xtracto mentioned, parsing and enough automata theory to understand what context free grammars are and where they fall should be part of your basic toolkit, but how they generate code is even more useful to C programmers. (One study done decades ago found that the average C statement resulted in 2.7 CICS assembly language commands.)

The Dragon Book is not the best teaching tool out there. Maybe try The Elements of Computing Systems Chapters 10 & 11? You might need to peek at the earlier chapters to get a grounding in the particular assembly language they use.


Best sum up of the language i've ever seen (plus it seems completely coherent with what i've supposed so far about that language, both pros and cons).

About the "no generic / you have to cast ,recast / the make issue" , could anyone here with a bit of experience gives an example of what that would look like in real code ?


About the "no generic / you have to cast ,recast / the make issue" , could anyone here with a bit of experience gives an example of what that would look like in real code ?

Sure. Say you have a Set type (which is custom, since Go doesn't provide a `container/set` type.

    type Set struct {
        set map[interface{}]bool
    }
All of the methods (`Add()`, `Contains()`, etc.) are going to be declared with `interface{}` parameters, e.g.,

    func (set *Set) Add(x interface{}) bool
You would also want a method to get a slice of all members, e.g.,

    func (set *Set) Members() []interface{}
If you want to use this as a set of strings, you would add without a cast:

    mySet.Add("foo")
However, when pulling it out you would need to cast it to a string. So if you were converting `[]interface{}` to `[]string`, it would look like this:

    values := make([]string, mySet.Len())
    for i, member := range mySet.Members() {
        values[i] = member.(string)
    }


But since you have to use casts you are giving up some advantages of static typing, right?


Yes, but the `container/list` standard library uses the same approach.

I wrote a Cartesian product library that took a different tack--it splits the index and iterator cursor logic (integer fiddling) into one struct and the actual type into another, so it's easy to build type-checked objects.

In full disclosure, though, that was mainly because the alternative (using `interface{}`) was so utterly painful.


That sounds interesting and useful. Is that code publically available anywhere?


No, unfortunately. It is proprietary.


You pretty much can't write your own container without significant pain. That said, with the built in objects you have a map, list, set (map of bools, which is actually very nice since maps return default types - false - for missing entries), and concurrent blocking or non-blocking queue (chan). Stacks get a bit cludgy, but can be done without too much pain. Apart from those, I've really never had to roll my own collection class. Someday I may, but it is surely not something you need to do very often.


On the contrary, I found it significantly easier to write a custom container (a trie in my case) in Golang than in Ruby (which let's stipulate is pin-compatible with Python). In particular, Golang provides fine-grained control over memory layout, and is even expressive enough to implement allocators, both of which are practically impossible in Ruby.

It's true that Golang grants a very useful power to its builtin maps, which makes it rankle that you have to cast in and out of interface{} to make your own general-purpose container. But apart from the fact that interface{} is a stupid name, I don't find it much more painful to write general purpose containers in Golang than I found it to write templated containers in C++.


Go does not provide fine-grained control over memory layout when you use interface{}.


You should actually be using a `struct{}` as the value, as it takes no space (opposed to a bool).

http://play.golang.org/p/KK1ymLzd_e


Cool, I didn't know that. will change my Set implementation :)

BTW You should remember that if you really want a python-like set, map[interface{}]struct{} is not enough - you still have to implement intersection, union, diff yourself. Not that it's rocket science, but I do think this warrants at least a package in the std library, if not a first class type.


Agreed. There is an awesome project that provides this: https://github.com/deckarep/golang-set


As you might expect, it produces code that is somewhere in between C and Python, where custom containers are also somewhat out of idiom.


There's nothing "out of idiom" of custom containers in Python (or are numpy arrays or pandas dataframes "out of idiom" now? I could swear that wasn't the case yesterday), and pretty much anything available to builtin containers are also available to custom ones (the exceptions being literal syntax, and true immutability unless the container is written in C).

In fact there are a few domains where they are rather common (e.g. multidict implementations in web framework, pretty much every web framework has one for query and form data; optimized or analysis-focused structures in scientific python)


Good pros and cons of Go language. There is some information around why Go and not say C++/Java/Ruby etc. The statement "I see Rails as the emperor with no clothes on" seems to be made in haste, since Rails is awesome for basic forms based apps.


I know a lot of friends of mine who have C/C++/PHP background struggle with rails at the start, primarily because of the magic of convention over configuration that happens (they aren't used to it). It takes a while to understand it. But once you understand it, it's a breeze.

Disclaimer: I am a Rails developer. My views are therefore biased.


In many ways the problems people have with RoR are the same problems people have with ORMs. They abstract away a lot of the clutter, but then, just as you become totally dependent on the abstraction, you find you need something that you can't get in the abstracted layer.

When this thing is a substantial increase in speed, sometimes you are totally f*ed, since to get it you need to toss away a lot of the detritus. For an ORM you can gradually phase away parts of it, but for a web framework generally you can't architect away the box into which your app is placed.

I'm unaware of any RoR that has successfully scaled, except by removing Ruby (Twitter). Github is probably the largest existing Rails app (at least that I use), and there are Unicorns aplenty (the new fail whale), or weird caching issues that seem to arise regularly.


I'm unaware of any RoR that has successfully scaled,

Ruby is slower and more resource intensive than some alternatives, but clearly it's possible to scale to large numbers of users and developers with RoR:

http://www.groupon.com

https://www.shopify.com

http://www.yellowpages.com

That's not to say that choosing an alternative or rewriting a particular service or website in something simpler/faster/cheaper isn't sometimes a better option, but I don't think you can claim convincingly that Rails is impossible to scale.


Apparently Shopify has quite a bit of Go on the backend. I don't know any specifics about the rationale, so I can't tell if this means anything WRT Rails.


I'm sure they all have multiple services running on different tech. Rails is clearly not the one solution for everything, and any website growing in scale is going to hit hurdles and end up rewriting some of their logic, whatever the language/framework used.


Github is unique in that the core logic is actually inside git itself - the real heavy lifting is handled in the git C code. RoR/github is just a pretty cover on top of that, which RoR does very nicely and why it manages to scale in that case.

As always, right tool for the job.


I think that, as with any automagic framework that the edge cases become exponentially more difficult to implement. I'm very comfortable with MV* frameworks on client and server, and still not big on rails.

It's also likely that TFA's author was frustrated with performance under load, where Ruby (and Python) generally do not shine.


I see it's become trendy to poke fun at Rails and the hurdles it puts on your way to "scale". Not only have many companies been able to do so using Rails, but it also completely overshadows the ease it provides to get started in the initial phase of a project.

That being said, Go really seems like an awesome language, I've dabbled some and I can't wait to get back to it.


For me, no templates + no exceptions = DOA.


Go has exceptions (panic/recover). They're pretty bad, you shouldn't use them (just as you generally shouldn't use exceptions in Erlang) and some in the Go community downright refuse to accept that they exist, but they're there.


Exceptions I can easily live without; Haskell's monads provide a much better mechanism for error handling (Maybe, Either, IO, etc).


You have generics^2 if you're using Haskell, so it's no wonder they are able to replace exceptions ;)


Templates are a sad hack in place of true hygenic compiler macros.


Macros are the sad hack. Templates can express compile time assertions.


I believe you misunderstand me. I'm not talking about preprocessor macros, but TRUE compiler macros ala Lisp or Clojure, which can bring the full expressiveness of the language to bear on the task at hand, and generate program code as a return value.

Consider the following example, from Clojure, which implements a C-style for-loop

    (defmacro for-loop [[sym init check change :as params] & steps]
      (cond
        (not (vector? params)) 
          (throw (Error. "Binding form must be a vector for for-loop"))
        (not= 4 (count params)) 
          (throw (Error. "Binding form must have exactly 4 arguments in for-loop"))
        :default
          `(loop [~sym ~init value# nil]
             (if ~check
               (let [new-value# (do ~@steps)]
                 (recur ~change new-value#))
               value#))))
Syntax primer: The code runs at compile time, forms (nested parenthized groups) that are prefixed with ` are the output, ~var means to substitute the _value_ var from the macro invocation, and var# means create a new anonymous variable var, guaranteed not to clobber or shadow any existing variable.

For instance, given out macro defined above, say we make the following call:

    (for-loop [i 0, (< i 10), (inc i)] (println i))
The code generated by the compiler for that invocation is:

    (loop* [i 0 value__95__auto__ nil]
      (if (< i 10)
        (clojure.core/let [new-value__96__auto__ (do (println i))]
          (recur (inc i) new-value__96__auto__))
        value__95__auto__))
You can see how

A: all the error checking is done at compile time and comes at no runtime cost

B: how the `-quoted code in the original macro gets "filled in".

C: How the arguments to the macro are not evaluated by the macro call or the substition until they are actually used.

Point C is quite powerful since it means that macros can be used to create arbitrary control structures that are fully as powerful and general as anything in the core or stdlib - and in fact many of those are implemented as macros.


      (not (vector? params))
I see that you are indeed able to access some amount of type information in your macro system.

Well done!


I thought I made it clear. The macro system is not some subset, it is the full language with all the capabilities that brings with it, including type functions, reflection, etc. You can even do I/O in macros, whatever you want. The full language is available.


Forgive me. The term 'hygenic macro' has historically been associated with Lisp and Scheme which were not known for the availability of type information at compile time, because they are dynamically typed and often did not have a compile time.


Fair enough. Clojure really is a wonderful language. It's sort of a "Javascript: The Good Parts" for Lisp - they kept (and expanded on) that wonderful expressiveness, while doing away with 50 years of dogma that held on to stuff like gen-syms, car, cdr, and singly-linked lists as the one-true-datatype.


It's the only language right now that makes me wish I weren't so allergic to the JVM.


It's really not so bad. At least for simple-medium sized projects you can basically ignore the JVM. The leiningen build tool means you never have to tough maven/ivy/ant/etc


Maven. [shudder] We must speak of this evil no more in public.


"The standard library is missing x"

What you are looking for is actually a productive community with a good module manager. This what Golang is missing.

(Golang fans: don't get me wrong, I use and love Go.)


How is "deploying modern Java applications, which is a minor nightmare IMO" true?

Dropping a .war file in a directory is hard? Or am I missing something about 'modern' Java apps?


I agree with the point about statically-linked binaries. However, I wish there were a compile-time facility for including the contents of a file as a read-only byte array in the binary. Then web application servers with HTML templates, and even images and other assets, could be truly self-contained.


I agree with most of this post, but I actually found online resources (especially golang-nuts and various SO pages) to be very helpful with debugging.


The more I learn about other langs, the more I appreciate the language design of Ruby, even though Ruby is slow as shit.


OH: "Any tips for getting started with Go?" Answer: "Yes, try Python."


I'm getting sick of all these Go goroutines... err threads.


You mean Modula-2 co-routines. :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: