Man, Go gets a lot of hate on here. It's certainly not the most flexible language. If I want flexibility + speed, I tend to choose Nim for my projects. But for practical projects that I want other people to be able to pick up quickly, I usually opt for Go. I'm building a whole product manufacturing rendering system for my company, and the first-class parallelism and concurrency made it super pleasant.
I will say that the error propagation is a pain a lot of the time, but I can appreciate being forced to handle errors everywhere they pop up, explicitly.
So much of language opinion is based on people's progression of languages. My progression (of serious professional usage) looked like this:
Java -> Python -> C++ -> Rust -> Go
I have to say, given this progression going to Rust from C++ was wonderful, and going to Go from Rust was disappointing. I run into serious language issues almost daily. The one I ran into yesterday was that defer's function arguments are evaluated immediately (even if the underlying type is a reference!).
I'm curious how one ends up with such ahistorical sequence. I'd expect it to be more aligned with the actual PL history. Mainstream PLs have had a fairly logical progression with each generation solving well understood problems from the previous one. And building on top of the previous generation's abstractions.
Turbo Pascal for education, C as professional lingua franca in mid-90s (manual memory management). C++ was all the rage in late 90s (OOP,STL) . Java got hot around 2003 (GC, canonical concurrency library and memory model). Scala grew in popularity around 2010-2012 (FP for the masses, much less verbosity, mainstream ADTs and pattern matching). Kotlin was cobbled together to have the Scala syntactic sugar without the Haskell-on-the-JVM complexity later.
And then they came up with golang which completely broke with any intellectual tradition and went back to before the Java heyday.
Rust feels like a Scala with pointers so the "C++ => Rust" transition looks analogous to the "Java => Scala" one.
Go is definitely of the “worse is better” philosophy. You can basically predict what someone will think of Go if you know how they feel about that design philosophy.
I remember that famous rant about how Go’s stdlib file api assumes Unix, and doesn’t handle Windows very well.
If you are against “worse is better” like the author, that’s a show stopping design flaw.
If you are for it, you would slap a windows if statement and add a unit test when your product crosses that bridge.
The problem is that most of the time, errors are not to be handled but only bubbled up. I've also seen it in Java with checked exceptions: the more explicit error handling is, the more developers feel they should somehow try to do _something_ with the error when the correct thing to do would actually be to fail in the most straightforward manner. The resulting code is often much heavier than necessary because of this and the stacktraces also get polluted by overwrapping.
The problem with the opposite is, since everything gets invisibly bubbled up to the top, you are not able to tell what errors do need to be handled. You only find those out from runtime failures and that's no good if you care about reliability.
You are right, I wouldn't want totally invisible bubbling either. I like Rust's ? notation although it's not perfect and I'm sure another language could do better maybe with more structured error classes.
That's just another way of dismissing the problem as a "skill issue" and is not helpful. While the problem can be prevented by having strong coding standards, many teams do not have the luxury of wisdom and thus recreate this exception tar pit again and again.
I will say that the error propagation is a pain a lot of the time, but I can appreciate being forced to handle errors everywhere they pop up, explicitly.