The problem is calculating all the types. This is largely a consequence of overloading (and especially operator overloads) and literals. Splitting them up into separate variables works because Swift type-checks each line separately, so moving an expression into a separate variable makes Swift resolve the type for that expression on its own instead of as part of a larger expression.
For example, if you have the line
print(1 + (2 as UInt))
this compiles as it infers the type of `1` to be UInt as well. But if you split it up
let a = 1
let b = 2 as UInt
print(a + b)
you get a type error as you cannot add Int + UInt. This demonstrates how the declaration `let a = 1` forces it to resolve the type there, and the default type for integral literals is Int.
Oh, that's disappointing. I had assumed that Swift did "full" type inference by backtracking from usage to assignment and then checking to see if the value assigned fit the constraints of the usage rather than just checking a single statement at a type like C++'s `auto`. I think I first encountered type inference like this from an OCaml course in college, but at this point I'm most used it it from Rust (for example: https://play.rust-lang.org/?version=stable&mode=debug&editio...).
> The Swift language contains a number of features not part of the Hindley-Milner type system, including constrained polymorphic types and function overloading, which complicate the presentation and implementation somewhat. On the other hand, Swift limits the scope of type inference to a single expression or statement, for purely practical reasons: we expect that we can provide better performance and vastly better diagnostics when the problem is limited in scope.
However Hindley-Milner systems are generally linear in complexity whereas Swift's type system experiences combinatorial complexity explosions in the presence of overloads and operators and literals.
Yeah, it makes sense that they have features that make it harder. I can't say that it's the _wrong_ choice, but it's enough to remove any potential remaining interest I might have even if it did reach the level of support on Linux that I would otherwise want. I honestly prefer not having function overloading, and I certainly wouldn't want to give up features that I actually would want to use to be able to have it be supported in a language.
For example, if you have the line
this compiles as it infers the type of `1` to be UInt as well. But if you split it up you get a type error as you cannot add Int + UInt. This demonstrates how the declaration `let a = 1` forces it to resolve the type there, and the default type for integral literals is Int.