I can't help but feel like we continue to just go backwards; Swift doesn't look like they spent much time really thinking about what they wanted to solve and how to solve it. I thought this was an interesting read: http://owensd.io/2014/09/24/swift-experiences.html
I know saying this is a bit terse, but a native Clojure compiler for iOS would have been the best thing Apple could have done -- let someone else with far better language skills who has already done a lot of thinking about values, identity, state and time solve it and then just use the work.
Would the problems outlined there be any better in Clojure?
Type inference: at least in Swift you have types; the author dismisses "click on it in XCode", but that's still better than you get with Clojure.
nils in the Objective C bridging APIs - you're going to have to solve that somehow, even in Clojure. I don't know how Clojure invokes Java code, but you have two choices, neither of them good: either you treat every "FFI" call as returning Option, in which case all your code that calls libraries is littered with Options and you better have a syntax for easily "casting" out of Option (like the one the article complains about), or you allow nils to cross the divide into your nice language and anything that calls the "FFI" has to worry about them.
I don't have a lot of Swift experience, but it looks like a decent incremental improvement, applying some of the lessons of modern language design in a conservative Algol-style syntax that many programmers are happy to use. (I'd've been happier to see Scala support, but I can see why Apple would want their own language). Clojure isn't that (and from Apple's PoV, anyone dedicated enough to use Clojure is probably also dedicated enough to use a non-first-party language).
Remember that clojure is dynamically typed. `nil` is its own type, but in a dynamically typed language, it can still be used anywhere.
I have a slight preference for statically typed languages, but FWIW I find I make a lot fewer type errors in Clojure (and other lisps) than in, say, Javascript.
And for users who can't stand dynamic typing, there's core.typed
I don't think Clojure would have been a sane choice for Apple though. It's too memory hungry for mobile and developers would avoid it because of how different it is from objective c
Clojure has some very nifty features for interacting with class libraries. It has a number of macros for easy access to methods and static methods, for calling a series of methods on an object and so on. There's also `proxy` macro which let's you create a subclass of any class you want, conforming to any number of interfaces, but providing sane defaults for methods you don't need or want to implement. There's also reify, which let's you easily create classes which implement some interfaces without any overhead. There are some features which correspond to interfaces, but are somewhat more flexible (defprotocol, extend-type, reify) and then you get multiple dispatch and more.
In short I found Clojure and ClojureScript to be very easy to fit into any OO framework. It's sometimes hard to convince yourself to use those features, because you know that idiomatic Clojure solution would be better, but other than that I saw no problems at all.
But I'm not a heavy Clojure user, so it could be that I somehow lucked out and not encountered any problems which more experienced users face sometimes.
Don't really see it being a problem; I don't have a lot of experience with UIKit but Clojure interacts fine with Java UI toolkits like Swing. I've also dabbled with the OpenGL Clojure wrappers and found them easy to use.
>really thinking about what they wanted to solve and how to solve it.
What they wanted to solve was how to introduce a new more modern language that worked seamlessly with existing Objective-C libraries. People suggesting alternatives tend to forget the ObjC part of that.
Considering that Apple created Dylan I'd argue that they have all the "language skills" needed. But Dylan was a failure, in part because of politics, but also because most programmers are conservative (I'd say dumb code monkeys, but this isn't exactly true) and don't like their "new" languages and tools to be "too new".
Clojure is a really great language, solving real problems in a nice way. But in language popularity charts I just checked it's far beyond first 20 entries and close to Forth(!) on one side and Erlang in the other in popularity. Clojure (and Erlang, and Forth of course, but also Haskell, OCaml, F#, Smalltalk and Io and many more) is just too new, too unfamiliar, too intimidating for our conservative programmers to consider using. And for Apple that was probably the reason for rolling out a somewhat "normal" language instead of something really good.
For mainstream programmers they are, in that they have features they never saw before. You know, like in "it's something new for me" or something like that.
The fact that objectively some of those languages come from '50s and '60s (like Lisp in '56 and ML in '63 IIRC), which makes them ancient by today's standards, doesn't matter. It's really sad. I devoted a couple of years to learning about and trying to use such languages (see here[1] if you want) but I'm in a very small minority; most young programmers never use anything other than 1-3 core languages they learned; the number of known languages increases with years of experience, but it's still biased towards currently mainstream languages. Which are mostly crap.
Anyway, that's how it is: most programmers are very conservative in their choice of tools they use and feel no need to look for alternative tools to use.
I don't want to spend too much time discussing this, but I read Swift guide[1] and even played with it in Playground, and I'm 97% sure that every single Swift feature is borrowed from currently mainstream languages.
There's a difference between "oh, it's like feature X in C#" and "well, didn't Common Lisp implement Y 30 years ago?". Feature X will be perceived as nothing new, merely catching up; Y will be seen as new, dangerous, cryptic and best avoided. I think Swift designers intentionally packed their language with Xes and added almost no Ys, exactly because they didn't want it to seem as "too new, esoteric, unproven" and so on so forth. This is a sane business decision, by the way, I just happen to dislike it.
I’m not one that gets excited about enforcing that every item in your collection is of type String. Why? Because it does not actually matter. If an Int gets in my array, it’s because I screwed up and likely had very poor testing around the scenario to begin with.
I wish I could paste in the citizen kane slow clap gif.
Me as well. Is it in a congratulatory way or just "ok, now shut up, take your consolation prize and let the people that know what they are talking about speak" ?
Somehow it seems that currently everything happening in the USA is "Dead on Arrival" - including Swift.
This time due to shipping an utterly broken compiler to an unfinished language, without being able to contribute to it. Because if we could contribute to it, Swift might still be soon something that people love to use. But I somehow really don't believe that this will ever truly happen.
I know saying this is a bit terse, but a native Clojure compiler for iOS would have been the best thing Apple could have done -- let someone else with far better language skills who has already done a lot of thinking about values, identity, state and time solve it and then just use the work.