People that don't use a feature (whether it's introspection, dynamic message passing, classes, FP, actors, GC, type systems, etc) saying they won't miss it, tells us something about the person but not necessarily anything about the feature.
Especially if, as was the thrust of many of the articles, they actually are using those features, and quite heavily at that. They just don't see them directly.
Kind of reminds me of the old slogan from the anti-nuclear movement in Germany: "Atomkraft, Nein Danke! Strom kommt aus der Steckdose". (We don't need no stinkin' nuclear power, electricity comes out of the wall socket)
>Kind of reminds me of the old slogan from the anti-nuclear movement in Germany: "Atomkraft, Nein Danke! Strom kommt aus der Steckdose". (We don't need no stinkin' nuclear power, electricity comes out of the wall socket)
Sounds like a hoax slogan (possible from anti-anti-nuclearists). The one I remember was "Atomkraft, Nein Danke!" by itself.
Personally I don't understand why strong reflection support has been left out of Swift.
I can understand that there is probably a problem related to ARC and garbage collection. But missing this set of features means that there are lot of things that simply cannot be implemented in Swift and thus having remain in Objective-C (Core Data, other ORMs, IoC-frameworks etc.).
That means the iOS/OSX-platform will continue to be split in two programming languages and object-models for the foreseeable future.
While there are almost certainly other considerations as well, ARC definitely makes Objective-C much less dynamic.
The Objective-C compiler gets significantly more strict with ARC turned on, for example sending unknown messages is an error instead of a warning. Various runtime functions that used to be quite safe also become dangerous, for example it used to be OK to send performSelector: style messages with non-object returning messages (void) and simply ignore the result. That now becomes a crash as ARC tries to retain the return value. So in order the be compatible with ARC, the "cheap and cheerful" reflection that ObjC has isn't really sufficient.
ARC is predicated on the compiler having essentially a statically checkable call-chain.
Yes, and that they aren't giving enough discussion to the universe of situations and whether given that universe, the trade off of feature inclusion or exclusion is worth it. There may be a fitter initial frame of discussion for language policy than focusing on your situation. At the very least, if you argue broadly, then you should give a clue about why you think your case is a sufficient proxy for the general case.
"The solution to bad metasystems is not to ban metasystems, it is to design better metasystems that allow these things in a more disciplined and more flexible way."
Yea, I think KVO could be done better. As I recall, Ember properties were pretty neat, and had a nice way of specifying dependencies. A lot less awkward than a bunch of + (NSSet )keyPathsForValuesAffectingValueForKey:(NSString )key methods.
I'd also want more strongly typed keys, less strings everywhere. I do use NSStringFromSelector as much as I can, but strings are unavoidable in IB.
Also the handling of one-to-many dependencies could be made a lot more intuitive as well.
EDIT: Just the compile times alone drive me batshit. That doesn't mean that there aren't nice aspects, certainly some of the syntax has been simplified by unifying two disparate syntaxes...but they could have done so much better, even there.
My heart skipped a beat today when refreshing my RSS-feeds. It was like a reunion with your first love. The best of new excitement & old familiarity: Wil Shipley is writing again!
He's saying that the same team that developed LLVM, clang, and ARC (which most people think are good things for the Apple ecosystem) is also bringing you Swift. So you should feel good that the language is being developed by a team with a good track record that will probably find a solution for this particular issue.
You are misrepresenting his argument about frameworks. Re-read the first paragraph of his post:
"Apple's shared frameworks are very useful for sharing executable code and its associated resources among multiple applications, but were historically not designed to be created by authors of consumer applications. I discourage developers from creating frameworks as a method of sharing code, because they encourage code bloat, increase launch times, complicate the developer's fix/compile/link/debug cycle, and require extra effort in setting up correct and useful developer and deployment builds."
and the last one:
"Creating shared frameworks is a lot of hassle for third-party developers of consumer software, introduces instability into the development process, and encourages slower and larger applications. Code sharing is better accomplished through creating new directories for shared code in subversion and judiciously including only the files and resources needed by any application in its Xcode project."
And none of his claims about the problems with frameworks are true. Frameworks are trivial to create and use. In fact, with most of my apps, the app itself is just a simple wrapper around a bunch of frameworks.
Which, by the way, is also how Xcode is set up. Have you looked how large the actual binary is? On my system, it's 35K.
-rwxr-xr-x 1 root wheel 35K May 1 22:12 Xcode*
Holy compression, Batman! No, actually, they just moved all of the actual code out into frameworks. Just like the clang and llvm teams put all their code into libraries, which gives us tools like the static analyser and Xcode-integrated syntax tools.
So even if I don't have plans to share the code yet, I still just put it in a framework target that's in the same project as the app itself, which takes around 30 seconds, and afterwards is no extra effort.
And the idea that only Apple can create software that is worthy of reuse...well fanboys be fanboys, but it actually isn't true.
No, let's skip technical details and your opinion about frameworks as code sharing primitive and get back to your words. You literally claimed that while Wil made money "leaning heavily on some of the most well-designed frameworks in the industry", he said "frameworks are bullshit".
This isn't true: he said that people shouldn't use a specific way of sharing code in OS X, not that "frameworks are bullshit". He even said that "Apple's shared frameworks are very useful for sharing executable code and its associated resources among multiple applications". He also praises Cocoa, the framework, and many other frameworks created by Apple.
So, yes, you did misrepresent Wil's opinion, and you should take your words back.
"The absolute worst thing ever, completely beyond bad, lower than horrible, and more crappy than explosive diarrhoea"
So when I wrote "bullshit", my only crime was that I was being euphemistic. Guilty as charged. And as I wrote, he makes this farcical distinction between Apple and the rest of the world.
So, no, I did not misrepresent Wil's opinion, apart from softening it up. So you should take your words back.
Calling names ("fanboy"), doubting what someone has or hasn't done, telling them what they don't understand, drawing personal conclusions from reading extraneous things they've written and importing that into an argument as ammunition.
None of those things is necessarily a personal attack by itself but they're personal, uncivil, and especially bad in combination.
"However, many of these dynamic features are definitely hacks, with various issues, some of which I talk about in The Siren Call of KVO and (Cocoa) Bindings."
"Note that Objective-C's metasystem...is a bad design."
You have a funny definition of "fanboy", but suit yourself. And if you've read my blog and think I write about type theory, then you are utterly confused.
That article just documents that the safety benefits of static typing are, empirically, rather minuscule, especially compared to the claims for the benefits, which are vastly overblown.
That said: I still like to be able to statically type my programs. I just don't expect it to yield a significant safety benefit (documentation benefit is more important).
> That article just documents that the safety benefits of static typing are, empirically, rather minuscule, especially compared to the claims for the benefits, which are vastly overblown.
Right, which is why I said that you fundamentally misunderstand type theory.
Even if I accept what you write that I misunderstand type theory...you do understand the difference between theory and practice, right? And that I didn't write about type theory, but about empirical effects that are independent of the theory.
Let me explain it to you: if I claim that I have a new car that goes 200mph, and you measure the speed and it only does 20mph max, then it really isn't relevant whether you understand the theory behind your engine or not, it's simply not as fast as claimed.
Or do you mean that anyone who doesn't accept claims about benefits at face value "doesn't understand". That sounds more like a religion than anything having to do with science and engineering. Which is, sadly, my experience with this particular cult.
Of the article that you find "laughable": show me the research that actually validates the claim of significant safety benefits and we can talk.
Which typed languages have you tried? In which have you written anything bigger?
> Of the article that you find "laughable": show me the research that actually validates the claim of significant safety benefits and we can talk.
I'm yet to see one good research paper that looked into the field that was worthwhile. This sort of thing is extremely hard to measure across different people. I speak from my experience and from what others have told me.
You still don't seem to understand this this is completely irrelevant to the point I am making. You don't have to be able to build an engine with my magic engine technology to be able to measure the speed of the car and see whether it goes as fast as I claim.
However, since you asked so nicely:
- Our algorithms class at university was taught with statically typed FPs. Mostly Hope, some Miranda IIRC. (We were the last generation of students to be spared the institute's own Opal). Haskell didn't exist yet.
- I also took the advanced FP courses.
- And am a great fan of FP. Backus's FP, to disambiguate
- It became quickly clear that FPs were no panacea, they just had different problems than other languages
- We also quickly surmised that this whole FP thing was a religious cult and that you were required to take all the claims that were made on faith
- Also used Pascal and Modula
- Did a major system in Java, probably one of my best pieces of work to that date[1]
- Also remember that Objective-C acquired static typing during my time with it (before that it was all "id"). I was hugely confident (kind of like you now) beforehand that this would be a major boost to my programs' correctness and my productivity, and I was very surprised when that turned out not to be the case
But again, all this is largely irrelevant to the point I am making.
> I speak from my experience and from what others have told me.
Really?! Not only do you ignore the evidence there is, you also, of course, have absolutely none yourself. And with that nothing, you make claims that anyone who disagrees with your personal opinion (backed by anecdote) is a complete idiot.
Well, at least I don't have to revise my 1989 opinion (based on the evidence at the time) that this is a cult. Boy is it ever a cult.