When I recently interviewed at Apple I was told my position would primarily be coding in Java, but the team would be open to other languages as the project progressed and needs changed. I asked if they would consider Swift. Then was told that Swift wasn't production ready in their eyes.
(I didn't get the job)
I guess every team is different, but this attitude to dog fooding isn't great.
I interviewed at IBM for a position making Enterprise template apps for iPad working directly with Apple, and the main reason they said they passed on me was while I had a good amount of Objective-C experience, I had only dabbled with Swift on my spare time and had difficulty answering a couple of the questions they grilled me on in Swift.
They specifically said I should have had more professional Swift experience when they rejected me. Of course all the companies I worked for were being more conservative and hadn't yet made the leap to Swift yet, so getting that professional experience was pretty much impossible. But I guess if anyone can afford to be picky, it's IBM on a project working directly with Apple.
So yeah, IBM is definitely gung-ho about Swift. Apple itself, possibly less so.
IBM advertises jobs yet when qualified people show up, they are rejected. Either it's a way to look like they are reaching out to the public to satisfy the government so they can then go hire outside the US; or it's a way to let insiders get those positions after the applying public are rejected, all IMO. When I was in there, 99.999% of the people I met came in through acquisitions, not applying for a job and getting hired.
I don't remember most of the questions. I do remember being asked about basic things regarding optionals, about how it handles exceptions (Swift 2 had been announced like a week before the interview and changed how it did exceptions entirely to try-catch blocks, and I answered that, but then they were like "okay, how about in Swift 1", and I blanked, even though I put them into the toy app I was doing).
Another was something like "In Swift, what are higher-order functions?", which I somehow only heard that term a couple of times before and didn't have fresh in my head, despite using higher-order functions in plenty of programming languages (C#, Python, Ruby, even in Objective-C). That one was a case of knowing the concept but not the term.
I believe those two questions were the main ones that lead to the rejection, because I stumbled and didn't come across as confident there, but was otherwise pretty calm and answered the other questions well. I was in and out of there in 45 minutes, talked to one person who quizzed me on the names of UI objects on a given screen, and then the pop-quiz-style interview with two other people.
I also stupidly didn't show them the app I had partially built in Swift, even though I brought an iPad with me. I think I even mentioned it, and they're like "You have it here?" and I said "It's not finished..." and they just started asking me more questions. It was a working puzzle game, but some of the logic for clearing matches was wonky at the time and was visibly broken so I was hesitant to show it.
Apple isn't telling people to go use Swift on the server, IBM is. Apple is telling people to write iOS apps in Swift, which they are also doing themselves.
>Apple isn't telling people to go use Swift on the server, IBM is.
I'm aware. This is the (humorous) point I'm attempting to make. IBM is adopting technologies faster then Apple. While Apple started it's life out as the Anti-IBM. Indeed it is still fulfilling this role. The (humorous) point being, their roles have in a way swapped.
Yes, it is interesting how the companies have evolved.
IBM lost the consumer space and is dominant in F100, while Apple won the consumer space and abandoned the enterprise.
With both IBM & Apple behind Swift the language is being positioned for success in both enterprise and consumer markets. It will be interesting to see how things evolve.
On the contrary - they've usually raced headfirst into abandoning old technologies for the hot new thing (early to have only CD and no floppy, early to get rid of CD drive altogether, early to go to USB-C, early to go to 802.11ac, etc.) They've used their weight to create a one-company network effect for up-and-coming standards and technology.
Early? All of those changes were made after long using old technologies. I wouldn't consider it "raced headfirst into abandoning", more like "everyone wanted to abandon those old technologies but couldn't survive being the first to, so Apple made the move and everyone soon followed".
I don't think you really contradicted anything the GP said, and in some cases you supported it. Apple raced headfirst into abandoning old technologies, which everyone wanted to do but couldn't easily, and was able to pull it off because of their market position and status. Others soon followed.
So,"early" not as in before expected, but during the beginning of that time (i.e. more akin to "first").
Their "weight" extends beyond their market share. As an easy example: There are many people that don't own an iPhone 7, but who also know that the iPhone 7 tossed the headphone jack.
The last time I looked, the market share leader for desktop/laptop computers held just over 20%. Apple is a top-5 computer supplier in the U.S., top-10 worldwide.
Glad you're aware of context, but apple jumping into new tech is responsible for their massive dominance of consumer computer computing which, you'll be pleased to be informed about, is mainly done on phones and tablets.
Wow, you got me there. Apple ditched the floppy because they knew they could leverage the iphone 5 years later.
And because they have a phone, they could totally ditch the cdrom, because, like, who wants to use a CD on their phone, right?
And the ethernet jack! Man, I felt stupid having to plug my phone into the wall for the internets.
But that headphone jack. Their phone totally blazed that trail, getting rid of that. Wait, what? The Macbook dropped it in 2015?
Surely though, most people that have an iphone use a mac right? That's the leverage point! Well no, only about 10% of iphone users had a mac in 2015. Even if it was 1-1, what difference would it make? There's no leverage to be had. iphones work with windows last I checked, and the growing trend is not bothering to "sync" your phone with a computer at all.
If the share was calculated by individual PC models and not by company, even in the PC business, there would be a bunch of Apple stuff ranking quite high.
So they can scale on many aspects much better than their competition.
(plus they still have margins)
Swift 3 was just released, like 2 weeks ago. Unless you interviewed very recently then I don't see the problem with that statement. If they had said swift 2 wasn't production ready, then I'd be concerned.
People always demand a stable ABI, but I've seen very few use case on enterprise software delivered to the cloud where this is even necessary.
Most people rebuild their software entirely prior to release, and deliver the entire package. Having a stable ABI doesn't do much to help this scenario.
Linux doesn't have a stable ABI, hasn't hurt its success much.
> Linux doesn't have a stable ABI, hasn't hurt its success much.
Linux does have a stable ABI. And GNU'e libc tries to have one but it breaks often enough to be a concern (even with symbol versioning). You're conflating GNU and Linux here.
More context would have been useful here, such as if this was a server-side product (even though you probably understandably can't disclose.)
Swift is great for client-side work, but I wouldn't use it on the server. Each tool good for a certain job. Our current stack is Swift on the client, Elixir on the server, and I haven't been happier.
Working on a big existing codebase is a different scenario too, since adding in Swift either means rewriting the whole thing, or convincing two very different languages to work together. And I imagine that's magnified for Apple's store code, which appears to have the unfortunate combination of receiving a metric buttload of traffic and being held together with baling twine and chewing gum.
I'd speculate that "not production ready" is at least partly rationalization or cover for "we really can't afford to screw around with this codebase, even though this is the company's Big New Thing."
It's probably a reasonably small codebase with well specified requirements, it's a standalone app already, and it's not likely to cause underlying OS performance or stability problems if it lost a little bit of performance migrating to Swift. I'm guessing it could be an experiment to see what parts of the OS are reasonably replaced with Swift?
People have also complained for several yearly releases now that OSX isn't improving as an operating system. It's also been around in various incarnations for like 18 or more years and has likely accumulated some deficiencies that needed addressing. Maybe things need some rewriting now to be positioned to become better in the future and a Swift migration is a good excuse to do both. Enough people are also complaining Apple isn't dogfooding Swift enough internally, so there you go, macOS is starting to dogfood it.
I'd also guess Apple probably has a reasonably hard time hiring for Objective-C positions to work with OS-level code (e.g. they've probably hired all of the existing, interested and qualified people willing to work in the Bay Area already), and the up and coming labor pool of people coming from iOS development is likely to be more interesting in or previously experienced with Swift. Modernizing your codebase where possible now seems like a reasonably way to hedge for the future hiring needs. Apple's been around long enough they likely have people retiring out of some critical positions.
> (I believe the Dock "app" also encompasses Mission Control, Spaces and all that sort of desktop interaction wizardry.)
It does. I think it also handles Cmd-Tab. I remember seeing a bug now and then in these things (task switching and desktop spaces etc.) which gets fixed by relaunching the Dock process.
That's interesting to here, because the rewrite apparently managed to re-implement the bug, introduced in OS X 10.8, where the Dock just randomly switches monitors for no apparent reason, every 12-24 hours, when you have 2 monitors hooked to a Mac.
(You can work around this bug by changing some of the defaults in the Mission Control pane of System Preferences... I think the "Displays have separate Spaces" setting.)
I don't think you can draw any conclusions about Apple's attitude toward dogfooding. I'm sure Swift is being heavily dogfooded there, they were just telling you that it hadn't reached the point where it could be used in the project they had in mind for you (especially if it's back end, which seems to be the case since your project was in Java).
I worked at Apple for severals years in engineering until 5 months ago leaving for a very small business which interested me more. Swift was being extensively used internally for a long time, including by me.
I interviewed with two different teams at Apple recently and they gave me the same answer. Most teams are using Objective-C and not even touching Swift. Which is pretty strange given they are shouting out of their lungs to other developers on adopting Swift.
I am not saying that this is necessarily what happens at Apple (I know nothing about it), but mind you that in big corporations there is sometimes rivalry between departments. Technologies are sometimes both tools and collateral damage.
I don't have any clients which will use any cloud solution other than half on-prem SAP and maybe Azure are the only cloud solutions my clients will come close to touching, so forgive my ignorance but which debacle is that? (Genuinely curious from ignorance, not being flippant.)
Overall I like Swift and it's certainly a massive quality of life improvement for Mac/iOS devs.
But I don't see as much upside for enterprise development. Server side tooling and deployment is still TBD, reference counting adds non-negligible cognitive overhead vs GC in an environment where the downsides of GC matter much less. They will have to build a complete ecosystem of server-side libraries from scratch. And unlike on iOS/OSX, there are already a lot of good and much more mature alternatives.
I wouldn't be unhappy if Swift became a real contender on the server but it seems unlikely to me that it will.
Meanwhile, developers are flocking to contribute to the most popular server-side Swift frameworks, like Perfect, Vapor and Kitura. I have followed the development closely, and for such a new language the enthusiasm around these projects are surprisingly big. I think the potential of server-side Swift is underestimated. Of course there's still several features lacking at this point, but that is to be expected of such a new language and will only improve. But yeah, large enterprise deployments are probably too early.
Can you elaborate on the cognitive overhead vs a GC language like golang or java? (With regards to memory)
To me sure there is things you think about but generally memory performance is less of a mental burden.
Also, swift 4 is supposed to support a rust like memory model. Not sure if it's opt in or how it will work but my impression is that it's memory model is about to support some new use cases to make it appropriate for more memory critical tasks.
Speaking only for myself, avoiding or breaking retain cycles is a noticeable problem in Swift code that simply isn't there in languages with a proper garbage collector. Among newer iOS programmers, it's a big point of confusion to figure out when a callback closure should use [weak self] versus capturing self strongly, for example. I've seen a lot of people struggle to understand, completely fail, and end up dogmatically using [weak self] everywhere whether it's needed or not (and sometimes where it's actually harmful).
Technically true, but very misleading in Rust's case. In Rust you rarely have to break retain cycles in the first place, simply because you don't use reference counting much. Things like "weak self" are not a problem, because closures aren't reference counted unless you explicitly make them so (and in fact I don't think I've ever seen a reference counted closure in practice).
The compiler can provide better error messages and guide the user, whereas with library types, at least on C++'s case it requires help from external tooling.
With Rust, while you can make use of the type system, it is harder for the compiler to provide such guidance, unless the types are somehow blessed.
What sort of guidance does Swift provide here? And nothing prevents Rust or C++ compilers from becoming aware of these types and providing specialized diagnostics for them.
My Swift experience is constrained to the GNU/Linux version, so I don't know if XCode is already doing something in this direction.
Sure one can have blessed library types that the compiler knows about, but then from language design point of view it creates a decision between those types and others written by developers, but with similar semantics.
Since you said it's easier than these other languages, I thought you were referring to how the languages work now.
There's nothing wrong with special treatment for stuff in the standard library. C and C++ already do this with many things (e.g. many compilers understand calls like memcpy and will emit optimized code based on the standard library semantics) and Swift gives special treatment to built-in types like Optional.
The distinction here is fairly blurred. The Swift implementation of weak pointers is mostly in the runtime library already, it's just that the compiler hides it a bit so you don't write the names of the calls directly.
Understood. If you have any ideas for cool diagnostics or other guides, I'd be interested in hearing. I couldn't think of anything offhand, but that's probably just my lack of imagination. I would love to see some static analysis help for finding retain cycles and recommending how to break them, but I'm not sure if that's practical.
I have never seen unnecessary reference counting be a problem in Rust. That's because not manipulating the reference count requires an explicit action—.clone()—and borrowing is so expressive that you rarely need to manipulate the counts.
Rust's borrow checker actually helps with this a lot- you can hand out references to the Rc-ed data without modifying the count, and the compiler will enforce that the count is not decremented until the references are dead. Combined with the fact that plain references are more idiomatic than Rc/Arc, this leads to pretty similar results.
Reference counting has upsides and downsides. If you look at the lengths that some Java based projects like Spark have to go to in order tame the GC, then you have to question whether tracing GC is the right choice in the age of hundereds of GBs of memory.
On the other hand reference counting is extremely slow. So slow that it puts a lot of restrictions on API design and that is definitely a mental burden. Swift's own collection interface changed massively between 2.x and 3 and I believe the main reason for the redesign was that the old design required too much reference counting. (I'm not completely sure this is an accurate reflection of their thinking though)
> On the other hand reference counting is extremely slow. So slow that it puts a lot of restrictions on API design and that is definitely a mental burden.
This is very wrong on both points. Reference counting has a lot less mental burden compared to GC, because it's predictable. You can rely on destructors to actually work and be useful. For example, you don't have to keep track of whether you closed your file descriptor or not, but in GCd languages you have to do that manually, sometimes even with the same reference counting, but hand-written and explicit. Reference counting is also very fast, with the exception of concurrent decades old implementations, that are irrelevant today.
I don't disagree on the benefits of RAII at all. I love it. It's a much cleaner and more general resource management idea than a tracing GC.
But incrementing/decrementing an atomic value in a tight loop apparently creates a performance problem that is severe enough for the Swift team to redesign their API. That wasn't decades ago:
"Code that handles indices has to perform atomic reference counting, which has significant overhead and can prevent the optimizer from making other improvements."
This is a typical Apple product to me - they obsessed over their idea of correctness and ended up with something that is difficult to use and completely alien. Meanwhile every other language offers convenience without even stopping you from doing anything else that you might need to do with Unicode...
To me, the Strings API is one of the best things about Swift. Strings are complicated and as Mike Ash says in that very article, "Swift doesn't sugar-coat it, but instead shows you the reality of what's going on. This can be difficult, but it's no more difficult than it needs to be."
I don't accept your non-acceptance of assembly as a valid analogy except I'll actually tell you the reason why - you didn't give any reasoning whatsoever.
This is a perfectly valid analogy. Visible characters in a strings are to Unicode code points as high-level functions in high-level code are to assembly commands.
Furthermore, even without using an analogy I'm quite comfortable in saying, again, that I want Strings sugarcoated because 99.9% of the time I don't give a fuck about some bullshit ivory tower corrected view of things that makes the common scenario more difficult than it needs to be.
Hilarious that IBM still thinks of the enterprise as the arbiter of quality. Consumer is now setting the defacto standards, including on things like information security.
IBM's days of having a voice in that conversation are probably more limited than they realize.
would you car to give some exampled ? My opinion is that the enterprise is a somewhat separate ecosystem. Thus "enterprise-ready" -ready is not a mark of quality, it's a mark of belonging to the enterprise ecosystem.
I'm wondering what makes swift ready for the enterprise ecosystem. It barely seems to meet the requirements of the "enthusiastic consumer" outside of Apple OS.
To be ready for the enterprise ecosystem basically means that you can sue someone if things go wrong, and you can buy support packages for enormous sums of money. IBM is saying that you can 1) buy bluemix swift from them, and 2) sue them if anything goes wrong - but since they have a great legal team, they probably got ironclad agreements ready to go.
It's not about having someone to sue it's about attracting blame.
Blame assignment isn't really about who made good decisions, or who worked hard, or any of that, it's a combination of how bad it went and who made uncommon choices:
common choice + failure = external blame (who could have seen it coming?)
uncommon choice + failure = it's your fault for doing it weird
common choice + success = great job!
uncommon choice + success = I don't trust you, but you might be a genius
Since hierarchical relationships function using blame ("responsibility"), making predictable decisions becomes more important than making good decisions at some depth of hierarchy.
This is probably the most important comment in this thread. Being able to delegate blame out to $common-choice on project failures is the difference between that CIO being able to say "hey, not my fault, I bought the Oracle platinum integration package!" to satisfy the board rather than "sorry guys, I really thought my node.js idea would work, it's all the rage with the kids! Haven't you ever heard of Slack?!"
(If this sounds strange to you, just replace "node.js" with "Rails 2" and you'll see why ASP.NET and Java's still king in any company with a market cap > 100MM && age > 20 years. It's about continuity of support, and Windows 2016 servers can still run ASP Classic and .NET 2 WebForms apps just fine.)
"Responsibility" == risk mitigation for that purchasing authority.
I thought stability of the ecosystem (i.e. stable standard library) was also a "must" for adoption by the enterprise. Never thought that a language in flux with "hosting" support from IBM is enough. But now it seems the enterprise world is eager to try the coolest thing and rewrite/refactor it after the next release as long as someone will host/run it.
Now that you enlightened me I can see the stock markets, hospitals, banks and all enterprise consumers embracing Swift and eager to upgrade once Swift 4 is released.
IBM's BlueBix et al was a direct result of the purchase of Rackspace, which was strategic buy to hedge against the growth market of Amazon and it's bazillion EC2-esque offerings. (Likely in response to seeing Amazon's success, almost strategically similar to why Azure and Google's App Service or whatever exist.)
They're still making money hand over fist in maintenance keeping the mainframes up (your state's DMV won't move off MVS/zOS as long as support for their old JCL/REXX stuff is still available, especially those 'cheap' z13s which come in at under 100k) but that's an entirely different tier of customer.
Maybe you mean softlayer. As far as I know Rackspace was recently bought out by someone else. I understand IBM wants to be cool but it doesn't make sense to promote an immature technology as enterprise ready...
really you do know the difference between a consumer product and carrier grade equipment. Apple abandoned the enterprise years ago and arguably anything larger than small boutique businesses.
I think the pitch is that despite static typing, it offers a lot of multi-paradigm high-level features while remaining elegant with fairly minimal syntax compared to other native static languages like Rust or C++. It automatically manages memory but without a stop-the-world garbage collector and goes a long way to opening a lot of flexible design options to any developer: it has great support for generic programming, object orientation, and functional programming.
The downsides? Well, I find some of its syntax choices a bit odd and inconsistent; I think the pursuit of elegance could have gone much further. And, while I celebrate its pervasive use of closures/lambdas and functional idioms in my many contexts, it lacks true functional data structures that would be necessary to really write in a proper functional style with confidence that the runtime wouldn't suffer.
Also, the implementation of its high-level features is somewhat magical and hidden. You can't depend on performance when using them until you really study them. So much so, that WWDC had a big talk about how to choose language features depending on the type of performance you need, and none of it was intuitive.
Additionally, at least on the Apple platforms, its Objective-C heritage can be a bit annoying. In particular, you can't mix and match closures with "selectors", which is a real shame and missed opportunity for flexibility and elegance.
One other downside is XCode. No refactoring support, and lots of errors that give you little help. For instance, if you're trying to implement a protocol which is itself a few layers of nested protocols, XCode will just tell you it doesn't conform and not what method you're missing, so you have to dig through and check every signature of every method carefully.
The swift 2 to 3 migration was absolutely insane, I'd estimate Xcode did maybe 25% of the work automatically and left me with hundreds of different errors to convert manually (and often they were red herrings where the actual error was not where Xcode said it was). And you have to rebuild all dependency frameworks on swift 3 as well since the binaries are incompatible. If you use Swift make sure you have 2-3 whole developer days per year to devote to just migrating to the next version in a medium sized project.
I would agree with this. The biggest issue with Swift right now is Xcode. Multiple times each day, syntax highlighting and code completion will break, even though the code is perfectly valid. The lack of refactor support after three Xcode releases and major versions of Swift is inexcusable.
Using storyboards is super frustrating due to the load times. It seems no one at Apple tests with more than one storyboard in a project.
I love going back to Ruby dev because VS Code is just such a joy to work in compared with Xcode. Xcode really struggles on my top of the line MBP with 16GB RAM.
I still love the language compared to ObjC but the tooling is abysmal. Every year we hold out hope that Xcode will magically get better but its the iTunes of IDEs at this point.
Well I don't know any experienced developer that would put more than three populated scenes in a Storyboard. I would only add more if it was something like a navigation setup (navcontroller, menu, etc) where I would simply nest a lot of empty screens and link to the real screens with storyboard references.
You can always use AppCode if you don't like Xcode.
AppCode has also become increasingly buggy over the last year or so, and is now almost as unusable as Xcode (albeit in different ways).
On my computer, recent versions of AppCode constantly use at least 100% CPU, even when it's in the background doing nothing. I got this a few times in earlier versions as well, but an "Invalidate caches and restart" usually fixed the problem. No such luck with recent versions. It will always use at least 100% when it's running.
I often get 30 seconds of beachball when I switch back to AppCode from some other app, despite having 16GB of RAM and not running any other remotely heavy app. This will happen even if I just leave AppCode for half a minute in order to check something on a web page. It's gotten so bad that I've developed the habit of checking web pages on my iPad instead of on my Mac, in order to avoid temporarily leaving AppCode.
There is a particularly infuriating bug that will sometimes cause AppCode to freeze when I type the opening parenthesis of a function invocation. When this happens, killing, restarting and typing again won't work, AppCode will freeze at the same spot. The only workaround I have found when this happens is to type my function invocation in some other editor and copy and paste to AppCode (often after waiting the usual 30 seconds for AppCode to stop beachballing).
It is not just me – all my coworkers suffer from the same problems. AppCode is by far the most annoying app I use on a daily basis. I hate it nearly – but not quite – as much as Xcode.
This is my point - I use multiple storyboards to break up the app and use storyboard references where appropriate. But loading up individual storyboards, waiting for them to render (which takes far longer now with @IBDesignable), etc is such a drain in productivity.
Swift is amazing, Xcode is crap, and I've been using Xcode and its NeXT version for almost 20 years. I work in Xcode every day and every day I want to hurl it. Apparently Apple either doesn't care or is too overwhelmed with building 4 OSs every year. To be effective for building server side apps we need a new dev environment.
> You can't depend on performance when using them until you really study them. So much so, that WWDC had a big talk about how to choose language features depending on the type of performance you need, and none of it was intuitive.
In which language does optimal performance come for free without really studying it? You have to pay attention to your case and context and use features accordingly, no matter what language you're using..
I have to stop you there. You shouldn't compare anything to C/C++ or Rust after that. You are better off comparing Swift with Go(i.e. both have automatic memory management, nice syntax, static typing, dynamic feeling etc) or Java(automatic memory management, well established in the enterprise world).
If Apple was to design a language to respond to Go or Java it would be Swift and that's it. So the pitch is that you have a nice language which encourages productivity and you can share the knowledge/code between server and client.
Well, technically Swift's memory management is the same as what C++ offers via smart pointers, the only difference is that in C++ it is opt-in (and still very popular). Swift is quite unlike Go or Java's memory management; it is not a GC in the sense that they use it.
Swift memory management isn't nearly as expressive as C++ when it comes to references (i.e. unmanaged pointers). You can use in/out/inout parameters on functions and that's about it.
I agree with the parent that Swift memory management is basically like Go or Java: everything is GC'd (reference counting being a form of GC).
Swift does have a runtime though. In its current form I can't see it replacing C or C++. It may provide more deterministic performance than Go but it's not C. If it were to be designed as C replacement I would expect a roadmap for embedded environments, praise about its mapping to the hardware, how it trashes or at least matches C/C++ on its application domain(i.e. low level/ hardware interface or performance) etc but it's clear that is not the case and its focus is on "apps" first, servers second and maybe a bit of low level.
But good Objective-C interop for working with old app code and libraries (Which comes from the C support, but with all the special stuff done to talk to Obj-C done for you)
Swift is statically typed (Elixir isn't) with a good type system (Go hasn't) and doesn't run in a VM (like Scala or Kotlin). I don't know how good its concurrency is and on how many platforms it is available though. It also looks like the language itself is still not that stable which is in my opinion a no go.
> Scala and Kotlin both have projects for AOT compilation to native code
Not sure about Kotlin (stable version of the language was only recently released; more likely a stable JS target is higher on the priority list), but Scala Native isn't production ready, will likely be at least a year from now; same goes for Scala Meta, and Dotty's even further out, minimum 2 years away.
Swift has a head start here, and a huge captive audience in legions of iOS developers -- unlikely that Scala will be able to compete on the native front directly. More likely Scala.js + React Native may draw some from the JS developer pool that want to take advantage of static types and seamless client-server interop that can be had via Scala/Scala.js today.
Sure, was just pointing out that no such native ports exist for Scala or Kotlin that are anywhere near production ready.
In fact, it appears that a Kotlin Native project exists in the sense that a blog post exists that announced a native target is something they're looking into implementing down the road -- hardly a project ;-) Scala Native's been under heavy development for over a year now[1], and even then it's a long way off.
Yep, and as a result of this, the JDK ecosystem is healthy as ever. N.b to those who have only written web or phone apps for a living -- GC latency isn't 'bad' in most cases - unpredictable latency is what's bad. Having a deterministic upper-bound is what is important in the enterprise, on your aviation component regulating cabin pressure, on your SWIFT bank transfer.
(N.b. around the time that paper was published, I was in HFT. We used Azul C4 and it just fine to crush even with management taking their typical 30/3 fees. Though to be fair, this was when the easy pickin's were still there to be plucked. Don't underestimate how well good-ol' free Hotspot is now; we're not talking 1996 Jakarta days with Swing)
One of the reasons I always argue for GC is that I used Native Oberon, and also collect papers related to Xerox PARC and DEC research.
Modern hardware would be a dream to any of those researchers doing systems programming in GC enabled languages.
I only consider the biggest flaws of Java not adopting AOT from the beginning and the lack of value types. Now the JDK ecosystem needs to wait until Java 10, probably around 2020 or later, to get them.
It is backed by Apple and will become the de facto language for writing iOS (macOS too?) applications, potentially attracting a healthy pool of developers and thus reducing costs which in the end is all that matters in non high integrity software.
For IBM, part of the lure probably is "the enemy of my enemy is my friend".
It's not running on a VM that Oracle or Microsoft control, and not a language Google controls.
Yes, it is Apple's language, but IBM apparently thinks it will become more popular than one they would control on their own, and picking Apple's language then is, for them, the least risky, as Apple isn't a company focusing on servers.
Other than Swift being a great language this is certainly at the top of the list. If not on a management level then on a developer POV level. Plus in some way they associate Apple with themselves, they become (in some weird way) the Apple of enterprise.
It allows enterprises to use their current iOS and macOS developers to do Web Application and server side applications. For developers it gives them additional ecosystems to use their skills on without having to learn a new language from scratch.
1. It doesn't have a GC. While the JVM's GC is decent, it can definitely cause latency spikes.
2. It has value types. Go has this, but Scala and Kotlin mean running on the JVM which means leaving memory locality behind. In fact, Swift allows you to select whether you want an object that contains pointers to things or a struct. Go gives you structs and Scala and Kotlin give you pointer-based objects.
If all you care about is syntax, Swift might not be that interesting to you. However, you get options in terms of how the program runs that the other languages you've mentioned don't give you. You can't decide that you want to create an object in Scala or Kotlin that is contiguous in memory. You can't decide you'd rather not have GC pauses in Go.
In many ways, Swift brought together the kind of niceties that have been available before. Generics aren't something new, value types aren't new, objects aren't new, first-class functions aren't new, ARC isn't new, non-nullable variables aren't new, closures aren't new, type inference isn't new. . . But none of those languages have all of those. Go took a hard stance against generics (I'm not going to go into why). The JVM doesn't really do value types. GC has been the standard for most new languages for a while.
> While the JVM's GC is decent, it can definitely cause latency spikes.
Which one?
Java is like C and C++, there are plenty of JDKs to chose from, each it is own set of implementation behaviors, for the different types of customer pocket sizes.
I don't think it's fair to call Swift an OOP language. It is certainly multi-paradigm, but provides a lot of facilities for functional programming as well. If you are not working directly with Cocoa, you can do a lot of interesting stuff with structs and protocols.
It appeals to me as a server-side language, and I've never done any iOS work. The reasons it appeals:
- Static types
- Good type system
- Lots of good support for immutability
- Apple is heavily invested in Swift, and very responsive to the community
It has some downsides. I'd evaluate Swift against Clojure and Elixir (which both have good, though quite different, concurrency stories), and maybe Haskell (but that would be tough from a buy-in perspective). But evaluating it against Python (which my team currently uses), it's hard to see a downside; we chose Python for Django and DRF, but we're increasingly not in love with that choice.
This reminds me of the big announcement of IBM partnering with Apple for iPad in the enterprise. I wonder how that worked out?
I am a little sceptical of IBM's business plan, all in on Watson, etc., but I have used their Bluemix hosting services and it is fairly nice. BlueMix is has a great free tier level' you can run one or two small web services or apps for free. It is a good idea supporting Swift in the server.
There are plenty of enterprises silently chugging away with Cognos as their BI platform. 'IBM MobileFirst' (i.e. some guys basically make a 'custom' dashboard for your TM1 instance that C-levels can use on their iPhone) probably broke a profit by doing what they do best. (I.e. severely overcharging companies for functionality, in exchange for the purchasing authority having the ability to say "hey I bought IBM"). Cognos still is silently huge (TIBCO and Informatica levels of huge) and considering how the corporate world went from BlackBerry to iPhone, it's a safe bet that it made a profit.
Side-note: There was an article about the imminent demise of Oracle, but residual support contracts and the need for CIOs to keep renewing those contracts for a few hundred k a quarter is a cheap buy as insurance in the same way you go to PricewaterCooper and not JoeSchmo to get your books audited. (If/when PWC messes up, the board will overlook your failure.)
I cannot take this seriously. Even minor versions of Swift have breaking language changes and depredations. That tells me precisely it is too early for the enterprise.
This should be greatly reduced now that Swift 3 is out. A big part of Swift 3 was getting all of the breaking changes over with so that they can maintain source compatibility from here. There will likely be some small breaking changes over the next year or so as they nail down bits they missed, but it'll be a lot less chaotic than what we saw with Swifts 1 and 2. I realize that "greatly reduced" may not be good enough for you, though. Perhaps next year even that will be done.
I love all these zippy marketing keywords: "digital transformation" and "digital experiences". IBM is trying hard to get developer attention. They have a hard battle ahead of them to compete against AWS, Google Cloud and Azure. That's what this really all comes down to, pushing their hosting services.
I don't see how your second sentence follows from the first one? It's true that Swift is heavily type safe but that makes it appealing to me. I actually really enjoy writing Swift.
Would you rather it were dynamic so you could write less code or something?
I should have specified that it wasnt an attempt to put down the language, as there are many, MANY caveats... only wanted to point out how writing it almost makes me feel unclean somehow hah
C++11 with review-enforced RAII. Boost if you need special stuff involving odd data structures. gtest/gmock. Depending on what you're doing, add some Qt5, OpenCV, OpenMP, RapidXML/RapidJSON, any C library ever written - and there are BSD/LGPL libraries for pretty much anything. Make plugins with Lua. Run tests in Valgrind.
I'm not sure I would choose something over the latest version, primarily because of how little a time investment using it can be... but then again it all comes down to what the task at hand is doesn't it ;P
Swift can't do realtime. It has automatic memory management. How do you replace C or C++ with Swift for that? What they mean is that Swift replaces Objective-C on user facing apps(on Apple devices) but I doubt you will see the critical parts of the OS (kernel, drivers, etc) written in Swift.
It's almost like saying javascript will replace C/C++. I want a nice language like Swift (or Go) to replace C but there is none. Rust was the best/latest failed attempt.
You can essentially write C in Swift by using UnsafePointer and such. If you avoid using reference types then you won't get any automatic memory management. I don't think it's quite ready to replace C as a kernel or driver language, but it's not that far off.
Is there any open source library that does that? I know in theory many things are possible but it would be great to actually test it/see it in action. I never thought of Swift as a language with zero cost abstractions and deterministic performance. If it works I think it really gives it a leg up.
Or peruse the standard library documentation for UnsafePointer, UnsafeRawPointer, UnsafeMutablePointer, and UnsafeRawMutablePointer. These are built-in types that the compiler knows about, and code that uses them should compile down to essentially the same stuff as C pointer code would.
Can you predict what your C compiler with going to do with your code, taking into account UB and compiler specific implementation behaviors not specified by ANSI C?
More than with Swift. I routinely see 100x - 1000x and more performance difference between -O0 and -O in Swift. Considering that the optimiser doesn't give warnings or errors if it can't apply optimisations, that's out of bounds for me for a systems programming language. YMMV.
The whole UB idiocy is a different matter, though related because it's perpetrated by roughly the same group of people, for similar nonsensical and non-validated reasons. See my post http://blog.metaobject.com/2014/04/cc-osmartass.html
> More than with Swift. I routinely see 100x - 1000x and more performance difference between -O0 and -O in Swift. Considering that the optimiser doesn't give warnings or errors if it can't apply optimisations, that's out of bounds for me for a systems programming language. YMMV.
Improvements required for the toolchain in a young language, than Apple, IBM and others will certainly improve.
In '95 I remember going to an excellent Smalltalk course. A year later that lecturer was working for IBM on Java. IBM '96 also bought and then shifted an excellent Software Configuration Management System by OTI towards Java (aka. Eclipse). IBM contributed a huge part of Java networking code too.
Java seems to be leaving a gap for various technical reasons. Java also may be seen by some as having strategic challenges under the Oracle stewardship. Go, Swift, Node and others are currently trying to exploit it.
Java took off not the least due to IBM stepping into the SUN court. Now IBM steps into the Apple court...
That's a good point, but do you think that things have a chance of playing out similarly this time around? I get the feeling that web tech was more of an open question at that point, so people would look to a group like IBM to set the standard. I'm not so sure that that's the case anymore (and I make my living building within the IBM ecosystem, so I wish it were so).
What exactly do you need in a server-side language these days? Most of the interaction is done on the client, in javascript. Your web service should just be responsible for handling basic request/response, CRUD operations and delegating to other more specialized systems when the need arises, which can be written in the most appropriate language given the problem domain and tools/libraries available. More often than not what dictates language choice is supporting libraries that you'd rather not rewrite from scratch. How good are they, how long have they been around? etc...
For that, its often just easiest to use anything non-blocking to serve as the main handler that communicates to your other services. For me, I use node but there are other choices if you don't like javascript.
Most of the time it's not worthwhile to switch to a language just because IBM deems it suitable. I've picked up more than a few freelance gigs where IBM have botched the job and grossly overcharged for something that could be done simply. My clients appreciate my honesty, openness, and simplicity of work.
What they really mean when they say "Swift is ready for the enterprise" is that THEY are ready --- they have developed all the sales materials, support orgs, and hired the "correct" developers (in this case Swift ones) to run their gambit on as big of fish as they can.
Swift is good for enterprises that standardized on Macs and ios devices, but there aren't many of those (yet).
I'm pretty sure Swift isn't ready on Windows, which is a shame because most "enterprises" standardized on Microsoft Windows and Active Directory.
Go (golang) has better support for a wider variety of platforms, but it has serious bugs on non-Linux platforms (as of go 1.7.1) that make it unsuitable for the enterprise. Basically, whatever customized Linux distro used at Google will likely get the most time from superstars on the go team. Also, calling C functions and callbacks from C can cause all sorts of issues, and the "cgo isn't go" mantra scares me when considering Windows desktops, etc. as a target platform.
C++ is making a comeback and I like C++14, but backwards compatibility and other factors make it too complex compared to other languages. But at least it supports all the major platforms and there's no worry it'll drop support for one due to vendor politics.
Rust looks promising from a cross-platform perspective like C++, but Rust is still way too new for the enterprise just like Swift. Perhaps in five years, it'll stand a chance in the enterprise (as much as C++) and without the complex baggage from inheriting decades of backward compatibility.
Who knows what'll happen by the time Swift or Rust is considered enterprise ready and truly cross platform -- maybe in 5 years, there will be a cross-platform native compiled F# that doesn't heavily favor one vendor's platform or nim might skyrocket out of obscurity -- who knows, but its fun to imagine and do some coding to make the future brighter.
Enterprises value stability over all else. Given the rather large changes to Swift 3, I don't see Swift as being "Enterprise Ready" just yet. That would require someone committing to maintain and patch Swift 3 for at least the next five, but preferably ten years.
Well, that's just not true IMHO. They need the ABI stability or I wouldn't do anything for enterprise. Also, Foundation getting open sourced is a huge part of this to avoid using third party libs for a huge amount of things (HTTP requests for example).
I see a bunch of stuff in their github (https://github.com/ibm-swift), but not the language implementation itself. Does anyone know what its license is?
Edit: My mistake. I read "IBM’s official introduction of IBM Bluemix Runtime for Swift" as IBM writing their own Swift implementation like they did for Java a while back.
I think IBM might be moving Java/Websphere technical staff to Swift. Considering new versions Java/JavaEE are getting delayed so IBM would prefer to use Swift increasingly instead of Java.
Indeed. IBM is also investing a lot in Go by porting it to their hardware and OS primarily for blockchain stuff.
I think it is more of thing that Swift has more interesting Client/Server/Mobile story than Java. Given that I do not see why IBM would not cut investments in Java.
Runs fine on the upcoming (almost-released) FreeBSD 11.0, Ubuntu & other Linuxes as well as Android. Not sure about Windows support, but people are working on that.
I wouldn't call it a "port", but basically the Ubuntu binaries are universal enough that you can unpack the .deb, symlink a few things to match Ubuntu naming and run the binaries on Arch.
I have serious reservations about IBM's involvement in Swift. I spoke with someone very high up in the Sun organization once who said IBM was the worst thing to happen to the JDK. Which is saying something since IBM wrote most of the JDK.
IBM seems to make a mess of everything they touch. I really wish Apple had taken a "Thanks but no thanks" stance when approached by IBM.
Swift 3 is the version with large source change. It required a lot of refactoring. Fortunately, many of the fixes were handled automatically by Xcode. I'm keeping a list of changes:
Any breaking changes going forward should be much less. Of course, you might be in the camp that wants zero breaking changes, then Swift probably isn't for you.
Do you really think Swift 3 is ready for enterprise usage? They say the core lib is not ready yet for production usage[0]. It's not about how may breaking changes I'm willing to take but in the enterprise world there is little incentive to rewrite large code bases. Even in the start-up environment the more code you write the less excited you are about breaking changes. Also Xcode doesn't run on Linux or windows and last time I've checked MacOS was not the the OS of choice in the enterprise world so Xcode is not an option. Next claim I'm expecting is that Swift is ready for the InSight Mars lander mission.
I haven't used Swift on the server yet, so I can't honestly say. I'll give it a try soon.
However, it's probably best for devs who are already using it for apps (code reuse, experience, etc). For other dev, I might try Elixir, for example, for web dev. It is gaining a lot of momentum. A critical mass in adoption means better support, quicker answers to questions, etc. We all can't pay IBM to worry about the details.
I see.. though it(swift 3) was just released. I didn't know the enterprise world changed so much(i.e. adopting a technology immediately after release).
The core/std library `is not yet ready for production use, and in fact are still in the earliest stages of development.`
I also heard Swift 4 is likely to have breaking changes. (ABI) stability will start only on Swift 4 too.
But what do I know...? Swift is great and everyone should abandon whatever obsolete language/tools they have in favour of this shinning thing because it has a nicer syntax.
I have access to books for Swift 1.0/2.0 as some has given them to me for free. How much of it is useful when it comes to breaking changes that came with 3.0 ?
If you don't already know Swift you'll be wasting your time. Swift 3 looks and behaves very differently from Swift 1. Fortunately, the Apple Swift book is free, up to date, and written well.
Given the way the "enterprise" works I would guess that their clients caught on to the high level of BS and low level of value of previous "technologies". In true enterprise fashion they have now found a new shiny marketing opportunity and only recently put the finishing touches on the sales presentations ;)
(I didn't get the job)
I guess every team is different, but this attitude to dog fooding isn't great.