Selection, yes. Natural? Do you somehow imply that technology that "wins" mass adoption is "better"? Or that it gains mass adoption based on rational analysis of its "value"? There are quite a few unnatural forces at play, here...
>Selection, yes. Natural? Do you somehow imply that technology that "wins" mass adoption is "better"?
No, just that it's more fit.
Which is the exact same thing natural selection in nature implies. An animal that spreads is not "better" -- it's just more fit for it's environment.
For an OS "more fit" can mean: faster, consuming less memory and with more control over it, usable in more situations and more hardware, cheaper to run, leveraging existing libraries, etc. It doesn't have to be "better" as in "less prone to crash", "safer" etc.
The parent mentioned SUN experimenting with a Java OS. What a joke would that be, given that SUN's experiments with a similarly needy application (a Java web browser) ended in utter failure, with a slow as molasses outcome.
Sure, it would run better with the resources we have now. But a C OS like linux also runs much better with the resources we have now -- so the gap remains.
It's not like we have exhausted the need for speed in an OS (on the contrary). It's not also like, apart from MS Windows of old, we have much problems with the core OS crashing or having security issues.
In fact, I haven't seen a kernel panic on OS X for like 3-4 years. And my Linux machines don't seem to have any kernel problems either -- except sometimes with the graphics drivers.
So, no, I don't think we're at the point where a GC OS would make sense for actual use.
ARC, an example the parent gives, fine as it might be, doesn't cover all cases in Objective-C. Tons of performance critical, lower level stuff still happens in C-land, and needs manual resource management, it's just transparent to the Cocoa level.
No, it just means so far Apple and Microsoft were not that interested into doing that. I don't count with the commercial UNIX vendors, except Sun, because those only care about C and C++, mainly.
But this is going to change slowly anyway.
Sun did experiment using Java in the Solaris kernel, they just didn't went that far due to what happen to them.
Apple first played with GC in Objective-C, and now you have ARC both in Mac OS X and iOS.
Microsoft played with Singularity, only allowed .NET applications on Windows Phone 7.x and Windows (Phone) 8 uses C++/CX, which uses a reference counting runtime (WinRT). While C is considered legacy for Microsoft (official version).
C++11 has reference counting libraries and a GC API for compiler implementers.
Android has Dalvik and the native applications are actually shared objected loaded into Dalvik. Contrary to what some people think the NDK does not allow for full development of C and C++ code, except for a few restricted system APIs.
Sailfish and Blackberry, Qt makes use of C++ with reference counting and JavaScript.
FirefoxOS and ChromeOS are anyway only browsed based.
If there is natural selection in this, it might just be that manual memory management is going to be restricted to the same set as Assembly. Interacting with hardware or for special cases of code optimization.
So a systems programming language using either GC or reference counting, with the ability of doing manual memory management inside unsafe regions, might eventually be the future.
Sadly no one in the industry decided to invest on them.