Is Linux a repeatable phenomenon? Is there a college student today whose hobby OS can snowball into such a dominant technology? If Linux started in 2011, I wonder what design decisions would have Linus made differently.
The Innovator's Dilemma suggests that a disruptive technology replace a dominant standard by working bottom-up: specialize in a corner of the market that is too small or unprofitable to be of interest to the dominant player, then add "cheap but good enough" features.
The economics of Linux might bend some of the Innovator's Dilemma assumptions, but it does seem like Linux is losing its focus as it tries to support servers and desktops and embedded devices. Perhaps a smaller, less capable kernel could capture some super-low-end devices (like cheap mobile devices or home automation).
I guess Linux will be hard to repeat, exactly because it is so successful. There's simply no similar itch left to scratch for hobbyists. There are still people writing general operating system kernels from scratch, but they are not driven by the lack of a decent free general operating system.
Linux lacks in real-time features. That could be an angle to get in a new operating systems.
There's a saying that Unix stopped serious research into operating systems. Whenever somebody writes a new system nowadays, one of the first things they port is the Unix infrastructure.
The Innovator's Dilemma suggests that a disruptive technology replace a dominant standard by working bottom-up: specialize in a corner of the market that is too small or unprofitable to be of interest to the dominant player, then add "cheap but good enough" features.
The economics of Linux might bend some of the Innovator's Dilemma assumptions, but it does seem like Linux is losing its focus as it tries to support servers and desktops and embedded devices. Perhaps a smaller, less capable kernel could capture some super-low-end devices (like cheap mobile devices or home automation).