> In addition, for several years now, Microsoft's WSL developers have been working on mapping Linux API calls to Windows and vice-versa. A lot of the work needed for Windows apps to run without modification on Linux has already been done.
This argument makes little sense: that "vice versa" is wholly unsubstantiated and without it the rest collapses. Making Linux binaries work on NT (which already has a subsystem concept; Interix proved that this could be done) in a way where they interact little with the Windows desktop, and making not just NT binaries but Win32 binaries work smoothly on Linux, are almost entirely unrelated problems.
It is true that MS owns the code necessary to make this happen, that WINE is an existence proof that it can be done shockingly well even without that code, and that there is both code and expertise at MS for stuffing Linux concepts and NT concepts in the same kernel. But that's about it.
On the other hand, Microsoft has been trying for years to get developers off win32 and onto .NET. Windows S mode was an experiment where they didn't support win32 at all. Much of .NET is already open source. One could argue they are ready to make a .NET shell around a linux kernel and ship MS Linux.
But ... why would they? The only reasoning I can see is if they could stop maintaining the windows kernel, like how the move to chromium means they have one less thing to maintain. But they have to keep maintaining it for all those enterprise windows deployments which are tied hard to win32 apps. So I think there's nothing here but clickbait. And yes, I clicked.
winRT (or whatever it’s called now) is not .NET. It’s based upon COM and has full support for native code. .NET is designed with native COM support so it’s use is seamless there.
Just that a database server is a very specific thing. Own memory handling, own file system handling, own scheduling, ... . The core of these systems give a shit about the operating system.
It's even more so true. The complexity of kernel calls is much higher when custom memory and scheduler involved I'd say. Getting Office running is probably rather simple in comparison.
MSSQL is basically a small OS, it uses very few system calls, around 50 if I remember the Microsoft presentation correctly. That’s not to say that porting was easy, but rely very little on the underlaying OS. Porting Office is most likely much more work, just to get the GUI working.
I believe TrueType fonts are still rendered in kernel mode. (They have a policy option for blocking untrusted fonts, because of course that parser has bugs.)
Although the rendering may still be done in kernel mode the parsing is now done in a usermode sandbox[1] which significantly reduces risks from handling untrusted fonts.
No argument that a desktop environment and a database server are different things. It'd be hard to judge which is a more complex task.
Even still, read the story about how they built an OS abstraction layer on top of Windows and Linux and then ported their product to it. That was an impressive feat and an perfect example of Microsoft doing something (maybe only this one thing) right.
Unlike Office on Mac which was a piece of shit for very many years and a very different product than Office on Windows.
Oh yeah, that's a pretty good point. There's no GUI involved, though...
but another good example is Internet Explorer for UNIX, which used a third-party WINE-like product from the mid-'90s. https://en.m.wikipedia.org/wiki/Internet_Explorer_for_UNIX But even then, that was with source access to IE.
I definitely believe MS has the skills to make a binary-level implementation of Win32 on Linux happen. I just don't think it will be easy.
Yeah the article is just plain wrong about WSL being relevant here. WSL is running Linux binaries on an NT kernel. This article wants to run Win32 binaries on a Linux kernel. Totally different problems. I like WSL, but it's not part of the solution here.
(Also WSL's IO performance problems suggest this kind of emulation approach has significant costs.)
I totally agree with your conclusion but I wanted to comment about the IO performance remark. If you take an IO-heavy application and compile it natively on both platforms you will find that IO performance on Linux is dramaticly faster than windows(1) (and OSX for that matter). For certain workloads, I could see a Windows application running faster under emulation on Linux than natively.
1) This is maybe not true for big databases since those are optimized to basically bypass all of the Windows filesystem.
Is there any particular reason why Linux IO is dramatically faster? NTFS is polished FS, Fat32 is extremely simple (may be unreliable but that's not the point) and both should be performant enough. Also there's new ReFS. May be there's some stupid bug like Windows Defender checking file for viruses every write() call which could be easily fixed?
It's the general overhead and speed of syscalls for reading and writing files. It's very noticeable in software like git which works by reading and writing hundreds of small chunk files to disk. This is a well known fact and there's no way the team responsible for Windows filesystems is unaware. I would be shocked if it were possible to fix with a small or non-breaking change.
Isn't it a programming model thing? There's no Linux version of IOCP, and my impression is Windows apps that do use IOCP get good performance—but apps using cross-platform APIs or ported from UNIX tend not to have an IOCP-shaped IO layer and just use blocking calls, which are less optimized on Windows.
Maybe I'm confusing this with subprocess creation speed (which also hurt git back when lots of tools were shell or Perl scripts) or network performance (many years ago, my boss and I found that Apache on Linux on VMware Workstation on Windows with bridged networking performed better than Apache directly on Windows, presumably because it's using select() or something)?
I read previously on HN that it has something to do with NTFS's MFT structures... particularly when large number of small files are involved. I don't think FAT32 suffers from the same problem but then again, it isn't widely used on desktops or servers any more.
It's certainly also ACL overkill. They lookup ACL inheritance by default on every single file call, unless you turn it off in your directory. That's uncached and recursive up to the root, extremely slow.
Thanks for that info! It seems particularly noticeable in WSL when compiling large packages; all the dependency calculations are super slow compared to native Linux. I'd naively thought it had something to do with Windows Defender overhead but your explanation makes more sense.
NT has the capabilities to run different API "Personalities". This was pretty major. WIN32 was only one of the APIs NT supported. IBM also did the same with Workplace OS, but it failed.
NT could have been a lot of things, if the competitive landscape was different.
From Wikipedia:
A main design goal of NT was hardware and software portability. Various versions of NT family operating systems have been released for a variety of processor architectures, initially IA-32, MIPS, and DEC Alpha, with PowerPC, Itanium, x86-64 and ARM supported in later releases. The idea was to have a common code base with a custom Hardware Abstraction Layer (HAL) for each platform. However, support for MIPS, Alpha, and PowerPC was later dropped in Windows 2000. Broad software compatibility was achieved with support for several API "personalities", including Windows API, POSIX,[11] and OS/2 APIs[12] – the latter two were phased out starting with Windows XP.[13] Partial MS-DOS compatibility was achieved via an integrated DOS Virtual Machine – although this feature is being phased out in the x86-64 architecture.[14] NT supported per-object (file, function, and role) access control lists allowing a rich set of security permissions to be applied to systems and services.
Yup, that's what I meant by "subsystems" (which is NT's own term; not sure why Wikipedia calls them "personalities," and Linux's personality(2) syscall is much more limited).
In any case, that support is in the wrong direction. The article proposes dropping NT and moving to Linux, which doesn't have subsystem support at all. Even if it did, the existing WSL code isn't helpful.
Thea reason why the term Personalities was used is because this is what the industry called this capability, back in the day. It was a huuuge turf war, and the capabilities enabled incredible things. It didn't really pan out however.
Here's a dopey idea: What if Microsoft open-sourced the Windows NT kernel? (While retaining some proprietary drivers, etc..)
If handled well, a community would blossom around "NT," with multiple "unofficial" distros (maybe one with full POSIX compatibility), and lots and lots of happy developers. If things went really well, they might even achieve the resources necessary to create Windows Phone 2.0.
Microsoft could still make money from consumer/enterprise support plans (i.e. AppleCare), and via commissions on sales in the Windows app store. And an open-source Windows could lead to growth as a cloud OS, driving revenue for MS Azure.
Open sourcing the Windows kernel would be the ultimate culmination of Microsoft's turnaround, and IMHO a fair penance for evil deeds past. Nothing could do more to invigorate the open-source community. I would be so delighted I might even start to use Bing! It's never going to happen, but it is a very pleasant dream. Far better than "One Kernel to Rule them All!"
Amazon and Google are likely to make cloud-ready NT distros, cutting off Microsoft's last cash lifeline on rival clouds. So it would have significant costs.
On the other hand, Flash once had API dominance and lost it as they (fortunately!) abdicated to HTML5. MS is now extremely aware that Win32 isn't the future - they even cut the OS into pieces and reorg'd NT under us (Azure.)
.NET Core is the way forward for the company. I could totally see us releasing "NT Core" without the Win32 userland, and WSL, Modern and .NET Core as the official personalities.
We could even release the shell that way. But it would pull a bunch of developers away from new scenarios to put onto an ever-shrinking desktop market.
Most likely we'll just see all new products become cross-platform and the execs will wait until a new "iPhone moment" comes along to get ahead with consumer OS.
But it would pull a bunch of developers away from new scenarios to put onto an ever-shrinking desktop market. Most likely we'll just see all new products become cross-platform and the execs will wait until a new "iPhone moment" comes along to get ahead with consumer OS.
I'm still fascinated by that. The iPad Pro and the Chromebook have taken away a lot of windows' traditional market. But ... you look at how people are using them, and they're using them as laptops. Why couldn't windows take that place?
The logical answer is: because of all the legacy. Windows is too easy to break (even microsoft can't upgrade it without breaking it), and too hard to use, and it needs major investment to fix those problems. So, why stop investing in windows when it's the lack of investment that makes it unable to compete? I'm still struggling to understand that one.
It seems microsoft's management has concluded they can't and won't compete after windows 8 flopped, and they'll just stretch out the decline of windows as long as they can and hope to catch the next wave. I'm sure google and apple love them for doing that, but I still don't quite understand it.
Whats your logic behind that? Current phones with 10gb of ram and 8 cores already surpass average computers. Its only a matter of right user interface.
It's not about the hardware (who am I kidding there, of course it is... ARM SoC's, blech). It's about the software side of the equation. Maybe Purism can pull this off, but Android definitely can't (and not for lack of trying, see Samsung DeX), and iOS is going the other way, with macOS getting more iOS-like over time.
at some point we're going to have to converge our operating systems so there's no differentiation on desktop or mobile, and usable for both developers or phone users.
This is commonly suggested and I don't even think from a business revenue standpoint it would be a big deal for them. The biggest issue is that it is a 20 year code base with 20 year old bugs and open sourcing the whole thing would immediately generate open season on Windows boxes from a security standpoint.
You'll see Microsoft open source new things as they go, and probably smaller components as they rework them, but I doubt you'll ever see a truly fully open source Windows, just because of the amount of work involved.
Security researchers don't strictly need sources. An experienced reverse-engineer could read asm like you would read C. So for some people Windows has been open sourced in some way for a long time. They don't apply obfuscation techniques. I don't think that there will be a lot of bad bugs.
A very good point. But if that were the only impediment, I suspect Microsoft could come up with a rollout plan to mitigate (though not completely eliminate) this problem. Releasing a "preview" of the kernel source to trusted security researchers and academics might be a start.
Believe it or not, trusted security researchers and academics already can request parts of the Windows source code, under non disclosure agreements, of course.
But it's one of the largest codebases in the world, AFAIK, and it's immensely complicated. It's hard to quantify just how much work would need to be done to verify it was even marginally safe for release. And there's tons of licensing related issues as well, as far as where Microsoft may have gotten some of the code inside Windows.
It's not impossible, but it's a big gamble, and Microsoft is not a company that gambles big. It's not really in their culture to do what you're suggesting. I would be thrilled if they did, but I'd be jaw-droppingly shocked if it did.
I think that underestimates how truly difficult it would be. How many machines are out there in the world still running NT4? Identifying bugs is all well and good but ensuring the fixes go where they're needed... that would be a whole lot more work.
Even shared source would be great. You can see the code, you can audit it for security or privacy issues, disable parts you don't like and distribute modified versions to people who already have a licensed copy of Windows.
That won't dry up Microsoft's revenue stream, and is a great middle ground between open and closed source.
Thats been true since early 2000s. Microsoft shares the Windows source code since over a decade, I don't understand why some claim it's not available and entirely closed down.
It was 2002 when I first read some of the Windows code via official Microsoft channels (Shared Source license) to unterstand some of the quirky behaviors.
Can you give me a link where I can read the entire Windows 10 source code, make changes I want, and create a build, and redistribute it to people who already have a license?
I only came up with https://www.microsoft.com/en-us/sharedsource/ which has specific categories like enterprise, government security, OEMs, etc. I'm looking for something available to everyone.
There would be nothing to stop anyone else launching their own app stores, and other cloud vendors would immediately be better able to compete with Azure, so both those streams of revenue would be dramatically weakened, not strengthened.
Enterprise support contracts are mainly about OS upgrades and patches. That would probably stay the same for MS, they would dominate in the same way Red Hat dominates the Linux enterprise. I don’t think this would be strengthened in any way though. It would be pros and cons.
I suspect not legally possible. There is a lot of code in the NT kernel from various other companies that at least at one point was under NDA, and a lot probably still is.
It was a free download for decades before being open sourced. From what I heard, it increased support contract sales quite a bit, letting companies feel that that weren't necessarily tied to Sun for Solaris, but knowing that Sun was the best outfit to support it.
> I bet no one that downloaded it has spent a dime to Sun.
You would lose that bet hard. They made their money on hardware sales and support contracts.
A family member of mine was the head of Federal Sales for Sun at the time.
> It was a free download for decades before being open sourced.
And it was always the support contracts even more than the hardware sales that kept them going. Starting to nickel and dime for software licenses too wasn't an option on the table for them.
I don't think you need an open source kernel for POSIX compatibility, afair NT allows different OS "personalities" to coexist and you could write a POSIX compatible one as a kernel module.
But I don't think that has more interest than a Linux compatible subsystem, which they already have.
A desktop is more than just drawing pixels, the frameworks also count, so far GNOME and KDE APIs still fail short as full stack frameworks across distributions.
I'd like to use linux, and install it from time to time, but am always eventually worn down by problems, some trivial & some less so, that could mostly be solved, but at the cost of research & fiddling time I'm not interested in spending.
The issues largely fall into two categories - missing software, and missing or undercooked hardware support. If Microsoft did signal to the market an increased long-term support for Linux with a concomitant warning about Windows' longevity, I suspect these issues would be mitigated. Bring it on.
I'm in the same boat and have the same thoughts. I'm essentially trapped on OSX, and have been for a decade - sure, with enough pain and effort I might be able to kinda-sorta get by on linux, but in practise the barrier is just too high. And windows has never really even been an option until fairly recently.
I'd love to see real competition in the OS space for people who need both a decent app ecosystem and open source developer tools.
> I'm essentially trapped on OSX, and have been for a decade
Me too until this May, when it was past time for my aging MB Pro to retire. I really couldn't justify a new one given I don't even like them any more (the Touchbar was the coup de grace).
Hence a Dell XPS 15, running Windows 10. I won't pretend the latter is anywhere near as good as OSX (less stable, less consistent UI, generally lower quality apps), but it does manage all the hardware well & gets the job done. WSL makes it livable-with.
I'd prefer linux though for its window manager choices, single-rooted filesystem, lack of nags & faster file io (among other things).
> I'd love to see real competition in the OS space for people who need both a decent app ecosystem and open source developer tools.
I've never seriously considered working primarily in a VM. For performance reasons, I suppose, but given how far the VM scene has come, that may well be an out of date prejudice. Maybe Linux under Windows (it would have to be that way round) would work. Worth considering, yes.
You'll get snapshots with VM. That's awesome. While you could setup them with Linux (not easy, AFAIK, but doable with some tinkering), for Windows it's a game changer. You could install anything and just roll back. Don't like that Windows update? Roll back.
I don't think there's a developer edition of the 9570 model (there's no linux driver for the fingerprint scanner for example), and in any case we rarely get the Dell developer editions here in Australia.
I am certainly saying that a fresh install of Ubuntu (and a reasonable attempt to get over the initial issues eg. using a respin available on github) has too many problems for me to either live with or spend time investigating. Others may have more tolerance for fiddling around with OSs, but I've done my dash with that kind of stuff.
I think it's a bit contradictory to choose an insecure OS because a more secure one doesn't support a nice-to-have piece of security hardware. That aside, I'd be interested in the specifics of the "too many problems" that you anticipate on supported hardware (which includes most configurations of the XPS and Precision).
> I think it's a bit contradictory to choose an insecure OS because a more secure one doesn't support a nice-to-have piece of security hardware.
Maybe. But that has nothing to do with what I wrote.
> I'd be interested in the specifics
Would you? Or would you like to gather factoids in pursuit of your belief that everyone must make the choices you make?
> that you anticipate on supported hardware
Not "anticipate". Experienced, after installation (more than once). It was more trouble than it was worth to me.
> on supported hardware
As I say, I don't think there's a developer edition of the 9570. There isn't an extant linux driver for the fingerprint scanner, which surely there would be if Dell preinstalled linux on this model.
I don't see obvious evidence of 'curiosity' in your replies - they seem more like someone who roughly throws all statements about linux into assumed "linux rocks" vs "linux sucks" buckets. If I've misinterpreted you, I'm sorry for my part in that. You've clearly misinterpreted what I've written, and to the extent that's due to my lack of clarity, I'm also sorry about that. But in any case I'm a linux admirer - I doubt I've had a work day in the last decade that hasn't involved using linux on servers.
As I thought I had made clear, the issues I found with my laptop Ubuntu installations are probably largely fixable with reading & research. There's a fair bit of information about them in (for example) the Gentoo and Arch linux doco sites. But I'm not looking for answers because that's not how I choose to spend my time. There's enough else to deal with in life, and I am no longer an OS hobbyist who does this stuff for fun.
Have you guys tried Linux on the desktop recently?
There's a host of folks in my org running Ubuntu on Dell XPS laptops. Issues do arise, but they're pretty rare. I made the switch from Mac almost two years ago and it's been great.
Yep, on the XPS this year (the last time a new Ubuntu install a couple of weeks ago). Many problems. Some fixable, but a poorer use of my time than running or sailing or any number of three-dimensional activities I choose over screen flatland when not working.
I'm not going to pretend there aren't issues, but subjectively it feels to me like there are few. The UI is decent and usable.
For me the dealing with the few issues that arise is a better use of my time than wrestling with things like Docker on Mac, homebrew, ancient python, and weird VPN issues that seem to pop up with every upgrade.
No doubt every case is different, so there's no contradiction here. If it's efficient for you, great. Ubuntu wasn't worth it for me. The UI is fine, but there's too much that just doesn't work, or works poorly, and I'm not going to do the research involved to overcome these things.
I don't really believe Microsoft is going to do a linux as a consumer-grade replacement for Windows, but if it did, I suspect the issues I have would go away. Hardware vendors would write drivers, missing software pieces would be filled in, etc.
[Edit: I committed the cardinal sin - or is it a common convention? - of commenting on an article before reading it. A mistake in this case - I realise now the article is just silly speculative hand-waving]
I still cannot to this day get pip working correctly on my mac -- it is eternally damned to never run python correctly, and if it wasn't for rvm and shell magic, the same would be true for ruby.
I have an evaluation DeV Edition XPS 13 to try out.
So far I’ve tried it out with Ubuntu 18.04, 18.10, Fedora and Intel Clear Linux. There’s not much set up involved in getting any of these installed and working.
The only issues I've seen have been, the trackpad’s a bit jumpy and I still don’t have a HiDPI and 1440 mix working nicely together under Gnome.
Currently I’m trying out Arch which is great as a learning tool.
I still have an old Mac for day to day work but I’ll be permanently switching to Linux soon I think.
The question is are they just "changing OS" issues or are they specific to Ubuntu/Linux distros? Changing from Win7/8 to Win10 was a massive headache (I don't use it daily, just admin for friends and family).
For Ubuntu on XPS 15 there are a couple of specific issues. Gnome has a weird UI bug that causes input to hang. This manifests in Wayland. The fix requires an entire re-architecture of GDB which isn't going to happen for gnome 4.x.
Works fine on X11, which is why I think Ubuntu shifted to Wayland and then back to X11, but at this point, X11 is no closer to dying.
Nvidia hardware doesn't like to behave well in Linux. One has to get just the right driver, and the correct kernel options to prevent hangs and poor standby behavior. However, I'm not sure this is terribly different than my experience with Nvidia drivers on Windows.
Sigh. No offense but it's comments like this that remind me why no, I haven't tried desktop linux recently. No, I am not going to mess around trying KDE Plasma or SomethingElse Tumbleweed or OpenWuffe SnufflePuss XL or any other of the billion permutations of half-baked shit which might or might not work. I am mid-career, I have money, I have little time and even less interest in fucking around with any of that just to get a working computer that meets my needs.
I want a big company, who I trust will follow through, to take my money and solve my problems with The Desktop Linux™. I need office apps, I need music apps, I need image processing, I need some sort of integrated cloud offering, and I'd like a unix-like back end. I'll pay money for this. I will not pay in time. Apple does this for me currently; MS is possibly the only possible other company which could take on such a challenge, hence the interest in the idea.
It's sort of ironic to me that one of the main reasons I like linux on the server - which I use exclusively - is because I perceive it as "just working" in a way that MS servers never do. The situation is entirely inverted on the desktop - OSX, and to a lesser extent Windows, "just work" and the solutions to all of my problems are, at most, an install and possibly a few dollars away. Linux is very far from that currently, and until it gets a whole lot closer - and it will probably take a big company to actually do it - that door is closed to me, and people like me, for the foreseeable future.
I see from the obtuse responses you've received, there's really little point in summoning the effort to type anything besides either 'linux sux' or 'linux rocks'.
> It's sort of ironic to me that one of the main reasons I like linux on the server - which I use exclusively - is because I perceive it as "just working" in a way that MS servers never do. The situation is entirely inverted on the desktop
If "just works" is what you're going for, then install a rock-solid distro on the desktop - Debian GNU/Linux, CentOS and OpenSUSE Leap are the ones which are really in the running, IME; Ubuntu LTS is a distant possibility if commercial support or third-party applications are a priority for you - and, just as importantly, accept its limitations. No, it's not going to support the office, music or image-processing apps that you're used to. But there's plenty that it can do, that many, many users will be absolutely fine with. (The fact that so many people are fine with something as incredibly basic as ChromeOS is proof positive of this. A Linux desktop gives you plenty more than that, and you can easily get it to run on extremely cheap hardware that would not manage to run any other modern OS.)
As a mid-career DINC in a 2% income bracket married to a 1%'er, I too, have money but not much time. That is why I will never not use a Linux desktop. Everything just works on my XPS 13 arch + i3. I also own the previous gen mbpro, I think I used that last year may. Just the idea of me wasting time with osx or windows is sneeringly ludicrous.
> and it will probably take a big company to actually do it
The lack of a company controlling Linux is exactly why it's Linux. I am surprised that someone who is mid-career, has money, and little time doesn't know that.
I always saw Canonical as trying to do this, considering what pains they used to go through to not use the term 'Linux' in their marketing. They were playing things as though Ubuntu was their own platform (and in many ways, given the different ways that Ubuntu used to break compatibility with plain Debian, it was) — their own software centre, their own desktop, their own services.
Many of those seem to have gone by the wayside: Unity deprecated in favour of GNOME 3, Ubuntu One dead. It makes me wonder if Canonical found out that it just can't be done, that pissing off the people involved in the projects upon which Ubuntu depends by sacrificing some of the sense of community in exchange the It Just Works™ magic towards which they have been working is ultimately a fool's game in the current ecosystem.
I find that a shame. I thought Ubuntu had the potential to be a sort of macOS for Linux: built upon a free base, contributing heavily to other projects from which it takes, but also adding Ubuntu-specific stuff (still open source, of course) that disrupts the Linux desktop ecosystem to make the current other players (KDE, GNOME, Xfce, etc.) feel like amateur hour. At one time, it felt like Unity could've been Ubuntu's macOS desktop and I could swear they were heavily promoting some Python API for desktop apps that might've been to Ubuntu as Cocoa is to macOS. That also seems to have changed.
> you know that everything that unity was doing can be done in KDE […]
Please re-read what I said.
I said that I was hopeful that Ubuntu would create something that would make KDE and GNOME seem like amateur hour. I never said Unity was the thing for which I was hoping, merely that it was Ubuntu's attempt (which, to my mind, was a failure), and I never said KDE or GNOME __are__ amateur — although, you forced my hand with the next bit.
> […] if you take 20 minutes to configure it
This is exactly what I'm talking about as "amateur", though.
Try thinking like a regular user, who just wants everything to work properly out of the box, rather than a technical user who might be happy wasting 20 minutes on the fruitless endeavour of getting KDE to work Just Right This Time And I Swear I Won't Spend Another Couple of Hours Reconfiguring It Again Later When I'm Bored™, because that very way of thinking is exactly why the Year of Linux Desktop will never come.
At some point, my threshold for irritation to run a Linux desktop dropped low enough that I was willing to make the jump. My threshold for irritation for Mac also went up.
I was only willing to do so if I could install a stock standard distro and get to work without having deal with a bunch of configuration hassle. That's pretty much true for Ubuntu. However, there are a couple of issues that one may need to address that I didn't mind so much, but I readily admit, I'm not an average user.
I switched when Unity was still a thing and was surprised to learn that it was pretty good with only a little refinement needed. I was pretty disappointed when Canonical dropped Unity settled on Gnome and here we are two years later at precisely the same place, with the aforementioned refinements still needed.
Like you, I was hopeful that Canonical would produce a generally usable Linux desktop, but that still remains to be seen.
You hit the nail on the head, I agree completely. But:
> Try thinking like a regular user, who just wants everything to work properly out of the box, rather than a technical user
I'm by any measure a technical user, with over a decade in software dev and a decade before that in network admin and corporate IT, and I absolutely insist on things working properly out of the box. Anything else is Work™ and I'm simply not going to do it for personal devices. Sure, I'll spend hours tweaking a server to get it behaving exactly the way I want. I am almost OCD about reliable, robust, version controlled environments. Just the other day I spent literally hours massaging some stupid nginx config file so it did exactly the right thing, all of the time, in exactly the right way. And all was right with the world.
My personal stuff though? I have zero, and I mean zero tolerance for any of that shit. My personal stuff better work perfectly first time, every time, or I will be taking it back for a refund or throwing it in the fucking river (regretfully this is not a metaphor). At work I am all linux and open source. At home I am the biggest Apple whore you ever met. I just want things to work.
I am also disappointed in Canonical, who I thought might do as you suggested and create "MacOS for Linux". I think such a thing is sorely needed. They haven't succeeded, though - perhaps MS could. I'd like to see that if they did it, and I'd give it a solid chance.
Of course I know that. I'm explaining why the lack of somebody capable taking responsibility for providing a "curated" linux with a world-class UI and backing it up with deep and broad app and services ecosystem means that it won't ever be acceptable to the vast majority of desktop users, including me.
Canonical has totally failed to deliver this. Maybe someone else could, maybe not. We're just speculating here.
I cannot stand windows for 5 minutes. to me it's just a terrible unresponsive, incoherent and unusable mess. Linux Desktop is a jewel in comparison.
I'm kind of starting to realize that small nuances in Linux are not forgiven, but they are easily overlooked in Windows.
People are just used to Windows and it's problem's are overlooked.
Anyways, Desktop Linux has always been a mostly volunteered project. There are billion dollar industries around server but most of the drive on desktop is just pure volunteered.
Considering that fact, I think it's doing amazing.
I agree, Linux desktop is much better than Windows. But I prefer Linux without a desktop environment (this is not the same as a window manager; I do use a window manager, but not a desktop environment).
Having had a decent go at all three major OSes (windows, osx and Linux) in the past few years, I've found all of them to have a whole bunch of annoyances that stop me being able to do what I want, how I want it.
Absolutely. I've been f/t on each for varying amounts of time over the years, and have found each wanting. There really is no good desktop OS in 2018. I have found OS X the best for my purposes, but it exclusively runs on hardware that doesn't interest me, and only gets worse over time.
Definitely. It's more a matter of choosing the one that sucks the least for your needs than anything else. And since they all seem to be drifting farther and farther away from what I actually want in a personal computer I've started entertaining the idea of making my own (not a new kernel, just most of the stuff on top of it, a new "operating environment"). But it's a ton of work and I'm not really qualified to do most of it, plus I have other projects. It would be great if there was a community with a similar conception I could work with, but I haven't found one yet.
Not quite. Those are the kind of people who would write their own kernel for fun. The kind of community I'm looking for would be focused on building a practical personal computing platform.
> The issues largely fall into two categories - missing software, and missing or undercooked hardware support.
I've trying every Fedora release for about two years, with a view to switching (back) from macOS. Everybody has their priorities, but Fedora is now there on software and features for my uses. If I switch back now, I'll lose the official Google Drive client, but that's it.
In the past, I've run Linux desktops without hardware issues by buying slightly older ex-corporate laptops with Intel CPUs and graphics. Today, there seem to be a bunch of vendors offering Linux preloaded, so I don't expect to this to be a big problem.
Yep. My problem is always the same - lack of documentation and transparency for the desktop side of things.
Example - I installed latest ubuntu on my laptop. Few days later, I needed to share my internet connection from wifi to ethernet. Ok, no problem, I'll just google it. Answer says it's easy as anything, you just open the connection manager, tick two boxes and done. Except....on latest version of ubuntu, the connection manager looks nothing like the one shown in screenshots, and doesn't have that option. Ok, so I start digging(and it's not easy, because almost every result on google shows me something that doesn't exist anymore). Finally, I find a command to open the old network manager, because of course it still exists - and voila, it works straight away. Except that I wanted to set up some filtering where only certain ports would be allowed through...no issue, I'll just use iptables. Fine. But since the last time I used Linux, someone had a moronic idea to change the interface names from easily understandable (eth0 for ethernet, wl0 for lan) they are now all something like eb239xsd83d, because of course that's easier to remember and type. And there is no way to instinctively tell what is ethernet and what is wifi anymore. Lovely.
Like, this is all extremely minor and easy to fix with some googling, but it feels like every damn time I want to do anything on Linux, I have to use the terminal. Ughhhh.
I'm sure you don't care but interface names were redone to be stable, as in if you have two ethernet cards or you replace one with another you don't end up in a situation where `eth0` today is `eth1` tomorrow just because of random chance timing during init.
I suggested on Mini-Microsoft about 10 years ago that Microsoft do an 'Apple'. Apple took a free OS from BSD and put their own GUI over the top of it. I suggested that MSFT do the same sort of thing, that being to put their Windows GUI on top of the Linux OS instead of using the Linux X11 GUIs.
This had the advantage that the underlying OS base would no longer need maintaining by Microsoft, and all their coding efforts could be allocated to the Windows GUI, thus permitting Windows versions to be released far more often than every 5 years as was the case back then.
Being full of Softies, the crowd on Mini-Microsoft dismissed my suggestion out of hand, it being sacrilege of course to even think of marrying the 'upstart' Linux with the 'sacred' Windows code.
Naturally, today I am smiling to myself that MSFT could have taken my idea and run with it, but were too hidebound to do so and have lost 10 years in the process.
Apple only needed to support their own hardware. Microsoft would need to make Linux work wherever NT works, and would need to figure out a solution for third-party closed-source NT drivers.
Besides, in a sense, Microsoft did this when they moved the desktop product line to NT. It wsa a similar compatibility break and a similar improvement in stability and switch to a robust, multiuser, preemptively-multitasking kernel as classic Mac OS to XNU; it's just that the kernel Microsoft already had in house was NT, and the one Apple/NeXT had was XNU.
A shortcut (perhaps the wrong word here) would be to migrate the Xbox to Xbox libraries on top of a Linux core. This would minimize the hardware for them to start with.
After a couple years, they could make Windows server run on a Linux core, and then another couple years have Windows run on Linux.
The whole progression could be pretty smooth, covering several years and taking a milestone-driven approach.
With the surface, im sure hey could use that to their advantage and have kore control over hardware.
Perhaps the best impetus to do this would be if they ever really wanted to migrate to ARM and reduce their reliance on the intel platform.
The hardware piece is important, but definitely controllable. Just like you mentioned NT, but this time with a smaller set of vendors to work with.
Harmonization with the rest of the computing world (everything else is effectively a Unix variant) as well as lower dev costs (kernel dev cost would now be shared with everyone else).
Nadella is a smart guy and has moved MSFT in the right direction. If his truly makes sense, then he’ll probably do it.
For the longest time I assumed MS would buy Ubuntu.
I'm not a .NET dev currently running PopOs on my XPS 13. VS Code is my primary editor for all things. If MS released a desktop that looked this good and came batteries included for their tool chains and services I'd hop in a heart beat.
My only guess is that doing so would lower consumer confidence in the corporate space? I know Chrome books are making a push there and the psychological factor of "it's not Windows" does seem to be a competitive advantage for MS (for now).
That would have been the end of me using computers, right then and there. An operating system that updates or upgrades more than about once per decade is moving entirely too fast. 5 years about the fastest possible rate that even approaches being reasonable. The fact that Windows has always bent over backwards to provide backward compatibility would be totally invalidated by shifting toward Linux. I run a bunch of DOS programs in Win 7 still, without needing to go through DOSBox, and that's not counting my collection of 3.1 and 9x programs which I actively use.
So yes, giving up entirely on the good things that Windows does without respect to the GUI and accepting the practices of Linux/Unix devs who have little or no respect for the past would be nothing short of a catastrophic mistake. Your idea would likely have killed computing forever, in my view, especially enterprise or professional-use computing, and within the first two rapid releases that broke necessary functions. It's the so-called 'Softies' who keep the world running and effective for normal people who don't want to have to figure out how to compile a program just to turn on their computer and do the things they need.
And it would have killed home use as well - we don't need network workstations at home with limited control, we need computers that we have full control over, all the time.
> And it would have killed home use as well - we don't need network workstations at home with limited control, we need computers that we have full control over, all the time.
And then you have iOS, OSX, ChromeOS, & Android that both upgrade annually. I'm sure you'll be happy with Windows 10 though considering it's the last official Window release. Really it's not though; it's just rolling releases in the background.
I'm afraid you have to realize you're in the tail end of the tech adoption cycle[1], either in the late majority or bordering on laggard.
I am quite proudly a lag-behind when it comes to tech within the last decade. The rest of you can go on ahead and suffer all the horrible bugs that would have been found in QA testing, if the testers had been given adequate time and resources. I'll happily arrive after you're done and use the finished product.
Also, I've never updated Android OS on a device - every phone and/or tablet I have, has the version of Android it shipped with; whatever that might be. The same goes for iOS, mostly. My old iPad, I knew better than to update the OS. The newer one, I've made the big mistake of upgrading to iOS 12. I'm sorely tempted at the moment to make a list of programs on the device and reset it to get it back to decency.
My feeling is, people are allowing the tech companies to release buggy software, they're still buying it, they're still paying for hardware using it and supporting bug-ridden versions. We need to shift the mainstream - and the business world - back to demanding software that works and keeps on working reliably for the life of the business.
What's a security update? I don't expect them to be responsible for my security - that's my burden as the user to make sure that I'm using software and files I personally trust.
That seems a pretty illogical position, it relies on developers being perfect at security. It doesn't matter how much you trust someone you must realise that people are fallible; that, in context, means that hardware and software will have [security] bugs/feature oversights.
If/When you discover a bug, and the developer has a fix out as a security update -- eg heartbleed -- then would you really not update because you trusted that developer before. Surely that they've issued a patch that they suggest to apply means, if you trust them, that you should apply the patch?
Linux (the kernel) is an extreme outlier in this case because Linus actually cares deeply about compatibility. You don't even need to recompile statically linked old Linux software to have it work in most cases.
But otherwise UNIX's compatibility story is pretty damned atrocious because part of its culture is relying on everybody to recompile everything.
That sounds weird to me. I used to do Solaris SysAdmin stuff from (roughly) the early 2000s. That also had a strong backwards compatibility attitude, sot you should (and generally could) run old Solaris binaries on much newer releases of the OS.
Which Unix's had this "pretty damned atrocious" backwards compatibility approach?
Having to "recompile" it means that it's not unmodified, though. Unmodified would mean that it runs like Windows programs where the same binary files work now as did then.
Yeah, that's how I took it too. eg being able to run old binaries on newer versions of an OS.
My experience across Solaris and Linux has led me to believe it's also been a strong nix thing too, not just a Windows approach. I used to use FreeBSD a lot too, but that was years ago and I don't have clear memories of it's compatibility approach. :/
Which nix's are you thinking of, which don't offer strong backwards compatibility for compiled binaries?
There are no current OSes which upgrade once per decade or 5 years, sadly. Sure Win 7 and 8.1 are there, but on their way out. Everything else is updated much more frequently. I suppose MacOS is the only remotely reasonable alternative once Win 7 and 8.1 are past EOL.
There are plenty of Linux distros that are supported for upwards of 10 years. Debian and CentOS come to mind but Ubuntu 18.04 was also recently announced to have support until 2028.
While Linux is faster and more scalable, Using BSD like Apple has many advantrges - especially when there are thousands of closed-source windows drivers which will never be released as open source. The other big hurdle is the sad state of Linux graphical facilities in the kernel. For Apple, this was never a problem due to limited hardware choices.
Well if I were them I'd implement my own stable driver ABI so companies wouldn't be forced to open source them (like it really makes a difference anyway with all the firmware blobs) and you wouldn't have to recompile every single driver for every single kernel revision. I'd also move more stuff into user space like they did in Windows, which is why a video or sound driver crash usually just wigs out the screen for a second instead of crashing the whole system.
Given that, maybe it would also be possible to implement a compatibility layer so some amount of current Windows drivers could be used anyway. I don't know if that's feasible, but if it is they're in a position to pull it off.
There is a project called NDISWrapper which attempts to do that. It used to be used quite a bit for WiFi drivers before Linux had decent WiFi drivers for most chipsets.
Not likely to happen - not very long ago, MS likened GPL to a virus. Since then, they have changed a lot, but fundamentally, they need to protect vendors interests, IP, profitability - not appease some open source proponents.
Never too late for them to make a go for it, even today.
But, to avoid alienating their huge existing developer community, they would first do well to get behind Mono [0] and maybe even, make it official (edit: Mono is "Sponsored by Microsoft", so already quite official indeed). In fact, as of 2014, there have been very promising signs in that direction. [1]
Mono isn't sponsored by Microsoft; they bought it. They also publish an insane number of things completely open source, developing them in the open. .NET Core being a huge one, but ASP.NET, VSCode, ChakraCore, TypeScript, and so many others. There are 2181 repositories in the MS Github account.
Among all the other reasons why not, I think GPL would pose a hurdle. (And I wonder if that influenced Apple's choice of BSD vs GNU―though I guess GNU didn't have a popular OS back in '95.)
Not derived at all. The design principles and operation are similar due to a few of the same people working on both. Would you say Linux is derived from Unix? I guess many would but, personally, I am working on a new trading system, would it be derived from my previous work? I would say this one is brand new, but shares common concepts due to the requirement of being a trading system. I would definitely say it is not derived at all. The way I do things similarly because they work well in practice is not derivation to me.
I agree with the skepticism here regarding the ease of getting Windows software working on Linux. The Wine and CrossOver folks are doing the Lord's work but even they have their limits.
Linux software tends to not rely on much. Linux makes it really hard to rely on much. The internals change often and the kernel is not nice to people who try to rely on implementation details, of what you can from usermode. You can hardly rely on libc when you are on Linux, and many try not to, to be more portable.
Windows software on the other hand is wild. Just look at the myriad of techniques used by anti-debugging and anti-reverse-engineering tools. A Linux binary wouldn't dream of reimplementing the runtime linker itself, but that's exactly what many packers do on Windows, to obscure the import address table and make patching/debugging harder. Did you know you can write into another processes address space with WriteProcessMemory? Why do we even have that lever!?
That's only considering usermode. But apps are just as eager to rely on kernel mode implementation details too, in the past it was even common to patch the SSDT to modify syscall behavior. Anti-cheat in video games can still do evil things even on Windows 10; nProtect GameGuard's kernel module seems to hide its usermode processes somehow. I'm pretty sure Linux kernel modules can't easily do that.
I'd love better Windows compatibility on Linux. Heck, I'm excited by what Valve is doing with Proton too. But in the end, I think much of the Windows software library is just too deeply ingrained in the Windows legacy.
Wine is a lot different from virtualization. It’s likely if Microsoft did release a Windows compatible Linux (or some micro kernel thing) they would virtualize the SxS assemblies and kernel for each application to help with security and compatibility.
They gave a demo called MinWin years ago that more or less did the same thing.
That would be a very neat approach. I had not heard of MinWin until today and now I'm pretty curious exactly what it entails and what the implications of it are.
It’s actually difficult to piece together what MinWin was exactly. Some sources say it used virtualization, other source say it was an extremely limited NT kernel. I suspect it was a little of both.
> I'm pretty sure Linux kernel modules can't easily do that.
They can, and rootkits in the past (unsurprisingly) have. But that's just technical ability; users willing to run things like that is a whole different matter, especially when we can just namespace and/or sandbox apps.
I'm not so surprised that rootkits can exist on Linux, but I was under the impression the Linux kernel made it much harder to break out of the framework provided and patch over whatever you wanted. Sure, it's all Ring 0, but there's kASLR and you have limited visibility, maybe not even a map of symbols. Is it still possible to do this kind of attack today?
Everything running in kernel-space is fully trusted (talking about code compiled and linked against the kernel, not JITted code running in a VM). Symbol map may or may not be built and/or exported during build (both defaulting to true on Debian/Ubuntu setups).
The Windows NT kernel has better hardware support and Microsoft is showing that they can run a full Linux userland on top of it. What do they gain from shipping Linux for the desktop?
WSL is a slow joke compared against native GNU/Linux. Also, you missed that WSL don't support GUI apps.
However the inverse, Wine, works very well and sometimes it's faster that native Windows
Just a passing note, if WSL can run programs with GUI libs, those programs likely can draw to a Windows-native graphical server. (I've briefly used this technique with CoLinux the userspace-Linux-for-Windows).
When one of these companies realizes they can create a polished Linux distribution, using the absurd amount of funds they have, the future of desktop computing is theirs.
And the funny thing is it is probably less work than maintaining their current solution.
Also I don’t think it has to be an either or situation. There is nothing stopping MS from supporting NT systems and putting future development into Linux systems. Similar to how they’re doing .NET and .NET Core. Over time they can port their office suite and other programs and drivers to the Linux system and the users will come with them.
And at this point they’ve already started to drive developers away from NT because of their Linux offerings. Let’s be honest, most developers would pick Unix in a heartbeat over NT (to develop in and deploy to) and now they have that option.
Also if anyone with power at Microsoft is reading this and this is a path the company ever takes: don’t make a ChromeOS, make an Ubuntu.
While desktop platform control is important and probably has a positive ROI, the article argues that desktop OSs are already "good enough", and that Microsoft can probably find more profitable alternatives for their engineering effort.
This is the same motive behind the switch to Chromium, that browsers are "good enough" and that the marginal gains of platform independence aren't worth the heavy investment.
I get a little uncomfortable when people talk about "engineering effort" like it's just some sort of power stream you can just repoint, because to me it belies a lack of understanding of what programmers do and the challenges they face. (I'm not accusing you personally of that, just that's what pops in mind when people talk about refocusing engineers. I always hear inexperienced project managers talk about having engineers "swarm" a problem, which is a dead giveaway that they're extremely naive). You can't just take a bunch of kernel devs and all the sudden have them work on sharepoint or web frontends or something. Not just from an experience standpoint but a lot of them wouldn't want to do it. Really "refocusing" tends to involve layoffs and rehiring.
I honestly don't think Microsoft cares about the Windows Desktop platform anymore. Certainly their actions regarding Windows 10 don't suggest that they do.
My bet is that if they release a Linux distro, it will be Microsoft's take on ChromeOS, tied to Azure and Office 365 and all that, with any other stuff shunted off somewhere in a container or VM. Oh, you can install your own software from other sources at will, you'll just have to compile it yourself or rely on a network of repos and maintainers. It'll be the death of everything I actually like about computing and I'm not looking forward to it.
It doesn't. Replacing the kernel doesn't but deprecating the entire OS and switching to GNU/Linux + Wine (which means MS would start contributing to Wine) does. If this happens we can get heavily improved Wine + vendors like Adobe releasing Linux-native versions of their apps.
I still don't get it. There's nothing wrong with the kernel, the crap is stuff that they've put on top of the kernel just like it is in Linux. You're giving up the best part of the OS and replacing the worst part with a compatibility layer, why not keep the NT kernel and use a compatibility layer? Since MS has all the code and it wouldn't have to be open source, that'd be much easier.
I could not possibly disagree more strongly. Linux's kernel has some problems but it is mostly ok. Userspace is where everything has gone horribly horribly wrong in Linux.
E.g. what? As for the kernel I personally dislike the the Linux kernel-driver separation model (I believe these should better be made more separate, like in Windows) but other people (kernel hackers probably) have mentioned the VMS genes in the NT kernel make it in interesting in more ways. But what's wrong with the GNU/Linux userspace? I can't name a single problem with it.
Arguably windows was much more of a disaster 10+ and 20+ years ago in the XP and 95 eras. I have trouble seeing microsoft give up something they've invested 30ish years on without a really obvious reason.
At least it didn't spend hours on applying mandatory updates just to stop working completely in the XP and 95 eras. When I was using Windows XP I ran Windows Update once in some months and everything worked perfectly.
No. They just plan to use developer tools (chrome based browser + vstudio) to push their one-push-publish-to-cloud.
embrace (chromium), extend (dev tool features that will only work with vsStudio but is awesome), extinguish (tool now only work with vsstudio when serving from azure because they are moving faster than open standards.)
I don't think this is happening, Visual Studio is basically a legacy product at this point. Still developed and maintained, but not a major focus for MS, and they want to push Azure adoption as much as possible so I don't think we'll see IDE lock in like you're talking about.
They could even use WSL to put Linux on top of the NT kernel. It'd be called Microsoft GNU/Windows.
Or they could ditch the GNU part and go for a non-GNU libc and userland like the article suggests. That wouldn't save nearly as much money as the article speculates because the Windows userland is humongous - NT is a small part of the whole, much like the Linux kernel is a vanishingly small part of the distributed effort of building desktop Linux distros.
I regularly develop Haskell, Rust, Node, etc. on WSL. It's dramatically improved since the early days, and that was already pretty magical.
I find it funny you mention "stock Linux binaries", the distributions that are available on top of WSL are stock Linux distributions. Ubuntu, Debian, SUSE, etc. There aren't any non stock Linux binaries involved.
It is surprising how well it works. I tried a massive compiled Prolog TK application that has been developed since the '90ies. It worked perfectly without much effort.
If 2 of the biggest Linux distributors get owned by giants with their intention of trying to save themselves than save the market is going to be a bad ride.
What other commercially supported option do we have for those concerned?
Back when Windows 8 was causing all sorts of problems I did my annual top ten list of what's coming in the following year. One of my predictions was that Microsoft would move Windows to using Linux as the back end.
Overall Windows would become much more stable, they might be able to get some Mac users to switch and they could put more people on getting the UI right. Like a lot of my bolder predictions on each years list I got it way wrong.
In my defense I didn't know anyone at Microsoft at the time to run my idea by. Still think it makes some sense though in a way they ended up giving Windows at the *nix command line as a gift to Mac developers. I'm happy with Windows 10, for me it's just good enough.
Overall Windows would become much more stable, they might be able to get some Mac users to switch and they could put more people on getting the UI right. Like a lot of my bolder predictions on each years list I got it way wrong.
Why would you think they would do this? NT is the architecturally more modern kernel. People should read up on the history of the NT kernel [1]. And if they should do this for non-technical reasons, I think it is more likely that they go for some permissively-licensed kernel (e.g. FreeBSD) to avoid the GPL.
In the end, Microsoft did exactly the opposite: make it possible to run Linux binaries on Windows. Windows Subsystem for Linux is a testament to the strengths of the NT kernel. NT supports multiple 'personalities' and Win32 is just one personality, the Linux ABI is another personality. One could say that the Linux ABI is almost as native to the core NT kernel as the Win32 ABI is.
At any rate, I think that the Windows desktop is getting less and less important to them. So rather than going through the massive work of porting Window on top of the Linux kernel, I think it is more likely that they put Windows in maintenance mode and make/fork something along the lines of ChromeOS or Android for client systems.
(Disclaimer: I haven't seriously used Windows since Windows 3.1. It just bothers me when people misrepresent Windows or the NT kernel.)
I've thought the same thing for a long time (10+ years, I must be wrong). It makes sense from a business perspective as well. I still think it will happen, but not sure when. The addition of Ubuntu shell was a pretty interesting move.
Well, Steam [1] is probably a good measure of "desktop" as it's usually home users that install on their personal computers?
So, by that measure Linux has ¼ of the users of MacOS.
The significance of that for you is not my call though.
Steam's top seller in 2017 was PUBG at $600 Million, so assuming equal spread of revenue that 0.8% represents $4.8M you're leaving on the table. With less competition on Linux then there's possibly a higher revenue to be gained there for AAA titles? It might be worth risking a bit of dicking around with Proton/WINE to get your game working for a few extra mega-dollars?
NetMarketShare.com gives similar ratios for Linux : Mac on desktop, giving Linux 2% of desktop/laptop installs.
But I would assume the cost it is not. With your PUBG example, it was known for bad optimization and vulnerability for hacking from the beginning. The developer could optimize for Windows and benefit 96% or more of their customers, while it is hard to justify do the same for the 0.8% user base while the spending is probably the same. So no, I don't think $4.8M is easy money.
And for smaller, even moderate successful titles, Linux version makes even less sense.
Microsoft is always a business not IT. In fact, underlying the whole IT revolution they are the key to transform it to business to the horror of the whole industry. Software as a business is not a model even for ibm. We have share/guide then and even now you can run Ibm os on pc.
Hence the question what is the business model. It is nothing to do with technology. It is as godfather have taught it - nothing personal but business.
The article mentioned about azure, here Xbox, guess pc side the old business they do os and oem do hw (vs iphone and mac mode of integrated business down to make apple cpu), rental model ...
Hence the question is not about whether it can but how.
It is all business to Microsoft and google. A bit different on apple side in the past ...
That is why we should be worry. The old embrace, extend and extinct is logical path for all business and empire.
Anyway, they are moving and get that here and there like github. But would they worry as a buisness about java and android model.
If Microsoft has an OS based on Linux and with good UI and MS support this rocks! Today it's not about OS, but about cloud services and Software used by users.... Email, Calendar, something more valuable... Anyway, Linux is about free configuration, make API for anything you wish, that is important for a user or (heavy-user). iOS and Apple already done this I think. (in some way, You can't change Apple Binary code, but You have API-s to change the behavior of MAC-OS in real world). It's complicated story anyway.
If it happens, it would be the most jaw dropping event of the decade.
But MS has a room to keep their own ecosystem as losing it will have less diversity as we've seen when they dropped Edge not to mention the world will come to a halt without Windows that can run apps from the last 20 years.
Having a UNIX kernel at the core would be really nice which would make developers easy to port apps between OS but how would the world deal with all the irreplaceable apps on Windows?
I think that such a product, even if developed internally, it would be stopped by the business team. The reason being it would be better than Windows in many aspects.
I'm a long time Linux user, but got an ultraportable lately and kept windows on a small partition (mostly for BIOS updates).
Linux was installed without any issues, everything except the fingerprint reader works. The interesting thing is that it does not just work, it works better. The most pronounced difference is the touchpad. On windows it is frequently stuck —and it's not palm rejection because I never had an issue in Linux. The 3 finger click that is paste or _open in new tab_ in Linux, in Windows opens Cortana. With Linux battery lasts longer. With Windows, the fan starts turning without any reason while the process manager says _no process is running_. Very annoying. Windows constantly nug me because I used my skype account which, according to them, does not have an associated email. Applications go into full screen without any indication on how to close them. I have to search for software on the internet, via my browser. The other day I wanted to start Windows for a Lync meeting and it decided it has to install updates and restarted a couple times before allowing me to continue with my life.
Perhaps the most infuriating thing was the candy crash tiles that greeted me when I first booted the computer.
I'm not saying you should use Linux, but you should ask for better Windows.
This [0] is a link to a portion of 2018 Bryan Lunduke's presentation where he gives a refresher on Microsoft + Linux, and goes on to discuss Microsoft's "Embrace, Extend, Extinguish" tactics and how they're relevant here.
Does this even make sense unless Office gets ported to Linux? Given that Office 365 is turning into Microsoft's main source of desktop revenue, porting Office to Linux is what will allow Microsoft to start to build an entire Linux desktop experience that still generate revenue for the company.
They could. However, if they did, they would need to open source everything they ship with it, if I remember the GPL correctly.
But also, I think it would be a daring and interesting business move. I'd probably start using Windows more if they did, and usage is the ultimate prize these days.
I think yes in 10-15 years business-wise that it would be a good plan and great switcheroo. The reality though is that there is so much Windows technical debt, Microsoft employees looking for a promotion, and let alone future unknown technological advances and trends that it will never happen. It may be possible...but who cares (see: above ^)?
If I was Microsoft I'd rewrite windows on top of Linux but keep API for apps the same before it's too late
Mac is used by most tech companies because of terminal and tools it can run. But Linux can run all of them so windows built on Linux will have best of both worlds.
They could and they should. Linux is an amazing OS and to have a great UI on top of it that millions of people are comfortable with would be tremendous.
Think about it. The year of Linux on the desktop brought to you by Microsoft.
System_76 borrowed heavily from elementary OS to create Pop_Os which is a dream. Now that I'm saying it I hope they just swallow a little pride and become a major contributor to Pop_Os.
Every devops and net admin I know would wiggle with joy to be issued a Dell XPS 15 + Pop_Os tuned with PowerShell and other MS tool chains fully supported.
People say this but I've tried many times to add themes, make tweaks etc. Can we give some 'distro' makers a little credit? I couldn't make Pop_Os if you gave me 10 years....I know cause I've been trying for 15.
This argument makes little sense: that "vice versa" is wholly unsubstantiated and without it the rest collapses. Making Linux binaries work on NT (which already has a subsystem concept; Interix proved that this could be done) in a way where they interact little with the Windows desktop, and making not just NT binaries but Win32 binaries work smoothly on Linux, are almost entirely unrelated problems.
It is true that MS owns the code necessary to make this happen, that WINE is an existence proof that it can be done shockingly well even without that code, and that there is both code and expertise at MS for stuffing Linux concepts and NT concepts in the same kernel. But that's about it.