Having a package manager is one thing, having a good curated repo of packages is another. I doubt MS Will have the balls to say that all their "partners" (most of the AV vendors, Oracle, ...) whose entire business model is based on crapware or bait & switch will somehow be banned from the repo?
Currently it seems that both the apt and choclatey versions of Java actually come without crapware. I have a feeling that if this kind of install becomes the default then the crapware will be bundled there too.
Microsoft Security Essentials is actively trying to put them out of business. I had the impression that it was the OEMs who liked bundling terrible software.
I was looking at tablet PCs and they have some marketing term for "verified by Microsoft not to contain preinstalled crapware". Android doesn't have that.
Another benefit of the Nexus program is that it's trivial to achieve root, so you can easily get to the point of removing even the small amount of bundled crapware.
Microsoft Security Essentials hasn't been testing as well as competitor AVs now for a few years. Also, MSE has been deprecated and replaced by Windows Defender, even though it seems that only the name has changed.
Their detection rates have been pretty low lately. http://www.av-test.org/en/ I've been using Bitdefender at home. It's kinda annoying but seems to work fine.
As someone who works in the information security space, I've seen this touted over and over again, and repeated as nauseum in the media. However, the only place I see these terrible results is reports from AV Test specifically. And it bothers me, because I can't see why they would rate it a flat zero, when it certainly did not score a zero in their testing.
It feels to me like AV Test is being dishonest with their scoring. I don't know if they have any stake in who buys what product and I certainly don't want to claim they are biased since I have no proof of that, but the way MSE/Defender has been vilified in the media following the results from AV Test, I do have to say rating the product as a zero when it did not score zero feels incredibly dishonest to me.
Zero should be the security you have with no AV, and MSE rates 75%-80% in their testing.
It doesn't get a zero exactly, it says it's a "baseline". Sorry if I linked to a questionable source, but it's not the only place I've heard this from. AV Comparatives does the same (PDF) http://www.av-comparatives.org/wp-content/uploads/2014/10/av... Under the chart on page 3, it notes that MSE scored 80% (which is lower than the others) but it's not "competitive".
For Protection Score, it's rated 0.0/6.0. For something that's 80% accurate, included with every computer by default (and turned on by default) and constantly updated, it's being done a massive disservice. Yeah there are better free AVs out there, but given the choice between nothing and MSE/Defender, I'd really rather everyone use MSE. There's a long way between 80% and 100%, but an even longer way between 80% and 0%. Especially when the AV is no longer even the most important part of PC security. That starts at the browser (every browser) and continues into UAC (on every OS) and things like SmartScreen verifying the reputation of executables (not just scanning for malware).
I wasn't meaning to call you out on your source, just the message that this test seems to give off ("MSE is worse than nothing").
OneGet comes with Chocolatey as a repository pre-installed for now - not sure if it will when Windows 10 releases, but I'm sure the repository won't be hard to install.
Sorry, penguin lovers — if you thought that 2015, in the heinous wake of Windows 8, would finally be the year of desktop Linux, you were sadly mistaken.
If you’ve ever ventured into the dark and mysterious land of Linutopia, where Ubutologists and Debianites reign, ...
Wow what a snarky article! I guess Extremetech does not like Linux at all...
To be honest, it was kinda funny. I don't see why we can't find it humorous that year after year someone says that the next year will be a year of Linux.
I love Linux - been using it for several years and wouldn't trade it for anything (be it OS X or Win with package manager).
Package manager is certainly an awesome thing, but not something that would make me (and probably anyone else using Linux) to switch over.
Yesterday I had to update my rarely used Windows partition and install some software for work.
Virus/Firewall updates, Windows updates, Java updates, repeated forced reboots (four times in total) and playing hunt and peck with all these 'suspicious behaviour' popups, remembering to untick the 'install dodgy toolbar' checkboxes.
I'm starting to feel that a decent package manager is the main reason I continue to use Linux.
The forced reboots thing in Windows is nuts. I installed from Windows 8 media a while back, and to get from that to an up-to-date installation of Windows 8.1 requires a seemingly endless loop of "Check for updates. Install. Reboot. Check for more updates that didn't show up last time!"
It's like nobody has told MS about cumulative updates. You can install OS 10.9.0 and go straight through to 10.9.5 with a single update.
Maybe this comes from their "enterprisey" system of releasing everything as individual little patches so that IT can decide what to install and what to skip, but it's a nightmare for normal users.
All Linux distros require a restart for kernel upgrades if you want to use that new kernel version. Or are you saying that Ubuntu actually forces the user to restart after a new kernel is installed?
You're still using third-party software for that? There's part of your problem! Getting rid of those would also kill of the "suspicious behavior" popups.
I'm an equal-opportunities OS user. I have two Linux boxes (Slackware!), two Windows boxes, and an OS X laptop within arm's reach of my desk. I was just poking fun at the Year of Desktop Linux thing -- it's a bit of a meme/running joke in tech blogger circles.
Given that more android devices are sold than windows, iOS and OS X combined, I would say it is a mere observation of fact that the past few years every year has been the year of linux. Linux on the desktop on the other hand...
If we combine BSD and Linux, it's actually not so bad.
* Globally, Android pretty much owns the mobile market. Android is a Linux.
* Most servers run some flavour of Linux or BSD.
* Although Windows still dominates the desktop market, a healthy share of the laptop market belongs to OS X, which is (partially) based on BSD.
This isn't exactly how the Open Source crowd envisioned "Linux on the desktop", but if the goal was to marginalize Windows in flavour of open source based *nixen, much of that goal has been accomplished.
Of course neither Android nor OS X are Free (as in freedom) and OS X is only partially a BSD, but you can't have everything.
That table is global market share. The source article for the 2013 data[1] shows Apple in third place for 2013Q4 market share within the United States:
Firm (4Q13 US Market Share)
HP (26.5%)
Dell (22.8%)
Apple (13.7%)
So while Linux users can feel happy that Android is Linux under the covers, really, the real reason it is so popular is that it's so different from Linux. For an average user, thankfully it doesn't have any traits of Linux. Like, when I install Android on my machine, I don't have to go and manually edit the xorg.conf file so that the machine knows my monitor can take 1280 x 800 resolution and not just 640 by 480, after being clueless for a long time ... or that even in today's time I have to go and make an application 'executable' from the command prompt after downloading it from the internet. There are some solid reasons Linux "as is" is not successful even today.
I do get your point, generally, but your specific example is kind of dated - I have been using GNU/Linux on my desktops for about 14 years now, and I do remember how much editing xorg.conf / xfree86.conf sucked. But it has been years since I have had to do that. The last time I checked was in Ubuntu 10.04, where xorg.conf consists only of a comment that says that xorg can detect the hardware and configure itself, making manual configuration unnecessary nearly all of the time.
Yeah. The xorg.conf file was a little old example. Though I keep coming back to Linux every few years to check, and the 'making executable an executable' problem is still there from my very recent experience (like 3 to 4 months back). Right at the time I was being impressed by how easy it was to isntall Ubuntu from my Windows partition, I couldn't believe when one of the professional products we use in our office had these official installation instructions for Linux: "1) Download the file, 2) go to the command prompt and make it executable by typing <whatever>" ... Now I understand it might be that the product makers haven't updated themselves, but it's definitely a turn of for a lot of people.
Yes, when I and the vendor both know it's supposed to be an executable. What a smart question. Also, an application is only as dumb as the OS allows it to be.
Here's how I installed Chrome/TeamViewer/VirtualBox/SublimeText3/etc:
I went to the website, downloaded a .deb file, double-clicked the downloaded file, clicked the big install button in the Ubuntu Software Center window that it spawned, entered my password, waiting a few seconds aaaaand it's installed. I can now run the application by clicking the big Ubuntu button on my sidebar (or pressing the Windows/symbol key) and typing in the first few letters of its name. I can optionally drag the icon to my sidebar for quicker access.
Here's how I installed the last couple of applications that didn't have a .deb file (mostly development builds of games and developer-targeted applications):
I went to the website, downloaded the .tar.gz/.tar.bz2/.zip file for my platform, double-clicked the downloaded archive to extract it to a new folder in my home folder, opened the folder and located and double-clicked the executable.
Perforce, one of the very famous Source Control Systems tells us to install its software like this (Hint: making the executables 'executables'. On Windows it's a direct executable): http://www.perforce.com/perforce/doc.current/manuals/p4sag/c... . I remember because I did it very recently.
Thousands of other software do the same. Examples (compare the installation instructions for Windows v/s Linux/UNIX for all these pages below):
To be fair, that's because you're buying a phone/tablet with limited, non-replaceable hardware and the software has been customized by the manufacturer to work with that hardware.
Android's base source is released under Apache 2. Is that not free as in freedom? Cyanogen Mod (among many other distros of Android) would not exist if it wasn't.
Will happen when Microsoft's ability to strong-arm hardware manufacturers using restrictive OEM licenses is diminished.
MS actually seems to prefer a scorched earth policy (destroying the laptop/desktop markets entirely) to allowing Linux to gain anything more than a token foothold.
We can see this by the way they scuppered 'desktop linux' by co-opting and ultimately destroying the netbook market in 2007 (by enforcing OEM licensing terms that crippled netbook specs - prohibiting giving them anything more than 2GB of RAM for instance).
We can see it again by the way they pushed UEFI / secureboot on manufacturers - a clear attempt at creating Windows lock-in that largely worked.
2007 would have been the year of the desktop were it not for this anti-competitive behavior by MS.
Often installing your own os or rooting your phone requires utilizing a security hole discovered by someone else. This arguably violates the DRM anti-circumvention bits of the DMCA.
As most android devices are handheld, "the desktop on the hand" is now a thing in my mind. It makes sense - so much communication and scheduling happens on the handheld device.
Android just uses another type of input than classical keyboards and mice. But you can also use keyboards and mice here. There are x86 implementations of Android, and you can root into the Linux system, and you can use Linux features within your Android device. Terminals for instance:
That's bullocks, you're just changing definitions to suit your needs.
Android is a mobile phone OS and as suchhas a marginal market share on desktop. Windows still has over 90% market share on desktop (http://www.netmarketshare.com/)
Desktop computing doesn't really matter that much anymore. I'm also coming down harshly on Linux here - they tried to be a better desktop OS rather than skate to where the puck was moving to.
The vast majority of actual 'personal computing' is phones and tablets. PCs are content creation devices.
Asides from Android obviously not being the traditional stack - no glibc, no KDE or GNOME - AOSP isn't very useful compared to semi-proprietary Android. We could also say that CMU Mach / FreeBSD won the desktop because iOS and OS X use them.
>Desktop computing doesn't really matter that much anymore.
For whom? Companies might make more money selling mobile devices compared to desktop/laptop PCs but that doesn't equate to the inane statement oft repeated that "desktop computing doesn't really matter that much anymore".
Desktop (including laptops of course) is where 99% of the world's actual office, accounting, design, writing, programming, editing etc, computing based work is happening.
">>> Desktop (including laptops of course) is where 99% of the world's actual office, accounting, design, writing, programming, editing etc, computing based work is happening."
Agreed. Try typing out a dissertation on a tablet (no external keyboard or mouse to compensate) which has an approaching deadline.
Is an email "content"? Is a blog post? A HN/Reddit/whatever comment?
I don't think the old stereotype of "consumers" as some sort of purely passive livestock is especially useful any more. Clay Shirky's old but very readable "Here Comes Everybody" makes a similar case at much greater and more articulate length.
> Desktop computing doesn't really matter that much anymore.
BWAHAHAHAHA!! Thank you. I needed that laugh.
Laptops and desktops will always be more powerful than mobile devices, therefore there will be users who want the option of having said extra power. Even if you could, say, run AAA games on a mobile device, the interface would be awful for most of them (eg FPSes).
So, what exactly is your definition of a desktop system? If we're going to have this discussion we should get the agreement or disagreement over that definition out of the way. To me, it is a desktop computer. I'm inclined to include laptops in that definition as well. Basically, it's the kind of computer I'd use to get my computer work done at a desk, with as little regard to what exactly that work entails as possible.
I haven't used Windows 8, but it still seems to have a greater market share on desktop systems than Android. Judging by the above link, Windows 8 alone has a greater share of the desktop market than the category of systems based on Linux in its entirety.
So, if I have a keyboard on my tablet that is running Android?
I think the point here is that "desktop" as a location has no real meaning - it's the use case that defines it rely. By desktop system one really means a day-to-day used system that is used for word-processing, web browsing and other standard domestic/work uses (photo editing, spreadsheets, games and such). That's the stuff a desktop system has always done and now you can do that on a phone, phablet, tablet, laptop, desktop or whatever. The hardware form is just about convenience it [no longer] describes a logical difference or significant processing ability [eg my phone has about as much processing power as my desktop (ignoring the GPU), some phones definitely have more]
If you're using your phablet paired with a keyboard for the same uses as others use their [traditionally defined] desktop for then what use is the distinction?
Android can be used on the desktop, but IMO it's not tailored for that and so shouldn't be described as a desktop OS.
The whole "year of Linux on the desktop" thing is about market-share as much as anything; as other have said many people have been using it as a desktop OS for plenty of years.
> The hardware form is just about convenience it [no longer] describes a logical difference or significant processing ability
What exactly is a logical difference in this case, and how is it relevant to the discussion? Is my decision illogical if I consider the ergonomic implications of using a particular hardware/software platform?
Aside from ergonomics, I'm not sure what is so controversial about assuming that the different user interfaces will (and quite obviously do) cater to different use cases. What I can realistically do with a keyboard and a 24" monitor is certainly different from what I can do with my mobile phone. What I can realistically do with Debian is certainly different from what I can do with iOS or Android. These differences alone convince me that the distinction between desktop computer and phone/tablet/crablet is still meaningful, regardless of the literal meaning of "desktop computer". Manufacturers, retailers and consumers generally understand this, no matter how much anyone pretends that a tablet is equivalent to a desktop computer.
> "year of Linux on the desktop"
People have been using [insert any relatively obscure OS] as a desktop OS for plenty of years. The fact remains that Linux doesn't hold a significant share of the desktop market, which is what the idea of "year of Linux on the desktop" has always been about.
I wasn't suggesting you choice was illogical, I was expressing that the ability of a phablet, say, to process data and run computing operations has a logical equivalence (in the processing sense) in a "desktop"; and then mentioning that whilst the processing power may generally differ there is substantial overlap (eg phones having more processing power and RAM than some desktops, etc.). The "logical" part was to contrast the physical differences.
You can use a smartphone with a bluetooth keyboard to write business docs with wordprocessing apps or construct spreadsheets (you could use the on-screen keyboard if you're a masochist). Different physical interfaces target different use cases as we've both noted but the distinction is more about marketing than ability of a system.
Going back to where this all came from I doubt this discussion is fruitful.
I'll give you points for tenacity, but Android was never intended to be used on desktop computers. The link you give is just a hobby project, nothing more.
And since we're being pedantic here, let's talk about definitions. A mobile operating system is one that has been designed primarily for devices intended to be carried on one's person, such as a phone or tablet. A desktop operating system is one that has been designed primarily for devices that are large enough to be stationary during use (e.g., workstations). Both form factors have unique requirements and capabilities, and need vastly different user interfaces.
Further, you seem to be arguing that the kernel (and only the kernel) is what defines an operating system. If that's the case, then Android can only properly be described as a fork of Linux, not Linux itself.
To play devil's advocate, I can't think of a single mainstream distro that doesn't fork and patch the kernel as well.
And one of Android's original use models was a non touch screen connected to a keyboard (mainly for Blackberry style smartphones before they were sure that slates would take off, but still...).
Define "desktop" and we'll tell you. You can run Android on a base system that is connected to a standard desktop monitor, printer, mouse and keyboard.
Yep, that's what I do. I even have an Android X86 running in VirtualBox in my Linux box - and several apps installed. No problem with keyboard and mouse, emulated full HD display, audio... even Google voice recognition works!
Of course I use it mostly to play around and do tests.
> Because on desktop Windows still has a ~90% market share.
This is only true for _PC_ desktops but not for the overall desktop situation.
Consider Windows 10. It derives from the mobile Win 8 desktop which is made more suitable for PCs so that people can use office and spreadsheets and other classic software in the traditional Win 7 way. But you can do many of these things right now on Android devices. For instance, gaming and office:
Both "static" and mobile desktops are continually merging together. Soon we will have mobile desktops which can turn into PC desktops instantly by just plugging the device into a docking station which is connected with a large display, keyboard, and mouse.
So when we define "desktop" as the sum of all traditional desktop applications then Android has really overthrown all other OS. The current office versions of Android are yet just not as convenient as PC desktop versions. Soon it will make no difference if you write your letters on a PC or on a mobile device with docking station. Of course this doesn't count for productivity software (software engineering, CAD etc.) which still requires a lot of PC horse power.
I guess this either means that you either think that those 85% are running Android on their desktop computers, or that you have a different idea of what a "desktop system" is than anyone else. Android isn't a desktop system except for a select few masochists.
Personally, I want my desktop system to be a real power horse, and despite being built on top of GNU/Linux, Android doesn't seem like a particularly open environment and it obviously wasn't developed with desktop users in mind.
I guess with "desktop system" you actually mean PC desktops. In my definition a desktop system is a system on a desktop. In the past decades this meant PCs but in the last couple of years many users chose tablets as their favorite desktops. They just replaced their PC desktops by mobile desktops for gaming, surfing and emails.
On the PC desktop world Windows is still leading with Win7 and XP, followed by OSX, followed by Linux and others. Linux won't get leadership here but who cares? Linux is a very good PC desktop system anyway, absolutely competitive to Win 7.
OK, since your definition seems remove every aspect of "desktop system" that makes it a meaningful distinction, we'll just have to agree to disagree here. As for linux on PC desktop, competitive in what sense? I prefer a good Linux based OS over Windows any day, but in terms of market share, it's just a ridiculous thing to say.
It's not built on top of GNU/Linux, but on top of Linux. If you count using GCC and binutils as GNU part then we should start calling things Intel/Linux if someone uses ICC and other funny things.
I think we should differentiate between desktop and other uses.
If it is just (any flavour/use) of Linux vs (any flavour/use) of Windows, then I think the vast majority of people use Linux more than Windows. Routers, TV-decoders (and most of the streaming servers they use for on-demand content), NAS for home use, phones, etc all use Linux.
On a side note. Linux users should be happy that Windows is using a package manager. It creates more opportunities to get Open Source software into Windows.
Nice. Windows which claims to be the default desktop OS copies a basic feature from the Linux "non-desktops" which is a command line package manager :-)
Another irony is that software installation has been much easier and safer on prominent Linux systems like Debian, Suse and Ubuntu than on Windows for the last couple of years.
Final irony is that I (and others) actually use Linux as a desktop system for more than twenty years. Since Windows 8 Linux was even the better desktop :-)
>Windows which claims to be the default desktop OS copies a basic feature from the Linux "non-desktops" which is a command line package manager :-) //
There's nothing contradictory about that. Good features can come from the worst of software or the best. It's actually a positive change IMO that Microsoft would adopt a good feature in the facing of crowing like "they stole that from linux".
Of course the devil's in the detail; usually when I see something MS are doing and appreciate it they manage to royally confound it - here letting Oracle, and their ilk, install the crapware they currently trick inexperienced users in to having is probably going to be part of that.
I find that snarkiness very reassuring! Some crazy straw man argument, calling us silly names, pretending that open source linux is more mysterious than the mad and secretive Ballmer temple of cash worship... Shows that the die hards are finding their truths to be a web of self deceptions.
It honestly doesn't even matter anymore... Dell sells Ubuntu machines, Steam OS exists, and between Chrome OS and Android a ton of Linux machines (even if they're not really 'GNU/Linux') are sold every year.
> and who knows, that might just trigger some kind of revolution in Windows app management
Free trials of un-tar? Installing apps that require subscriptions to the cloud? Dev libraries that require enterprise support packages?
Good package management on Linux is owed largely to the tireless voices such as Stallman who understand the core issues here. Yes, tying together install scripts and maintaining repositories requires a lot of work, and good for Microsoft. But the reason It Just Works is because the software is free, from top to bottom, including the OS. And Mac will have the same problem here as Windows. For now, I'm guessing this is just a command-line interface to app stores.
> Good package management on Linux is owed largely to the tireless voices such as Stallman
I thought it was due to the people designing the package managers. The people who run the repos Stallman-like end up shipping IceWeasel and Chromium and telling long boring stories to someone who just wanted Firefox and Chrome.
It was not the decision of debian to rebrand Firefox. Mozilla forced them to.
Debian wanted to be able to ship security patches whenever they pleased, Mozilla required that they shipped Firefox stock, even if that meant waiting for Mozilla to approve security patches.
Chrome is not open source. Chrome tracks you. Chromium is open source and does not track you. Other than the slight branding differences, a user would not know the difference.
> Chrome is not open source. Chrome tracks you. Chromium is open source and does not track you.
I'm pretty sure neither Chrome nor Chromium 'track you'. When you sign in to your Google account, then they keep track of your searches, to provide things like auto-complete, remember your book marks, and provide hints, location services, etc...
There's a huge difference between tracking you personally, and reading some information, keeping it in a database and providing you a better service later... (not going to lie, yesterday my boss sent me an email for a meeting, and GMail then automatically made a reminder for me and pushed it to my phone via Google Now - pretty damn awesome if you ask me).
The only differences I've seen between Chromium and Chrome is that Chromium lacks the Pepper plugin by default (which provides Flash), and the Hangouts binary blob. All the 'tracking' if you really want to call it that (I personally wouldn't), happens through Google services, not Chrome per se...
We owe a lot to those who work on package managers and the packages, but he has a point in that if it works so well it has a lot to do with the software being free.
Important things like good and efficient dependency management and library reuse are far easier to achieve when all the relevant sources are available to packagers.
IceWeasel and Chromium visually are Firefox and Chrome with different icons and you'll find them if you search their "real" names in the package manager. Not so hard, right? You can find about why they exist if you are interested, it's not required.
I know that you know it, I was talking about your hypothetical user.
Except for the color of the icon and the lack of support by default for AAC, H.264 and MP3 (I have support for those 3 in Chromium in Arch), do you find anything relevant that will make some non-expert notice they are using Chromium and not Chrome?
I assume you haven't used Chromium. It's not an issue, as I said they don't come by default as in Chrome, but those codecs are easily installable through the package manager.
I hadn't even noticed Chromium doesn't have support by default for those codecs as I have used HTML5 audio and video without an issue. Really, for a non-expert Chromium is just Chrome with a blue icon.
Pretty much. While it's cool coming from Microsoft I'm willing to bet it's going to be an inferior implementation one way or another compared to Linux distributions.
Why? Powershell innovated a lot over bash - try and kill all processes started in the last hour with bash, then do the same with posh (hint: a regex is a terrible, terrible way to deal with time). MS could do something good with package management.
Do you have a positive experience with CoApp? Mine has been overwhelmingly negative: it does not appear to work, does not appear to be maintained and does not appear to be supported.
I'm afraid to say that I haven't had a positive experience either, but I was just interested in CoApp out of curiosity: I tried to see if it was doable to use it to package pure Python (or other interpreters) libraries.
Obviously it was not ready yet, but I wasn't really let down: I seldom use Windows, so I don't have any real need for it, and my perception was that with some more months/years it would get there.
But CoApp and OneGet are worthwhile endeavors, so I'm cautiously optimistic that they'll eventually build something that can make software installation on Windows less painful
PS: uh, I just realized that you're a Microsoft developer as well (just like Garret Serack)... is it so difficult to get hold of him even inside the same company? (I don't know... maybe you're actually on different sides of the ocean, and this would make the matter quite a bit more complicated)
I'm also very optimistic about OneGet. I do think it's a worthwhile endeavor (and I think CoApp is worthwhile.) My frustration comes when Microsoft builds a technology and gives it a web page and open sources it and... promptly ignores it.
I hope my criticisms of CoApp / Native Nuget did not transfer to a criticism of OneGet - my suspicion is that CoApp is abandoned simply because Garrett doesn't have time to both work one CoApp and OneGet. (It's also possible that OneGet deprecates CoApp.)
Certainly I understand all these problems - as you note, I am a Microsoftie as well, so I'm very familiar with having too much to do and not enough time to do it - and I've abandoned a few projects myself. But I'm not happy about that.
As for your question about being at Microsoft: I'm not across the ocean but I do happen to be on the other side of the country. I suspect I could have gotten a reply if I sent an email from my microsoft.com account (even if that reply was "sorry, don't have time") but I sort of hate throwing that around since it feels unfair to the rest of the community trying to use it.
> It's also possible that OneGet deprecates CoApp.
That sounds like it's the case:
"CoApp's features are going into OneGet, WiX, NuGet and Chocolatey. I had the opportunity to take this approach to get it in-box OS, I figured that was worth it."
Thanks, that actually looks really good. No-fuss copy and paste, auto resizing windows, transparency, etc. All they needed to do was clone how terminals such as Ubuntu's does it ... and it looks like that's what they did. :) I've been waiting for this for quite some time.
You mean that right click on the window chrome then select 'mark' then select text then right click window again and then choose 'copy' isn't a good user interface?
No, it's crap, so the only people that do that are beginners, command line tourists, or people using one of the very small number of console programs that support the mouse. Regular console users use the QuickEdit mode: left click to enter select mode, drag box - or double click to select a space-or-line-end-delimited region. Then press return to copy or esc to cancel. Then right click to paste. Squint hard enough, and it's almost like using xterm!
(Bonus feature: you can halt unexpected TTY spew by clicking in the window, giving you a chance to examine it.)
QuickEdit still doesn't fix the box select thing (at least not pre-Windows 10) but it's a massive improvement nonetheless.
Windows has an app store already. I don't know if it works with the command-line tool, but if not, I'm sure they will be tied together eventually. Ubuntu has an app store as well.
Hmm, I'm not so sure I appreciate the Chocolately support. I've found the packages really spotty as to where they are going to install, what options they are going to use, whether they bother to provide an uninstaller. I think a large part of that is how Windows manages installation and libraries though. In an ideal world maybe everything would provide MSIs, but they alone are not a silver bullet.
I hope Microsoft take the Updates side seriously - having many separate auto-updaters on the system all taking different views on when they should run is a nightmare. If they created an integrated third-party application update system I would love them for it.
But those are installer problems, not Chocolatey problems. Anyone who starts on the road to automated installations on Windows realises that it's basically the Wild West. Especially if you need to install XP or Vista era software.
The article says that it's the same format as Chocolatey (an existing Windows package management tool) but actually it's a fork of NuGet in Windows 10 and Chocolatey is also a compatible fork of NuGet. Microsoft already owns and maintains NuGet.
Not trying to be pedantic, but Microsoft doesn't own/maintain NuGet. OuterCurve does and while MS supports OuterCurve with cash/time, it is its own entity and MS does a good job keeping them separate.
Dependency management is more than 'defining and getting dependencies'. What if package X relies on version 1 of package A, and and package Y relies on version 2 of package A. Will they work together, each in an isolated sandbox, or will you have dependency conflicts?
Package management is a really difficult problem to solve, which linux package managers have barely been able to solve. In addition to this, each programming language usually has its own package manager again, which vary a lot in quality.
Linux repositories conflates package management and (library/ABI) dependency management.
Windows already has locally deployed assemblies and central side-by-side (SxS) assemblies that can accommodate multiple versions of the same assembly, complete with major/minor versioning, redirection etc. Windows Installer supports reference counting and automatic uninstalling for the centrally deployed assemblies.
This new OneGet package manager is for solving the availability and automatic updating of packages. But Windows is not going to need separate repositories for each incremental update of Windows and the ABI.
no - this is not applicable to Windows, since there is not a widespread practice of system installed libraries. Windows packages are far more similar to something like 0-install (http://0install.net/).
If there are two software packages which depend on different version of a third party package, then the usual practice is that either the third party DLLs are included in the package or both the versions are specifically installed.
I think this is the way that OSX works as well - it is only Linux that works with centralized dependencies, which is what needs a very sophisticated dependency management solution.
It looks like they are basing OneGet off of NuGet - their package manager for Visual Studio. It does handle dependencies and is actually quite good compared to other package managers.
I'm coming to this massive comment party late, but I wanted to drop my thoughts about this and explain why I'm personally very excited by OneGet.
I'm a very experienced Windows Server Admin (15 years). I'm also a fairly experienced Linux Server Admin (7 years, on and off).
I'm currently an SCCM guru for a ~4000 user organisation.
I'm sure that if OneGet is supported by MS to the level which the developer explained in the reddit thread it'll be a boon to power users managing their own systems and to desktop support people.
But I can imagine that it'll be the SCCM teams who can leverage the most out of it.
The thought of being able to deploy and manage software across desktops and servers in a similar way to apt or yum makes me feel something close to utter joy.
I've had to build some very complex task sequences to install software on corporate machines. The worst example I can give is MS's own Dynamics CRM application. I was seriously proud of the batch files, registry inserts, dll hell avoidance, dependency solving, mother of all automated install processes to get Dynamics installed and hooked into Outlook in a magical way that the end user had no clue the complexity of.
It was only after slugging through developing all of the above that I found the incredible PowerShell App Deployment Toolkit [1], which people smarter than me had developed to basically handle everything I'd just slogged through.
From what I see OneGet has the potential of allowing us to easily push installs and updates via SCCM with out having to rely on 3rd party tools, batch files, msi rebuilding and general hackery.
Even if it doesn't gain wide support from software vendors, just enabling me to rip everything out of an msi and repackage it into a private OneGet repo sounds superior to having to rely on complex SCCM task sequences.
I have tried PowerShell a few times but I can't never appreciate it.
Yes, it technical superior than it's counterparts on Linux by piping objects instead of mangling text, but the syntax of PowerShell is just too convoluted.
I have concluded that when doing command line you want to write commands and not program code to execute.
That's why bash is usually straight on and powerful for the 95% commands you do. It's edge cases when you need to do horrible awk piping and general guess work.
PowerShell feels the other way around. Easy for the 5% edge cases, but pain for the 95% common stuff.
Yes, you can create aliases in PowerShell for the most common tasks, but then you don't really learn the powerful mechanics that you need to know.
And it does not help that you need to learn one more scripting language. Microsoft should have gone for JavaScript.
Maybe I'm missing something in the greatness of PowerShell?
is something that doesn't really exist anymore for management tools on Windows, except those that have been around for a long time. Even the GUIs on Windows Server nowadays just drive a PowerShell runspace behind the scenes and you can script just as easily as click around.
Install-Package is a PowerShell cmdlet and thus gets sane and consistent argument parsing and discovery for free. Something like instpkg32.exe simply won't exist for a new feature here.
As for your complaints, I can gladly answer them and other questions, but it's probably a bit OT here.
You do realize that PowerShell is case-insensitive (you don't need to hit Shift unless you need it for something like parentheses) and there are aliases to ease typing when you're just using the shell? (I wouldn't recommend using aliases in scripts, though.)
Besides, if this is about command parameters, all that's needed is the dash and enough letters so the parameter name is unambiguous; for many common commands that's not longer than the single-letter arguments to Unix tools. And they are case-insensitive, too. No need for Shift here either.
But perhaps you do enjoy the haphazard mix of /parameters, -parameters, --parameters, /p, -p with varying ways of specifying arguments to those parameters, like /x:foo, /xfoo, /x foo, /x=foo that can be found all over the place in the default Windows command-line tools (because that's what the argument was about here). In that case, yes, PowerShell is probably a huge step backwards.
Windows 10 command prompt will indeed let you resize the window. It will even re-flow the text already written, i.e. a long string written that flowed across several lines will also flow back into one line, once the window gets wide enough.
Open cmd window, left click on icon on left of cmd title bar, adjust window size in Defaults and Properties. This includes, height, width and buffer size.
Unlike every other window, where you can change the horizontal size by dragging.
It's a relic and I'm glad they're fixing it, but as with the browser they're only delivering changes because third-party software innovated them and it's making them look bad.
>but as with the browser they're only delivering changes because third-party software innovated them and it's making them look bad.
How is that a bad thing? Firefox was kinda stagnant before Chrome came around and took the performance throne, I don't see anyone criticizing Mozilla for that.
Firefox isn't in the Debian repos, unfortunately. But Iceweasel is installed by default on the desktop, so there's that. Unfortunately, Iceweasel is chronically behind mainline Firefox and suffers from many horrifying bugs that make it almost not worth the effort to try to use, which is why the grandparent comment goes through so much effort (Well, outside of the obviously well planted troll.)
Also, Linux Mint is a terrible way to get Firefox for Debian. You're better off installing it from the "Ubuntuzilla" repo, which contains just Mozilla components built for Debian-based OSes: http://sourceforge.net/projects/ubuntuzilla/ (Ignore the hellhole that Sourceforge has become).
Honestly, I wish Mozilla would just create a Debian repo (and one with Nightlies would be nice), but I understand the scorched earth there...
This is the problem I have with a lot of software. We're supposed to use redhat, for example, but don't subscribe to the RHEL repos. So I then I have to go add EPEL repos or find whatever repo the actual piece of software I want is in. Like google chrome, each time I can remember where to get it.
For those downmodding the parent post: the author is being sarcastic, but the commands are entirely accurate - due to Firefox's trademark, Debian does not ship Firefox or include it in their repositories. Adding a new repo, and the keys required to sign that repo are entirely necessary to install Firefox on Debian.
It's adding a debian-compatible repo (from Linux Mint Debian Edition) in order to replace Iceweasel (firefox without the branding) with Firefox (firefox with the branding). Firefox's branding is incompatible with Debian's licensing requirements, and they amicably split the branding years ago as a result.
The example was intentionally selected to be as obfuscatory as possible, in order to be a troll. Most of those commands are about adding a repository rather than installing software, anyway (and the four lines involving gpg are usually done in a single line)
Is there a reason why we have fewer of those problems with Mac OS X? I can see that the Windows audience often isn't tech-savvy enough to understand that Ask isn't part of Windows, and that's probably also because Chrome/Firefox/Safari are more diligent about managing extensions than IE. But there's a very serious constant problem of spyware/adware in Windows that I don't feel exists in Mac OS X and Linux.
Because the markets are too small. Windows is what, 90%+ market share? Why spend n2 or n3 times the effort (as a malware writer) when the absolute highest upshot is a single-digit percentage ?
Well you named it. Marketshare is still over 90% on the desktop and most non-tech-savvy users are likely to use Windows as well.
I can't think of any other reason.
Fantastic. It'd be great if it would integrate with the Store. I'm guessing that even if it supports dependency resolution, most packages will just be monolithic blobs like currently. That does have its advantages, so it'll be interesting to see if people switch to specifying dependencies instead of bundling them.
Will be interesting if this will be similar to linux with the possibility of community driven repositories alongside official repo's. Or if it's more of an admintool to do windows updates/windows store installs.
It says it uses the same package format at Chocolatey and you can even add their repos. I don't know if there's any caveats to this, but it's a very, very good sign. I am excited about this.
As simple as it seems, this single failing has been my primary reason for outright dismissing Windows as a reasonable contender for server usage, no matter what else Microsoft might do to make it attractive for a variety of server tasks. A system that cannot be updated easily without human intervention is not a system that deserves a place in any data center.
It's been nearly a decade since I managed any Windows servers. I'm more than willing to believe there are features of modern Windows systems that I don't know about. Perhaps you can fill me in on what I've been missing...I assume you're implying there is some equivalent to yum or apt-get on Windows that allows easily updating system and third-party packages from the command line without human intervention?
And, do most vendors make their software available through this mechanism? That would be a miraculous improvement over the couple dozen different update services that run on a Windows box (Java updater, Adobe updater, Apple updater, Oracle updater for VirtualBox, etc.), all of which have wildly variable reliability and are mostly impossible to script or automate. If those are gone from the Windows management experience, that'd be great.
I haven't admin'd windows boxes for a few years, but apparently PowerShell has finally given windows admins a proper commandline that they can actually work with - no longer are GUI tools required to admin a windows box.
Interestingly, the last time I looked at Windows servers (2011?) there was a whitepaper from MicroSoft lionising windows headless servers - without the GUI, there was 70% fewer security bugs, giving a smaller attack surface.
I've thankfully never seen a Windows Server with Flash, Reader, iTunes or VirtualBox installed, sounds like a security nightmare waiting to happen regardless updates.
I also see lots of insecure and outdated applications running on Linux servers, especially Wordpress, Drupal, myphpadmin, cPanel etc. Heartbleed and Shellshock themselves are still not patched in a good percentage of servers.
I run a few windows servers maintained via ansible. There is a protocol called winrm built that is functionally equivalent to ssh for the purposes of remotely controlling a box. Ansible uses winrm to provision. Beyond that, chocolatey.org provides a complete package management solution as long as you are looking to install popular free (as in beer) software. Also, you need to learn powershell to do anything useful. It's painful but I have to sit back in awe at the extent of the APIs available to you from the powershell prompt, each one hand crafted by a Microsoft employee. Pretty much anything you need to from a OS configuration perspective can be done in powershell pretty easily.
Once you get over these couple of things it's not terrible. (though I of course widely prefer linux)
I think one can hardly exaggerate what an improvement over cmd.exe PowerShell is. Even if I really hated the outcome, these days PowerShell is just too damn useful to ignore as a Windows admin.
One thing I am seriously disappointed with is the documentation or lack thereof. PowerShell's Get-Help/man cmdlet is admittetly nice, but compared to the kind of documentation you get with Perl or Python, I am left underwhelmed.
Well I take it you have never maintained a 10k+ installation of Windows machines then? How do you imagine this is done? An army of sysadmins going from PC to PC and manually clicking through installation dialogues?
I agree. I use OSX and Linux at home, but I am a Windows admin at work. And while I strongly prefer Unix-like systems to Windows, I welcome every effort to improve it and make my job easier.
Chocolatey is great... although I wish it had a different "more professional"-sounding name.
When I was using it in anger a while ago I also found the quality of some of the packages to be a bit random, many of them just seemed to be random developer X's favourite aggregation of other packages. In which case I'm happy to see that they are adopting package moderation;
Someone at work told me Windows 10 is also going to have virtual desktops.
Haven't verified.
I did play with a release preview of 10. It was actually OK in a brief 5 minute survey. I won't ever use it for day to day stuff, (I'm a Linux user), but its nice to see them at least appearing to make an effort. I think most of us in the field are at some level negatively impacted when Microsoft engages in evil/stupid behavior, so it's nice when they come out with something good which they do from time to time.
Those are PowerShell cmdlets - and follow PowerShell verb-noun naming convention. There is a finite set of "approved" verbs (though you can use your own) representing the "actions" you can perform against a resource (the noun part).
From within a PowerShell console I can "add" a lot of things. Try typing
gcm -verb add
from a PowerShell console (gcm is alias for Get-Command and the command above will list all commands where "Add" is the verb).
Descriptive names are good (to some limit). You can maybe add something else (a remote repo? Something else), or maybe it isn't unthinkable that Add-X will eventually be a feature. You can always make an alias "inst" so it's better to keep the underlying command long and descriptive.
I am not sure why even in 2014 we have to pit Linux against Windows. We have surely grown out of that sort of debate.
Microsoft successfully achieved its objective of putting a computer in every home with the help of Windows. We should thank them for that. Linux on other hand has grown leaps and bounds. Android after all is a linux kernel fork. I think as technology lovers we can surely love both and look at them as technologies complimenting each other rather than competing.
> I am not sure why even in 2014 we have to pit Linux against Windows. We have surely grown out of that sort of debate.
I think it's unfair to frame any linux/windows debate as fundamentally childish. There are big differences between them, both philosophical and practical, in ways that have real impacts on the users.
I never said there are no differences but those differences are more like between a Business Suite and Bikini rather than Republican Party v/s Democratic Party.
Two different OSs which have a different view of the world and catering to different needs in different ways rather than something where their success is a zero sum game between them.
Sure, but there's no reason for a Windows enhancement to be framed as a Linux vs. Windows thing, except as antagonistic clickbait. As somebody who uses both Windows and Linux, I'm not sad that Windows is becoming more usable. I'm happy - it's one less thing to complain about using Windows.
Good step, now the rest of Windows... I mean, Program Files containing the binaries is not even in %PATH%. I assume improvements will be made here too, but I wonder.
One problem with putting every subdirectory of Program Files in the path is that most of them contain 'setup.exe' and 'uninstall.exe'. Type "uninstall" at the command line to uninstall a random program!
Or we could use a real binaries path (e.g. %programfiles%/bin) and symlink (NTFS supports symlinks I think) the binaries from there. Then include that in the %path%.
There are many ways to do this but until Windows and its developers decide on one, it'll remain a big mess.
I liked how the idea's champion dealt with Microsoft bureaucracy:
> So, back in August I started looking at what I was going to accomplish over the next year or so, and I thought it would be a good idea to try and see if I could get some of the CoApp package management ideas put into Windows itself (hey, it'd be kinda nice to be able to do apt-get style-stuff and have that built into the OS)
> I had proposed some of this at the beginning of the product cycle for Windows Blue (Server 2012 R2/Windows 8.1) but it was a little too late in the planning cycle, and I gave too-grand of a vision.
> I finally came to full understanding of some advice my pappy once told me: "The secret to success is to find someone else to care what you care about, and make it their problem." ... I looked at him like I understood what he meant, but he could tell that I was just paying lip service. He then said "Try it this way: Set the building on fire, take someone else's stuff into the building with you, and then cry for help"
> While Windows and Mac users have to run graphical installers — you know, where you hit Next a few times and try to avoid installing bundled crapware — Linux users can just open up a command line and type sudo apt-get install vlc.
Right. As if VLC on windows comes with crapware.
As if apt-get [0] never asked cryptic messages [1].
[0] or yaourt or any other package manager
[1] and it's more about the package than the package manager
Technically, MSI has supported GUI-less command line installation from the beginning, with relatively good dependency management and error reporting. I've invoked MSIexec from install/provisioning scripts fairly often as part of continuous integration toolchains and it works great. Some of my applications also know how to pull down dependencies (like the windows SDK, or the MS CLR) off microsoft's servers via HTTPS and then invoke the installer automatically so features that depend on them can Just Work.
If the package manager knows how to leverage this (no doubt it will), that will allow GUI-less installation of a huge percentage of windows apps (MSI is relatively popular now despite its quirks.)
> rse yet, the Windows Store is now integrated with the system search feature. Search for an application using the Start screen search or search charm and these garbage apps from the Windows Store will appear. For example, whenever I use the system search feature to launch Firefox, I see a link to install “Firefox Training Lite” from the Windows Store.
This is insane.
But it's hardly VLC's fault. You could say Windows is distributing malwares.
There are a surprising number of malicious repackages of common FOSS utilities like VLC and 7zip i've seen non-expert Windows users install. It's actually not all that uncommon.
I believe the biggest problem for its success is that most companies don't want to give up the control a dedicated installer gives them (e.g. installing AskBar that comes with Java, as some comments pointed out).
In Linux, dedicated installers are the exception not the rule. I'm curious how the adoption on Windows will be, especially for commercial/non-free software which there is a lot on Windows.
I'm also curious how they will present the packages to the average user. Worst case, the user will have to look in three different places to uninstall software (namely System Control, the package manager, and in directories of software that isn't registered anywhere).
Somehow, folks have been led to think that the command line is a scary place, just for geeks, but it's so easy to lead someone through a slightly complex manual installation by just giving them a few commands to copy and paste into the terminal.
And as a somewhat casual user myself, I even find it easier to follow a command line installation than to wade through several pages of "open this window, click on this, click on that," especially when the instructions and the actual installer don't exactly agree due to revision divergence.
I've personally struggled with it to fix corrupt updated installed by windows updates, and while it doesn't meet the bar set by mainstream Linux distro package managers, it is a package manager nonetheless.
Plus, it seems like puppet is working on a plugin to make use of OneGet. It'll make work much more interesting in the future that I'm actually looking forwarding this happening.
I've always thought verb-noun made more sense (since functionality in powershell isn't bundled/accessed as a complete program it has that flexibility).
* Install-Package firefox
would be hugely preferable, though. Smart defaults are good things!
I think the article misses the point. It doesn't matter if the package manager is CLI or GUI-based, what matters is good UI and quality (+ price) of application packages.
Mobile stores (Google Play, Apple Store) are a good example of this... they host many apps & are easy to use. Who cares if you use keyboard, mouse or touch to get them.
For the purposes of this reply, I'm talking about packages as distinct from Windows itself.
A reboot is required if something you are trying to install is already in uss. Windows locks executable files and dells while they are being used, so it's not possible for an installer to overwrite them. When an installer detects this it places the new file in a temp store, and windows empties this store on startup,
Thus if you are updating say Java, and the Java binaries are in use, then a reboot will be required. On the other hand if the binaries are not in use, then they won't.
So the need to reboot will vary enormously from one user to the next, based n their habits (do you close the program before updating it?) and also the kinds of programs they have running when they do an update.
Aside: some installers can detect that the program is running and terminate it as part of the upgrade process, thus explicitly avoiding a reboot. That's why say Firefox and Chrome never need a reboot. But that's easier to do with a program, and less easy with a runtime like say Java.
Restart Manager along with Windows Installer will manage stopping and restarting running applications/services to make sure that executables, DLLs, configuration files etc are updated atomically and transactionally.
Windows locks executable files and dells while they are being used, so it's not possible for an installer to overwrite them
Exactly, and Linux doesn't do that, hence my comment. The Windows model is flawed--why do you need to lock a binary on disk when a copy of it is running in main memory? Linux just lets the installer overwrite the files on disk, so there's no need to restart the whole OS, just the program whose files were updated.
I assume Windows 10 will not change this behavior? A command-line package manager on Windows would be cool, but its utility will be limited if you still need to restart the whole OS just to upgrade a program that's currently running.
Again, not a problem on Linux because Linux keeps numbered versions of .so files.
So it's not that Windows has to restart after replacing a file that is in use. It's just that it would rather not deal with the complexity that results if it doesn't.
In other words, "we didn't want to bother with versioning DLLs."
It's more complicated than that. Windows doesn't have true inodes but rather uses the name alone to identify files. So you can't unlink any open file. (Try to delete a directory when you have a command prompt open in it.)
Windows did hack on a form of DLL versioning in the form of isolated assemblies. It's ugly and complicated and Microsoft still resorts to suggesting that you avoid DLL hell by bundling local copies of all your shared libraries. Which kind of makes me wonder why they're even called "shared" at all. May as well just statically link.
Popular installer packages include the "you should restart your computer now" dialog purely as boilerplate, you almost never actually need to do so. This is yet another sad case where the "power users" realized something was a non-issue years and years ago, but the platform holders couldn't properly communicate the fact to the public.
every time seems exaggerated (given that even something major as Visual Studio installs without requiring a restart). Would be like saying Debian always needs a restart after a bunch of updates. Yes it does sometimes, it doesn't on other times.
tip if you want even less restarting on Windows after installing (and less registry rot): always look for portable versions, if you can't find them first just try to get it yourself by extracting from the installer. You'd be surprised for how many installers this wor. Some installers you can just open in 7Zip/UniExtract and the likes and extract, others can be extracted using msiexec /a PathToMSIFile /qb TARGETDIR=DirectoryToExtractTo
Yeah, I guess it's not every time anymore, to the extent that developers tend to just package copies of all their app's dependencies with the installer, rather than try to update a shared DLL that may be locked by another program.
Which version do you have of VS that installs without requiring a restart? Because I have vs2013 ultimate and each time I install it in a new machine I have to restart it (maybe two times if I have also to activate hyper-v)
2013 pro; IIRC it was the same for 2012 pro, rest is too long ago to remember but I'm pretty sure it hasn't been like that always :]. You got me thinking now though: I don't think I ever installed on a fresh machine without any previous installations of VS where possibly the prerequisites would require a restart.
I think there is a huge missed opportunity here. I mean, it's good that Microsoft is finally adding functionality that went mainstream in the late 90's. But why don't just skip the flawed approach most of the current Linux distributions use? In my opinion they should have been looking at a more declaritive approach, such as the Nix package manager uses.
I will agree though that this oversight is not as bad as TFSVC. Where they were building a centralized version control system when the whole world was switching to decentralized VCSs.
As much as I wish it were so, I do not think that is the case at all. What worries them, if anything, is more people using Macs, and more people ditching their PC/laptop in favor of shiny smartphones and tablets.
I'm not a Windows user so maybe I'm way off, but wouldn't adding Cortana go much farther in getting people to upgrade?
How's the Kinect support in Windows? Quite honestly, Microsoft is only competing against itself. How many XP uses are still out there? A couple hundred couple hundred million? Give them a reason to get excited.
It seems like you haven't paid much attention to the leaks and rumors out there... Cortana is 100% going to be in Windows 10 from the looks of it. Also, this package manager is designed to get the people MS needs excited the most - hardcore power users like us.
This isn't aimed at consumers. This is aimed squarely at businesses. Cortana won't excite them as much as further automation which decreases workload and by extension costs.
Dashes and uppercase...what terrible names. Is the windows shell case sensitive (I seem to recall it is but it may have changed)? We've been designing command line interfaces for over half a century now, this is just lazy. Shells are not magically exempt from UI best practices just because they're not graphical.
It's not case sensitive in terms of execution, this is just how they are stylized: "Verb-Noun" is the standard format for PowerShell commands (or "cmdlets", to use their terminology).
You can also easily setup aliases, and there are many unixy aliases built in. For example 'ls', 'cat', other will work. I don't even know the commandlets they're aliasing.
We were looking to replace our ancient Exchange server with an Open-Source equivalent but nothing has a feature-parity. Nothing has the widespread support as Exchange: ability to fetch on all mobiles, widespread desktop protocol support (not just IMAP but you know, that Exchangey binary protocol), a web interface etc. etc. etc.
It just is a massive continued success, although from my migration from 2003 to 2010 and subsequent pain to 2010 SP2, I didn't find it successful or painless - I don't miss maintaining it.
I think a lot of folks on here look at the world through very narrow glasses where they write Python web-apps under Linux, or work in SOHO environments that rely on external email providers; they don't come across the all-pervasive Windows desktop culture. They never need to touch Active Directory, let alone understand what it is or why they'd need it, and never have to touch an Exchange server or roll-out applications across an array of desktop machines.
It is a pity because some end up making short-sighted narrow-FOV comments about Microsoft and Windows etc. that are unnecessary. Just because we don't use a system, we shouldn't assume that nobody else does.
Haha yes, but they also have their RPC over HTTP method if you don't want to use the binary protocol, plus the ActiveSync method, plus the SMTP and POP3 interfaces, plus the widely utilised Web layer that was rewritten after 2003 - I think most of the Exchange clients on mobiles use this.
The ability to reset mobiles devices from within Exchange when a device gets pinched and to enforce keylocks on mobile devices is a great feature that I can't find anywhere else at such a low OS level.
.NET is massive. Just look for job listings. C# gets more money over here in the UK than C++, with there being far far far more jobs using C# and .NET.
Also, just look at how many developer jobs there are in Windows-land versus Linux-land: you hardly ever see Linux development jobs apart from embedded development. Windows is huge.
Good point! But the job descriptions will never specifically ask for strong Linux skills - they want you to code and upload, and the languages are not tied exclusively to Linux (although I don't know anyone running a real production server with PHP under Windows, so you're right!)
I never said their nice was small right now. I said that MSFT is not guaranteed a place at the Big Person table in every market forever. For example, when's the last time you saw someone using a MSFT mobile device?
.NET & Windows are used in the enterprise space, a lot. If mainframes are still around in enterprises, as a decent market (still worth billions of dollars), I imagine that .NET & Windows will still be around and probably still worth tens if not hundreds of billions of dollars, 20-30-40 years from now.
There's no foreseeable threat to Microsoft in the enterprise, that market is roughly split in 3 between (mostly) cross platform C++, (mostly) cross platform Java and Windows-only .NET (I don't think any enterprise really uses Mono in production). And both C++ and Java predate .NET, so enterprises seem to love .NET to have moved over to it.
If Windows 10 was built on top of Linux I'd get a lot more excited.
I have to use Windows for the myriad of engineering tools that are not available under any other OS, for example, SolidWorks, Altium Designer, various embedded toolsets, etc. And, while I've been using PC's (and Macs and Linux) since they came on the scene I hate, hate, hate the DOS or technically DOS-like underbelly of the beast.
I know it is a ridiculous idea. It would break everything, including their profitable corporate platforms.
Yes, there are ways to mitigate this but it'd be nice if all computing platforms got behind a common standard. Utopia. I know.
The APIs may be annoying (HWND, NULL, NULL, NULL, NULL, NULL, DWORD_PTR dwcsprcadrswtfbbq, NULL, ...) but that backwards compatibility you're suggesting they do away with is pretty much the best thing about Windows. I got so fucking sick of the desktop Linux people constantly breaking everything on foppery and whim that I went back to Windows 7+mingw, and while Stallman would be ashamed of me, I have to say that it's incredibly comforting to use a platform that mostly figured itself out over a decade ago, that won't change under my feet every other month, and even when it does (Windows 8) I can keep using the version I liked best for years and years because they actually took binary compatibility seriously throughout the entire system. There are still people on XP for christ's sake. I'll take that over overzealous redhatters who think they know what's best for me any day.
Being built on Linux as GP suggested doesn't mean its just Linux -- presumably you'd implementing Windows APIs on top of Linux. Think WINE, but without the disadvantage of having to be done by third parties.
Though a more likely way, were Microsoft to want to bridge that gap, would be to have a solid first-party POSIX environment on top of Windows, rather than the other way around.
Why? What's the point? NT isn't bad at all, it's just different from POSIX (hell, I can't believe the irony of this statement, but I'm glad that Microsoft is still around to resist the POSIX monoculture). Neither Microsoft nor the Free Software™ world would gain much from killing the NT kernel, but the latter camp would gain so much more from learning the lessons that made Windows so successful 20 goddamned years after the fact. It's so sad hearing Linus passionately say "you do not break binary compatibility in the kernel" then watching people on the outside flagrantly disregard this concept time and time again.
The point is portability. People now write all of their server software for Linux. More people would bother porting it to Windows if doing so was easier.
Example: There is no Windows equivalent to epoll/kqueue. Neither of them is even POSIX. But if you bring this up to most Windows people they tell you to use IO completion ports, which are totally different and require the core of the program to be redesigned.
> But if you bring this up to most Windows people they tell you to use IO completion ports, which are totally different and require the core of the program to be redesigned
IO completion ports and epoll both require the programs flow logic to be designed for them. If a program has been designed for synchronous IO, a redesign is required to take advantage of any asynchronous IO pattern. There's no magic fairy dust that allows you to magically drop "asynchronous" to an otherwise synchronous program.
The reactor pattern (Windows overlapped IO) is - at least theoretically - more scalable than the proactor pattern (Linux epoll/aio).
Under the proactor pattern (Linux epoll/aio) the process must indicate it's desire to perform an IO operation. The OS will notify the process through a callback/event when the IO resource is available for the operation, and must then perform the actual operation. However at that point there is no guarantee that all of the IO will be completed - the OS will inform you how many bytes were actually read/written, and it is your responsibility to wait for the next "ready" event before trying again.
Under the reactor pattern (Windows overlapped IO) the process asks the OS to perform the IO operation directly. The operation is started by the OS while the call returns immediately. When the operation has been carried out, the OS notifies the process through a callback/event. There is no complexity in managing partially transfers - the transfer in the responsibility of the OS and you'll receive notice when it's completed. IO completion ports is actually a thread pool dedicated to IO ops, and queuing is built-in.
The proactor pattern (Linux/epoll) requires a context switch between the IO resource becoming ready for the operation and the actual operation (higher latency). If transfers are partially completed - e.g. large transfers - you'll have extra context switches for each remaining transfer operation (lower throughput).
The reactor pattern (Windows) allows the OS to directly complete the operation without a preceding context switch (lower latency), only notifying the process when done, thus avoiding unnecessary context switches even for large buffers (higher throughput).
Both approaches require the program to be deliberately designed to support asynchronous IO. However, the Windows API was always designed with overlapped (asynchronous) IO in mind, Windows IO completion ports was designed in NT from the start. The problem was always (for both platforms) how to coach the developers to actually leverage the asynchronous APIs as opposed to the simpler-to-understand synchronous APIs.
The new Windows Runtime API takes it a step further and requires virtually all IO to be asynchronous (there simply are no synchronous versions any more). That is coupled with programming language innovations like async/await (C#/VB.NET) which makes it very easy indeed to take advantage of this.
This is exactly what I'm talking about. The attitude is that overlapped IO is "better" and screw you if you don't want to redesign your existing software to use it.
First, here's a real example where it isn't better. You have some app which holds a thousand some sockets and receives packets infrequently. With epoll you need one buffer when the odd packet arrives regardless of which socket is ready. With overlapped IO the amount of buffer memory you need is more by a factor of a thousand because each idle socket still requires a buffer to be allocated to it.
But that's not really the point. Having overlapped IO available is fine -- it is better for some applications. The problem is not having epoll, not only because epoll is sometimes better, but because it makes portability unnecessarily difficult. Because in most cases the performance difference between the two is irrelevant and the important criteria is how much work I'm going to have to do to make them both behave the same way on each platform.
>With overlapped IO the amount of buffer memory you need is more by a factor of a thousand because each idle socket still requires a buffer to be allocated to it.
Use the select function then. Wait for completion of that instead. No need to allocate buffers for sockets that receive messages infrequently.
epoll is a horribly designed API. It should absolutely not be replicated on other platforms.
Which bit of Windows do you feel is DOS-like? Did you ever use DOS? The DOS underbelly of Windows disappeared with the release of Windows XP over a decade ago. Windows 95 and 98 were DOS-based, or at least had DOS underpinnings to some extent I think.
From an API point of view, the Win32 API is incredibly stable and reliable and allows the ability to run programs from decades ago. This contrasts sharply with Linux, where the APIs are in a constant state of irritating flux. From a serious development perspective, this might be why the commercial application market has flourished under Windows and why you see commercial Linux applications few and far between, and (usually?) treated as a side attempt before falling by the wayside (as it appears that few Linux users will pass over cash for an application).
I do not install mingw or cygwin or any other "Linux-land" compilers/systems on Windows because it feels like I'm using the wrong system - just install Linux if I want all that!
It would be better to just use native Visual Studio in Windows-land and keep everything separate, no? Ordinary users that I am writing software for will typically have the Visual-C++ runtimes already installed so bundling of different GNU DLLs is redundant.
We are now starting a large Python/Django project. All of our software development to date save iOS has been done on Windows machines. It is quickly becoming a PITA to develop these kinds of projects on a non-unix platform. Serioulsly considering switching everyone to Mac Pro's despite cost (10 to 12 seats). Yes, there are work-arounds. Not sure they solve the problem. More than willing to listen to suggestions.
I would switch to the Mac Pro, even an older tower one. They still have bonkers performance (it's my main development machine for native C++ and Windows development in a VM). But you don't feel like you're fighting with the system to do "unix" things like you do on Windows, and it doesn't feel like you're trying to fit a peg into a square hole, which shoving all GNU tools etc. onto Windows certainly feels like to me.
Or I suppose you could go with Linux and fight the shifting desktop sands? (That's why I got fed up with it and switched mainly to Mac OSX; despite the OSX changes, they're gradual and not insisting that we drop the dock or window behaviour etc.)
That's interesting. Still, there are a huge number of copyright holders of the linux kernel and some may have a different opinion on the legalese applies.
And would all Windows developers leave? Most I bump into see Linux as a toy still, despite the ever-growing army of vocal Python developers who favour development on the platform (if you love slow GUI programs, Python sure is the way to go!)
What about just real POSIX and some kind of X interop or compatibility baked in? My (poorly informed) guess is this would allow a lot of low-hanging fruit from Linux, BSD, etc to be ported properly.
Well, no, they didn't remove it; its been a separate component for a long time, and with Windows 8 they restricted it to Windows 8 Enterprise and Server 2012. (Though I think the Windows 7 version can actually be used in Windows 8.)
It was gaining popularity at the point where they killed it; there was this real buzz, a feeling that this really cool thing was finally spreading to more people.
> It was gaining popularity at the point where they killed it
Was it? As far as I can tell, it had been losing popularity for a long time before Windows 8, and it was rare and getting rarer that any nx-based project would recommend using it on Windows, though occasionally you'd find people outside the main projects with recipes for making it work (or horror stories of their attempts to do so.)
I felt like it was. Things like the gentoo-on-windows project were new and lively, and that company that was backing it was getting a bigger and bigger set of packages in their repo.
Inevitably that effort would still fall short, and therefore not really at all. The leverage of linux is the linux community, and the access. Not ls vs dir. IMO. In any case, if they provided the Windows 8 experience built right on top of linux (and all the binaries and everything still worked) it would be a marvel.
In terms of managing programs/packages, what is wrong with Add-Remove Programs/Programs and Features? I've always found these to be satisfactory. Or am missing the point here? In terms of automatically retrieving the correct dependencies, I can see this being pretty useful, even if most programs already automatically install the dependencies or at least let you know that you don't have them. Maybe I'll find myself in .Net framework or MS C++ distribution hell a little bit less. The same problem, with missing or incorrect shared libraries, seems to happen just as much when I am on a Free/Net/OpenBSD system. And the workflow for installing a Windows program is fundamentally different from the *Nix eco-system; as the former is mostly closed while the latter open, you have to be much more discrete when installing on Windows.
Maybe I am using Windows differently than a lot of other people even if I consider myself a "power" user. I've always felt much of Windows' power and usability came from it's GUI focused experience (forgetting the maddening changes that can occur between versions).
Post: after writing all that, I looked at Programs and Features (on windows 7 now) and I think the biggest advantage of something like this would be mapping out dependencies, even if I don't think I've ever deleted one by mistake. Still, it would be nice.
Installing and updating Windows programs is painfully slow and complicated compared to Linux. Instead of searching for a program and clicking "Install", you have to find and navigate to the website, download the installer, execute it, grant permissions to run, uncheck adware boxes, choose install options and click exit. For every single program. Many programs still make you do the same for every update and that's one of the reasons most users don't keep their software up-to-date.
Installing and managing software is easily my biggest gripe with Windows right now.
Currently it seems that both the apt and choclatey versions of Java actually come without crapware. I have a feeling that if this kind of install becomes the default then the crapware will be bundled there too.