Although it is probably good for the companies still using windows XP to receive those updates, I do not think updating windows xp was such a smart move because now those companies do not need to update once again and keep running outdated windows machines.
If Microsoft did not provide this update for older OSes, I believe more people would update because they simply had to. Smaller hacks might not be enough for the manager of those machines to allocate resources to update, but this hack might have been just fatale enough for them to update those machines.
And we should all expect to see such a hack again. This is not a one in a century thing. Such costly hacks could easily happen a few times a year.
Alternatively, Microsoft should continue to support Windows XP as long as there are a large number of critical systems that rely on it. They are unable to upgrade because it would break a lot of legacy software. I believe Windows XP should have been supported for at least 50 years, considering how many people still depend on it. IMO, Microsoft has acted irresponsibly, and the decision was only driven by money. Sure, 2001 was a long time ago, but so what? What's wrong with a PC that runs the same OS for 100 years? Sometimes things work fine just the way they are. For all intents and purposes, most machines running Windows XP are fast enough. Office workers aren't running some insane virtual reality system, they're responding to emails and entering data into some Visual Basic application.
I understand that newer versions of Windows do have stronger security, so it is better if people can upgrade to a more secure version of Windows. But it would have been better if those security features were somehow back-ported. But seriously, can the current version of Windows please be considered "finished"? Why not spend the next 50 years just maintaining Windows 10, just the way it is? Imagine if they devoted all of their resources to finding and fixing all of the possible security issues until it's virtually bulletproof, and the price of a zero-day gets to a billion dollars. I think it's a shame that Microsoft constantly releases unnecessary upgrades and tried to get people to keep buying new licenses.
> They are unable to upgrade because it would break a lot of legacy software. I believe Windows XP should have been supported for at least 50 years, considering how many people still depend on it.
It's not Microsoft's fault that people depend on Windows XP. IMHO it's the fault of companies buying hardware and software from manufacturers who are unwilling or unable to upgrade their product to run on newer versions of Windows.
To put it another way: the only reason there is a huge demand for COBOL programmers is because banks are too spendthrift to rewrite their software in more modern languages.
> IMO, Microsoft has acted irresponsibly, and the decision was only driven by money.
Welcome to the world of successful businesses. They don't do things for altruistic reasons, they do it because it makes money.
> Imagine if they devoted all of their resources to finding and fixing all of the possible security issues until it's virtually bulletproof, and the price of a zero-day gets to a billion dollars. I think it's a shame that Microsoft constantly releases unnecessary upgrades and tried to get people to keep buying new licenses.
A lot to unpack here.
1) I personally don't think it's possible for anyone to design a general purpose OS as complex as Windows that is bug free.
Just look up how small the space shuttle software was (IIRC ~600,000 LOC) and how mind bogglingly expensive it was.
2) Businesses could already avoid a lot of this if they didn't view IT as a cost centre and instead as an investment.
3) Yes Windows 10 is a privacy nightmare. But Microsoft has made real strides in OS security since XP. It's wrong to claim that all they've done is put a minimalist theme on the same old OS.
From a security perspective they absolutely are not "unnecessary upgrades"
> It's not Microsoft's fault that people depend on Windows XP. IMHO it's the fault of companies buying hardware and software from manufacturers who are unwilling or unable to upgrade their product to run on newer versions of Windows.
It is Microsoft's fault. A software customer doesn't know that a vendor is going to go out of business or get bought by a competitor that discontinues their product and promotes an incompatible alternative with seven figure transition costs.
Microsoft are the ones who worked so hard to make sure that software for Windows isn't compatible with not-Windows, creating all their own alternatives to POSIX, OpenGL, Open Firmware and everything else so that it's as difficult as possible for software compatible with Windows XP to be compatible with any Unix or Linux, leaving the user out in the cold if it also isn't compatible with Vista or later.
> To put it another way: the only reason there is a huge demand for COBOL programmers is because banks are too spendthrift to rewrite their software in more modern languages.
The reason there is huge demand for COBOL programmers is that it's more reasonable to hire a COBOL programmer than to discard a working system with a multi-million dollar replacement cost. "Just throw away everything you own and start over from nothing" is only rarely cost effective.
> It is Microsoft's fault. A software customer doesn't know that a vendor is going to go out of business or get bought by a competitor that discontinues their product and promotes an incompatible alternative with seven figure transition costs.
It is categorically NOT Microsoft's fault that software vendors are bought or go out of business. It's on the software customer to ensure that they don't get stuck with vendor lock-in.
Reverse the situation: A bunch of critical systems which run Linux 2.4 are being compromised by cyber criminals via a kernel exploit. You're going to argue that it's Linus' fault for providing such a great kernel and not supporting it forever?
Or that it's Linus' fault vendor XY who hasn't existed since 2 mergers ago chose Linux 2.4 for their product and now aren't around to provide updates for legacy software?
> Microsoft are the ones who worked so hard to make sure that software for Windows isn't compatible with not-Windows, creating all their own alternatives to POSIX, OpenGL, Open Firmware and everything else so that it's as difficult as possible for software compatible with Windows XP to be compatible with any Unix or Linux, leaving the user out in the cold if it also isn't compatible with Vista or later.
And so does every other operating system on the planet. You cannot take a MacOS binary and run it on Linux. And until very recently you couldn't take a Linux binary and run it on Windows.
You could argue that with Linux, the only thing preventing it from running Win32 or MachO binaries is that those operating systems are closed source, but this is the world we live in. If you want a "universal" binary, write it in something like Java.
> The reason there is huge demand for COBOL programmers is that it's more reasonable to hire a COBOL programmer than to discard a working system with a multi-million dollar replacement cost. "Just throw away everything you own and start over from nothing" is only rarely cost effective.
Yes, and I feel that I addressed this when I said "because banks are too spendthrift to rewrite their software"
It's a business decision. Currently it's cheaper for banks to hire COBOL programmers at obscene rates to fix their software. Eventually, there either won't be COBOL programmers, or they'll be too expensive, and the bean counters will dictate it's time to rewrite.
> Reverse the situation: A bunch of critical systems which run Linux 2.4 are being compromised by cyber criminals via a kernel exploit. You're going to argue that it's Linus' fault for providing such a great kernel and not supporting it forever?
I think there's a simpler way to think about this. Windows XP still functions for these businesses, and many of them would gladly pay an ongoing subscription fee for support. Sure, Microsoft wants to expand, but I don't see why that has to happen exclusive of supporting their well-loved early 2000s release. I run software which hasn't changed substantially for more than 20 years on a daily basis, and I don't see why Microsoft has to pretend that that's not a use case.
> > Microsoft are the ones who worked so hard to make sure that software for Windows isn't compatible with not-Windows, creating all their own alternatives to POSIX, OpenGL, Open Firmware and everything else so that it's as difficult as possible for software compatible with Windows XP to be compatible with any Unix or Linux, leaving the user out in the cold if it also isn't compatible with Vista or later.
> And so does every other operating system on the planet. You cannot take a MacOS binary and run it on Linux. And until very recently you couldn't take a Linux binary and run it on Windows.
Open Firmware, POSIX, and OpenGL are explicitly not related in any way to binary compatibility. You can run unmodified OpenGL programs (with some abstraction) on Linux and OS X out of the box, and with some fiddling, on Windows as well.
It's clear that this is an argument that Microsoft has refused to support standard APIs, not standard ABIs.
How many apps are there really that run on the win32 API of Windows XP, but not on the one of Windows 7/8.1/10?
Not saying there aren't any, but nearly everything I've seen fail has been coupled to hardware drivers, where other platforms are just as bad, or other outdated APIs, e.g. ancient versions/insecure configurations of Java. This is a big problem in IT, but I don't think it's fair to blame Microsoft more than others for it. Maybe for making as much a mess of Vista as they did, but 7 was still available more than early enough for slow migration.
(And at least the NHS had an extended support contract for Windows XP, but ended that at some point, despite (as far as I know) it still being available from MS for other big customers)
> It is categorically NOT Microsoft's fault that software vendors are bought or go out of business. It's on the software customer to ensure that they don't get stuck with vendor lock-in.
It is Microsoft's fault that the programs only run on Windows XP. If they had used standard open APIs it would have been much easier to port the applications to other platforms to begin with, and more of the original developers would have done so before going out of business.
If you want to argue that customers should not buy Windows-specific software you'll get no argument from me, but that is certainly not Microsoft's position.
> Reverse the situation: A bunch of critical systems which run Linux 2.4 are being compromised by cyber criminals via a kernel exploit. You're going to argue that it's Linus' fault for providing such a great kernel and not supporting it forever?
Linus absolved himself by providing source code. If you really want to keep using Linux 2.4 and patch it yourself, you can, and some companies actually do.
> You cannot take a MacOS binary and run it on Linux.
And that project is in a weak state not for difficulty but for lack of demand, because so many programs that run on one OS already run natively on both since it's so much easier to port between them than between Windows and anything else.
> You could argue that with Linux, the only thing preventing it from running Win32 or MachO binaries is that those operating systems are closed source, but this is the world we live in. If you want a "universal" binary, write it in something like Java.
> Yes, and I feel that I addressed this when I said "because banks are too spendthrift to rewrite their software"
> It's a business decision.
That's the point. How is it not also a business decision with Windows XP, which Microsoft has forced everyone to make in a way that many would otherwise not?
> To put it another way: the only reason there is a huge demand for COBOL programmers is because banks are too spendthrift to rewrite their software in more modern languages.
I heard that medical devices are approved (by whatever regulatory body) to run an _exact_ set of software. As such, even applying a security update invalidates the approval, because it could potentially introduce bugs that place a patient's life at risk.
Not sure that those systems are connected to the Internet.
The software affected in the NHS was mainly in GP surgeries and admin offices to do with scheduling of procedures. Not directly life-threatening in the sense of a machine going wrong but indirectly damaging in the sense of procedures and consultations having to be cancelled, records not being available &c.
Looking decades into future I'm wondering: Web applications + fairly basic clients Chromebook style for the scheduling/records stuff.
From a security standpoint the best thing is to stop adding features and fix security issues as they are found so the system gradually becomes secure.
The upgrade treadmill is good for microsoft's revenue stream, but bad for customers who already have a working system. New != better, just different. I guarantee you that w10 has just as many security holes as xp, they just haven't been found yet because it's so new.
Many of the existing XP computers a big monolithic systems that have been decentralized so Microsoft could sell more licenses.
Think networks of retail stores using hundreds or thousands of PCs for their point-of-sale. Or government agencies like the DMV with locations in every rural county.
>It's not Microsoft's fault that people depend on Windows XP.
When XP was released, Microsoft was unarguably a monopoly. Monopolistic businesses have responsibilities that other, non-monopolistic businesses don't. The law is vague, but the FTC could reasonably force Microsoft to continue to patch security problems in windows XP forever. After all, the defects to which the support period refers existed within that support period, regardless of when they were proven to exist.
> That means the opposite of what you want it to mean.
Thanks! I actually didn't know this. Do you have a suggestion of a better term for the behaviour I'm trying to describe? I was trying to be more elegant than "cheap"
> It's not Microsoft's fault that people depend on Windows XP.
It absolutely is. Not releasing the Windows XP successor in 2004 as it was promised to customers and shareholders is their fault. Microsoft only released a successor 6 years after XP release, 3 years later than promised. And it doesn't help Windows Vista was a disaster on its release.
Sure, 2001 was a long time ago, but so what? What's wrong with a PC that runs the same OS for 100 years?
My thoughts exactly. If you look outside of computers/tech, at other large pieces of infrastructure like electrical and water supply systems, you'll find plenty which are much older and still in active operation. 2001 was only 16 years ago. I have many things older than that and probably "unsupported", yet continue working with only occasional maintenance.
Thus, as computers become more integrated into our lives and a part of the infrastructure, it makes sense that they should stop "moving forward" and settle down stably at some point, yet all I've seen is the complete opposite. In fact, I'd say instability in infrastructure is immaturity --- to take an example, in the early days of electrical power, there were plenty of incompatible sockets, voltages, and frequencies. Now we have settled on a small set of standards, which have been nearly the same for the past decades.
I understand that newer versions of Windows do have stronger security
The flip-side is that some of this security isn't really helping the user, i.e. DRM. Newer version of Windows also come with other unnecessary/unwanted features like telemetry (possibly with their own vulnerabilities), regressions in UI, etc. In other words, if you upgrade you could be more secure from remote attackers and get some new features you actually want, but you're also giving away more privacy to Microsoft and moving toward a more locked-down-against-you ecosystem. It's not all positive.
If only Linux and WINE, or even ReactOS, were in a more usable state...
I'd be a fan of Microsoft providing 3 or 5 years of support to business for free, then offering increasing levels of paid support for the next 5 and 10 years. It both preserves the revenue Microsoft depends on to operate, and gives businesses choices to keep operating their legacy systems, albeit at increased cost.
At a certain point, it's cheaper for businesses to rebuild their solutions with the latest tech Microsoft would prefer they are on, so they can re-license, rebuild their solution with modern tech, and start enjoying the benefits that come with free support for a new licensing period.
During the days of Windows XP, XP wasn't designed for a world of Internet threats, I think it would have been difficult for them to continue to support XP in a production environment while preserving compatibility with existing features.
Windows 10 is built for the modern world of threats. Microsoft sells a Windows 10 Enterprise LTSB (Long Term Serving Branch) with guaranteed support for 10 years to address many companies' needs. Everybody has learned from XP, Microsoft has learned that long term support is needed so the innovated their support model, and companies have learned that they can't expect an operating system to exist in a connected world for 15+ years. Hopefully with new support options and more reasonable expectations from business, Microsoft's LTSB support model is going to address everyone's needs.
Windows XP was most definitely deployed in a world with constant Internet threats! That it was not designed to address the threats is probably more of a misprediction on their part.
Let's not conflate something "working" with something able to receive support from the manufacturer. If I have a car from the 1960s, I'm sure it will work, but beyond the legality of some components, if something breaks, if the manufacturer even exists anymore I would be lucky that they would help support it, and if they did, I would have to pay a large sum of money for the support.
If a customer wants to use Windows XP, they can, but if a customer wants support for it, they can pay Microsoft a lot of money for it (and I think some actually still do).
Besides, don't forget that businesses do need to make money. Are you suggesting that Microsoft should have supported/should support Windows XP forever, for free?
> Imagine if they devoted all of their resources to finding and fixing all of the possible security issues until it's virtually bulletproof, and the price of a zero-day gets to a billion dollars.
The trend in security has lately been toward isolation (sandboxing, etc.) instead of fixing security issues as they come up. Newer versions of Windows have more isolation features. Setting up a sandbox on Windows XP is a nightmare, and there's no way to shut off access to FAT drives or win32k.sys, etc.
> I believe Windows XP should have been supported for at least 50 years, considering how many people still depend on it. IMO, Microsoft has acted irresponsibly, and the decision was only driven by money.
I don't think you understand how much developer effort is required in maintaining such an old OS. In addition, these same developers need to get paid and since MS is in the business of making money, supporting a SW that massively burn cash (I assume the maintenance is not cheap) is blatantly foolish.
> Why not spend the next 50 years just maintaining Windows 10, just the way it is?
If I understand it correctly, this pretty much is the plan with Win10. It's supposed to be the last Windows OS
I think you must differentiate between a general purpose OS and a single-purpose OS.
Why isn't XP good enough to run a map kiosk in a mall? Or flight arrivals screen in the airport?
Face it, the needs of many single-use applications where solved decades ago...the need to constantly undergo the endless forced upgrade cycle waste a tremendous amount of human effort.
> Why isn't XP good enough to run a map kiosk in a mall? Or flight arrivals screen in the airport?
It is and the stripped-down versions of XP Microsoft sells to run kiosks still get updates [1]
> Windows Embedded Standard 2009. This product is an updated release of the toolkit and componentized version of Windows XP. It was originally released in 2008, and Extended Support will end on January 8, 2019.
> Windows Embedded POSReady 2009. This product for point of sale devices reflects the updates available in Windows Embedded Standard 2009. It was originally released on 2009, and extended support will end on April 9, 2019.
However, MS makes it very difficult to acquire and manage those products. Generally speaking, you must buy their embedded products with a motherboard/cpu purchase from an authorized vendor.
MS business strategy basically mandates that a whole class of "single purpose" customers can't / won't buy via the way MS wants to sell it.
If you try to mandate that the mall buy your special (expensive) motherboard/XPe combo you will generally make no sales. Therefore the default becomes that your customers just go buy "whatever computer they can that matches specs" and run that. Hence you wind up with tens of millions of devices that aren't supported anymore.
At work, our automation people have one or two customers with industrial plants where the PCs connected to the PLCs still run XP, because they use some very specific piece of hardware whose vendor went out of business, so there are no device drivers for more recent versions of Windows.
(Hell, I've been at one plant where PC side of the automation still ran on NT 4.0! The customer is reluctant to replace that machine because the interface to the plant automation is a freaking ISA card, and it has become rather difficult to procure a motherboard with an ISA slot.)
It's easy to tell people to update, but in some cases it's just not that simple. (OTOH, the plants I know about are not connected to the Internet.)
And don't get me wrong, I totally agree that it would be better in so many different ways if these machines were upgraded. But some users basically have no choice.
I sadly suspect if a large corporation with a very small ethics department or a state actor wanted to subvert these networks, they would have very little trouble doing so. Another comment mentioned Stuxnet which proves that point rather well.
Fortunately, these plants I am talking about are food plants (yoghurt, pudding, and such), so the risk of some foreign government wanting to shut down that plant is rather low. ;-)
And the risk of becoming infected by drive-by malware is contained by not letting these machines talk to the Internet.
(There is one connection to the regular corporate network, which does have Internet access, to tell the ERP system how much of each ingredient is left so the Purchasing department will order new ingredients on time. But in my benevolent imagination that connection is one teeeny-tiny hole through a humongous firewall.)
Agreed. Still not bulletproof (see Stuxnet, which transferred to PLCs over USB sticks), but you're good against classical malware (as in, not developed by a state actor).
> ...PLCs still run XP, because they use some very specific piece of hardware whose vendor went out of business, so there are no device drivers for more recent versions of Windows.
There are alternatives to this problem but many companies don't have the budget, don't want to invest, or don't know that these drivers can be rewritten to new platforms with reverse engineering.
I don't know if you know much about the automation industry, but reverse engineering a driver so you can run it on a newer PC would be an unacceptable risk. Are you confident in your driver being able to safely control the industrial equipment? A coal train which costs $1000s of dollars a minute for any down time? Are you willing to take the blame or deal with the lawsuit if something goes wrong? The other side of the coin is that some companies do have the budget and can invest in newer technologies, but the cut over to new equipment is just down time they cannot afford - so be it if there's a risk of larger down time in the future.
Risk has a price. I am taking this kind of risk for several projects. I understand that the customer doesn't like to take the risk, even when they have the budget.
Even so, there are formal ways to show you are solving a problem with certain probability. These kind of projects require extensive testing.
Some companies have no choice but to run old systems.
For example in the labs I use there are plenty of equipment hooked up to PCs running Windows 95 and Windows 98. The equipment works fine. The proprietary software for the controller and the logging software is old and there is no version that works on newer systems. The only option is to buy a whole new kit which is a stupid waste of money. So we just use these stand alone Win 95/98 computers instead.
And that's fine... as long as you don't connect them to the network and expect them to interact with other clients. And that's often the case for these standalone machines and tools.
The problem we're having in the real world are all of the companies and schools that have 15 year old machines that they use for reading email and filling in spreadsheets that are connected to the internet because their corporate IT was a guy that liked computers in high school and was hired with zero training.
Shame too, since almost all of those cases would be well covered with Linux distributions, or now even Chromebooks.
In a lab that does clinical work, replacing a single instrument could mean having to re-validate all the tests the lab does, update protocols, get inspections from regulatory bodies, etc... it requires people to spend large amounts of time not doing their "regular" jobs.
Designing a system that is secure is part of the job. Accurately estimating the total cost of ownership is also part of the job.
Ultimately project management and ownership is responsible, but they won't be interested if we don't make our expectations clear.
Downvoting these sort of opinions is just saying it's too expensive to secure some things. To me that means we can't afford to computerize some things in the first place, at least not with the chosen tech stacks.
It depends on how you use the machine and how cautious you are. I ran an un-updated XP machine for 10 years and was never infected by anything.
OTOH, the people in the other departments (running up-to-date machines) fell multiple time for malicious email attachments while our developers department was never infected despite the fact we download stuff from the Internet on a regular basis (be it for libraries or tools) - stuff that doesn't even need sneaky means to get executed. Yet we are in theory subject to the "no software installation without permission from IT dept." internal rule.
So if those machines in that lab don't run an email client and are not used to browse the Internet, they are actually quite safe. The only threats that remain are worms spawning from local infected machines or infected USB pen drives.
The thing is, security can quickly become an unhealthy topic. It's so damn easy to FUD people.
My "new" Win7 machine has this "security advisory" that pops every time I copy a file from/to a network drive that say "this file can damage the computer" even when it's a freaking text file - and I do that all the time (BTW imagine what mental model of security it generates in the mind of non technical people - it's not protection or education, it's fearmongering).
So I went to disable this warning but I then paused for a moment, thinking - what if one day I make an actual mistake and get infected? Will I be blamed for disabling it?
It's so damn easy to say, "if you don't follow this PITA security measure, you will be held responsible for the consequences". I admit that like many I would submit to that. There's no point in gambling my job on this after all, and I have better things to do.
I think that the cyber-security topic needs to be sanitized. And the first thing to do would be to tell week-end security consultants, who don't understand a thing about security contexts and threat assessment, to keep quiet a little so that people that are actually in charge of cyber-security can listen and learn from actual experts.
You seem to be reading way more into my comment than was intended.
I'm just saying it's not cheaper to ignore technical debt. It's actually a bug somewhere in the operational budget or business plan.
If the system should run for thirty years on without an OS upgrade, design and budget for that up front. I promise that no one considered that when they configured a Windows XP box and threw it in a lab somewhere. And, hey, maybe that's OK in the short run. But there was no budget or business plan to replace those systems down the line either.
So, the problem was to support/buy such equipment/software in the first place, which only runs on proprietary software (Windows). Isn't it pretty clear that at some point, the support will end and you end up with such problems. If it would run on Linux, you could at least use a newer Linux and it still should work (although maybe with problems). Another better solution would be if the controller software itself was open, then you could maintain it yourself and port it over to newer platforms. Don't support the manufacturers which give you only closed software. If you do, then that was your choice, and you should accept that it will become useless at some point, and you will end up with such problems.
Yes, but there isn't a choice. You are imagining products which don't exist. Lab hardware is generally very specific and there is usually only a handful of machines suitable for a given application, all of which run out of date software even before they are purchased. The best machine our lab has for running gels appears to be from the 1980s and is using 5 1/4" floppies. Some clinical trials run for 5 years or longer - the equipment needs to be standardised for that period of time. At the moment our IT disconnect it from the network and they are used standalone, which seems sensible and would have been useful in this case.
Yes, there is a choice. You can just not buy from them. If there is no such manufacturer, than if people understand that it's of value to have hardware/software which they can maintain themselves, such market will be created. You could also build the hardware yourselves. But right now, if you ignore it and just buy the hardware anyway, you are kind of ignoring this. I mean, this will never solve the problem, it will always continue to be this way. But it's wrong to say that you have no choice. You accept how it is, and that is your choice, and by accepting it, you are supporting it to stay that way.
> If there is no such manufacturer, than if people understand that it's of value to have hardware/software which they can maintain themselves, such market will be created. You could also build the hardware yourselves.
I'm afraid most of the value is probably in the lab hardware itself, not the FOSS stack fetishism. As such you'd need to figure out how to ship better and/or cheaper lab hardware to gain market share, when an entrenched provider has already learned how to ship state of the art hardware. If you really believe market forces will value FOSS for FOSS's sake that highly, perhaps you'd like to start that hardware startup yourself? You make it sound so simple... perhaps you already started one? But for most, I think the Hobonson's choice I think you're actually proposing is between:
1) Continuing to do lab work with the proprietary hardware
2) Letting someone else replace you when you switch careers to work on your new hardware startup. Your replacement is unlikely to care as much about FOSS as you (after all, you were willing to quit over it!), and even if they do, their replacement likely won't. The perverse result of this is a decreased demand for FOSS lab equipment, not increased demand!
You might increase supply of FOSS lab equipment. Or (and I think this more likely) you'll go bankrupt before moving the needle. Speaking for myself, I can't convince myself of the untapped market potential of this niche, I don't know hardware, I'd be going up against incumbent experts, I'm not passionate about lab equipment... if I somehow acquired investor funding under these circumstances I would worry I had conned them more than I had convinced them of the merits.
Some "choice".
Better to keep doing lab work if that's what you enjoy, and perhaps agitate for less proprietary FOSS options if that's something dear to your heart. Maybe you'll cure cancer and save the lives of future developers of FOSS lab equipment.
> If it would run on Linux, you could at least use a newer Linux and it still should work
No, because if the same guys would be coding for GNU/Linux, it would mean a closed source binary compiled against a very old C library that most likely wouldn't even start in modern GNU/Linux.
Or try to access kernel features, drivers or pathname that no longer exist in modern distributions.
Using *BSD or GNU/Linux for laboratory hardware doesn't mean that the code is made available, or even if it is, it is cost effective to pay someone to port it.
Many of those XP systems have actually code available, at least in life sciences, just labs don't want to pay to port the code to new systems.
I think most C libraries are binary backwards compatible. But even if not, you could just use the old C library for that application and it should work - I don't see why it should not.
Also the Linux kernel should always be backwards compatible.
You could also place any missing files.
Even if that is all too complicated, you could just use an old Linux distribution or Linux kernel and backports patches for yourself, although not sure if that is easier.
What I'm saying is, it's good to have the choice and option to do that.
Of course, like you say, if you have that option but just don't do it because it takes time / money to do it, then that is your choice.
> I think most C libraries are binary backwards compatible. But even if not, you could just use the old C library for that application and it should work - I don't see why it should not.
If dynamic linking was used, the entry points might have been moved, the syscalls have changed, bugs are were being taken advantage of were fixed.
If static linking, the calls into the kernel might have changed or data read from /etc, /dev or other type of expectations from target OS.
So I really really doubt you can pick a static executable compiled against kernel 0.x.y, using a.out format and execute it in Ubuntu 16.04 LTS as example.
Actually, I just need to take the dust out of my Walnut Creek CD's to prove my point.
Nice how you avoided the issue of running code targeted for 0.98.1 (latest kernel release in 1992) against kernel 4.4.z.
Not to mention how disparate the file system structure, including device drivers, of something like Yggdrasil Linux 1.0 is compared with modern distributions.
As I said, easily verified by dusting off Walnut Creek CD's and picking a random binary from them.
If that was the case, it would be easy to upgrade the kernel on old Android phones -- in fact lots of people put lots of work into supporting older phones with newer kernels, and it's often impossible due to closed source driver blobs (exactly the same problem as usually strikes Windows)
XP is 16 years old! Microsoft is in a tough spot, this was a worldwide problem so they had no choice.
They should make the support contracts for old version quadruple in price for every year and offer incentives to upgrades. Better for them, more revenue and less bad press for MSFT.
Upgrading from XP gives no benefits (for me), the PC would only work slower (and will get telemetry spyware). I wish Microsoft stopped developing new OS with useless (for me) features and instead fixed bugs in older systems.
The only good thing that was added in Windows 7 was a search field in start menu. And builtin firewall became able to filter outgoing connections. But it is not worth bothering with upgrading the system.
How about having an up-to-date browser? Firefox stopped supporting XP with v52, and Chrome with v50.
There were some problems with opening Let's Encrypt-encrypted websites on one of these browsers since they've stopped supporting it, can't remember which one.
"once again"? There's been thousands of reasons why already, this is just one piece of malware in the giant field. These companies don't care, and in many cases it doesn't make sense to care because the device isn't even capable of direct internet access and is just embedded XP.
They're not offline, theyre just not directly connected. Instead, they're on an in-house network which relays the info back over VPN, or can only be connected to specific addresses over VPN.
> I believe more people would update because they simply had to.
In an ideal world, sure. However many organisations run older machines because of large-scale investment in now-defunct technology, which they can't afford to re-invest.
When hardware vendors leave their customers in the lurch to force them to replace perfectly good equipment, we call it "planned obsolescence" and deplore it. When software vendors do it, it's business as usual.
It's pretty clear that this completely 100% predictable and predicted attack - open source and free software people have only been talking about the incredibly obvious dangers of a proprietary IT monoculture for 20 years now - is such a damaging "welp, Stallman was right again" moment that Microsoft didn't really have much choice.
Despite being a big fan of open source software, this does not make a difference. Microsoft released patches for new windows system relatively quickly (in March if I recall correctly).
I bet there are still tons of systems that suffer from the Shellshock, or Heartbleed because they are either not updated at all or they are running old linux version which are no longer supported (I bet there are still tons of RHEL/Centos 2, 3, 4, and 5, which do no longer get security update or the companies do not have extended support contracts).
The real issue is that people are afraid of updates because they tend to break things. They do not want to invest into "slow rollout strategies" and the like.
If updates were applied immediately to 10% (or maybe even less if the company is big enough) of all machines, and if there was a way to quickly rollback the update, there would be less problems and the consequences of failed updates would be less serve. This way you can have your systems up-to-date within 48h (maybe: 1% of 'key users' who do not freak out if things break, and then after maybe 4h 10% of normal users who can call "IT support" to roll back the update, and after 24-48h, to all PCs. This would be even easier for stateless servers because you could redirect all requests to other servers if the 10% fail with 0 downtime).
I know people that still run OpenSuSE 11.2 on their internal systems. No amount of "open source" will have customers modify what they see as a working system, especially not when any upgrade comes with a week of integration followed by tests and bugfixes to ensure flawless operation in production. The difference between OpenSource and Proprietary is in this case that Windows XP exists and works well enough while GNU/Hurd doesn't and therefore will never have an outdated version in use.
I'm a FOSS supporter, but do you think FOSS systems are always updated on time? Or that distributions released in 2001 are still updated? Or that companies running critical systems on old FOSS systems always go through the pain of updating?
Just because the update is free, doesn't mean there's no risk associated with it.
I completely disagree with your reasoning, and frankly find it downright infuriating. It feels straight up partisan and political.
Isn't it entirely plausible an attack of precisely this sort could occur in a world where Linux (or macOS, or templeOS, or whateverOS) is the go-to desktop OS? Isn't Windows the preferred target for attackers because of its ubiquity? How in the world would this be mitigated by "open source"?
It seems that a root problem is not the proprietary OS, but the proprietary and abandoned drivers, hardware management tools, and patient record systems the obsolete OS is required to support.
Open source might be part of the answer to this, or some kind of legal 'right to migrate'.
If all of your patient records are in some ancient software, the new vendor would probably be happy to get them out again if there were documents or a codebase saying how.
If you need XP to run ExpensiveScannerManager95, if you had a legal right to get the code somehow, I'm sure you could find you an SME that would port the driver to windows 10.
Maybe we / our companies and governments need these legal rights now. But what exactly should they be?
> Open source might be part of the answer to this, or some kind of legal 'right to migrate'.
It isn't. Everyone who tried to decide over which version of a distribution to run should know this. It's fine as long as you run the newest or don't need new things. But once you need something specific and especially once you start installing things outside the package manager things go down hill quickly.
I wish people wouldn't use this argument in favor of open source, because if you make institutions choose between open source and proprietary solutions based on "updates" it's appstores, cloud software and subscriptions that will win.
>>Everyone who tried to decide over which version of a distribution to run should know this
I have used Linux as my primary operating system for more than 15 years, I have been using Arch as my primary distribution for more than 5 years. I do not know this.
>t's fine as long as you run the newest or don't need new things.
So which is it, I am fine if I want to run the newest, or if a do not need the newest? Your statement is a contradiction
>But once you need something specific and especially once you start installing things outside the package manager things go down hill quickly.
No, not really... I install things all the time outside the Package manager, of course I know what I am doing so...
>because if you make institutions choose between open source and proprietary solutions based on "updates" it's appstores, cloud software and subscriptions that will win.
How so? App stores to not solve the Lockin problem the OP is talking about, if anything it makes it worse
It's not very appealing to respond when you don't give any reasons. I work with making and maintaining Linux distributions for enterprise, and previously embedded, systems (including desktop). We commission open source work, buy 'support' from major vendors and upstream our own changes. I don't share your views and judging by the development in things like e.g. configuration management I don't think I'm alone.
> So which is it, I am fine if I want to run the newest, or if a do not need the newest? Your statement is a contradiction
I don't see the contradiction, maybe I didn't express myself very well. The problem is when you mix old and new software and distributions. As long as you run a single release (old or new) and all software is for that release you're fine. When you have to deal with many different versions of third party software, libraries, interpreters, shells, build systems etc. is when you run into problems. Just like in the case with "ExpensiveScannerManager95".
>>I don't share your views and judging by the development in things like e.g. configuration management I don't think I'm alone.
How does the development of Configuration Management tools for linux support any of your statements? I fail to see the connection. Linux has needed enterprise configuration management tools for awhile, it is one area where Windows is better as there are many many many many Configuration Management tools for Windows.
>>maybe I didn't express myself very well.
I think this is true, because I still do not understand
1. What you are really system
2. Why you believe windows is better at any of these things than linux
3. How it is relevant to what we are talking about.
Yes when you mix old and new things you may have problems, depending on the system. I however maintain you have LESS problems with linux than you do with Windows, having managed both systems in large enterprise environment, Windows is a finicky broken system that does not play well with anything.
I spend the majority of my time fixing broken shit on windows. The idea that Linux is worse is laughable
While I agree open source is the answer, linux as it stands isn't -- which we can see from the mess which is abandonded and un-upgradable Android phones. I'm not saying it's Linux's fault, but it certainly hasn't proved to be a magic solution either.
In case with Android things like drivers or builtin software are often closed source and it prevents community from fixing them or migrating them to newer Android versions.
While these attacks are not impossible on linux/bsd there are inherent weaknesses in design of Windows, Especially Windows XP/2003 that make these attacks more probable
Also due to the nature of Linux being a Monolithic Kernel and open source, there tends to be less issue with backward compatibility issues with Linux making it easier to update systems that today companies refuse to update windows on because it is not compatible with older hardware/software
Infact Linux often has the reverse problem in that hardware support for new technology often lags behind because hardware vendors focus on Windows first.
One argument for Linux here is that people in the know could have patched it themselves and recompiled the kernel or userland utility causing the problem. Or people after the fact, without having to wait for Microsoft. With Windows, you get what you're given, when they want to provide it (essentially).
This has almost nothing to do with proprietary vs. open source. A patch for the vulnerability exploited here had been available for months.
The real problem is that organisations had devices sufficiently connected to be vulnerable that had not the patch applied. That leads to questions about software update policies within those organisations, and that in turn leads to some quite difficult questions about regulated medical devices and how they are supplied and maintained.
I don't think Linux distributions from 2001 are receiving security updates today. The only thing going for a free(as in beer) OS is that upgrades are free, but the main reason that so many corporate systems are still on XP is compatibility, not the cost of the upgrade license which is peanuts for large orgs affected by this. And Linux distros from 2001 would still have the exact same problem.
Consider the fact that Windows has the best backward compatibility in the business, while even drivers break across relatively minor Linux kernel versions and compatibility is likely to be a bigger problem with Linux.
due to the nature of Linux being a Monolithic Kernel and open source, there tends to be less issue with backward compatibility issues with Linux making it easier to update systems that today companies refuse to update windows on because it is not compatible with older hardware/software
Infact Linux often has the reverse problem in that hardware support for new technology often lags behind because hardware vendors focus on Windows first.
>Consider the fact that Windows has the best backward compatibility in the business,
That is a complete and utter myth. Windows has terrible backwards compatibility, and changes to the Windows Driver model, and other changes require drivers and software to be completely rewritten between generations of windows.
>while even drivers break across relatively minor Linux kernel versions and compatibility is likely to be a bigger problem with Linux.
Where do you get this? Drivers are included in the Linux Kernel, it is impossible for a driver to "break" across minor version of Linux, if a driver breaks the kernel fails and is not released.
>Where do you get this? Drivers are included in the Linux Kernel, it is impossible for a driver to "break" across minor version of Linux, if a driver breaks the kernel fails and is not released.
Linux comes with a limited set of device drivers in the main source tree, just like Windows' bundled drivers. Most of this thread is about rare medical equipment or proprietary drivers/programs from companies that have gone out of business.
Also, the Linux kernel ABI routinely breaks drivers, unlike Windows which happens much more rarely.
I think it assuming you believe windows driver problems are rare. Every time I get a new model computer or hardware I have spend many many hours testing, finding, and packaging the drivers to make sure the new hardware plays nice with our system, deployment systems and does not break other shit.
I'd like to see the set of sysadmins that belong to both the group that is running a distro from 2001 and are willing to manually update the kernel on those systems to a later one.
Yes, the Linux kernel being what it is makes for good/great backwards compatibility.
But that's so far from the point it's not even funny. This is about update policies and internet security at the organisations involved.
It was reportedly available to those who were still officially supported, though, possibly as far back as February.
As others have suggested, Microsoft has historically offered support (in the sense of at least security patches) for each generation of Windows for much longer than any of the major FOSS operating systems. Obviously you don't get free, unlimited, eternal support with any version of any OS, but even then Microsoft has apparently made arrangements with those who really didn't want to update to a more recent one than XP to continue offering support in return for additional funding.
As I said before, the real problem here is how to deal with the conflict between wanting to keep connected systems up-to-date with security patches, while at the same time not breaking their essential functionality. Medical systems used in regulated environments where failures may literally be a matter of life or death are pretty much the ultimate example of this difficulty.
> Additionally, we are taking the highly unusual step of providing a security update for all customers to protect Windows platforms that are in custom support only, including Windows XP, Windows 8, and Windows Server 2003.
It's impressive that they had the patches, but chose to put the security of their customers on a lower level than making more money by forcing said customers to upgrade.
Their (ex) customers placed their own security on a lower level themselves. WinXP is unsupported for quite some time and was given plenty of time to switch or upgrade.
What's impressive is how people manage to spin this into an anti MS rant even when Microsoft is doing the right thing (stopping support for paleolithic software).
Windows 8 was shipping on brand new computers three years ago.
And while upgrading your OS is nice in theory, it often means abandoning perfectly-good hardware because driver support for multiple versions of Windows is terrible. In this age of barely-getting-faster CPUs, how long do you think a piece of hardware should be usable?
> Windows 8.1 Update is supported until 2023 AFAIK.
Talking about 8, not 8.1
> Purely out of interest what kind of hardware is incompatible between the two versions?
Many drivers tend to break as windows goes through changes. When I tried a preview of the windows 10 creators update I had a driver stop working right. There were also serious nVidia problems that broke the old driver versions; imagine that happening with your typical company that stops providing drivers after a year.
Though the driver comment wasn't specifically about 8. Good luck upgrading XP to not-XP, even if your hardware can perform better than a modern Surface.
Windows 8.1 Update is basically a roll-up update service pack to Windows 8 and is a free upgrade. That's like complaining that the released patches work on XP SP2/SP3 but not the original XP.
If someone fails to update Windows 8 to 8.1 Update or has magically written software that works on Windows 8 but not 8.1 Update, it's on them.
>Many drivers tend to break as windows goes through changes. When I tried a preview of the windows 10 creators update I had a driver stop working right. There were also serious nVidia problems that broke the old driver versions; imagine that happening with your typical company that stops providing drivers after a year.
Those drivers are typically either using bad coding practices or relying on unsupported features or bugs. Maybe you should contact them as a customer and let your displeasure known. If enough people do that they may actually do something about it.
1) Drivers break because they were poorly written, not because of Microsoft. (I know it because I write drivers)
2) Windows barely ever changes. The lengths through which the MS devs go to preserve compatibility are insane. It's incompetent driver writers who write buggy code who should take responsibility. If you were even minimally familiar with the Windows API, you'd know. And if you aren't, here's a quote from a Windows developer:
"I could probably write for months solely about bad things apps do and what we had to do to get them to work again (often in spite of themselves). Which is why I get particularly furious when people accuse Microsoft of maliciously breaking applications during OS upgrades. If any application failed to run on Windows 95, I took it as a personal failure. I spent many sleepless nights fixing bugs in third-party programs just so they could keep running on Windows 95."
This is more like a pharmaceutical company taking a financial hit to help in a sudden epidemic by giving out antidotes/vaccines for free, for the sake of public interest and the ecosystem.
Just because they had the vaccines ready in warehouses or could manufacture more easily doesn't mean that their customers "deserved" them for free before the epidemic hit.
If the customers actually desired security, they would've paid for XP/2003 patches or upgraded to a different supported OS. Those customers messed up on their own, and Microsoft is giving them an out here.
I agree with you broadly but your analogy is flawed because patches have zero marginal cost (once developed, code can be infinitely duplicated at no additional cost) whereas vaccines are physical entities and therefore giving one to somebody entails not giving that same item to somebody else. If Microsoft had already developed these patches why not distribute them widely, since it costs nothing to do so and takes nothing away from paying customers?
> Just because they had the vaccines ready in warehouses or could manufacture more easily doesn't mean that their customers "deserved" them for free before the epidemic hit.
They can't force XP etc users to upgrade. Custom support accounts probably got the patches some time ago. This release is for all users of XP etc. I'm sure that most of them have no support contracts. And many are likely running pirated versions. Maybe this is PR-driven. But I can't imagine how it directly makes money for them.
You can't support everything forever, even with the kind of resources Microsoft has available. At some point you've got to tell people to upgrade off the old shit, we're not supporting it anymore (unless you pay us exorbitantly to do so...)
They're still providing these to Windows XP Embedded customers like those running it in ATMs, POS systems and industrial control systems until 2019. It's probably been out since march like the other OSes for those users, so they just released the same patch to everyone else.
I surely won't be doing it myself, but I can imagine some spook making this small personal sacrifice of becoming an employee at some Windows XP shop just to smuggle patches to his mothership for vulnerability analysis.
I hope that the fact this patch was signed in February doesn't imply that it was published in February and available to every semi-competent cyberwarfare unit in the world.
QA maybe? Imagine the shitstorm headlines Microsoft would get if they managed to accidentally brick every Windows XP computer in the field with an automatic security update.
In my opinion, hospitals should never run any kind of software accessing complex protocols locally. Should run everything, except real-time critical devices, using virtualized remote applications, so the security gets ensured at data-center level, instead of at client level.
The "complex protocol" is actually "plain" Windows file sharing, which is practically always enabled in the local networks. That's how the computers in the local network are traditionally supposed to be maintained.
I've managed a bunch of computers which weren't configured so, but configuring them that way you lose everything Microsoft created for the management of the computers in the local network -- to be effective you'd have to maintain and develop your own tools, which most of the companies wouldn't like you to do. It the users are supposed to "normal work" on the given computers, not enabling the file access is much harder to achieve.
I know there are "everything virtualized" approaches, but they are really expensive.
Windows file sharing (SMB/CIFS) is complex, in comparison to delivering remote applications using HTTPS (TLS/HTTP) through one single port (which is very simple, and easier to secure). So compare: local applications running many client remote protocols, or running remote applications through HTTPS (TLS/HTTP) with all the complexity at server-side (e.g. in-building data center, or remote data center, with high availability and being fault-tolerant, etc.).
Personally, I consider the approach "nothing done on the clients" problematic on many levels. I like what Apple is doing, like trying to make the client phones do the processing there, and not moving everything to the cloud, but still keeping the phones somewhat harder to be attacked "en masse". But there they have the user base that has grown effectively from nothing, on another side the whole personal computing approach has other historical development and expectations. The oldest known computing approach was actually "dumb terminals" but moving completely in that direction is quite wasteful.
At the end, I blame Microsoft for not recognizing enough what their users actually want: I know a lot of the companies which actually pay "the Microsoft tax" (as much as the Microsoft accounting is considered, they "use Windows 10") while in fact using Windows XP and anything but 10.
And they are right to do so. The problem with everything after XP wasn't that the companies wouldn't pay for support. The problem was that Microsoft "innovates" in the areas that businesses find directly harmful. The business would of course like updates, would of course like better and safer protocols implemented, would of course turn on new security settings if they would be delivered, but they don't want all the annoyance of all that doesn't have to do anything with the infrastructure, like "Windows or Windows Server which you have to use through the new 'phone' UI."
In short, there are many reasons there's a lot of Windows XP use, and Microsoft simply decided that they don't care.
Linux is of course even worse, even with the efforts of Red Hat to have long-term stable OS versions, the concept is that most of what the user consider "just apps" are typically so dependent of so many random stuff that maintaining the stability is unnecessarily hard.
Finally, Apple traditionally doesn't care for the company use of their products much.
Which leaves most of the infrastructures in not having any "straightforward" choice. And "redeveloping everything" every time the OS companies decide to "innovate" is really not possible.
That's were we are now. There's simply not enough awareness among the OS companies that "every non-programming entity" wants the stable infrastructure. Instead, the attention deficit goals of the managers of the moment are typically chased.
Designing expensive vertical market hardware that connects to the Internet around an embedded copy of Windows that never, ever gets updates is obviously insane, and yet that's the world we live in. It's well outside hospitals, etc - this is a world in which the SCADA worm was even possible.
More specifically on the NHS, it appears there was a decision in 2015 not to update some OSes because of Conservative budget cuts. I'm trying to track down details.
yeah. They finally said "look, get off this thing" which is fair enough, but they then cut funding to the bone, and IT is always the first thing skimped on.
I'm told by a friend in IT in an NHS Trust that the NHS actually came off quite lightly - all the affected systems were front-end PCs that don't store patient data locally, the patient data was safe on back end databases, so he spent Saturday reimaging a few hundred PCs and not one satoshi of ransom was paid to the attackers. Hopefully they won't get complacent about the bullet they dodged. (Ahh, who am I kidding.)
How would you handle emergency situations where communications could be disrupted?
Instead of remote apps, I think we need something more like Microsoft's edge intelligence they demoed at Build. The central data-centre pushes containers down to the devices and monitors them. You get the centralised control without the latency and disruption that remote communication would add and be unacceptable in many medical scenarios.
PCs without network are useless in a hospital context, anyway, as everything is already networked, even with applications being running locally (the database is not stored locally). It is easier to ensure connectivity through simple protocols (TLS/HTTP) running remote applications, than running local applications plus complex "remote monitoring" stuff. Regarding latency, if the servers are in the same building, you can get < 5ms latency with the default configuration (e.g. accessing Windows remote applications using HTML5 web clients).
This isn't true in a medical environment - the devices have value "in the room" whether its device control, test results interpretation or organisation of staff, procedures or patients, you can't just lose every computing device during DDOS and switch to paper.
edit: regarding the "same building latency" do we want every hospital, clinic and doctor's office running its own local datacentre? That will come with its own availability horror stories. For something like the NHS a multi-region centralised AWS style datacentre makes sense.
In a medical environment there are two kind of devices:
- Critical devices (that in most cases don't even run Windows): already safe, because using higher security standards.
- Administration devices (patient reports, etc.): don't have local database. And if you don't have connection, your computer is useless. That's the reason of "computers are not working" on hospitals when network is down. So a network failure would be a denial of service both when running local applications accessing a remote database, and for the case of pure remote applications. With the difference that with pure remote applications the attack surface would be near-zero at client side.
That is an oversimplification and the precise point of our disagreement. I don't believe "administrative" devices are non-critical to providing care. The last time I saw an NHS doctor examine an xray - it was an XP box. Is organising ER triage non-critical too?
Distribution, redundancy and routing around faults should be our vision for these systems and IMHO edge devices get closer to that. There are many ways a hospital can still shunt data around and use it locally in an emergency without giving up due to failure of remote systems.
You're taking it to a weird extreme of magic. There are reasons to run things locally and there are reasons for centralisation. Medical centres (at least on my area) often run on thin clients, because that's a better solution than having local technical staff in each village and town. This makes things like security controls easier to manage. It also makes things like backups a part of the contract rather than part of infrastructure you need to buy. On the other hand when the internet goes down, your results may not be available.
Look at tradeoffs. There are no magic solutions. Pretending that cloud services don't solve any problems is as bad as pretending they solve all.
If a bug is wormable & your OS is still in widespread use then this ought to be the least you can do. If you’re unwilling to put the effort in, then open source the OS in some form so that someone else can.
There are vast numbers of XP boxes out there. They represent a risk to all of us.
Where is the responsibility on the companies and organizations that want to run this ancient and insecure software on such mission critical applications? Microsoft does offer a path for these organizations that can't upgrade to a newer OS, they charge an onerous support fee. This is intentional to make it painful for these companies to continue using a product that Microsoft knows is insecure. They are trying to incent good behavior (aka upgrade), like the government does with a tax on alcohol and cigarettes.
If you are in a highly regulated environment like the UK NHS then there is no excuse for either not being current, paying the proper fee Microsoft to support the OS you choose to continue to run, or taking other measures to ensure that your systems are protected, such as keeping them on an isolated / secure network with no Internet connectivity. We have solutions for this stuff, Microsoft isn't the bad guy here. The people that consciously made the budgetary decision to disregard their customer's / patient's data / welfare are responsible for this.
I'm no Microsoft fanboy, but blaming Microsoft for this is like blaming Ford for a traffic death that occurs today in a car that was manufactured in the 1950s before seat belts were standard equipment. We now know seat belts save lives, if you chose to take the risk of driving a car without them that's on you, not Ford.
I will admit I've just rapidly paged through that PDF, but it looks like I'm reading a Xen introductory paper.
Xen is open source.
I found some PV IO drivers at https://wiki.xen.org/wiki/Xen_Windows_GplPv which mention XP (search for 'XP' including (!) single quotes), and a quick Google does immediately give hits on running XP as a HVM guest.
I'm (genuinely) curious what you're describing/referring to here. What project disappeared?
He's talking about Windows on Xen, which existed at a time, but was never released, like a lot of research projects. AMD-V and Intel VT made it mostly moot though.
So you mean like... NTOSKRNL et al essentially retrofitted to run in a kind of userspace?
Nice.
I don't expect that kind of thing to ever leave a research environment though. It would mess with too many people's heads and give people too many ideas of running bare-metal kernels other than NT.
Now I think about it, I realize the reason why HW virtualization really took off is because it let vendors keep their operating systems as actual operating systems in the traditional sense of the word, making for fewer legal issues (among many other reasons).
Also, I thought Xen was essentially just a super-thin layer to kickstart VT-x/AMD-V. I didn't know it could do anything else. In fact, I thought there was only emulation and hardware-assisted virtualization. Is there a middle ground I'm not aware of?
Yes. It's paravirtualization.
Oh, Drawbridge is full NT in user-space, is in production now for SQL Server on Linux, but Dk (Drawbridge) is much newer. :)
TIL. That's really cool. Now I'm wondering if there are any small fully-paravirtualized hypervisors and guests I can play with. I guess Linux's support for various forms of I/O acceleration is more or less it.
I didn't know Drawbridge was that amazing - that's incredible.
And now I'm starting to understand Microsoft's vision: they have WSL to get Linux infrastructure onto Windows, and Dk to get selected Windows infrastructure onto Linux. Impressive.
But now I think about it that way, I know Dk will only ever be an internal framework - if that got released we'd basically have "perfect Wine" and it would allow quite a few too many applications to move off of NT.
Someone the other day wrote an excellent comment explaining why you can't just replace Windows with Linux in some professional environments like hospitals: medical hardware drivers that are only available for Windows.
I'll try to find and link that comment that managed to make a better point than I did.
This is what we did 10 years ago when I worked in a hospital.
We knew these devices were insecure by default. Some even shipped with a network enabled MS SQL Server with a blank sa-password. Quite literally a free root-kit.
Scientists and doctors working on these machines were forced to use portable storage (floppies, ZIP-drives or CD-RWs).
It was cumbersome, but no network was a strict policy, and it was there for a reason.
That's refreshing to hear that whoever had the authority in that situation also had a brain.
I wonder how tricky it would have been to set up a MAC- and plug-location-based VLAN to isolate those devices onto, with a very very carefully locked down machine sitting between the devices and the rest of the network. Deep packet inspecting firewall, copious logging, antivirus turned up to 11, the works.
I ask because I'm curious how well a theoretical setup like the above would have worked out for the described scenario - I'm sure there are similar environments where it may be impossible to get having no network approved by management.
Not all vulnerabilities are equal. If I understand correctly, the reason this one enabled extremely easy spread of ransomware is that, reading the description of MS17-010, it is enough for the computer to be a part of the local network and have the "file sharing" enabled to be infected.
Ok, having to scrap an excellent hp laptop that had vista (did not buy it in a month that Microsoft thought they had to offer free upgrades to W 7) My doorstop was not past it's prime. If car companies treated customers like window does, the brakes would fail at 4 years and the 40 page online agreement would say that the purchaser would not be able to hold the auto mfg. liable for intentional brake malfunction. Kind of like GM screwed Americans by filing for bankruptcy, getting federal funding and saving 10 cents on the ignition switch that killed many people. Microsoft and GM could be good companies, they choose not to be.
Enter rolling distributions, where each upgrade is guaranteed to work. Even if you can't upgrade from version A to C, you can upgrade from A to B to C. Why can't Microsoft products also upgrade this way, even between major versions?
Except you can see Youtube videos of examples where people started on early versions of DOS in a VM and upgraded all the way to Windows 10. Name one company that does more work to ensure backwards compatibility. You can even read Raymond Chen's blog about times where Microsoft developers wrote shims to emulate bugs in previous versions so that specific applications would continue to work.
And no, Microsoft can't guarantee everything developed by a 3rd party will continue working, and nor should they.
The bottom line when it comes to places like the NHS is that they decided to cut costs by either not entering into a custom support agreement with Microsoft so that they could continue to get security patches for XP, or by upgrading their systems to run on newer versions of the OS.
I'm starting to seriously dislike HN's lack of moderation transparency. I don't know who changed the title - if it was the post author or a moderator - and when.
For the life of me, and embarrassingly, I could not locate this patch (or any patches) to download manually, for offline install on 2008 server which would not get online updates (the reasons I would leave unknown).
Then it turns out parent is the ransomware vendor and the linked file turns out to contain the ransomware, with a few letters in the URL substituted for Unicode lookalikes so it appears to be a legitimate Windows update.
I'm not saying that's truly what's happening, but it's easy to imagine. I'd verify I'm connecting to the right domain and double-check with e.g. VirusTotal if I were you.
That's very true. Such attacks are predicated on an ignorant and/or lazy target demographic, I guess.
Incidentally, when I copied the link out of Chrome (57) it pasted the punycode link even though it showed "apple.com" in the omnibox. So then I carefully copy-pasted just the domain and TLD to work around Chrome's link-copying magic, submitted, and... discovered that Arc punycode-ifies Unicode domains.
So that was interesting, but it kind of killed the impact of the point I was making.
Wait - a critical security update distributed via plain HTTP instead of HTTPS...? I checked all the links provided for my computers, too - they're all http://
Direct links for english have been provided (as stated in a sibling comment). Yet the catalog server (to download localized versions) seems to be under heavy load or otherwise hard to reach...
Furthermore at least one download points to the wrong file: I need the patch for Windows XP SP3 x86 in German but will always receive the patch for the embedded version, which obviously doesn't work.
It's an interesting conundrum. One could argue this is a precedent. Then it becomes a question of "for what levels of severity do they patch vulnerability for?".
Exactly! IMHO Microsoft shouldn't have released the patch to people that aren't paying for extended support. It is just going to encourage cheap CIOs to continue to ignore their dying infrastructure. They just doomed themselves to another 10+ years of Windows XP support.
I can't get this to work on Win 8. The application starts up, then says "Searching for updates on this computer".
I have been waiting for like 15 minutes now
Windows Update can get itself into a state at times where it takes a long time to 'do its thing', I'd leave it an hour or two. There's a debug log under the Windows folder somewhere.
I am on 8.1, problem is I tried upgrading to windows 10 a year ago. I was one of the "early adopters". I cant remember properly but there was incompatibility with Visual Studio with Xamarin on. I was reliant on developing an app at that time and could not afford downtime, so I reverted back to 8.1.
8.1 still gets updates until 2023. They only mean 8.0 with those packages. :)
They should add a clearer error message when trying to install an 8.0 package though.
That's news to me. The free upgrade period is over[1].
The only free Windows 10 upgrade is the one for users of assistive technologies[2]. Of course, you can just pretend to be a user of assistive technologies, but I'm uncertain about the legal ramifications of doing so, regardless of whether or not you'll be caught.
Additionally, Windows 10 silently accepts Windows 7 and 8 product keys[3], but the legal situation is equally nebulous in the face of [1].
I wish more people would learn what can and can't be removed from a URL before sharing it. It's not that difficult, and it's easy to test that the "minified" version still works.
On sites like Reddit (and maybe HN), links with those extra parameters aren't flagged as a duplicate, and end up getting posted multiple times with multiple comment threads.
Because they are data for analytics. Not everyone is a fan of that, especially when the URL points to the servers of a company with such a terrible privacy policy.
If Microsoft did not provide this update for older OSes, I believe more people would update because they simply had to. Smaller hacks might not be enough for the manager of those machines to allocate resources to update, but this hack might have been just fatale enough for them to update those machines.
And we should all expect to see such a hack again. This is not a one in a century thing. Such costly hacks could easily happen a few times a year.