I'm sorry but come on. The writing has been on the wall for decades now. If you run Windows on your computer, it's not your computer, and you have to comply with whatever Microsoft says or does.
I haven't had Windows on my computer since Windows 7. Instead I've been using modern Linux distributions. This is freedom. My computer doesn't slow down, is lightning fast, and just works, year after year.
If you run Windows on your computer, it's not your computer, and you have to comply with whatever Microsoft says or does.
In this case, Windows 10 will continue to run on your computer perfectly fine. Microsoft are not changing anything about Windows 10, or the way people use Windows 10, or removing any Windows 10 'ownership' that people have right now.
The complaints are that Windows 11 has additional requirements (a modern CPU with a TMP 2.0 chip), and that older computers don't have that. Essentially people are sad that they can't upgrade because their computers don't meet the minimum spec. TMP (https://en.wikipedia.org/wiki/Trusted_Platform_Module) is a bit controversial, but you can run Windows without it. Just not Windows 11.
Although things might be different with Windows 10 and 11, people have had plenty of bad experiences with the previous versions.
For example, if we change the version number, we get a slightly different picture: "...Windows 7 will continue to run on your computer perfectly fine. Microsoft are not changing anything about Windows 7, or the way people use Windows 7, or removing any Windows 7 'ownership' that people have right now."
Remember Windows 7 having constant nag screens in it to upgrade? Remember Microsoft pushing their godawful Metro design, alongside a lack of a proper Start menu, before that was patched out? Remember people discovering that their OSes had forcefully upgraded after leaving their computers unattended for a while?
Given all of that, a bit of distrust is to be expected.
> In this case, Windows 10 will continue to run on your computer perfectly fine. Microsoft are not changing anything about Windows 10, or the way people use Windows 10, or removing any Windows 10 'ownership' that people have right now.
This isn't exactly comforting or acceptable. Windows 10 will stop receiving security updates in what, 5 years? I have a desktop computer from 2014 that I use daily, there is absolutely nothing wrong with it. The amount of (unnecessary) ewaste Microsoft will create by going through with this decision is staggering.
Microsoft insisted that Win10 was their last Windows OS; that it would be the current OS, and continue to receive updates, in perpetuity. I for one have relied on that declaration.
Doesn't that count as deceptive marketing? I mean, the timespan between Win10 and Win11 is comparable to the timespan between Win8 and Win10. Does Microsoft lack a corporate memory? Have they said anything about this discrepancy between their forward-looking announcements and their actual behaviour?
Do they think it doesn't matter what BS they spin to their prospective customers?
MS is between a rock and a hard place when it comes to security and legacy support but I still think they approached this the wrong way. Windows is both the most popular PC OS and a huge target because of that legacy baggage they're been carrying around for years.
On the other hand every other device on the market now (mobile phone, tablets, everything Apple) are almost completely locked down from the user's perspective so MS probably feels they are safe in joining the rest of the world in tying the user's hands.
I understand the move towards better security but they should also consider extending Windows 10 support beyond the current end of 2025 deadline, given the harsh requirements for upgrade. You could buy a brand new MS device today and see it turn into an unsupported paperweight in just 4 years. The security pros for Secure Boot shouldn't come at the cost of millions of useless and unsupported PCs so early in their (historically long) lifecycle.
I bet they'll allow Windows 11 upgrade for all computers, when Windows 10 will stop receiving updates. They just want to force manufacturers to make all motherboards with TPM 2.0. There's no technical reasons to forbid those upgrades, just political ones.
I'm probably not as technical as the average HN commenter so maybe you can fill me in...
I've had personal computers for years and none of them have been on corporate networks so I don't see how MS should care about me in that - and for actual businesses aren't their hardware turnaround times generally less than 5 years? Is this all worth it to shave off that 1%?
Everything will be in the cloud and connected with a subscription, paid or with ads.
Your computer will only be client connected to those services, and just as users getting banned from different internet services by breaking the user agreement, you will be banned from your computer for breaking the user agreement. The distinction between the computer and the internet services will be very fuzzy because they will be intertwined.
Your pc is 7 years old today and windows 10 will receive updates for another 4 years, that's 11 years, and you can keep using it after but without security updates (which I think Microsoft will eventually give in and extend them longer). 11 years (or possibly more is a good lifetime for a pc considering people now change cars more often than that)
> Your pc is 7 years old today and windows 10 will receive updates for another 4 years, that's 11 years, and you can keep using it after but without security updates (which I think Microsoft will eventually give in and extend them longer). 11 years (or possibly more is a good lifetime for a pc considering people now change cars more often than that)
To go through your points, people are still buying new computers today which Windows 11 will not support, so we're looking at a 5 year life cycle for that hardware if those users do not do something like switch to Linux, which we know most people will not.
Using a PC which is no longer receiving security updates is a terrible idea if it's connected to the internet. Continuing to use the device while it is connected to the internet is not a viable option.
Most importantly, even if my computer happens to be 11 years old at that time, if it still works, it still works. I 100% reject the idea that people should retire perfectly fine computers over unnecessary and arbitrary hardware requirements.
What really gets me upset about this whole situation is the marketing around Windows 10, and how it would be the last version of Windows ever.
> In this case, Windows 10 will continue to run on your computer perfectly fine.
Has it been confirmed anywhere that Win10 installations won't be automatically updated to Win11? Given the automatic nature of Win10 updates I'd expect that I'm waking up one morning, and be greeted by Win11 on my gaming PC (and since Win11 is just a marketing name for what's normally a Win10 feature update, this is most likely to happen, at least for Win10 Home users).
I might be wrong but I don't think that Win10 even updates itself to new releases. I stayed on Win10 of 2017 year for 3 years. Of course it was receiving updates and there was option for major update, but it never forced me to do so.
The rollout of Windows 10 itself contradicts that. Silently switching from opt in to opt out prompts, so people trained over months to click away the Windows 10 adware spam would suddenly find themselves with a Windows 10 installation.
They enforce secure boot. Many vendors only have the MS CA implemented to check OS integrity. So you would have to get your OS certified by Microsoft...
Can people stop already. Its NOT true. Windows 11 runs perfectly fine without. There is no enforcement at all. All the hardware checks are in the setup/installer which is not needed its just one way to install the OS and ofc it can be circumvented easily if someone really wants the installer/update experience for some reason.
Same for TPM. Not even TPM 1.2 is enforced and that was a requirement for windows 10 already.
I looked at the code - it downloads a .reg file from the web and modifies your registry with the content of that .reg file in admin mode. At the moment that .reg file modifies the registry to allow bypassing TPM checks, but at any future time the author can replace that .reg file with something malicious. I wouldn't download that program.
I'm aware of that. It also loads a DLL from windows10 installer. I assume it intentional so the files can be updated if MS would add more checks. It's open source for a reason.
They enforce Secure Boot on Windows 11. It was always a requirement on Windows 8 and 10 but never actually enforced.
There's nothing to stop you from using Windows 10 or Linux on your hardware unless MS goes all anti-competitive and forces OEMs to build hardware that only runs Windows 11.
> unless MS goes all anti-competitive and forces OEMs to build hardware that only runs Windows 11.
They didn't force OEMs, but some devices can only run Windows 8.1 with no ability to turn Secure boot off.
> ARM-based Certified For Windows RT devices, such as the Microsoft Surface RT device, are designed to run only Windows 8.1. Therefore, Secure Boot cannot be turned off, and you cannot load a different operating system.
>Many vendors only have the MS CA implemented to check OS integrity
Secure boot was introduced for Windows 8, ages ago. Microsoft and PC vendors promised that on all Intel PCs the user can install their own certificates. I have not heard that this would have changed. So the problem affects only ARM PCs, which aren't particularly common.
As a Linux user I use secure boot. On some machines with big distros that have shims signed by Microsoft. On others with Arch or Yocto with my own keys.
Who uses remote attestation for what purpose? I have never used it.
Secure boot is simple and lightweight. Should I just wait until some damage happens? I agree it's not the most critical measure. I have at least one machine where it doesn't work and many where initramfs remains unchecked. Have not lost my sleep yet.
Almost all mainstream Linux distros can OOB do UEFI secure boot through a MS-signed shim, or Grub signed by Canonical or Fedora (which is a trusted cert by the MS cert).
Secure boot does not prevent a user from installing Linux. That’s just FUD.
I don't think making people aware of the logistic problems of having to sign you OS at Microsoft in most practical cases is FUD.
Secure boot or TPM can make sense for the cloud, since you might want to check if the provider changed your OS. MS used ransomware to justify secure boot. That is FUD. Boot and Bios trojans have become extremely rare and it doesn't provide enough protection anyway.
It may be FUD, but people's concerns are valid. This is clearly another stepping-stone in the direction of removing control of the hardware/computer away from the user (under the guise of Trust, Safety, Privacy and Security). We just don't yet see the bigger, long term end state.
But if you ask me, "they" essentially want an un-tamperable and un-recordable pipe between hosted server and the monitor that emits electrons to your eyeballs.
I agree completely. I believe this to be an extremely bad failure at MS for product and company strategy. They basically own the PC OS market but jeapardize their own position massively. I meet devs that target Windows less and less.
That said, I might not really cry myself to sleep if people leave Windows behind.
Do you really think random people will leave Windows behind? And do what? Most of the average PC users don't even know there's an alternative (aside maybe for macs, which they probably consider overpriced, so not a direct alternative).
I think people who complain about this situation are people who know enough about computers to know that an "older" but higher-end computer would run just fine. The random Joe probably isn't running a 6th gen maxed-out i7, but some cheap low-end PC that they're used to having to change fairly regularly because it breaks for no reason.
I'm not defending MS here, but I think their strength is based on enterprise clients, where people change computers much more often, as based on their support contracts, and run-of-the-mill users who don't even think of "Windows" as being different from "the computer".
Power users are either "stuck" somehow on Windows (games, or other software that requires Windows) or are already on Linux. In the first case, it may be an issue, but they'll probably just grumpily whip out their card and upgrade. What else are they going to do?
I'm in the second situation and running a 3rd gen i7 and have no intention of upgrading it as long the computer works. But I only occasionally boot it in Windows to play some game. When games won't support Windows 10 anymore, I'll just play some other game.
So tech people have a big responsibility to make these "random Joe and Jane" educated, at least just enough that they can make more informed choices and learn further, if they should so choose. Is each tech person at least making a few other close non-tech friend/family aware about the hot issues in computing? And I'm not taking about that recalcitrant 90 year old grandma or that pop-singer who doesn't even know or care what an icon means.
But there are plenty of people out there who are otherwise highly educated or intelligent who simply don't have sufficient domain knowledge when it comes to computers, to make more informed choices instead of being herded like sheep by the tech-giants.
And I can't think of a single go-to resource on the net that can explain the intricacies of computers and apps to the average (but intelligent) Joe, with a particular view towards enabling more freedom and privacy for them.
> But there are plenty of people out there who are otherwise highly educated or intelligent who simply don't have sufficient domain knowledge when it comes to computers, to make more informed choices instead of being herded like sheep by the tech-giants.
That may be true, but I also get the feeling that they choose to spend their energy worrying about other things, as long as the computer does the job they want it to do.
Hell, I have a friend who's actually a techie, tried Linux, actually even has it installed on his daily driver (dual boot on a single drive in a laptop), he just never uses it. He's a Java dev, so all his stack would probably work on Linux with little or no fiddling.
But he just does not care. Windows doesn't bother him enough to make the change, so he didn't change.
I think this is the most important part: Windows mostly works for most people. Paradoxically, I think that the fact that the OS is less and less important is actually a bad thing for Linux adoption. Because people just don't care about the OS as long as the browser works. And on Windows, it works well enough. So why would they change?
There are of course philosophical reasons, but people don't appear to care. There are billions of Facebook users. This shows people don't care about these issues. I don't know if it's because they intrinsically don't care, or because we've done a bad job of educating them. Maybe a bit of both.
But it sure feels like an uphill battle, even among "techies".
> Paradoxically, I think that the fact that the OS is less and less important is actually a bad thing for Linux adoption. Because people just don't care about the OS as long as the browser works. And on Windows, it works well enough. So why would they change?
I disagree that it's a bad thing for Linux. If you install Linux on a persons machine where they only browse facebook, they wont notice and they save ~120$. Worked fine enough for a couple of non-techies I know (kindergarten teachers).
The same tech people that are now resposible for the Web having turned into ChromeOS, or the tech people that keep the Web open by advocating Safari and Firefox?
No we are NOT responsible for other peoples' life/choices. Period.
This is no longer the 90s, every year there are articles "20?? is the year of the linux"! Linux has actually been around for longer than w95/NT and it is 10-15 years now that user friendly distributions are available. More or less, a laptop with a linux installation will work until it is thrown away.
However, installing an operation system (be it win or linux or OSX) is not sth that everyone can do neither that can be expected of everyone.
So, in my opinion:
1) Linux usage/adoption can not increase unless there are ways to be preinstalled on laptops/supported in a similar way as windows/osx are.
2) The importance of freedom is a personal matter which we can not impose on people.
> Do you really think random people will leave Windows behind? And do what? Most of the average PC users don't even know there's an alternative (aside maybe for macs, which they probably consider overpriced, so not a direct alternative).
This is a pretty dated argument. Chromebooks outsell Macs. Tablets (Mac or Android) are another well-known substitute for PCs for many. Macs are not that expensive compared to similar Windows hardware, and there's an entire low end tier of alternatives priced below Windows machines. 15 years ago Macs were expensive and nobody knew about Linux, so they drove to the store and bought a cheap Windows device. That hasn't been accurate for a long time though.
I agree with what you say, but this makes me think my argument wasn't clear enough.
I think that "regular users" don't make a difference between operating systems, as in it's not something they consider and, most importantly, it's not something they change after the fact. As a sibling said, people upgrade when they change computers.
I'm not arguing whether apple's hardware prices are justified or not (I personally consider they were up until a few years ago – I own multiple MBPs). I'm specifically talking about cheap computers. But the fact is I can walk into a random supermarket in my parents' small town and walk out with a windows laptop for a few hundred euros. Not sure where you can find "low end macs priced below windows machines" (or maybe I misunderstood your point). Chromebooks do look like an alternative, though, and can be found just as easily.
Of course, many people are taking up alternatives to windows pcs, like chromebooks or tablets. But I doubt they do this because they can't run windows 11 specifically. They probably realized they only browse randomly the internet, so they don't need a "full-blown" computer. Bonus points for chromebooks being cheap, and for tablets being light. But I think they only make this change when it's time to buy a new computer, not in response to some MS decision.
And more importantly, they won't install some other OS on some PC they have when they realize it can't run windows 11. They'll just keep running windows 10 until the pc won't boot anymore.
This is the same exact thing that has been happening the past 40 years with this company and Windows.
Windows 10 was my fav version because that was the straw that broke the camels back for me and I would never even consider installing Windows at this point.
KDE Plasma is so vastly superior to Windows it isn't even close. I don't even work in IT either. A Dev intentionally using Windows is just embarrassing.
Yeah, it would be amazing if Adobe, DS, Autodesk, and others would somehow realize they should decouple their software from the OS and make a Linux version (that could even run on Windows).
Though right now the only problem I have is Windows restarting overnight for updates. Closing half the open programs.
Even though I explicitly set it to no auto restart if user logged on. Then again, it ignores the work time, too.
> Even though I explicitly set it to no auto restart if user logged on.
This really annoys me. I mean, why can't they just pop up a warning that I need to reboot? Or perhaps (not so good) at least take note of which apps are running, and re-launch them after the forced reboot?
Windows has been getting suckier with each release since Win2K. None of this suckiness is security, or seriously-cool enhancements. It's all caused by marketing exigencies.
Microsoft's awfulness makes me want to cry.
[Edit: changed "since WinNT" to "since Win2K" - Win2K wasn't suckier than WinNT - as far as I know, it was just WinNT re-skinned]
I think even if it was a good investment to release that kind of high value software on linux and support it there (and it might be,) certain decision-makers may choose not to for ideological reasons. FOSS is often either treated as a resource for exploitation or some kind of anti-capitalist boogie man. I swear I've been put on some peoples' shit lists just for suggesting that we publish something under a FOSS license for visibility and goodwill.
Those vendors used to have UNIX versions and gave up on them, and some of them still do have a GNU/Linux version, only to keep a bunch of very important customers happy, otherwise they would have given up already.
Me too. Today I turned on my Windows 7 machine for the first time in months. I checked my old Facebook account. I deleted some unwanted friend requests. I shut the Windows machine down and went back to the Linux machines.
> Today I turned on my Windows 7 machine for the first time in months. I checked my old Facebook account.
That's brave of you, I hope you had installed the latest security updates, which for Windows which hasn't been updated in months can take several hours, multiple reboots and has 1/10 chance to update successfully.
Update mechanism are another reason for someone to run Linux, Especially a rolling release distribution.
I totally agree. Ditched Windows a couple of years ago after using Windows and Linux. I didn't agree with Microsoft constantly changing privacy related settings through updates.
Linux for work and development. PS5 for gaming. I am happy that my complete household is Microsoft free.
Linux is great and a lot of progress has been made on the gaming front lately. However, if you want to play all the latest AAA titles at their highest fidelity, there's no choice but to use Windows.
Yes, and? Computers are so much more than gaming devices, it makes no sense to restrict yourself just by that one facet. Or should I use a PlayStation as my main computer just because it has exclusive titles I like?
Nobody said anything about only using a computer for gaming. It's just that for many of us, gaming is the only reason we're still running Windows at home on our desktop computers.
At the moment, it would seem apple is at least a bit more responsible with their walled garden. Their users don't get served ads, aren't forced to have Teams now always running, and in general are subject to far less bloatware.
of course you can make the hardline FOSS argument but compared to say, the phone OS market Windows is actually remarkably open, in particular given how dominant it is and especially was.
Imagine if Windows had charged every developer who ever built anything on windows 30% of their revenue. They got almost bludgeoned to death for shipping a default browser 20 years ago.
Windows deserves a lot of criticism but we take it for granted that a monopolist basically just sold an operating system and let everyone run and build whatever they wanted on top of it, we could've been fucked way worse.
> Instead I've been using modern Linux distributions. This is freedom. My computer doesn't slow down, is lightning fast, and just works, year after year.
That doesn't match my experience of modern Linux at all. Plenty of forced "upgrades" that broke things, sometimes leaving me with an unbootable system (systemd).
Nobody should run a linux distro that has systemd in it, but thankfully you have many many options for that that don't involve running an OS with terrible hardware support (and a small community to boot).
My computer doesn't slow down, is lightning fast, and just works, year after year.
Can say that about pretty much any OS, highly depending on what exactly it is you do with it, and if you happen to be lucky enough that you're the one which doesnt't have one single problem. Especially the 'just works' bit rings much less bells for me when it comes to Linux than it does for other OS. Which kinda harms the feeling of freedom I get from it.
As someone who worked both in IT support and has parents who use compiters despite not knowing how to use them, this is not true.
I had a ton of issues with Windows when my mother used it (despite me being a long time Windows user and offering her remote assistence). At one point I got so fed up with yet another update that fucked things up (=changed the UI she was used to) that I decided to install Ubuntu. My mother was happy with the new system and the number of times I had to support her was one thenth of what it was before.
OSX is also not without pitfalls, especially the updates can break a lot of existing setups (hardware...)
Linux is great for real power users or real beginners, not so great for intermediate people who know enough to fuck a system up for good. But if you are a beginner or a power user it just works unless you fuck it up.
With Windows and OSX it just works till they decide to fuck it up, even if you are trying to actively prevent it.
You're missing my point, sorry if that was not clear from my wording. It is indeed perfectly possible that you experienced e.g. Linux working flawless for you and had other OS 'fuck it up'. But just like OSX isn't without pitfalls, nor is Windows, nor is Linux, nor is XXX. Not everyone encounters those problem, not everyone has no problems at all. There's enough anecdotal evidence around showing that (including your post). I don't think any of that is provable untrue.
Just like the sibling comment, some more anecdotes painting the 'it depends' picture: my main Windows 7 dev machine has been running for about a decade now. It's not slower than it used to and just works, year after year. My main Ubuntu (LTS 14 through 18) machine has also been doing pretty well in the roughly same timespan. Same hardware, but took substantially more hours on keeping it running. In the meantime also had an MBP with OSX and the software overall is pretty decent, just got unlucky apparently and the hardware died on me after just 3 years IIRC. Other machines I also use sometimes didn't all do so well, but there's really no clear loser.
That strikes me as highly anecdotal. Another anecdote for you; my Windows 10 Thinkpad which is used for 10 hours every day remains as quick and reliable as the day that I got it and generally goes for months without being restarted.
Well sure, because it is. As mentioned I also worked both in first- and second level support in an enviroment that had Windows, OSX and Linux desktop, with very varying degrees of proficiency between the users of all operating systems.
The anectdote from my mother is mirrored in the many experiences I made at the job. Sure that is still no true evidence, but I never claimed it was more than just my opinion. My opinion that I derived from working at precisely the issues involved.
Btw. there was one particularily bad Windows user (the type that comes with viruses and toolbars, the type that thinks someone can control their computer if you plug their mouse into another computer). We had nothing but hassle with that person until we moved said person to Linux, where a big pain turned into a much smaller pain.
This is again just another annecdotal example, but I could go on.
Microsoft really is trying hard to get Windows 11 to fail. They still haven't realized most people don't feel the urge to update unless its 100% painless or it's necessary. If you alienate or disgrunt the "techies", ie those who recommend people what to do with their tech, you are going to get people that remove 11 to install 10, or holdouts on older versions. That's the way it is. IT Departments won't maintain two separate versions of Windows, so expect 10 to remain the default everywhere until 2025.
If they’re not automatically installed, they’ll never be installed. They’ll upgrade to Windows 11 when they buy a new computer. The idea of downgrading won’t cross their mind because the operating system and hardware are not distinct, decoupled components—if they don’t like Windows 11 they’ll just learn to live with it because that’s just how new computers are.
If what you've described was indeed the case Vista would have taken off eventually, but it didn't and it barely touch a 20% of the market in almost 3 years of being exclusively sold by OEM on new machines. In the Enterprise market, it probably never reached a 1% of all installs.
The reason for this phenomenon is in my experience that most computers are actually managed by a vast public of "techies", i.e. people that are computer literate enough to reinstall an OS and to provide support to their computer illiterate relatives and friends. These people tend to be vastly over assess their IT knowledge, and are very vulnerable to "rumors" spread on forums and blogs. This makes them very prone to becoming zealot about random topics they hear on the internet (like "Windows 8 sucks", or "Linux is bad").
When a non-techie gets a new computer with a new Windows version, they often can't figure out how to accomplish even the most basic things - or rather, they don't care and don't want to invest the time to tinker with the new system, generally. They then rely on their own "trusted" person for that, and one frequent answer they get is that the new Windows version is crap and they offer to downgrade their computer. I've seen this happen a lot with both Vista and 8, EVERYONE was downgrading PCs back then without a good reason for doing so. I even saw PC shops recommending to downgrade before selling a computer, with leaflets advertising for the service on the front desk.
I saw a high school buy Core 2 Quad machines in 2008 and wipe away Vista in favour of XP, even though they only needed the machines for Office and Autocad, and there was thus absolutely no reason whatsoever for doing so, given how beefy the machines were back then and that Vista SP1 had basically fixed the OS. The main motivation behind this was the stigma around Vista, and the fact the IT department was full of low-skilled technicians that got their training from pinned threads in PC forums. It was rare to find anyone using Vista back then, even on new computers, unless they were basically grandparents whose grandchildren lived far away.
For what I've learned in these years, the first month or so a new Windows version comes out is crucial and it foreshadows how well it's going to fare with users. If there's even a slight hint of doubt it could be bad the users immediately reject it, it gets stigmatized and it is basically dead in the water.
It's very diffierent in the Apple ecosystem in my opinion. People are usually nonchalant, even excited about upgrading. I think that's how Microsoft want people in the Windows ecosystem to feel, but it's just not refined enough for that to happen.
That's not my experience at all. People are just as wary about upgrading Apple devices. I've seen it among friends and family. They're afraid things will break or their device will slow down or everything will move around and they'll have to relearn how to use their device again.
And rightfully so. iOS 7 upgrade turned my iPhone 4S from blazing fast to a slow molasses, and there was no way back. At least you can downgrade most OSes (not sure about macOS).
Practically. And people will be so elated with Win 12, when the stupid changed will be normalized somewhat. I feel like I'm being courted by a pickup artist.
They already disgruntled a lot of devs with their environments. .net core still lives, but anything else seems like a wasteland. There are still a lot of VS Basic and C# devs, but it at least seems their number is decreasing.
It is really a problem to get someone that knows the tech if you have to interface it for some mundane Office tasks.
Interfacing their Graph API is a horror show that I recommend to any dev about how not to do security.
“Deleted all comments” is click bait IMO. This is a YouTube video, so the choice is either to allow commenting, or disable the comment section with all existing comments with it. The title reads to me to imply MS is doing some kind of censoring. Yeah that is one of the result of what they’ve done, and that might be the main intention, but heavily implying this in the title without evidence is bad journalism.
Actually, that's not true. I entered a mildly critical comment on a YouTube video the other day, very spot on if I do say so. It immediately got a bunch of upvotes, then that stopped completely. My comment was on top still, but that's just how YouTube displays your own comments.
But when checking the video without logging in to YouTube, my comment was gone.
So it looks like comments can be quietly deleted without the commenter even knowing it.
The censorship isn't indiscriminate, it actually discriminates against and suppresses users who trend towards flamebait. Without careful moderation, this site would quickly turn into a worse version of Reddit.
There are two possibilities, and the following isn't a valid choice, so...
"Microsoft disables comments because of overwhelmingly positive reception of Windows 11!"
They don't want to deal with criticism and likely consider everything a sunk cost, so it is what it is and nothing will change.
It's kind of funny. I recall watching the recent Apple developer conference live on Youtube and the comments section had been turned off. What are these giant tech companies afraid of?
Comment sections on youtube are usually at best useless, and at worst very toxic.
I would disable them as well, not because of 'being afraid of anything' but because it's a net loss to have them enabled, and nothing of value is lost if they are disabled.
Well that's just your own personal opinion. I consider real time audience feedback to hold some intrinsic value to the live presentation. The fact that you think that enabling comments would result in a net loss means that at the very least there is something to be afraid of.
Are you confusing live chats to comments? My post (and your parent post) was about the comment section, which is not real-time. You can’t comment under a live event, only in its recording after the event has finished. The live chat section (appearing next to the video in the desktop web UI, not under) is what’s going real-time.
Am sure they have promo videos for their clients (governments/health orgs). Unless there is a direct-to-consumer vaccine on the market that am not aware of.
I said this on a previous win11 post here, and I'll say it again. I don't want telemetry on my machine, I don't want to jump through loops to turn it off, I don't want Teams on my machine (are we back in ie6 territory again?), I don't want a 'must be connected to the internet' operating system.
After 20 or so years of using Windows at home, I think I'm probably done with Microsoft.
Linux and KDE satisfies all requirements now, and in fact my machine is dual boot and I haven't booted into Windows in weeks.
The only thing I'll miss is Visual Studio, but I have a Jetbrains license, so that's fine.
It makes me wonder: why does Microsoft take this risk now? Switching to Linux was so much more painful 20 years ago. Switching to Mac is probably easier than ever. So why, instead of encouraging their users to stay, they do everything to alienate them? Of course, the majority will stay because of inertia, especially in the enterprise. But it paints a bleak future for Windows.
With Windows 10 Home forced upgrades I saw again something that I haven't seen since Windows 95: a growing feeling among users that their OS, which just supposed to work, is overtly hostile to them. You can hold it just a bit and then it will reboot against your will and you just hope all your work, settings and so on was saved. Your OS is getting in your way, it prevents your work being done, just like Windows 95 did with constant crashing. Once these things have piled up, users seriously start looking for an alternative. My parents, for example, switched to iPad Pros and haven't regretted it. This might not be possible now for all users yet, but we're getting there, and Microsoft seems to be as blind to this aspect as in the 90s.
Does the user have any control of the operating system on an iPad? Why should Windows hardware be any different? You are making a contradictory argument.
What has happened is that the computer has changed, it is no longer the big old tower but mobile phones, tablets and laptops. And the laptop and tablet space is already merging as laptops that has a touch display or a tablet with a keyboard.
Where the tower could be upgraded part by part including the operating system, the new type of computer doesn’t. The consumer usually buys a new one within a few years. This is where Windows 11 fits in.
I wouldn’t be surprised if you rent your computer in the future, you already have phones, which is a computer, with carrier subscription. This will of course also reduce user control even more. Combined that with the cloud computing and your computer is more of a client on a corporate controlled network.
My point is that if users feel their OS is more and more hostile, they will slowly start migrating away. Going towards Apple is a disaster from the point of view of open systems. Nevertheless, this is where people will mostly go. Only power users can allow themselves to migrate to open systems like Linux, BSD and more exotic ones.
Sure, but your comment didn't make much sense if switching to an iPad was an improvement in regards to hostility, logically that would mean that Windows should behave just like an iPad, i.e. no control.
What Windows "suffers" from is that it is still in a transitive state going from an open platform (open in the sense the user decides what to run & install) to a closed platform (the iPad).
Traditional desktop users expect an open platform and thus gets annoyed over forced updates, stores and similar things, but those things is considered normal on closed platform.
The thing is that the traditional desktop user is vanishing, as your parents are an examples of. The "market" wants a closed system, thus Windows is adapting. Eventually Windows will have completely moved to a closed platform and will be used and viewed as iPad or similar (the Xbox is already such a device) and the annoyances will no longer be annoyances.
Yes, the traditional PC user will be migrating to Linux & BSD, but they will be such a small number globally it will be a rounding error, and eventually it will be harder & harder to stay on those platform when the hardware industry is entirely serving the closed platforms.
> Yes, the traditional PC user will be migrating to Linux & BSD, but they will be such a small number globally it will be a rounding error, and eventually it will be harder & harder to stay on those platform when the hardware industry is entirely serving the closed platforms.
I think there is some hope as the example of Valve shows.
I wouldn't say Windows is getting in my way any more than KDE, or MacOS (I own an Apple laptop, a Windows laptop, and a Win/Linux PC).
It's just not what I want or need in an OS any more. I develop on and for Linux.
I don't expect my friends to switch from Windows because they're not that techie.
But consider that in 2001 you'd go to a download page and the option would be "Windows only", and in 2021, you can go to a download page and it can now be "Mac only" or no Windows option at all.
> Apparently everyone is going to switch since Windows XP.
Haha. Yes, next year of the Linux desktop. Like 'free beer tomorrow' outside your favourite bar.
In my other comment, I believe that developers are gradually moving to other platforms. I don't expect Windows to go away, because it's a massive platform and therefore you can make money writing code for it, but I do see it becoming less and less relevant.
Developers will go to and target where the users are. Despite the archaic coupling apple puts in place between MacOS versions and Xcode, developers still buy new macs every few years to target iOS because that's where the money and the users are.
I’ve been very surprised by Macs. If you told me forty years ago that I’d be using a MacBook one day I wouldn’t have believed you - they were always the fringe systems that couldn’t integrate with anything except the alien spaceships in Independence Day and the users were always marketing darlings that could not find anything that wasn’t visible on the screen in front of them.
However fast forward to today and Macs integrate very well into the environment, the development tools are free, you can throw iterm2 on and have a very nice Unix environment, add in Docker and you have a pleasant Linux environment at your disposal as well. Microsoft stuff works well enough - Teams on Linux is a real pig that consumes all memory and CPU, and drops audio randomly - but that is completely on Microsoft.
Apple has its own issues but if your not developing MacOS kernel extensions you can steer clear of most of it. Pay your $100 for Apple Care or your a third class citizen. I’ve seen Apple replace iPhones that had obviously been clawed open with a fork with a smile if you had Apple Care but they won’t do jack if you don’t have it.
> If you told me forty years ago that I’d be using a MacBook one day I wouldn’t have believed you - they were always the fringe systems that couldn’t integrate
40 years ago, the Mac had not been released yet. Hell, neither had the IBM PC, MS-DOS 1.0, or even the Commodore 64.
At the time, people would endlessly complain about every new version or even small change, too. I don't remember the complaining to be worse for the (in hindsight) worse versions. Plus it's only the complaints that you hear about, very few people will rave about how great something is. Personally I thought that Windows 7 was a majors step backwards from XP and avoided it, skipping straight to 8, which I thought was great.
More and more we saw in the latest Microsoft products (also other companys) really bad ui and features. Nobody want have these features exept some marketing people. Nobody use these features .. they are just anyoing as hell, like the actual save file dualog in office, the 3D directory in home etc .. For a time it might work, after a while your product is just a piece of broke shit.
The outrage that they release a new major version that requires new hardware is…weird.
This is exactly like someone who bought a PS4 realizing that some new games require a PS5 (change my mind)
If Microsoft had made this in the 2022 update of Win10, that would have been outrageous. And if they stopped patching Win10 that would have been bad too. But it’s neither of those. It’s entirely new software for new hardware.
> The outrage that they release a new major version that requires new hardware is…weird.
My impression is that the outrage is because the requirement is... weird. As in artificial. It will support new, lower-end CPUs, but not older, more powerful CPUs.
I seem to remember the cutoff was 8th gen Intels. A friend has an XPS with a 7th gen i7H. That thing is much more powerful than the dinky i5u I have in my work laptop, but since mine is 8th gen it will be able to run Win 11, but not his. My desktop has an overclocked 3rd gen i7 that runs circles around both of the laptops combined. Can't run win11 either.
I get there's the TPM thing, although it's not clear to me why it's so important or whether it's not at all possible (as opposed to just uncommon) to have a TPM v2 on an older than 8th gen CPU.
I wouldn't hold my breath. I'm not a Windows user, but my understanding is that the legacy code allows people to run old software, so I wouldn't expect this to go away just because the required minimum CPU is fairly new.
If anything, it should stay in place, as an argument to push adoption of Windows 11: "yeah, you need brand-new computers, but don't worry, all your old software will still work as before".
> but my understanding is that the legacy code allows people to run old software
There might be hardware support that can be cut, if specific x64 features are guaranteed to exist on the newer generation or bugs are known to be fixed you can drop the fallback code without breaking anything.
That's the invisible hand of Intel pushing their good buddy MS to burry the Meltdown/Spectre vulnerable family of products from officially supported software. EDIT:Or at least those family of products too old to be patched to a sufficient degree. Once you view the product requirements through that lens, it makes a lot more sense. Additionally the TPM/Secure Boot requirements allow them better "control" for lack of a better word over OEM Windows licensing, but their masquerading it as a security feature (even though w10 supports it) kinda blew up in their face.
Between that and trying to push a new MS store that allows them to take a cut of 3rd party software a la the play store/steam/etc is the real reason Microsoft suddenly switched gears from W10 being the "last" version of Windows to "hey guys, everyone upgrade to W11, it's totally different see, we moved the start menu!"
> MS to burry the Meltdown/Spectre vulnerable family
> Once you view the product requirements through that lens, it makes a lot more sense.
No it doesn't. You can disable these mitigations in kernel right now and there is no guarantee that intel won't have any vulnerabilities in future micro-architectures. So you need the support to enable/disable future mitigations in kernel anyway.
Yes, it does. Because this is a officially supported hardware list (read: liability to support, which heavily influences enterprise and business purchasing and upgrade decisions). It's important to remember that "support" in this context does NOT mean function, that's an entirely separate and unrelated issue. Put another way, Disabling those mitigations does not prevent the product from functioning, but it will allow MS and Intel to deny support and push a new product as a fix instead.
Just because you can disable the mitigations or that the new family of products may well be flawed (in a similar manner, or in entirely support manner) is frankly irrelevant, and suggesting as such entirely misses the point.
This is a business (Intel) pushing a product through a partnership (With MS) with the idea that they'll both see boosted returns from upgrades to newer hardware families and in turn new OEM/Enterprise licensing. Why fix the problem when you can bury it under new quarterly sales.
If they were saying you'd need more disk space or a faster CPU, I'm sure people would be more understanding. But the new restrictions feel very "artificial" - there doesn't seem to be an obvious need for them to not support old (but relatively recent) CPUs any longer, and I cannot fathom why having a TPM should be a hard requirement.
> The outrage that they release a new major version
I have the feeling that it's a bit inherent to OS releases in general. I'm very interested in macOS release news, but if I come here for the discussion, it's overwhelmingly negative. I'd like to see positive discussion about how new features can benefit me, but it's just not there.
How serious is MS about requiring TPM? I haven't read too much into it but every time I hear people talking about it they say there will be workarounds or MS will eventually budge. The same with requiring online accounts.
That's funny. I have virtualization-based security (Core isolation, memory integrity) turned on, and working, (but had to uninstall a couple of incompatible drivers and use generic ones instead), on my old Haswell laptop. And Haswell is old by this point...
My friend said he installed Windows 11 preview build on an old spare computer for the evaluation.
"But how did you solve the TPM requirement?"
"I just bought a cheap TPM module from dubious Amazon market place. I cannot examine this piece of additional device so this system is less secure in my opinion."
Knowing how sloppy MS is they are going to get pwned in a month. Let's all remember that the community managed to find a way to install macOS on PCs a few weeks after the first x86 build was made public.
It's more of an acknowledgement to how impossible it is to avoid people to run what they want where they want, unless your software is not meant to run on off-the-shelf hardware in the first place (see iOS)
this stuff has always been configurable from the policy editor and I can't imagine that ever going away, but I can imagine them removing the policy editor from the free editions - but then someone would just make a free policy editor (if there isn't one already)
Last time I checked it was wrongly reported as malware and blocked at the browser level.
At least it's accessible now but there may be some latency in Defender getting updated.
The tool is sound. Unless your company explicitly disallow it, you should be able to bypass defender and force it to install, although there are some dark patterns there that may not make it obvious.
The author of the tool is aware of the problem, from the github page:
_N.B. A few antivirus programs incorrectly flag Policy Plus as malware. Policy Plus is a powerful tool and so may cause problems if used recklessly, but it is not malicious. If you would prefer to not trust binaries, feel free to read the code and compile Policy Plus from source. You can also verify that a build was created from the published code by examining the output of a GitHub Actions run: the input commit hash can be found under "checkout master" and the output executable hash can be found under "compute hash."_
They are banking on the power of the normies not being aware of such power tools, let alone having the patience/competence to go through those workarounds. And they are right. Normies don't usually care/bother, they will just order that Windows 11 laptop.
I'm kind of out of the loop, but I expect that if I cannot get Windows 11 then eventually my Windows 10 machines would no longer be supported with updates and so on, right?
I guess I will check if I can play my games and do my work using Linux... I feel confident that I should be able to.
the Proton compatibility makes almost all games work on linux. for work, if you're into programming, it is going to be a much better experience with linux. if you're doing something else, you'll find most windows software works on linux, or you'll find free alternative that work about as good.
> if you're into programming, it is going to be a much better experience with linux
This is an opinion, and not a fact. I have had a much better experience with windows tooling for C++ and .net for the last 7-8 years than I have with tooling on Linux. Visual Studio and a modern terminal emulator (Conemu), are an excellent pairing.
People hate change and reality is that Windows is at the core of every casual's experience. Devs will bitch and complain amd threaten to switch to Linux but reality is that your already switched or won't. I've been developing using Windows for over 10 years and wouldn't switch for the world. In my latest gig I ssh into linux dev machine through VSCode and this is the best setup possible
Not sure why I'm always hoping that they'd depart from the dependency on the past Windows architectures and produced a modern operating system. It's not that they don't know how, their labs had some very interesting concepts over the years.
Yes, and it appears that they are trying to do this by moving to "Windows Drivers" and other technologies before they are ready to containerize/sandbox everything.
With TPM requirement, MS succesfully shot itself in the foot.
With TPMs sold out around the world for the next 12-18 months, and Windows 10 still being offered to OEMs, Microsoft has assured that legal installs of Win11 will be almost unheard of.
They have just assured the revival of the long since tamed WaREz culture.
P.S. Some lucky chip hoarders made a fortune, just look at this:
Most x86 CPUs from the last 3-4 years include firmware-level TPM support and do not require the modules you are talking about, so there will be uptake amongst people with computers made in that timeframe.
(I am not defending Microsoft's decision, just pointing out that your claim is incorrect)
I haven't had Windows on my computer since Windows 7. Instead I've been using modern Linux distributions. This is freedom. My computer doesn't slow down, is lightning fast, and just works, year after year.