I don't understand what "being managed entirely hands-off" means. I can assure you there is no manual intervention from IT on every desktop system in the fleet, so I would call that exactly "being managed hands-off". I am certain there are always a few systems that do need some manual intervention for whatever reason, but that seems to just be the way with computers. It's not like Ubuntu updates never break, or no one ever gets to install a broken package before it is retracted.
I mean with 'entirely hands-off' that the computer just keep on running. On Linux, most packages are part of the repository and update automatically. On Windows, those packages have to be updated manually. Even if everything is automated, the computer will have to be restarted quite regularly. At least to me, Windows seems to be more difficult, especially if you have a fleet of desktop systems that can be chosen to be perfectly Linux compatible.
Where does the complexity on Linux come from that makes managing them more difficult?
Even on Linux, you need restarts if you want the updates to shared libraries to actually happen. You can't just apply a critical security update to OpenSSL for example and hope that the user actually restarts their programs at some point - if you care about the update being applied, you need to know that by some date all of the programs on that system are actually restarted, and a scheduled reboot is by far the simplest way to do this.
Then there's the question of pushing an update to all managed computers. Maybe it's not a package update, but you want to change some SELinux policy for all users, or update some DNS server or the default search domain and so on.
Never mind the question of how you can instruct one of those Linux computers to delete all data it holds whenever it next connects to the internet (to handle the case of a stolen company laptop).
There are so many things that you need in an enterprise setting that have common (though probably quite expensive) tools available for Windows. Maybe some of these exist for Linux as well (I would expect RedHat to have some), but I'm not sure. Linux admin is usually reserved for servers much more than desktop computers.
Why would you want to delete the data? That's when you open the door for an attacker to access it. If the data is encrypted, it can remain on its partition because it is inaccessible.
Interestingly, apples have to be compared to oranges. On Linux, it is easy to identify the programs that are using a library. Thus it is easy to restart just the services that are patched. In general, things can be scripted so there are no tools available. But this requires somebody who understands the system. From a business perspective, this might be more expensive, or not, if the tools are expensive.
Why? I have never seen Windows being managed entirely hands-off whereas Linux just works.