The monitor you have didn't properly implement the Thunderbolt spec, and since Windows has looser adherence to the spec than macOS, things work fine.
This happens with web browsers every decade or so. "Browser X" follows the Javascript spec to a tee, which breaks millions of poorly written websites, so "Browser X" has to degrade its performance or lose market share, and thus we have lots of sites that are out of spec.
Your explanation could be plausible but do you have any evidence to back it up? Most curious is the fact that people are complaining about a specific OS version with regard to the problems. Did the spec change between OS releases?
My intent was not to move the goalposts, I was just wanting you to elaborate. Even though the explanation is trivial to you, it may not be that way to others.
(A corollary to this are the forum posts starting with a technical problem and ending with the OP saying "figured it out!" and no further explanation :)
Assume this is true. Why is it a good thing? If the looser spec handling on Windows fixes the bug without introducing other problems, then from the user perspective, Windows is doing the correct thing and OSX is failing.
But it then ends up with standards being meaningless and people who are running not-Windows get screwed (like UEFI, ACPI, and various other nonsenses that "work fine" on Windows but not on Linux, etc.)
That assumes Microsoft broke the spec and the hardware was designed for it. The scenario in the thread is that the hardware broke the spec and Microsoft just made it work. I don't think that's much different than a lot of other software. Look at all the application specific code and fixes added to graphics drivers, for example.