> This seems like not only a boneheaded move but also a very user hostile one. Not the one I would have expected Apple to do.
Apple defines user hostility. They just get away with for reasons I have never understood. Apple doesn't want pictures or videos or any use of a mouse plugged in while being used? Put the charging port on the bottom of the mouse. Doesn't want to adopt the industry standard USB-C? Puts several different permutations of ports on devices, often incompatible without highly marked up dongles. Wants users to buy into wireless headphone ecosystem? Removes audio jack from all devices except laptops and desktops and the lowest end iPad. They refuse to fix longstanding issues with their Bluetooth and wireless cards. They refuse to cooperate and support third-party external monitors. They significantly break backwards compatibility and apps with every OS update, often intentionally.
Apple has done whatever the hell they've wanted to for a very long time. They just hide it behind what I see as inane marketing but nonetheless marketing that has been very successful.
The Lightning port preceded USB-C by many years. And it was a huge improvement in every way over everything else available at the time.
Apple didn’t invent Thunderbolt or USB-C. Intel did. Apple is the biggest adopter, but most blame for all the crazy shitty problems with USB-C should be laid at Intel’s feet, just like they should take the blame for all the crazy weird shit with older USB standards.
I’m still pissed at the loss of a 2.5mm physical jack. Don’t get me started on this one.
Bluetooth is a big bag of shit, and way worse than Blueray. At least Apple is trying to fix some of these problems with their U1 and W1 chips.
As for external monitors, HDMI works everywhere HDMI works. And USB-C or Thunderbolt to DisplayPort adapters are easy enough to get. Apple has done nothing wrong there.
Apple works harder at maintaining backwards compatibility than any other major vendor in existence. Your extraordinary claims require extraordinary evidence, of which you have provided none.
Looking back at this post, your score appears to be one for six, so far. That’s not great, but you could improve on it if you were to provide adequate evidence of your claims.
Most monitors don't support more than 60Hz, so 120Hz is still is bit iffy sometimes. Apple can't help some monitor vendors take their time to get their stuff together.
None of the 64 bits versions of Windows run 16 bits code. And lots of games are broken because the DRM used a kernel driver or some other stupid assumption.
I've got no dog in this series, but consider LCD monitors greater than 60hz have been around for I'd estimate at least 13-14 years
I had a 2233RZ in 2009 - one of the first at 120hz. Prior to that 75hz was commonplace.
The modes are presented by the display through EDID.
This is messy at times, but they [Apple] simply need to look there. Windows/Linux respect these just fine.
This is more on Apple than it is the vendors. The only mistake the vendors made was marketing these to the gamer niche. Office work benefits from high refresh rate too
You had a 3d monitor for gaming. You need a high refresh rate for 3d with shutterglasses since you only get half or it for each eye.
Office work benefitted in the CRT time when <85Hz would cause perceptible flickering giving you a headache. Since TFT's are slower there's no flickering anymore. Scrolling benefits slightly with higher refresh, and of course the top 10% of gamers will care about the latency, but I don't think I'll be typing any faster on a gaming screen ;)
> None of the 64 bits versions of Windows run 16 bits code.
Because it's impossible due to how x86-64 CPUs work.
And the statement does nothing to refute my point - that code compiled for Windows keeps working 4 times as long as it does on MacOS. It's an empirical fact, that can be measured in the real world, denying it is like denying gravity.
Backwards comparability is a priority for Windows because all the line of business apllications must keep chugging along. Corporates don't upgrade their office PCs if it breaks their terrible software.
> Apple can't help some monitor vendors take their time to get their stuff together.
Whats is wrong with them? They work fine on Windows, Linux and Android(!), sounds like an Apple problem
> code compiled for Windows keeps working 4 times as long
That might have been true in the NT/XP days but how long will 'the new Microsoft' that decided '10 would be the last Windows version' and now decides it will be phased out in 3 years keep that promise?
Corporates are going all-in on terrible cloud offerings now so the best move for MS is to lock down their desktop OS.
> Whats is wrong with them?
There are lots of monitors with incorrect EDID information or buggy behaviour. Yesterday I crashed a Dell monitor's firmware while trying to find the optimal displayport settings. That's not a macOS problem...
Apple defines user hostility. They just get away with for reasons I have never understood. Apple doesn't want pictures or videos or any use of a mouse plugged in while being used? Put the charging port on the bottom of the mouse. Doesn't want to adopt the industry standard USB-C? Puts several different permutations of ports on devices, often incompatible without highly marked up dongles. Wants users to buy into wireless headphone ecosystem? Removes audio jack from all devices except laptops and desktops and the lowest end iPad. They refuse to fix longstanding issues with their Bluetooth and wireless cards. They refuse to cooperate and support third-party external monitors. They significantly break backwards compatibility and apps with every OS update, often intentionally.
Apple has done whatever the hell they've wanted to for a very long time. They just hide it behind what I see as inane marketing but nonetheless marketing that has been very successful.