Apple was able to do it the way they did because there are so few choices in Apple hardware. Sure the "double-or-nothing" approach simplifies things but it's not a practical approach for large ecosystems like Windows or Android where the resolutions vary a lot more.
But that would only work on new hardware, not on the millions of existing machines that people will upgrade, which means it would still have to support the old resolution model to support older machines and thus the incentive for hardware makers to put on higher resolution screens would be reduced since they could get away with older crappy screens.