As someone who has used a retina display in daylight, it is unusable (bring up a black terminal, it is a goddamn mirror). It only compares favorably with the notoriously horrible apple glossy displays... well know for having some of the worst imaginable glare problems (unusable in many office overhead lighting situations). How the author found this better than the matte screen is beyond me.
That all said, I am happy about Apple pushing forward on a decoupling of resolution from perceived size... it is long overdue. That combined with apples supply chain clout, I can't wait to see more ultra-high resolution screens everywhere.
Can this kind of decoupling be accomplished on Windows and or Linux?
I just ask because I have a 1920x1080 (native) resolution LCD which looks wonderful in Movies and Games, but makes text and other OS artefacts unusable small.
Windows will let you set the text size (Medium, Large) but all this seems to accomplish is to break all non-native applications (e.g. force text off of the right hand side of the drawable area).
It would be wonderful if I purchased for example a 2048x1536 display which Windows could run in 1024x768 "mode" but with four times the pixels per every dot of content on the screen.
It can't even really be accomplished well on OS-X yet. But I am delighted Apple is going for it, because the application will catch up.
Right now it works really well with Apple's font render engine, and not much else. But application developers will catch up -- as will Microsoft and Linux (some Linux mailing lists are already blowing up with how to handle it). X can handle it no problems, it is the libraries that can't.
Does anyone else think it weird that the first mainstream hi-res displays weren't pushed by Microsoft? Let me explain...
Back when WPF was still called Avalon, Microsoft were pushing the technology as a solution for the soon-to-be-arriving high-dpi displays (which, we were assured, were right around the corner). As an example, see a contemporary blog post: http://blogs.msdn.com/b/marcelolr/archive/2004/12/03/274213....
Given that Microsoft were (are?) deseperately trying to hold back the tide of web-based applications, wouldn't displays like this, combined with tchnology such as WPF, have been a killer feature of a desktop OS? And hence, shouldn't they have been a high-priority for Microsoft?
So why did Apple get there first? Why didn't Microsoft research/fund/introduce these high-dpi displays?
> Back when WPF was still called Avalon, Microsoft were pushing the technology as a solution for the soon-to-be-arriving high-dpi displays
Although they were quieter about it, it has been quite clear that Apple was interested in resolution independence since at least 10.4, when such features started to show up.
Remember that very few third-party Windows applications use WPF. Most first-party stuff, even, doesn't use WPF. We're seeing on the retina Macbook that, while most Cocoa apps work well enough, most non-Cocoa apps are at least somewhat broken; imagine how much worse this effect would be on Windows, where so few apps use the current UI toolkits.
It's a chicken-and-egg situation, agreed, It just seems really strange to me that they spent so much effort pushing and developing the software side, without putting any effort into the hardware, when the two clearly have to go hand-in-hand.
Microsoft had the software stack ready - imagine if they had released (via partners) high-dpi laptops and screens shortly afterwards? That actually might have made a Windows machine a compelling and desirable product, instead of just an expensive way to run a browser.
Apple had a rather similar software stack ready, though; both worked on the basis of a point being a fractional number of pixels. The trouble with this is that while it tended to more or less work for simple things, for complex things, particularly where the programmer was drawing stuff, it was (a) somewhat inefficient and (b) very hard to get right. Apple used to periodically have talks at WWDC about how to get ready for independence back in 2006 or so, but it was clear that few developers were doing so successfully.
This new approach, where you have a point be an integer number of pixels, possibly with scaling after the fact (as with the non-standard resolutions available on the retina MacBook) is far simpler for developers, so more likely to work out okay.
Thousand times this. I still own a 22'' Trinitron (20'' working area) that beats any `FullHD' hands down, by resolution and by any other measure as well -- from lag to colorspace -- and it wasn't even top of the line back then. Heck, it even starts up faster than many LCDs.
I refused to let go of it for so long exactly because there was no LCD that would be an upgrade, with the sole exception of the expensive T221. Guess now's the time :-)
Sony multiscan trinitrons weren't exotic - you could purchase them in any reasonable electronics store. I have several that go up to 2048x1536 @ 75hz. Viewsonic had monitors that did the same. While I look forward to higher resolution LCDs becoming a standard option, LCD's set us back resolution wise for quite a long time. LCD does have other advantages over CRT of course.
I had and loved a similar NEC CRT. But 2048x1536 on a 22" display is ~116ppi. Apple's non-retina laptops have been higher than that for years (with the MacBook Air being highest, at ~135ppi), and Sony, at least, has offered configurable 1080p displays in 13" laptops (~165ppi) for some time.
It was pretty nice a decade ago, when everyone was moving to LCD monitors that were vastly inferior in every measure beyond size or weight, but, at least in terms of ppi (and on consumer grade equipment), display technology eclipsed the CRT quite a while ago.
edit: I'd forgotten that CRTs were measured with the bezel included, so at 20" viewable, the display is 128ppi. So closer, but still pretty far behind.
Microsoft isn't a hardware company. Apple's supply chain dominance is what brought this to market.
Microsoft sells some rebranded hardware products (manufactured and designed by Samsung, Acer, IBM, ATI, Nvidia and Asus)... but it makes PROFIT by selling software.
Apples makes software, but it makes its PROFIT by selling hardware.
Depends how you define it. But, X-Box (the original) was a PIII + Nvidia + Other PC Hardware... none of it made by Microsoft.
The 360 I give them more credit with, because they worked closer with partners and got off the WinTel brand... opting for PowerPC chips with ATI ... again, not manufacturing the hardware.
I define companies by WHAT MAKES THEM MONEY. I can't tell if people are pedantic or dense.
Apple makes money by selling hardware. Microsoft makes money by selling software. Google makes money by selling ads. Amazon makes money by selling goods.
You aren't "defining" companies, you are /classifying/ them and you're classifying them into arbitrary boxes based on your strange, subjective and clearly disputed perceptions.
Let's move on from your inability to admit the silliness of classifying a large company with many different divisions as a "software company" and go back to your original contention: that Microsoft were unable to bring a new hardware device to market because they did not have a enough "supply chain dominance". Whatever "supply chain dominance means", you're wrong --- Microsoft have, in the past, released hardware products. The Xbox is an obvious example. So is Microsoft Surface. There is nothing more "whitelabel" about these products than there is about the MacBookPro.
Apple designs its own hardware, even does some FAB work now in regards to its chips.
Microsoft doesn't move enough hardware to have anywhere near the supply chain power of Apple. Microsoft is a pathetic flee in the hardware world compared to apple. Hardware (manufactured and DESIGNED by others) makes up less than 20% of Microsoft's business.
Apple is first and foremost a hardware company. Apple is known for its incredibly well run supply chain and operations which give them amazing power to lean on manufacturers.
Apple will guarantee and lock in massive purchases removing the vast majority of risk from the hardware vendors, allowing hardware vendors to take risks on stuff like the Retina Display.
Nearly all of Microsoft's non-trivial plays have failed pushing and partnering with people. They promised to market and move tablets for years... FAIL. Windows Phone was going to change the world... FAIL. Zune... FAIL. Hardware vendors or rightfully terrified of Microsoft as a partner, even when they fully commit -- you know why? MICROSOFT IS NOT A HARDWARE COMPANY.
As someone who has been waiting for a higher resolution laptop screen for years, what apple has done is a godsend. At least now other laptop manufacturers will take high res demands seriously.
I'm always surprised when I see internet reviews where people talk about no longer being able to see individual pixels. I can't see individual pixels even if I look closely at my screen... and everyone I've talked to in person has said the same thing.
As someone who has pretty good eyesight (for the time being at least), I am pretty excited about the prospects of coding in the terminal with a super HD screen (I hate having to buy into Apple's marketing speak). Right now I struggle to have three 80 coloumn wide text files open in VIM on my Lenovo X201, and most other people have a real hard time reading the small font on my screen, while I have just gotten used to it. A super HD screen would mean vertical and horizontal lines of code for me without sacrificing the readability issues of going down in font size, assuming I can adapt to the smaller text.
In my old 16 inch notebook I cannot notice pixels at all. And font rendering is just fine. Ok I have an eye problem but still I fail to see the point of having very high resolution on a small screen. What is the catch?
One of my first thoughts after hearing the original iPhone announcement was "what a low resolution screen".
After all, the Nokia 770, released a couple of years before the iPhone, had an 800x480 screen.
Most full websites back then were usable on an 800 pixel screen. Because it had a low-res screen, the iPhone required mobile versions, which I thought would doom it due to the chicken and egg problem.
Remember, back then the iPhone didn't allow third party apps, the "blessed" way of adding capability to your iPhone was through web sites. Who would buy an iPhone if it didn't have any apps or mobile web sites? Who would mobile web sites if nobody bought an iPhone?
"Touting increases in DPI as game changing couldn't be farther from the truth."
I think that remains to be seen. If the existence of this laptop in the market drives consumers to seek out laptops with this resolution display, and other laptop makers are required to offer laptops of similar resolution, and web developers convert their graphics to work well with "Retina" resolutions ... then you could make the case that it's game-changing. But it will be awhile before that's known.
Actually, it's the end of the high-resolution PC, since there's no point in adding more detail if your eyes are physically unable to detect the difference. Also, we're already able to buy 3D displays, so that's another 'old' technology too. Smart 3D TV's - old.
I imagine someone's going to be selling 3D contact lenses soon, so all surfaces will be 3D video devices. The next step is to make them intelligent...
This is very smart marketing for Apple. They have the assets (reputation, substantially different UX) that let them position their releases as new products rather than improved versions of existing products.
That said, most companies would fail completely if they tried this. The average startup wants their customers to quickly and easily recognize their product. YMMV.
Retina displays are suddenly the buzzword. Is it a new concept? No. Do we really need it? Debatable. Do we want it? Undeniable. Credit where its due once again to the Apple marketing machine...
It's not so much marketing as that _they are actually making these things_. There has been a small but rather fanatical demand for high-DPI screens for a good while, particularly since the demise of CRTs. Note how excited people got about the resolution independence features introduced (and subsequently scrapped; 10.7 does it very differently) in MacOS 10.4.
That all said, I am happy about Apple pushing forward on a decoupling of resolution from perceived size... it is long overdue. That combined with apples supply chain clout, I can't wait to see more ultra-high resolution screens everywhere.