Hacker News new | past | comments | ask | show | jobs | submit login
Retina MacBook Pro Review: The Age of the High-Resolution PC (stevestreza.com)
24 points by basil on June 18, 2012 | hide | past | favorite | 44 comments



As someone who has used a retina display in daylight, it is unusable (bring up a black terminal, it is a goddamn mirror). It only compares favorably with the notoriously horrible apple glossy displays... well know for having some of the worst imaginable glare problems (unusable in many office overhead lighting situations). How the author found this better than the matte screen is beyond me.

That all said, I am happy about Apple pushing forward on a decoupling of resolution from perceived size... it is long overdue. That combined with apples supply chain clout, I can't wait to see more ultra-high resolution screens everywhere.


Can this kind of decoupling be accomplished on Windows and or Linux?

I just ask because I have a 1920x1080 (native) resolution LCD which looks wonderful in Movies and Games, but makes text and other OS artefacts unusable small.

Windows will let you set the text size (Medium, Large) but all this seems to accomplish is to break all non-native applications (e.g. force text off of the right hand side of the drawable area).

It would be wonderful if I purchased for example a 2048x1536 display which Windows could run in 1024x768 "mode" but with four times the pixels per every dot of content on the screen.


It can't even really be accomplished well on OS-X yet. But I am delighted Apple is going for it, because the application will catch up.

Right now it works really well with Apple's font render engine, and not much else. But application developers will catch up -- as will Microsoft and Linux (some Linux mailing lists are already blowing up with how to handle it). X can handle it no problems, it is the libraries that can't.


Don't know about windows, but it certainly can be on Linux. The only problem is the marketing: people buy "HD" screens now.

Apple sells "retina" screens, and people buy them.

Our world is quite sad.


Does anyone else think it weird that the first mainstream hi-res displays weren't pushed by Microsoft? Let me explain...

Back when WPF was still called Avalon, Microsoft were pushing the technology as a solution for the soon-to-be-arriving high-dpi displays (which, we were assured, were right around the corner). As an example, see a contemporary blog post: http://blogs.msdn.com/b/marcelolr/archive/2004/12/03/274213....

Given that Microsoft were (are?) deseperately trying to hold back the tide of web-based applications, wouldn't displays like this, combined with tchnology such as WPF, have been a killer feature of a desktop OS? And hence, shouldn't they have been a high-priority for Microsoft?

So why did Apple get there first? Why didn't Microsoft research/fund/introduce these high-dpi displays?

Edit: More links from a Coding Horror post from the same year: http://www.codinghorror.com/blog/2004/11/trapped-in-a-bitmap...


> Back when WPF was still called Avalon, Microsoft were pushing the technology as a solution for the soon-to-be-arriving high-dpi displays

Although they were quieter about it, it has been quite clear that Apple was interested in resolution independence since at least 10.4, when such features started to show up.

Remember that very few third-party Windows applications use WPF. Most first-party stuff, even, doesn't use WPF. We're seeing on the retina Macbook that, while most Cocoa apps work well enough, most non-Cocoa apps are at least somewhat broken; imagine how much worse this effect would be on Windows, where so few apps use the current UI toolkits.


It's a chicken-and-egg situation, agreed, It just seems really strange to me that they spent so much effort pushing and developing the software side, without putting any effort into the hardware, when the two clearly have to go hand-in-hand.

Microsoft had the software stack ready - imagine if they had released (via partners) high-dpi laptops and screens shortly afterwards? That actually might have made a Windows machine a compelling and desirable product, instead of just an expensive way to run a browser.


Apple had a rather similar software stack ready, though; both worked on the basis of a point being a fractional number of pixels. The trouble with this is that while it tended to more or less work for simple things, for complex things, particularly where the programmer was drawing stuff, it was (a) somewhat inefficient and (b) very hard to get right. Apple used to periodically have talks at WWDC about how to get ready for independence back in 2006 or so, but it was clear that few developers were doing so successfully.

This new approach, where you have a point be an integer number of pixels, possibly with scaling after the fact (as with the non-standard resolutions available on the retina MacBook) is far simpler for developers, so more likely to work out okay.


The "first" main stream hi-res display? I have some Sony Trinitron CRTs in my basement that would like to talk to you.


Thousand times this. I still own a 22'' Trinitron (20'' working area) that beats any `FullHD' hands down, by resolution and by any other measure as well -- from lag to colorspace -- and it wasn't even top of the line back then. Heck, it even starts up faster than many LCDs.

I refused to let go of it for so long exactly because there was no LCD that would be an upgrade, with the sole exception of the expensive T221. Guess now's the time :-)


Sorry, didn't really clarify what I meant by mainstream: easily available to purchase, by non-techie consumers, from a high-street shop.


Sony multiscan trinitrons weren't exotic - you could purchase them in any reasonable electronics store. I have several that go up to 2048x1536 @ 75hz. Viewsonic had monitors that did the same. While I look forward to higher resolution LCDs becoming a standard option, LCD's set us back resolution wise for quite a long time. LCD does have other advantages over CRT of course.


I had and loved a similar NEC CRT. But 2048x1536 on a 22" display is ~116ppi. Apple's non-retina laptops have been higher than that for years (with the MacBook Air being highest, at ~135ppi), and Sony, at least, has offered configurable 1080p displays in 13" laptops (~165ppi) for some time.

It was pretty nice a decade ago, when everyone was moving to LCD monitors that were vastly inferior in every measure beyond size or weight, but, at least in terms of ppi (and on consumer grade equipment), display technology eclipsed the CRT quite a while ago.

edit: I'd forgotten that CRTs were measured with the bezel included, so at 20" viewable, the display is 128ppi. So closer, but still pretty far behind.


Viewsonic had a 19" display (with 18" viewable) that did 2048x1536. I think that was the smallest size that the 2048x1536 CRT's came in.


Microsoft isn't a hardware company. Apple's supply chain dominance is what brought this to market.

Microsoft sells some rebranded hardware products (manufactured and designed by Samsung, Acer, IBM, ATI, Nvidia and Asus)... but it makes PROFIT by selling software.

Apples makes software, but it makes its PROFIT by selling hardware.



All whitelabel products.


The X-Box is a whitelabel product?


Depends how you define it. But, X-Box (the original) was a PIII + Nvidia + Other PC Hardware... none of it made by Microsoft.

The 360 I give them more credit with, because they worked closer with partners and got off the WinTel brand... opting for PowerPC chips with ATI ... again, not manufacturing the hardware.


"again, not manufacturing the hardware" -- well, you noticed all Macs have nvidia or amd/ati GPUs and Intel processors, and are assembled by Foxconn?

"That's because Apple isn't a real hardware company, it's all white-labelled." I guess?


I define companies by WHAT MAKES THEM MONEY. I can't tell if people are pedantic or dense.

Apple makes money by selling hardware. Microsoft makes money by selling software. Google makes money by selling ads. Amazon makes money by selling goods.

Graphic for the insanely dense: http://visualign.files.wordpress.com/2012/02/applemicrosoftg...


You aren't "defining" companies, you are /classifying/ them and you're classifying them into arbitrary boxes based on your strange, subjective and clearly disputed perceptions.

Let's move on from your inability to admit the silliness of classifying a large company with many different divisions as a "software company" and go back to your original contention: that Microsoft were unable to bring a new hardware device to market because they did not have a enough "supply chain dominance". Whatever "supply chain dominance means", you're wrong --- Microsoft have, in the past, released hardware products. The Xbox is an obvious example. So is Microsoft Surface. There is nothing more "whitelabel" about these products than there is about the MacBookPro.


It is different, "Microsoft Surface" is a damn Samsung product you can see on Samsung's site. http://www.samsunglfd.com/product/feature.do?modelCd=SUR40 ... and how many models do you think will ever ship, over its entire lifetime?

Apple designs its own hardware, even does some FAB work now in regards to its chips.

Microsoft doesn't move enough hardware to have anywhere near the supply chain power of Apple. Microsoft is a pathetic flee in the hardware world compared to apple. Hardware (manufactured and DESIGNED by others) makes up less than 20% of Microsoft's business.

Apple is first and foremost a hardware company. Apple is known for its incredibly well run supply chain and operations which give them amazing power to lean on manufacturers.

Apple will guarantee and lock in massive purchases removing the vast majority of risk from the hardware vendors, allowing hardware vendors to take risks on stuff like the Retina Display.

Nearly all of Microsoft's non-trivial plays have failed pushing and partnering with people. They promised to market and move tablets for years... FAIL. Windows Phone was going to change the world... FAIL. Zune... FAIL. Hardware vendors or rightfully terrified of Microsoft as a partner, even when they fully commit -- you know why? MICROSOFT IS NOT A HARDWARE COMPANY.

http://www.businessweek.com/magazine/apples-supplychain-secr... ... and google "Apple Supply Chain" to find literally dozens of more stories about how good they are at this.


It's also worth noting that Apple acquired Intrinsity a few years ago, a semiconductor company. They actually design ARM chips. Samsung is a customer.


That's the product that used to be called "Microsoft Surface", which was a table-sized touchscreen computer, not the new Microsoft Surface.


As someone who has been waiting for a higher resolution laptop screen for years, what apple has done is a godsend. At least now other laptop manufacturers will take high res demands seriously.


I'm always surprised when I see internet reviews where people talk about no longer being able to see individual pixels. I can't see individual pixels even if I look closely at my screen... and everyone I've talked to in person has said the same thing.


As someone who has pretty good eyesight (for the time being at least), I am pretty excited about the prospects of coding in the terminal with a super HD screen (I hate having to buy into Apple's marketing speak). Right now I struggle to have three 80 coloumn wide text files open in VIM on my Lenovo X201, and most other people have a real hard time reading the small font on my screen, while I have just gotten used to it. A super HD screen would mean vertical and horizontal lines of code for me without sacrificing the readability issues of going down in font size, assuming I can adapt to the smaller text.


In my old 16 inch notebook I cannot notice pixels at all. And font rendering is just fine. Ok I have an eye problem but still I fail to see the point of having very high resolution on a small screen. What is the catch?


I clearly notice the pixels on anything lower than 200dpi and can only completely ignore them past 250dpi.

It's the point where antialising becomes unnecessary, basically.


The intro/overview are a complete load of codswallop... Touting increases in DPI as game changing couldn't be farther from the truth.

Nevermind the painting of apple as innovators.


One of my first thoughts after hearing the original iPhone announcement was "what a low resolution screen".

After all, the Nokia 770, released a couple of years before the iPhone, had an 800x480 screen.

Most full websites back then were usable on an 800 pixel screen. Because it had a low-res screen, the iPhone required mobile versions, which I thought would doom it due to the chicken and egg problem.

Remember, back then the iPhone didn't allow third party apps, the "blessed" way of adding capability to your iPhone was through web sites. Who would buy an iPhone if it didn't have any apps or mobile web sites? Who would mobile web sites if nobody bought an iPhone?

Apparently, I was wrong.


"Touting increases in DPI as game changing couldn't be farther from the truth."

I think that remains to be seen. If the existence of this laptop in the market drives consumers to seek out laptops with this resolution display, and other laptop makers are required to offer laptops of similar resolution, and web developers convert their graphics to work well with "Retina" resolutions ... then you could make the case that it's game-changing. But it will be awhile before that's known.


Actually, it's the end of the high-resolution PC, since there's no point in adding more detail if your eyes are physically unable to detect the difference. Also, we're already able to buy 3D displays, so that's another 'old' technology too. Smart 3D TV's - old.

I imagine someone's going to be selling 3D contact lenses soon, so all surfaces will be 3D video devices. The next step is to make them intelligent...


I kind of want Retina display, but I made a promise to myself that my next laptop would be a Linux laptop. Perhaps I could run Linux on a Macbook Pro, but it would look like this: http://www.omgubuntu.co.uk/2012/06/what-does-ubuntu-look-lik...

How long until other desktop environments catch up?


It's not an mp3 player. It's an ipod.

It's not a PDA. It's an iphone.

It's not a tablet laptop. It's an ipad.

It's not a high resolution display unit. It's a retina macbook.


This is very smart marketing for Apple. They have the assets (reputation, substantially different UX) that let them position their releases as new products rather than improved versions of existing products.

That said, most companies would fail completely if they tried this. The average startup wants their customers to quickly and easily recognize their product. YMMV.


Saw a Retina display yesterday, I can't see what the fuss is about.


is it just me or is the font in this article ironically horrible?


It's not just you. Could barely read it.


In Chrome I cannot read it, but the text appears fine in Firefox.


It is indeed within Chrome as reported.


Retina displays are suddenly the buzzword. Is it a new concept? No. Do we really need it? Debatable. Do we want it? Undeniable. Credit where its due once again to the Apple marketing machine...


It's not so much marketing as that _they are actually making these things_. There has been a small but rather fanatical demand for high-DPI screens for a good while, particularly since the demise of CRTs. Note how excited people got about the resolution independence features introduced (and subsequently scrapped; 10.7 does it very differently) in MacOS 10.4.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: