Hacker News new | past | comments | ask | show | jobs | submit login
Pixel Perfect (daringfireball.net)
143 points by olivercameron on Aug 14, 2012 | hide | past | favorite | 135 comments



I find it quite surprising that the author is so passionate about typefaces, but serves his own blog in Verdana. Good thing it is not called JaringJireJall.net [1]

That being said, I really think the days of sans fonts are close to over. 10 years from now, we will all be reading on 300 dpi screens and taste in typography will return to being the dominate feature of design.

I can't wait until I can read the Economist online in the correct typeface. I can walk down an airplane isle and recognize that font in less than a second. The New Yorker also uses a font that seems to be unique. The Atlantic needs a custom typeface.

I am so excited that high DPI screens are going to be the norm. The web design world as we know it today is about shitty flourishing with artwork. 10 years from now it will be about designing beautiful glyphs that represent the brand.

[1] J is particularly gross in Verdana.


You sure told him.

May sound funny coming from a typography-obsessive whose website has used Verdana as the text face for its entire nine-year existence, but I just don’t care much for Verdana for anything other than use on relatively low-resolution displays at small sizes. (E.g. Verdana’s uppercase I and J. Ugh.) Interesting expansion of the two families, though.

http://daringfireball.net/linked/2011/11/16/georgia-verdana

Ikea Switches From Futura to Verdana

Horrible decision. Baffling, really. I have never seen Verdana look good in any way other than in small sizes on-screen. (Via Adam Lisagor.)

http://daringfireball.net/linked/2009/08/26/ikea-futura


I also find daringfireball's typeface to be too small to read comfortably on modern displays. (I understand picking Verdana back in 2002, but it's not so nice now.)


Un-anti-aliased (aliased?) Verdana at a certain size (11 or 12 point? Can't recall) was the very best pixellated optimised font for the web when Gruber designed his blog.

He hasn't redesigned since.


His blog is nearly unreadable on my non-retina display. I can only imagine how small the font looks on a Macbook Pro.


You can change it via a link at the bottom to http://daringfireball.net/preferences/


⌘+


I use readability to convert the content to be readable on my devices(computer, kindle, mobile phone...) .


"sans fonts are close to over"

I'm curious as to why you believe this. Some of the most versatile fonts are sans-serif, i.e. Helvetica, Gotham, DIN, Gill Sans, Futura, and Franklin Gothic.


You are right, I don't believe that. I was exaggerating, and mostly thinking about how impossible it is to make a serif typeface that works on low dpi displays. TNR is the only serif face that is well hinted in my opinion.

There are much better options with sans typefaces, which often look nice on low dpi displays. My point was mainly that we are currently choosing among sans and TNR, rather than among all typefaces.


That's true but I've always thought of it because serif generally feels warm and historic, while sans-serif can often feel modern and emotionally detached.

For quite a number of European and American fashion sites, most use sans-serif because they want to appear bold, but emotionally subdued from the user, letting the photos of the products and models do more of the talking.


Serif fonts have been under-represented for years, because our displays just aren't good enough to do them justice. When the pendulum swings, it always swings too far - look for serif fonts to become the next big thing.

Personally I think this prediction is a little premature because the retina displays aren't widespread yet. I've got my eye on Open Sans for the near future.


You mean like the site you are looking at right now?

    body {font-family:Verdana; font-size:10pt; color:#828282;}
    td   {font-family:Verdana; font-size:10pt; color:#828282;}


Neither this site nor its operator claim to be passionate about typefaces.


I'm interested in this. As someone who doesn't pay that much attention to typography (beyond the obvious 'Papyrus is bad' stuff) what makes individual typefaces preferable over a standard typeface for all similarly structured content?

(Besides branding, that is.)


Don't worry about it too much. Most people don't care. I am the kind of guy that walks into the Google offices and immediately thinks "What the fuck is this place, a daycare center?" Most people disagree, and think the Google decor is wonderful. My opinion is likely not mainstream.


Edward Tufte's forum has some good discussions on type

This one is short

http://www.edwardtufte.com/bboard/q-and-a-fetch-msg?msg_id=0...

but see the collection of additional threads at the bottom of that thread.*

Also, Bringhurst: http://www.amazon.com/Elements-Typographic-Style-Robert-Brin...

------------------

*I like the "local masters" concept Tufte has (full disclosure, I helped prototype it), and Wikipedia does something similar.


Every font is designed for a purpose. Serif fonts are designed to help the brain recognize the different letters. Comic Sans was designed to be used informal. Helvetica was designed to be used in signage. Verdana was designed to be readable from (72dpi) computer screens. And so on.

So as screen resolutions increase a serif font can help the brain to recognize small letters increasing readability.


Serif fonts are designed to emulate Roman inscriptions, which had serifs to help water drain out of them without wearing them out (and thus reducing maintenance costs). The brain stuff is at best speculative and not backed by any data (and in any event is a post hoc rationalization).

In general, readability tests (per the original article) have produced equivocal results which seem to show mainly that people prefer the fonts they grew up with.


It's pure marketing. There's no objective, measurable difference.


Yes there is. Certain typefaces are objectively easier to read than others, and some work better for certain content than others. Some are best for body text, some are best for headlines, etc. Some are optimized for low resolution screens, some for print (and as such work well on high resolution screens).


A great recent example that appeared on HN was the typeface designed for road signs, which the designers then put through Germany(?)'s test suite for legibility at distance in various road conditions and determined the new font was objectively legible at greater distance than all previous standard road-sign fonts with which it was compared.

[By request, I found the original article: http://ilovetypography.com/2012/04/19/the-design-of-a-signag... ]


Do you happen to have the link to that? Must have missed that and I'd like to read it.



I remember reading a post several days ago, mentioning that particular font face could help a paper score higher.


This is from very long back. In school, I once passed an exam on grace marks given because of my graceful handwriting (this was writing in the 'Devanagari' script, which even naturally allows for more calligraphy than the Roman script).


> I can't wait until I can read the Economist online in the correct typeface.

It's possible, but with a lot of work. It's called EcoType. If you download a PDF of one of their "Special Reports" then you can use a font-extraction tool to generate a font file (possibly missing a few rare characters), then create a custom user style sheet for your browser that sets Economist.com body text to that typeface.

Whether it's worth it is a whole other question. ;)

Thankfully, the New Yorker just came out with their iPhone app which is typographically gorgeous.


The New Yorker does indeed use two custom typefaces -- Irvin for headings, and a Caslon variation for body text. (Wikipedia says it's Adobe Caslon, but I seem to recall it having been a customized version, at least at some point in the past.)

On the website, the New Yorker seems to be using Irvin for its headlines, but Times New Roman for body text.


There is nothing wrong with Verdana if used properly. Verdana was designed specifically for digital displays at small sizes, which is how it is used on daringfireball. And HN, actually.


I hope your right about the typographic future. This link argues that serifs are not more legible than sans serifs. http://alexpoole.info/blog/which-are-more-legible-serif-or-s...


Possibly because at 11px Verdana renders nicely. I guess at the time he came up with the design, there was not really an alternative to use CoreFonts, and this was one of the types that rendered fine at that size. FWIW headings are in Gill at 17px, and I don't find the balance unpleasant.


not to be too pedantic, but The New Yorker uses Caslon, which while gorgeous and second in my mind only to Garamond, is a rather old type face and is widely used because it is 1) very readable, and (possibly more importantly) 2) uses a lot less ink than other similar type faces


So more pixels is great. Is it really that great?

I feel like there are people who look at their devices and people who look at what's displayed on their devices.

For me, text is text is text. I'm looking at it, not so I can enjoy how crisp it is (or isn't), but so I can consume those words and get them inside my head.

I find (I've played around with high DPI devices) that while they do look nice, the niceness is quickly forgotten when I actually start using them for something other than looking at how nice they look. Likewise, going back to use lower DPI devices you notice the difference, until you stop looking for it and actually get work done.

I'm not saying that I don't care entirely, I just... I wouldn't go and buy a new device solely because its resolution was higher. It will be nice when these devices eventually make it to me, but I'm not really bothered either way.

If we were to talk about recent revolutions in the home computer I'd rank multi-core CPUs as a much more important change. No longer can one broken app go on a rampage and block you out.

Anyway, tl;dr; it's great he's passionate about it, I just don't see it as such as important thing.


>For me, text is text is text.

You might get a majority of people to agree with you, but I've never met anyone for which this was actually true. It's just the first-order approximation, without having quantitatively studied the differences that can exist with text. Text is not interchangeable, and seemingly small differences severely impact readability both in speed and accuracy.

Displaying text is the primary function of my computer, and the most important aspect of my computer. I want that tiny subset of the human population who are finely attuned to text to get it right for me. It's far more important than doubling my RAM or doubling my CPU performance, these are commodity and trivial compared to the hard work of getting text right.

>No longer can one broken app go on a rampage and block you out.

Minor quibble, you don't need multi-core for that, since preemptive multitasking this has never happened to me.


> So more pixels is great. Is it really that great?

If it reduces eyestrain, then it is worth every penny!

I don't know if there is any evidence that it does, but it certainly seems likely that sharper text would reduce eyestrain. I certainly can read tiny text on my Retina iPhone much more easily than I could on my previous iPhone.

Otherwise, it's worth it if it gives you enough more joy, and not if it doesn't.

Some people are fine with music played through stock earbuds, and other people find music more enjoyable played through hi-fidelity equipment. Some people prefer the food at McDonalds and some people prefer 3-star Michelin restaurants. Some people think their old SD CRT TV is fine, and other people prefer 1080p HD. The people who like the former things may never understand why other people will pay more for the latter things. YMMV.


> Otherwise, it's worth it if it gives you enough more joy, and not if it doesn't.

That's the point. For many people it just doesn't matter, they can't tell the difference without focusing on it.

I'm typing this on an HP dv6 laptop that I really like. It came with the hardware I wanted at a price I was willing and able to pay. Does it have a retina display? Hardly. But considering my eyes are shot: it's night-time now, my eyes are tired, and I've taken my glasses off to give them a rest and the text on the page is ever-so-slightly blurry at this distance, having a retina display wouldn't do me much help. I can't see the dots at it stands anyway.


My eyes are shot too, but I find that the sharper text on my Retina iPhone is a godsend. I find that the sharper the text is, the easier it is for my aged-vision eyes to focus on.

Also, even if many people don't care about the difference, I think that this is a great advancement. Imagine if all the books and magazines were suddenly reprinted using pixelated fonts. Do you think that people would be happy with this? Do you think they would say, "Doesn't really make a difference to me"?

No, of course not. People would be rightfully outraged! A decade from now, everyone will wonder how we ever suffered with pixelated screens for so long.

The issue is not that many people don't care about the difference. The issue is that many people do care about the issue! The near-future move to sharp text on computer screens will be a wonderful thing for them, and it won't be a bad thing for anyone else. (Modulo pricing differences, which I suspect will rapidly approach zero.)

It's a completely win/win outcome.


I have an iPad 3 and an iPad 2. The "retina" display is certainly a nice upgrade but it's hardly essential. This is obviously where the industry has been heading for a while and the only thing interesting about it here is that Apple's supply chain dynamics makes it possible for them to get 6-12 months ahead of the competition.


Some text looks like crap on the old screen (iPad 1), with visible aliasing and color fringes - making zooming absolutely required for reading. Not so with the retina display.


Anyone who has used one of these knows they're awesome, and that words do not do it justice.

I think at this point, Grubber just needed to commit that entry to the historical record as a member of the top tier of Apple reviewers/fanboys/critics.


gruber


Some people make a point of appreciating the small joys of life. Others do not. Much effort and attention is lost on the latter.


I think its disingenuous to imply that because someone doesn't notice or care about the increased PPI of a laptop, they don't appreciate the small joys of life.


ok replace 'the small joys' with 'some small joys' the point is still the same.

It's similar to what John Carmack said when he was testing out the Rift VR Headset with Rage.

He said he was fascinated in just looking at the bricks in the walls and being able to 'look around' objects in the world, which gave him a new appreciation for all of the art assets that are for 99% of people just a blur as they run past shooting monsters.


Some people focus on the substance rather than the style. Others do not. Much effort and attention is lost on the latter.


I find myself a million times more "productive" in Sublime Text 2 with a "light" theme (Soda Light) than previously when it had a dark theme (the default one). I do look for substance, but we are all machines and can't fight over our limitations. My eye thanks me when it can visually see an "orange" function instead of having to look for a "black" (i.e. misspelled) fuction in code. It's not my fault - eyes appreciate semantically color-coded (or typeface-coded) blocks of code/article/etc (my last sentence is poorly phrased, as a non-English it's hard to articulate what I want to say but I'm sure you understand what I tried to say).


Catching resolutions up to print DPI is not "style", it purely a quantitative exercise.

The screen is the only component of computer hardware that we use at full capacity all of the time. It is much more important to get displays up to print resolution than a small clock speed increase.


> For me, text is text is text.

But legibility changes any and all text, and a better display means better legibility (or at least the potential for it). Like mattresses and office chairs, I believe in investing in screens given how many raw hours of my life and attention are given to them.


How many times have I faced ridicule for spending more on computers than cars? Presumably others here are in the same boat.

They depreciate at similar rates and I spend much more time with computers.


Don't forget about data visualization. The dense display is great for time series work (even a small, 1000-point time series can overwhelm a standard-density laptop display), or for image analysis.

Of course, the great-looking text is certainly a plus as well.


I'm probably just blind, but I have trouble seeing the difference in display quality based on resolution. And I don't think I'm alone.

When I was at the Apple Store after they launched the new iPad, there were several people who confused the iPad 2 for the new iPad. And even after being shown they were looking at the wrong one didn't go, "Oh... this one looks much better!", rather I heard from multiple people a line like, "Oh, what's the difference?"

I've looked at the HTC Titan side by side with an HTC Rezound (one of the lowest PPI versus probably the highest PPI phones) and it's really hard to tell the difference unless you're really looking hard. I think in part because even the worst PPI on phones is actually quite good.

I haven't seen the retina display on the Mac Book Retina Display, but I do hope that it lives up to the hype, as I was sorely disappointed by how incremental the iPad retina display was (versus the iPad 2).

(Note, the iPhone jump to Retina Display was a visibly dramatic improvement.)


I was just looking at the Retina display vs. regular display at the Apple store. I've also compared an iPad v3 next to a v1. An iPhone 4 next to a 3G. My eyesight is better than 20/20 (yay laser surgery!). I honestly can't see an appreciable difference. Pictures do look better, sure. My current screen has a 140DPI. Text and windows for reading and development are fine, and I often use tiny fonts. I don't see the hype.


>Text and windows for reading and development are fine, and I often use tiny fonts. I don't see the hype.

That's because you're looking at elements hinted for and designed for a blocky pixel grid. Try rendering a typeface designed for print, and the difference is night and day.

There is no hype. Until displays can go toe-to-toe with print, we have a long way to go. Hold a magazine or a book up next to a standard-resolution display (with the same body font!) to illustrate how inept standard-resolution displays are.


Your eye sight must be terrible. The retina iPad is a vast improvement over the iPad 2 for me.


Note that the whole point of the "retina" branding is that at a certain distance you will no longer see any improvement as you move further back. If those people were standing sufficiently far back (as you might when looking at a store display iPad) then the retina effect will kick in.


The difference between the retina and non-retina products is shocking once you're accustomed to the retina version.

If I pick up my wife's iPad 1 or my old iPod touch, it's like looking at a jumbotron. It's the first thing I see. "Ohh, a big clunky grid of pixels... I forgot!"

For now I haven't switched my computer screen, but I worry after a few weeks on a retina monitor, I'll have the same reaction to other displays.


You can't see the difference in text, particularly small text? Nytimes.com for example, is much harder for me to read on my wife's iPad 1 vs my new iPad or nexus 7. Comparing photos, I don't notice the resolution so much as the increased color gamut.


You are of course not truly alone, but you must be unusual.

As an example, consider the upvote/downvote arrows on this site. They have very fuzzy boundaries on my laptop (128 ppi). It almost looks smudged. They are perfectly clear on my new iPad (264 ppi).

This is not surprising, it's 2x the resolution in each direction.


It surprises me. The upvotes and downvotes are gif images, so all you're seeing is a blown up gif. I'm just in the process of replacing similar triangular gifs with near identical looking Unicode characters precisely so that it will look better on mobile devices that make more use of zoom. And in my experience (on iphone, iPad and Android phones and tablets, though admittedly not the new retina MBP) the current gifs look pretty bad when blown up.

I think you've fallen for a placebo and somehow prefer the blurry resized images to the "pixel-perfect" originals.


you can include "hi-dpi" versions of images, which a retina-display device will search for, before searching for the original image, e.g upvote.gif will cause safari to check for upvote@2x.gif (or something like that - i can't remember the specifics)

assuming HN follows this convention (and given the number of iOS/OS X users on here, i would be surprised if it didn't), then he will be receiving the high-resolution image, and hence it will look "pixel perfect".


Once again, I find myself surprised by someone's lack of surprise. Do you really think Hacker News uses hi-dpi images? My mind would be blown if it turned out it did. The HTML here is very old-school.

Note also, that the zooming that is done on mobile browsers, and (as I understand it on most modes for the retina MBP) doesn't simply multiply by 2 like native apps on the iPad/Phone so even if you provided a double resolution image then it would be unlikely for it to be displayed "pixel perfect", but the increased resolution (in image and screen) would help to hide any resizing artefacts better. That's why using Unicode characters is a better idea, when possible, as they are vector and scale nicely, particularly so on high-dpi devices.



If I could downvote myself, I would, because what I said is wrong. I was thinking about another site.


I'm with Gruber. The display is sexy. And good typography is beautiful on it. But my experience with Safari at the Apple Store was that the FPS when scrolling through pages is noticeably low. (Also choppy enough as to be frown-inducing were native apps I tested, like REAPER.) I know it seems like a strange thing to complain about (who cares what FPS a desktop app runs at?), but it's something you have to see and feel to judge the impact on your overall experience, IMO.

Lion inverted trackpad scrolling so that when you drag your two fingers down, the content follows them, like there is a physical connection between your hand and the page. On the Retina MBP, that illusion was mostly lost for me. I'm actually curious to know if owners of the Retina display get used to this chugginess, or if it's still kind of a big deal, even after awhile.

I know only a little bit about hardware composition in WebKit—and webpages are definitely not assembled "entirely" on the GPU. I suspect that there are gains to be made there, just in terms of how much work is being offloaded. Even if desktop Safari were to do what my old iPhone 3GS would do—separate the thread handling scrolling from the thread doing the painting, so that you could actually scroll ahead of where the renderer had filled in content, and see a checkerboard pattern—might be a better feeling alternative. Then again, this could already be happening. I haven't kept up with WebKit changes. Credit to Apple for pushing the boundaries. But I want my buttery smooth page renders, damnit.

/end entitled whining


As of Safari 6 on Mountain Lion, scrolling is threaded and nonblocking on Mac, but it doesn't yet work for all pages. So yes, you can scroll ahead of what is painted. The most common exception is pages with fixed positioning.


Was it running Lion or ML? I recall some reviews mentioning that ML had specific optimisations to improve scrolling performance.


It was Lion. I'll revisit my impressions. It'd be nice to collect some hard data too...


Unfortunately, a laptop is still a laptop. Small keyboard, inferior mouse (or lack thereof), single display, performance constrained by battery life. The performance margins are getting slimmer, but for a little over the price of the high end MacBook Pro (about $3000 with tax), I could build you a 64-core AMD G34 with 64 gigabytes of RAM, that would utterly destroy the Macbook performance-wise. Obviously, to each their own. I don't think the author is protein-folding or raytraycing, so this computer probably suits his needs better. Alternately, I could build a six-screen Beowulf cluster using 6 Nexus 7 tablets for $1200 (Tegra 3 performance is getting close to Core-Duo performance).


You're probably right about a couple of things here. First of all, Apple's products are notoriously expensive. I've once got myself a list of hardware in the Mac Pro and looked up the individual prices of the components. It added up to roughly half of what Apple charges you. For that sort of money you have a huge margin of what you can do to make things quicker.

Also, laptops are not performance computers. The idea of a "high-end gamer laptop" makes me shiver and every time somebody asks me which laptop to buy for high-end gaming I'll tell them "a desktop". The thing is though, that a proper laptop is good enough for doing a lot of work-related tasks and definitely good enough for consumer use.

People like their laptops because they can carry them around, use them wherever they want. And they know that the price for this mobility is worse performance. Most people don't care though. It's quick enough for most things they do, so there's no reason to care either.

Personally I like my laptop, it's got a near-fullsize keyboard, a whopping 18.4" display and I don't care about a mouse at all, so the touchpad doesn't bother me. It's quick enough for everything I do, large enough to be comfortable to work on but still at a size that makes it somewhat portable.

Now obviously, if you've got performance computing to do, stay away from laptops. For everything else, they're just fine.


> for a little over the price of the high end MacBook Pro (about $3000 with tax), I could build you a 64-core AMD G34 with 64 gigabytes of RAM

Can you detail how you would achieve that ? 3000 $ seems insanely low to me.


http://i.imgur.com/Gy9GU.png

Just priced it out on Newegg.com (see screenshot above) -- does not include shipping, power supply, etc...but gives you a general idea of the price range -- overall, you could probably do it for around $3.5k-$4k, or easily 3k if you dropped 2 processors, which would still be a beast of a machine...


[deleted]


Well, HN isn't "the internet", it has its own traditions. I didn't down vote you, but, if I vehemently disagree with you, I'll certainly do so, regardless of how thoughtful and considerate your comment is. Likewise, if I see something in the grey that I agree with, I'll go out of my way to upvote it back to at least neutral.

In general, the vast majority of comments on HN fall within the bounds of good taste, so consider ones aggregate karma as a reflection of what the community here assesses the validity of your contribution as.


It's a shame Google Chrome still doesn't render CSS3 fonts correctly on Windows. The strangest thing about it is that they barely even seem to acknowledge that it's an issue.

http://stackoverflow.com/questions/8950164/horrible-renderin...

If you've noticed that web fonts look terrible on Windows / Chrome, please star these issues!

http://code.google.com/p/chromium/issues/detail?id=137692 http://code.google.com/p/chromium/issues/detail?id=91899 http://code.google.com/p/chromium/issues/detail?id=102371


The keyword you're looking for is 'DirectWrite'. Here's a post from January this year indicating they're working on it: https://groups.google.com/a/chromium.org/d/msg/chromium-dev/...


As elktea said, this will be fixed with DirectWrite support. This is the issue you need to star: http://code.google.com/p/chromium/issues/detail?id=25541


Think about that: a laptop with no performance tradeoff compared to a high-end desktop.

Compared to a high-end desktop which has barely seen updates since 2010, and includes a GPU released in 2009, sure.


The point being made was that it is a desktop-class machine. Generally, laptops have always been convenience and portability, and desktops have had power and price point.

Obviously there are exceptions to any rule, but in this case, the display is actually better than any desktop I've seen, and the SSD and CPU/GPU give it all the power necessary for 95% of users. The price point is definitely high, but otherwise it seems best of both worlds.


You've always been able to get a laptop that was as good as the desktops of 2-3 years ago, and never one that's as good as the current ones. Saying "desktop-class" is a nonsense; this is no different from previous efforts.

Now, if you're saying we've reached the point where the extra CPU etc. of a desktop are not worth it for most users, then I agree with you. But again that's nothing specific to the macbook air; any modern laptop will have the same properties.


> Think about that: a laptop with no performance tradeoff compared to a high-end desktop.

It helps if that high end desktop hasn't been updated in two years.

For CPU performance, laptops lagging a couple of years behind desktops is more or less business as usual.


Still, we're talking about a Xeon here, not your typical desktop-variant Core Something. It was a powerhouse.


The benchmarks were lightly threaded, for which Xeons basically offer no advantage - they let you have more cores in a machine, and that's about all. For rendering and encoding, the 12 core 2010 Mac Pros would still slaughter a retina MBP. No competition.

There are many impressive things about the retina MBP, but CPU performance is not one of them.


I would say the CPU performance is very impressive, sure a 12 core will beat it at rendering, but the single threaded performance is amazing, and the super fast SSD makes it basically the fastest Mac ever in general usage.


Even worse, it's 4 years for GPUs.


I kind of had the same thought too, as I walked through the Apple Store today and took a gander at some of the Retina MBP's. I thought: in as soon as 5 years, people are going to look at the pre-Retina MBP's and wonder, how did we ever live with 160 DPI displays? All of these jaggedies, this unsharp text, these blurry textures in iCal -- all this ugliness that you have to squint your eyes to see. Who would want that?

And for a moment I regretted I'd gotten an Air instead of a rMBP. But then I realized... what I have is good enough. It is. It's great. And there will be better products to come. And I'm not going to lose any sleep over it.

But it is interesting to think about a future world where pixels are antiquities, merely "remember when's".


One thing that I've noticed as pixel density is increasing, is many 3rd party UI elements aren't increasing in size relative to monitor improvements. I remember using Photoshop on my old LC475 in the 90's and not having to squint and hold the mouse steady to select the tool I wanted.

Please tell me someone else has noticed this? Or am I just getting old.


Monitor pixel densities haven't increased that much, they've been limited by having to keep interface elements visible. It's only easy for apple because they just doubled the density in both directions. Any intermediate numbers of pixels would have impacted usability.


So, here's a question for those who actually have one of these mega-res displays... how's the terminal? Or any other typically aliased text? Secondly, how well does it work in a dual/2+ boot situation? Obviously other systems have the drivers for such large displays, but without knowing how to scale it... Is there any way to set the hardware to simply enlarge everything x2 in the absence of serious display drivers (for example, a desktopless linux or bsd)?


re: A terminal: probably fine, albeit small. Running at 2880x1800 without any scaling isn't terrible, but it would quickly drive your eyes to escape your skull. I can't say for sure what would happen under Linux given only a terminal, but I imagine things would be pretty small until you got it using a different resolution than the native resolution.

Terminal emulators are fine, but I assumed you meant the other kind. Terminal emulators look great, as does every text editor I've used (MacVim, vim in a terminal emulator, Sublime Text 2, TextMate 2, Xcode [looks great, but it's unfortunately still a lousy IDE], etc.).

re: Dual boot, also fine in my case, though I only keep Mac OS and Windows 7 installed right now. Windows 7 in particular is an issue, as DPI scaling only affects some programs (so where on Mac OS you'd get pixel-doubled views or text, on Windows you get windows that aren't scaled at all, such as Chrome). If you have dual monitors, Windows is particularly painful, as DPI scaling is system-wide rather than per-screen. I haven't tested Windows 8 and likely won't until after it's been released.

The best route I've found is to simply set the display to 1920x1200 or lower and scale it up on Windows. Windows 7 just doesn't handle high resolution stuff particularly well -- it's not bad, but things just aren't all there. I expect Windows 8 will be better.

I couldn't say what your options are on a desktopless Linux/BSD system, but I would imagine that there's something you could do, even if that something is just using a larger font.


I totally agree about the Lucida Grande thing. It's weird seeing fonts like Lucida Grande and Verdana on such a nice display.


The Retina MPB has opened up a whole world of fonts. It's amazing how much attention I pay to them when I visit websites now.

There is a downside to such a nice screen -- it makes me painfully OCD. On my previous laptop, the screen was such crap that it didn't bother me. On this one, I notice every subtle flaw.

For instance, image retention (which I'm now convinced that after swapping three of these things out eventually affects all of the LG panels used in the MBPR [Samsung panels do not seem to have the issue]).

Then there's the fact that it's just slightly not bright enough for my liking. I find myself hitting the brighten button a lot when it is at max. No matter that my old laptop looks downright dim next to this.

And despite the fact that it is an IPS display, the colors do change quite a bit with viewing angle. A solid white screen kind of shimmers in luminance and tint when moving my head.

And the black level isn't pure black.

And 220ppi isn't quite enough to complete the no-pixel illusion for me.

And the color gamut could be a smidge greater.

It's like the uncanny valley of screens. It's so close to perfect, but not quite there, that it leaves me paying a weird amount of attention to it.

But I wouldn't trade it for any other.


I've recently been studying typography and it's been really heartbreaking to realize that what differentiates a good and a bad typeface onscreen is how it well it renders. Plenty of great typefaces are just unusable. Most fonts distort wildly as you step up/down in pixel size. I didn't start in print design, but man is web design primitive in comparison!


I have to admit that rage was starting to burn inside of me until I read your last sentence– that's one of the reasons why I take so few jobs for the web these days. I always thought that web technologies would far surpass what we could do in print (which, as a lead type junkie, sounded pretty amazing) and then I grew up, started dealing with W3C &c;– now here we are, only in the last two years or so getting a reliable (and compliant) way of using more than just Times, Georgia, Helvetica, Arial, &c... It's slow, slow progress but it's great to see that while those issues are hashing themselves out that the pixel problem is practically resolved. (Though I should note that pixels or not, setting a spread to be displayed digitally was never that much of a problem if it didn't have to be displayed in the browser. Sure, some fonts have shitty hinting but the ball is at least in your court.)


> Dots were how computers rendered everything: pixels on screen, dots of ink/toner on paper

The writer seems to ignore the world of X-Y plotters and vector graphics workstations, which have been around since the very early days of computing.


Color me surprised to read another frothing fluff piece for an Apple product from arch-apparatchik Gruber.

Doubling the display density of a laptop screen qualifies Apple for the Nobel Peace prize now, or something.


So? He likes the display. So do I. It's awesome. When you read a Gruber article you know what you're getting into before you even read it so I don't see why anyone would complain.

The difference between Gruber writing about the things he enjoys and someone else who say, loves Free Software or something or even another Apple lover, is that Gruber can write and is an excellent storyteller (the two don't always have to go together by the way).


I'm not sure what your critique is. He's not a politician or TV host getting paid by the bad Apple to write/say nice lies about them... He writes about Apple, and people visit his website to read what he writes. It's that simple.

If you disagree with any of his points, please post them in this topic. But don't go dismissing it just because it's from arch-apparatchik Gruber.


As he himself has written on his blog, in exchange for this kind of PR he gets invited to private meetings inside Apple.


Does he? What are you referring to? The Mountain Lion presentation? Apple invited loads of journalists to that, they just also included Gruber there.


http://daringfireball.net/2012/02/mountain_lion

And instead of a room full of writers, journalists, and analysts, it was just me, Schiller, and two others from Apple — Brian Croll from product marketing and Bill Evans from PR.


..and from the very same article:

"It’s Phil Schiller, spending an entire week on the East Coast, repeating this presentation over and over to a series of audiences of one."

He wasn't the only one getting the special treatment, they were communicating to tech journalists.


Sure. And? What’s your point? Apple met separately with loads of journalists. I don’t see anything special about how they treated Gruber.


How many of these journalists are known to be critics of Apple, or write about them in a more direct, non-ass-sucking way?


They invited correspondents from The New York Times. Don't tell me the iEconomy articles wasn't the biggest blow to Apple this year (after the ruling that they had to say Samsung wasn't cool enough to copy them, of course!).


Yeah you're right. An exclusive, face-to-face meeting with the senior vice president of marketing is nothing special. I'm sure things like that don't influence his writing about Apple at all.


That meeting happened in 2012. Gruber's been writing Daring Fireball since 2002. http://daringfireball.net/archive/


I've been reading him on and off the whole time. Once upon a time he was actually worth reading but I don't even bother now because I can practically write his tech-Pravda pieces in my head.


I agree.

Also, I wonder how many more rMBP are sold as a result of people reading this? (assuming the would-be buyer is sitting on the face between a regular non-retina vs retina macbook)

The thing reads like a praise-review hidden beneath a veneer of typography fetish.


Why the hate, dude? Can you point out anything that is actually wrong with the piece?


I could quote specific claims from the piece but it's just too annoying to enumerate all the breathless reasons he gives why this is yet another Apple product that makes everything that came before it trivial and irrelevant.

I'm fed up with this kind of PR masquerading as tech journalism. He's not a journalist. He's part of Apple's third-party marketing arm and he gets compensated for writing this kind of trash with exclusive meetings etc. at Apple.


So would you be willing to go back to 9 pin dot matrix for print? That's kind of Grubers point. Printer scaled from 144 dpi to 1200 dpi quickly, while displays stagnated at 72-100 for a ver long time. We're finally seeing movement on that front, and eventually it will be really hard to go back to the blocky pixelated 100-130 dpi screens. I know I can't.


So you can’t. Noted.


+1

How many of these blogger/journalists have broken real stories? Future of Mac Pro anybody? Sure, they like to name drop and whisper about 'informed sources' but the truth is they have nothing of substance to report. Everything is gossip, speculation, or parroting of press releases.


I don't think Gruber would ever claim that he's a journalist. Nor that he's in the business to break stories.

He brings a certain wit, and skill at writing, to observe all things Apple. His article are traditionally read by those who love the platform - it's unclear to me why people who aren't enthusiasts for the Apple platform would read his blog - as that is the audience his articles are targeted at.

Back when he started 10 years ago, he was writing about the Apple-the-underdog, and there was a certain quixotic element to his articles. Nowadays, with Apple being the 2-ton beast that it is, that sense of rooting for the challenger has gone, and with it a bit of the ideological purity that comes with being a member of the downtrodden class.

I for one, appreciated the longer essay that he wrote on the MacBook Pro. I think his essays are his greatest strength, wish he did more of them.


I would agree. He's essentially writing opinion pieces and preaching to the choir. Nothing wrong with that and his audience enjoy his work regardless of objectivity.

I think any half-decent tech blogger or journalist should be fighting for court room seats in San Jose right now. That's where the story is.


  Nor that he's in the business to break stories.
Indeed. I think this write is what it seems to be his review of rMBP. So he waited more than one month and half before delivering his take on the new laptop. Hardly seeking to break stories.


200+ DPI displays actually do make everything that came before trivial and irrelevant. So take that point and put it in your pocket and let's all move on.


You don't even read what he writes, right?


I started reading him in the mid-2010's but his utter lack of objectivity lost him a place in my RSS aggregator long ago.

If I want to read perceptive analysis of Apple products I prefer to read real, objective journalists like John Siracusa.


You are welcome to your opinion, but on your suggestion I just went and looked at John Siracusa's blog and for this entire year we have: 1: a discussion about various ways you can read his Mountain Lion Review. 2: How to cook pasta. 3: summer movies. 4: summary of 2011. That's it.

He writes also on Ars. In 2012, he has posted a single article, his review of Mountain Lion.

Might I ask, and I am being completely sincere, what DO you read regularly for "perceptive analysis of apple products" that goes beyond once a year OS reviews?


John Siracusa has weekly podcast on 5by5 network, Hypercritical[1]. That's where most of his "perceptive analysis of apple products" come from.

[1] http://5by5.tv/hypercritical


To be honest, since I work all day writing iOS apps on a Macbook and have numerous iSquare devices I'm mostly content to form my own opinions.


Let me just be straight up.

I'm normally an Apple skeptic and I work primarily on Linux.

That said, I don't think anybody else would've bothered to make retina display available to the general public until Apple forced their hand and made it available themselves...


That said, I don't think anybody else would've bothered to make retina display available to the general public until Apple forced their hand and made it available themselves...

The technology was coming, it was just a matter of time. Apple are in the unique position of controlling both the hardware and the software, which makes it far easier for them release this in a nice way where the OS and much of the software works well as soon as they made the hardware available.


Sure, it's always coming. Many of us were waiting for someone to make an "iPhone" for a decade. I used to explain to people at work that I shouldn't have to carry a Palm and a phone, for example. How long would we have had to wait if Apple didn't build it? Another 5 years? I'd like everything a half a decade sooner.

Apple gets ripped for not being innovative. What they really do is identify when good technology is ready for mass consumption then ship it. Remember when they went with USB? I was a PC user back then, building my own computers. I waited years before USB became widely adopted on affordable motherboards. Within 12-24 months I'd say Apple's entire product line will be Retina. Hopefully, there'll be enough buzz that it catches on with the Dell's and HP's.


"I used to explain to people at work that I shouldn't have to carry a Palm and a phone, for example."

You didn't - instead, you could have carried a Blackberry, for one.


Lucky Apple, always getting there first with the inevitable.


It's been a matter of time for 10 years now.


My 2009 MacBook pro just died and I'm debating which Mbp to get. I'm leaning on the non-retina display because of the reports of FPS sluggishness. I'm very aware of frame rates.

Also, I keep hearing programs like photoshop and so on aren't optimized for retina yet. As much as I'd like to be bleeding edge, and even help develop for it... I can't help but feel it's at least another year before its more viable to go retina.

I don't want FPS lag. I'm probably going to pick up the 1,700 model and add 3rd party ssd + 16gb ram.


I bought a new MacBook Pro during WWDC this year. I had the option to get the RMBP, but I didn't, and I don't regret my decision. The performance vs. cost ratio of this laptop is significantly higher (plus I get a disc drive), and honestly with the high-res screen, I have to be a lot closer than my normal viewing distance to notice a difference between this and the RMBP.

Only thing I think I think I'll regret in the future is not getting an SSD. Oh well. I like my storage space more than my speed.


Since you got a non retina MBP you can upgrade your storage to a SSD and at a discount.


I found this to be true with my phone. I currently have an iPhone 4 and I want to switch soon. However, looking at the other phone screens hurts my eyes. The image seems blurry.


Gruber's full of shit. I had a 1600x1200 17" monitor in the 90s. LCDs were a step back in pixel density, and retina means they're JUST NOW catching up to where I was 15 years ago.


That's incorrect. MBPs were available with better density than your figure starting 5 years ago (1920x1200 @ 17"). I have one from 2009 and it continues to be a joy.


You still to this day can't buy a 17" 1080p monitor. LCDs are still behind my 15-year-old CRT.


my old T61p's 1920x1200 15.4" panel disagrees.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: