Hacker News new | past | comments | ask | show | jobs | submit login
A 12pt Font Should Be The Same Size Everywhere (github.com/kickingvegas)
101 points by kickingvegas on July 12, 2012 | hide | past | favorite | 64 comments



"My appeal is to software and hardware developers to ensure that a 12 point font will be rendered the same size everywhere, regardless of screen size and density."

This is a not very well reasoned appeal. If it were implemented, every Keynote presentation would be unreadably small when projected onto the big screen, literally the same size as the presenter holding up a piece of paper for the audience to read with their binoculars. Likewise, no more than a few characters at a time would be visible on a phone.

There is simply no serious counteragument; nobody can say making things unviewable would be a step forward.

The best you can do is complain that the name of the units is misleading. Fine. So what about introducing a new unit that is truly faithful to a real world measuring stick? You can argue for that all you want, but guess what? Nobody in their right mind would actually use it, outside of exotic scenarios, because generally you want to avoid the above effects. You want your user interface to be visible to users at different screen sizes meant to be viewed from different distances.

Thus, even if we were to introduce a new "TruePoint" or "TrueInch" unit, it would be largely.. and I beg you to excuse the pun here.. pointless.


There are several slight distinctions to be made there. A keynote/powerpoint presentation is typically displayed as though on a piece of paper (A4, say), and thus of course type should be rendered proportionate to the virtual piece of paper, no matter the display. The post is referring mostly to HTML sizes.

The second distinction is that a projector or video monitor should duplicate the master screen, and indeed for a projector you have no idea how big the final screen is, and can change its size by moving the projector around. Video projections, similar to paper, are a proportionate medium.

Not making these distinctions is definitely an oversight of the linked article. His major point is well illustrated in his example picture, where "5 inches" has been displayed with varying degrees of accuracy, but it's clear all the monitors are trying to display something _close_ to 5 inches. Not getting it exact (±½ pixel) is a failure.


I agree about the physical rendering size always being the same for a point, a point should be more general, but I still think there is some utility for having things defined in real-world units showing up as real-world units when measured on glass in the same z. An inch is an inch, and if it's too big for the display it should be clipped and if it's 100 feet away it should be unreadable. It would be useful for classes of devices--instead of special casing for resolutions, ratios, and densities, special case for desktops/laptops, phones, tablets, home projectors, conference projectors, etc.

But what's really missing is the explicit presence of a projection surface and the ability to adjust the factors that go into the calculation when projecting. It would be nice to have a unit that lets you say "I want this item to be as tall as the subjective inch if a person holds a ruler one foot away from them." That way no matter how close or far away I'm looking at the item (assuming the display is large enough--too small and too far will get it clipped, too few pixels will get it blocky), if I hold a ruler one foot away from me the item will be "as tall" as the ruler's inch mark. (Edit: I'm having fun trying to visualize a powerful enough 64x2 display built on a mountain. Where I grew up, there is just a single letter. From afar: http://upload.wikimedia.org/wikipedia/en/d/d9/Little_mountai... And from on top of it: http://2.bp.blogspot.com/-QeeaDcCiD4o/TexykEujKiI/AAAAAAAAHR...)


> "I want this item to be as tall as the subjective inch if a person holds a ruler one foot away from them."

Sounds like you're looking for arcseconds.


> This is a not very well reasoned appeal. If it were implemented, every Keynote presentation would be unreadably small when projected onto the big screen

And that's why CSS3 introduces viewport-relative units[0]. No reason why Keynote couldn't have and provide that, instead of removing all meaning from existing ones.

[0] http://www.w3.org/TR/css3-values/#viewport-relative-lengths


They could also define an arcsecond unit, and each device would carry its typical (configurable and overridable) distance of use to internally convert to pt, and then DPI to rasterize to px.


That's a rather interesting idea. You should suggest it to the CSS3 dimensions WG to see what they think of it.


Well yes, 12pt should be the same everywhere, but points are a terrible unit of measurement for anything computer-related.

Points are useful on paper only. They should only exist in the context of word processing and like, where we expect things to be printed in an actual physical size, with 72 points to an inch.

For computer interfaces or web documents, we just need some kind of measurement that is a relatively known proportion to the computer or browser's interface. Fortunately, CSS "px" does that rather well -- it doesn't indicate physical pixels, but rather logical pixels, and we all know what "16px" text looks like relative to our OS, and it works great. And happily, people rarely use "pt" in CSS, and in my opinion it should never have been an option in the first place.


That's like telling me points are a terrible unit of measurement for print because we use different sizes of paper and points should be relative to the paper size.

The whole goal of points is that a point is a point is a point, regardless of whether I'm printing on a business card or a billboard. Why can't we have that uniformity for digital devices, too?


Just curious, but why not inches / cm instead of points?


Older industries often had specialty units for measurement. The Point unit was designed during the early days of printing, so the system is several hundred years old. It was by no means the only system, but it is the one that seems to have taken over the industry by the 20th century.

A point being about 1/72 of an inch or around .35 millimeters.

The usefulness of a specialty unit, particularly in a pre-digital, largely human labor intensive world is that you can tailor the unit so that the common sizes are easy to remember and subdivide usefully without having to a lot of math involving fractions or decimals.

Most commonly used type sizes are around 12 points, or one pica (1 pica = 12 points) 12 is a good number because there are several whole number factors of it; 1,2,3,4, and 6. Most common font sizes are even numbers. And the most common font sizes used are going to be gotten by adding, subtracting, multiplying and dividing these numbers. Thus in mechanical type you would often see fonts at 6, 8, 10, 12, 14, 16, 18, 24, 28, 36, 48, 64, and so on. I'm not saying you never saw sizes other than these, but they were comparatively rare. Remember each font size of a typeface had to have the letter forms cut, molds made, type cast, stored and maintained. A big shop might have a larger collection, a smaller shop would get by with fewer sizes.

These days it's certainly no big deal to specify a 17.3 pt font sizes. At one time, when digital type still relied more heavily on hand tuned pixel based versions for screen display, the above numbers were useful because you could have algorithms for regular pixel thinning since the numbers all factored more easily; but even these used to require a lot of hand running.

And even now it' useful for humans to deal with a few, easily distinguished font sizes, at least conceptually. It's easy to remember what an 18 pt font size might look like vs., say a 24 pt character. And simply using whole number millimeters is a bit course right around the range of most common variation, say 6 - 24.

The thing is that with computers it shouldn't really matter what unit you use, assuming you have enough resolution and appropriate anti-aliasing. But the unit should be an absolute measurement, not something based on an undefined pixel size.

For a while there was a sort of standard. In the early days of the Macintosh, Apple declared that one screen pixel was one point, and we would pretend that for practical purposes that was 1/72 of an inch. And it was pretty close, at least if you were only using a word processor. Designers did have to worry about the difference and understood that the image on the screen was not exactly the same size. And there were wrinkles and variations in how the various font rendering technologies dealt with the issue.

The simplest solution would be to simply require the OS to know the actual size and aspect ratio of the pixels on your screen. But getting everyone to rewrite the software to be pixel agnostic is going to be the rub.

For UI uses, points would seem to be not terribly useful. Many elements would be large numbers, so the advantages over, say centimeters is going in the wrong direction. And you are going to be seeing much more variation in sizes for UI elements than you are going to be seeing in typography. You have little bitty UI elements and you have UI elements that easily take up half the page or screen. Normal metric measurements would seem the most appropriate.

And in fact, in typography and page layout you see other units other than points to describe things. There's the aforementioned Pica, which is often used to describe things like column and page widths. And of course, these days, plain old inches and centimeters are used for laying out pages all the time.

Edit: Pixels do still factor heavily into measurement on video displays, as opposed to computer displays. A video display is "stupid" in the sense that it is a known number of pixels, but the size of them is assumed to vary quite a bit; from a couple of inches to tens of meters. Of course these screens aren't conceived of as an up-close medium. You look at them from at ranges from across the living room to across the football stadium.

In video it is quite common for measurements to be expressed in percentages of the screen rather than pixels or absolute measurements. You have concepts like "lower thirds" for supplementary text and images, or 20% safe areas around the edges of the screen. You can put things in the center of the field or use the old standby "rule of threes" from photography composition. Confusingly and irritatingly type is typically specified in pixels. And yes, this does cause problems when designing imagery for use on different devices like real time character generators and special compositing hardware. These days you simply make the assumption that you are designing for one particular format, say 720p, specify in pixels for that, and then "do the math" when targeting the graphics for use at other formats.


An inch is too big. A centimeter is too late.


Real metric users would tell you to use millimetres instead of centimetres anyway. Powers of 1000 and all that.

(Real metric users would also spell it "centimetre", but if adopting the American spelling will expedite a changeover I think it's a fair trade.)


A millimeter (which I rather often call “milímetro” or “millimètre”) is also too late. By the way, the SI uses powers of ten.


Points/Picas are common in the print world. I assume he used them for the text example because that's what he was talking about. I don't know how well they'd translate for UI elements though.


But on any piece of paper, you know what the context is. You know if you're designing for a billboard or a business card. You never design anything that gets printed on both without human intervention in the middle.

We don't want that for digital devices, because having text be the same physical size on my iPhone and my iMac would be a usability disaster. But I know that CSS "16px" text on both devices will be readable and a little on the large side.


Some random points here. (Disclosure: I'm a professional UI designer with a strong development background. Erstwhile CS dude, but still geek to the core. I consulted with the OP on this article.)

----

You really dont know what "16px" means with respect to your OS. The DPI presented to the user is hardware dependent, not OS dependent. True, in certain tablets and phones, the OS and the hardware are bound together (iPads, etc), but not so on the desktop where screens have different DPI (as evidenced by the OP's photos of his screen setup).

This issue is pretty subtle - there's lots of places where ems are the better way to specify point size, like print-centric expereices.

Other experiences, like 10' UI on set top boxes, I could argue that pixels are as good as anything given (a) the huge range of screen sizes and distances-to-screen the interface will find itself in, and (b) pixels are, for all practical purposes, the same as expressing things in percentages (HD being a fixed size), but easier for designers to reason about.

And finally: 12 point on the screen might be some other type size when moving to mobile (probably smaller) and should almost certainly be using different layout techniques when moving to mobile. Line length, inter-line spacing and distance to screen are all in play here, so don't get suckered into the idea that you can spec the text once and expect scaling on the platform to automagically adjust it everywhere.

Related: responsive layout. Spec'ing a type size is really only useful within a known domain range. You should think through the design as you move from platform to platform.

Quite the mess. There's no Grand Unified Theory for this stuff yet. Not sure it even makes sense to go find one. Great design isn't a matter of being uniform, but through being right-for-the-purpose.

That said, the OP is dead on. IF you're in a place where specifying points is the right thing to do, then a point should be a point, not a pixel.


CSS isn't just for screens, you know. There are also media types that include print, where points really would make sense.


Or audio.

Can you tell me what 12pt font sounds like?

http://www.w3.org/TR/CSS2/aural.html


Points aren't completely standard in print either.

Prior to significant computerization, a point was slightly smaller than 1/72 of an inch. Often 1/72.27 of an inch.

With computerization (particularly postscript but I think it has roots before that) it was redefined to 1/72 to take advantage of integer arithmetic.

These days publishing systems can still use a mixture of definitions.


You seem to be taking it as given that the size of stuff should match between different displays, but you don't give any reason why that's desirable. I understand the argument for 1:1 WYSIWYG in DTP, but not for UI. I haven't been doing much DTP in the last decade.

Some people prefer larger UI and some prefer smaller — partly due to differences in vision and partly just preference — and today's industry lets them choose. Your proposal takes away that choice and thus is guaranteed to anger people.


First of all, whether or not something offends people is rarely the best reason to do something.

Secondly, "choice" is a generous way to describe it. It's more that sizes are happening of basically their own will, whether the user likes it or not. At least if inch measurements were respected, the designer could be deliberate about a decision. This would not invalidate user stylesheets or zooming; I can't fathom a single disadvantage here.


I'm more concerned about UI widgets where user stylesheets and zooming tend to not exist (and zooming would produce blurriness if it was used).


People may want to see pictures of products in 1:1 scale in webstores. I would.


scary... my experience is that the more platforms try to get this kind of thing right, the more they get it wrong.

back in the 90's, photoshop would try to do all sorts of gamma correction on images you were editing with the consequence that, if you didn't turn it off and make sure it was always turned off, you'd get your colors wrong 100% of the time.

a system like that working requires that all the pieces be correctly configured, and the consequence is that instead of having something that's 4.1 inches on one platform and 5.2 on another, you have something that's 12.7 inches on one platform and 1.4 on another.


Gimp had the only cheap effective, backward compatible solution:

There was a setting that showed a ruler in the screen, you could scale it to match a ruler you were holding against the screen.

It disappeared after a while...


I don't think this would be necessary if folks ensured their X server's Screens' resolutions were configured correctly.

    $ xdpyinfo | grep -A 2 '^screen #'
    screen #0:
      dimensions:    1920x1080 pixels (483x272 millimeters)
      resolution:    101x101 dots per inch
    $
The all X clients would know how many pixels represented a physical measurement on the screen.


And how do you configure it? most lcd manufacturers put wrong info on labels and dmp info


One uses

    Option "DPI" "96 x 96"
in an xorg.conf or similar file; this overrides DDC from the monitor. Note, not all monitors have the same density of pixels in both directions. https://wiki.archlinux.org/index.php/Xorg#Display_Size_and_D... has more.


but my monitor does not have the real DPI the label on it's back says... configuring by DPI is moot.

i read a little more on the link you provided... and the closest to the gimp old way of doing that is putting the width and height of the viable area of the monitor in mm. think that might work rather well if it's correctly implemented.


I wasn't suggesting reading a label. One measures the visible picture in real life, e.g. ruler, and calculates the DPI. CRTs didn't have a label on the back and, anyway, one could adjust the picture size.


It would be nice, won't happen of course but here is to hoping.

The challenge isn't typography, its people. People who want their document to be as wide as their phone on a phone and as wide as their tablet on a tablet, and not quite as wide as their screen on their desktop or laptop. If you force them tom compute their 'zoom factor' they get annoyed. People who give them what they want, get their business. And typography continues to suffer.


That's not the point of the article. If it's badly used or not is not discussed.

The point is that even if it could cure cancer, it's not possible to be done on any display.


Github repo as blog post?!?! Hows that for some transparency in your edits:

https://github.com/kickingvegas/12pt-should-be-the-same-ever...


And if you want to get really fancy, there's Github Pages -- http://pages.github.com/. So, you get all of the benefits of the repo approach, but your pages can be all sexy and stylish.


This sounds like a good idea at first[1], but then you realize that text is often accompanied by graphics that are not vectorized. Resolution independence when coupled with raster images is a hard problem. You can render the text at the desired size fairly easily, but you can't resize a bitmap arbitrarily without it looking pretty terrible. This is why resolution independence hasn't happened for displays despite lots of attention. It's also why Apple just doubled everything to keep it simpler.

[1] Actually, it might not even seem like a great idea at first if your first thought is to contrast your phone screen and TV screen.


I don't think I see the point, aside from for historical reasons and printing (though that one's already been solved quite thoroughly...)

The beauty of electronic displays is that they can be tailored to your specific wants and needs. If I want to view your text, I should be able to view it at a size that's comfortable to me on whatever device I want to read it on. If I'm not reading it, and am instead waiting for updates, I should be able to scale it down and put it somewhere in my workspace that doesn't take up too much room or distract me.

To have an edict that says 12pt must always be 0.4233cm or 0.16777in (quite a ridiculous measurement) achieves nothing. What if I want to display the resource you built for display at 12pt on a 24 inch monitor? A 6 inch phone? A projector? What if I have bad vision and want it larger? Suddenly, 12pt needs to get multiplied by some arbitrary factor and it's lost all of its usefulness again.

My expedient excuse is that it's not necessary unless you're dealing with something static and physical.


The basic solution is fairly simple (and I'm surprised there's not much mention of it here): use something other than "point". In an ideal world, there would be three measures:

Points/inches/mm/etc. - used only for displaying content which must match the physical world. For the most part, this means "page preview" and when you want to display some image at "actual size".

Arcseconds/radians/etc. - used for displaying content that is intended to fill a certain field of view. The vast majority of content.

Pixels - used for fitting to pixel grids and 1:1 display, or other scenarios where scaling artifacts are not acceptable.

There is a considerable deal of fiddling to do with this, especially with regards to user-configurable scaling (which would be extremely important for arcseconds/etc., given that field of view depends on distance to device), but the basic need for some angular measure seems obvious.


I don't think anybody's saying you shouldn't be able to scale anything on your screen - just that "100%" should be the same size on anything.

100% should not necessarily be the default. Fit page width to screen/window seems like the proper default.


Good point. Viewing distance needs to be taken into account. There is no reason this can't be solved though. How about instead of [m] we start thinking about the angle at which a an object falls into our retinas? Instead of pt or mm we could define fonts and distances in •.


This will lead to more harm than good. Sure, designers can use it to their advantage to create great pages, but some people who don't account for this will make their websites so their 12mm font looks great on their 4000px monitor but everything looks fuzzy on my 1024 px screen. And oh god the scrolling when I try to use my ipod touch to browse the site.


that's what they do already. i'm totally missing your point. How does this not happen with px? and with em and % being also based on px...


For your reading pleasure, here is (imho totally flawed) argument by one Mozilla guy who thinks otherwise:

http://robert.ocallahan.org/2010/01/css-absolute-length-unit...


Shouldn't we just switch to dimensional measurements like millimeters, inches, etc and expect the operating system to use whatever metric it chooses to make it display size appropriate (pt, px, em, etc)?

I think the problem with this is that developers (me included) have enjoyed using resolution to to change the physical size components in order to gain screen real estate. Perhaps, shifting to a zoom/scale setting would be a better approach than tying ourselves to resolution.

Humorously, it would be kinda like a print guy measuring font size in fibers and the size of the type changing based on fiber density. :-P Then again, they probably do that.


No. I want to be able to make slides that work on projectors. The only unit able to handle this would be "angle based on standard viewing distance", which will require markedly different "actual lengths" for different devices.


It has been pointed out before that the CSS `px` is such a unit: http://inamidst.com/stuff/notes/csspx

(Although I seem to recall the discussion here was skeptical of the idea.)


A point is not necessarily 1/72 inches. TeX, for example, uses 1/72.27 by default. See http://www.oberonplace.com/dtp/fonts/point.htm for details on the different definitions of points. Even once you've settled on a definition of point as a unit of length, there's still the problem of defining what a "12 pt font" means; does it refer to the cap height, the length of an em-dash, or something else?


For the web, users must be able to override font size independently of layout (all fonts, not just some of them).

Zoom text only feature in Firefox makes it much better than Chrome for me (even IE has a really poor man's implementation of this).


"Font-hinting becomes less necessary at 200 PPI and higher."

Umm, no. Font hinting is still important at 300 DPI, so it's also still important at 200 PPI.


i was fighting with the same thing the other day... then i realized, i do not look at my 24" monitor at the same distance then i look at my 4" phone screen.

So, which size should the font be if i'm designing for both those screens? pt sure isn't the answer.

this being only tangentially related to the topic :) so, back on topic, yes, i wholeheartedly agree that pt should mean what it mean. it's just retarded that it's not. and you are not even accounting for TVs which cut random portions of the displayed image for no reason, making this even harder to calculate the real dpi.


Ironically there is a straightforward answer: arc distance. What angle does the font (or pixel, or whatever you're trying to scale) subtend from the position of your eye? This works always. It's sometimes hard to define for some devices (e.g. TV's, which might be used a widely varying distances), but even then the world has come up with standard conventions (e.g. assume you're 10' from your 40" TV, assume your phone is about 12" away, assume the controls in your car are at a 30" arm length...).

And, of course, it's not implemented anywhere. So we all suffer with "dpi" and "px".


According to CSS 2.1, px _should_ be defined in terms of an angle: http://inamidst.com/stuff/notes/csspx.


That's fun. Obviously that's not how it works, 1px is 1 physical pixel everywhere I've ever seen. Think of how much would break if something decided to do this "right" ...


On retina displays it is two pixels.


I haven't paid too much attention to this but shouldn't it be a square of four pixels? That would be true double resolution, right?

Or is there some pixel trickery like those olpc displays? [http://wiki.laptop.org/go/Display]


px is a measure of length, not area.

A 12 px high font is 24 px high on a retina display.

A 12 px by 12 px box is 144 px^2, or 12 x 12 x 2 x 2 = 576 px^2 on retina.


This works for head-mounted/heads-up displays, too, making it somewhat more future-proof.


Thanks for giving me yet something else tocurse browsers for.


I don't think points is a good fit for that. A point has a defined length, 1/72 of an inch. If you don't want your text to be 12/72 of an inch, don't specify the size in points.


What? That is not the case with any browser. Nor with any software as the article points out.

Would be awesome if it was...


Since the invention of desktop publishing, PostScript points have been 72 to the inch. Resolution-independent rendering was taken into consideration in the 80's. If a "point" is not 1/72 of an inch then it's rendered wrong. https://en.wikipedia.org/wiki/Point_%28typography%29


That's kind of what the article is pointing out...


>So, which size should the font be if i'm designing for both those screens? pt sure isn't the answer.

I think the "small", "medium", "large" etc would be the best choice. Those by definition should be always the correct size.


Huh? I sit at a different distance from my monitor than my phone is from my face. I buy high resolution monitors, not because I want the "OMG the circle is a smoother circle" sense of a Retina display but because I want greater density. If we're nitpicking about "pt" in particular, fine. But if we're saying we should design UI elements and text specifically so that it's always the same size, count me completely out.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: