Once upon a time, pixels and points were happily married. In the days of the Mac Plus and, a bit later, 17” 1024x768 CRTs, developers could expect a 10 pixel line would be 10 points long.
That’s no longer true. The 9 pixel fonts that were readable in 2000 are now impossibly small, but the 9 point fonts of 2000 should be as readable as ever. (Assuming well behaved software and same aged users!)
This seems rather esoteric, but many of us still carry habits and assumptions from that old world. Happily I just wrote up a refresher for my own use, so I’m happy to share :-).
First a quick refresh for non-specialists (see [1] for definitions):
- A pixel is an attribute of the computer screen. Higher resolution screens have smaller pixels.
- A point is a unit of length with a laughable history. The current “point” is the “DTP” point popularized by Warnock/PostScript – 1/72 of the anglo-saxon “inch” [2].
- Pixels and points correspond when a screen has 72 pixels per inch (PPI [3]). The original Mac had a screen res of 72 PPI (pixels/inch). So did the old 17” 1024x786 CRT. On those screens, a 9 pixel font looked like a 9 point font. Some UI standards may be based on 72 PPI screens where 9 points = 9 pixels
Modern screens have much higher resolutions, and thus more than 72 PPI. Apple’s 30” cinema display is 100 ppi and the 27” iMac is 108 ppi. The iPad is 132 ppi, the iPhone has 160 ppi, the Droid has 265 ppi, and the next generation iPhone is rumored to be 330 ppi (so HD video might fit in the phone)
A 9 pixel font that was readable at 1024x760 on a 17” CRT would be about ¼ the size on iPhone 4. Obviously, it would be unreadable. Of course that wouldn’t happen right? Developers would specify everything in points and apps and OS would translate to pixels properly.
In practice the latter happens a lot – often for good reasons [4]. OS X 10.6 applications, for example, have fonts that render at too small points on a 108 ppi display. XP is similar. Windows 7 and OS X were supposed to both be resolution independent, but it didn’t seem to take [4].
Things are different on new age computers. The Droid, iPad and iPhone expect and respect points, not pixels. That’s good, if they didn’t then documents would be unreadable on those high PPI devices.
If we’re lucky, a few years from now, only OS designers will need to know the difference between pixels and points …
[1] http://stackoverflow.com/questions/604203/twips-pixels-and-points-oh-my
PIXEL
The smallest dot you can draw on a computer screen
POINT
996 points are equivalent to 35 centimeters, or one point is equal to .01383 inches. This means about 72.3 points to the inch. We in electronic printing use 72 points per inch
1 point (Truchet) = 0.188 mm (obsolete today)
1 point (Didot) = 0.376 mm = 1/72 of a French royal inch (27.07 mm)
1 point (ATA) = 0.3514598 mm = 0.013837 inch
1 point (TeX) = 0.3514598035 mm = 1/72.27 inch
1 point (Postscript) = 0.3527777778 mm = 1/72 inch
1 point (l’Imprimerie nationale, IN) = 0.4 mm
[2] 72 has a LOT of divisors. That’s probably why it was chosen.
[3] PPI is pixels per inch, not points per inch. Unfortunate ambiguity there.
[4] I’m simplifying a topic that’s really beyond my ken. Fonts are the easy part of resolution independence. The real problem is all the raster images that are a part of a “modern” UI. If you want your font to scale along with it’s nice “tab “background, it has to be specified in pixels, not points. Maybe one day SVG 6.0 will take care of the rest of the problems …