Solving the Linux DPI Puzzle

by Billy Biggs <vektor@dumbterm.net>, Sun Dec 5 2004

Linux applications use the DPI reported by the X server when converting from font point size to pixels. "Sans 10" will be a smaller number of pixels if your X server is configured at 75 DPI than if it is at 100 DPI.

How the XFree86 and Xorg servers calculate DPI

The DPI of the X server is determined in the following manner:

  1. The -dpi command line option has highest priority.
  2. If this is not used, the DisplaySize setting in the X config file is used to derive the DPI, given the screen resolution.
  3. If no DisplaySize is given, the monitor size values from DDC are used to derive the DPI, given the screen resolution.
  4. If DDC does not specify a size, 75 DPI is used by default.

You can check what DPI your X server is set to by running xdpyinfo | grep resolution in a terminal window.

How X applications choose DPI

For traditional fonts on X, the DPI was chosen as either 75 or 100 depending on which set of fonts was listed first in your X font path.

Xft/fontconfig added a new DPI parameter as an X resource (Xft.dpi). If set, this value is used by both Qt 3 and GTK+ 2 when rendering. If it is not set, Xft, GTK+-2 and Qt 3 will all fall back to using the DPI reported by the X display.

GNOME also offers an interface to set the Xft DPI. If "gnome-settings-daemon" is running, this will advertise the DPI value set in the gnome-font-properties dialog to GNOME applications via XSETTINGS, and set the Xft.dpi X resource. Any Qt application started after gnome-settings-daemon is running will use the Xft.dpi value configured by GNOME.

Problem: choosing default font sizes

Having a standardized DPI is important for choosing good default font sizes.

Windows machines use the DPI value as a way of globally changing font size. Windows XP defaults to 96 DPI. Changing to large font mode increases the DPI to 120. Users can also specify a custom DPI value. The default application font on Windows is "Tahoma 8".

MacOS X standardizes on 72 DPI, which means that fonts are smaller on the Mac at the same point size as on Windows. The default font on my MacOS X laptop is "Lucida Grande 13".

GTK+ uses a default application font of "Sans 10". This size seems to be chosen assuming a screen configured to 96 DPI.

DPI in practice

The DPI used on a Linux desktop is defined by the following:

  1. If gnome-settings-daemon is running, it defaults to 96 DPI, and all GTK+/Qt applications will use this value. Your fonts will appear as intended.
  2. Otherwise, some distributions launch X using "-dpi 100". Fonts will appear as intended.
  3. If your monitor announces a size via DDC, X will derive its DPI from these values. These values are unreliable, and regardless, this is not a good way to determine font sizes. The result is fonts which are usually either too big or too small.
  4. Your X server uses 75 DPI, and your fonts are all too small.

In one weekend supporting tvtime and Eclipse on IRC, I saw the following DPI values from various users, all of whom were using the default X setup from their distribution: 75x75, 85x80, 100x100, 117x115, and 133x133.

Proposal

I strongly believe that fonts on Linux should pick a standard default DPI value, and thereby standardize the pixel sizes of fonts on Linux desktops. Applications and desktop systems cannot reliably choose default font sizes without this. My proposal is that we standardize this value in Xft.dpi so that it does not affect the DPI reported by the X server for applications which may find this information useful.

The proposal is as follows:

  1. Decide on 96 DPI as the default, since this is already quite popular and matches all default GNOME desktops.
  2. Distributions should set the X resource "Xft.dpi" in an Xresources file sourced by all X servers.

Here is the code for inclusion in the Xresources file:

     Xft.dpi: 96.0

Should the X server DPI match the Xft DPI?

There are conflicting ideas about whether screen DPI is useful or not. Configuring Xft to use the default value of 96.0 is much less controversial, as it only applies to fonts.

Why is a fixed default DPI better than an autodetected one?

An obvious criticism of this proposal is that it is proposing a single and arbitrary DPI, which seems to go against the whole concept of "dots per inch". Having a DPI value calculated based on the monitor size is appealing.

The reasons why I believe a fixed DPI is a good idea are as follows:

  1. User interfaces are designed in terms of pixels. Having a standard default fixed DPI makes it possible for application and desktop developers to choose reasonable default fonts that will be readable even at low DPI.
  2. Font hints are specified for certain popular font sizes. Changing the DPI can affect the appearance of text, not just its size.
  3. DPI applies well to printing, but not well to the screen. If I project my laptop display on a screen for a presentation, the theoretical DPI has clearly changed, but I do not want all of my fonts to suddenly change with it. DPI values for computer screens are simply convention and not meaningful.
  4. Other operating systems like Windows and MacOS choose arbitrary DPI values rather than auto-calculate it.

References

  1. fontconfig bug 2014
  2. DPI on Windows systems
  3. Visual effect of hinting at different DPIs
  4. My original proposal