GNOME Bugzilla – Bug 378338
Deal with X servers which misreport the screen's dimensions
Last modified: 2009-05-14 17:43:17 UTC
If your X server does not reporting the screen's physical dimensions correctly, then you get unusably huge or tiny fonts. This is because the DPI value is computed based on meaningless data. To test this, put "DisplaySize 16 16" in your xorg.conf. This happens for a good number of exotic monitors and laptops. The attached patch does the following: 1. In gnome-settings-daemon, we first check if the user's GConf setting for DPI is actually set. If so, we use it --- this means that the user has configured that by hand. Otherwise, we compute the DPI from what the X server reports. We then see if the computed value is within a reasonable range (50-500 DPI); otherwise, we use a fallback value of 96 DPI. 2. In gnome-font-properties, we do the same to see how to fill the DPI spin button initially. This gives users usable desktops even if their X server is misconfigured in that respect.
Created attachment 77046 [details] [review] gnome-control-center-378338-sanitize-dpi.diff
One problem I can see with the patch is that g_screen_width/height_mm() can return 0 (we can't handle this properly at the moment - see bug 398568 - but that's no reason to continue being sloppy ;-)). Otherwise +1 from me.
*** Bug 398568 has been marked as a duplicate of this bug. ***
(In reply to comment #2) > One problem I can see with the patch is that g_screen_width/height_mm() can > return 0 (we can't handle this properly at the moment - see bug 398568 - but > that's no reason to continue being sloppy ;-)). Eeek, good point :) Big fat division by zero. I'll take care of this. I'll also do the checks whenever we get a "changed" from the DPI key in GConf. This can happen: 1. User does "gconftool-2 --unset /desktop/gnome/font_rendering/dpi" because he wants to use the DPI reported by the server. 2. gnome-settings-daemon gets a notification that the key changed. Since the key is not set in the user's GConf data, g-s-d gets the value from the GConf schema defaults (i.e. 96 DPI). 3. The user gets 96 DPI for his session, which is not what he wants. 4. Upon logout/login, the user gets the X server's DPI as he expects. This last bug is https://bugzilla.novell.com/show_bug.cgi?id=240246, by the way.
(In reply to comment #4) > This last bug is https://bugzilla.novell.com/show_bug.cgi?id=240246, by the > way. It looks like outsiders are not invited to this party...
Created attachment 81623 [details] [review] Updated This fixes the division-by-zero problem. No changes were needed to refresh automatically when you unset the GConf key; I think I was using a bad patch on my testing machine...
Hm, Federico, are you aware of bug 104341?
(In reply to comment #7) > Hm, Federico, are you aware of bug 104341? Oh, thanks for the link! I'll mark it as a duplicate of this one.
*** Bug 104341 has been marked as a duplicate of this bug. ***
In my experience the DPI you calculate from gdk_screen_get_width (screen) and gdk_screen_get_width_mm (screen) does not necessarily match the DPI the font server is using; I found that this snippet of code we use in AbiWord works much better: FcPattern *pattern = FcPatternCreate(); if (pattern) { double dpi; XftDefaultSubstitute (GDK_SCREEN_XDISPLAY (gScreen), iScreen, pattern); if(FcResultMatch == FcPatternGetDouble (pattern, FC_DPI, 0, &dpi)) { /* do whatever with the dpi */ } FcPatternDestroy (pattern); }
Federico, any reason you close a bug with useful discussion as duplicate of your new bug rather the other way around?
Interestingly I'd done exactly the same thing (although with less sanity checks) to our fork of settings-daemon. Yay for synergy!
Created attachment 81643 [details] GNOME Font Greeter You ask LazyWeb for a tool (http://primates.ximian.com/~federico/news-2007-01.html#font-sizes). LazyWeb gives you the tool. Things left to do: - really do the first-run-test - merge it into some existing GNOME package
Actually running something on first login is a bad idea imho -- but a sane font chooser is a good idea.
Created attachment 81645 [details] Screenshot of my thing.
While I very much appreciate the effort to make fonts not suck, I think that Federico's proposal misses the point. Font sizes have a meaning and trying to hide the numbers from the user only because they are wrong is not a solution. We should instead make sure that the numbers are meaningful. I would want to tell people that I am using a 12 point font on my desktop. With the proposed approach I couldn't do that. I would have to resort to saying "I have set up my font sizes to the third row in the font size chooser.". So please fix the X server and please improve the font selection dialog. But don't try to hide font size information from our users. Thanks for considering this.
Would a compomise be to have the visual display of font size, but also put next to it the real font size? This way users can pick a size visually, and also will start to learn what "12pt" means.
Entirely agree with comment #16; users generally know what 12pt means and those who do not, will have to learn that sooner or later (I am tired of gnome assuming user IQ of < 50). Ross's proposal makes lot of sense to me, it combines best of the two approaches.
Created attachment 81650 [details] Updated GNOME Font Greeter Took inspiration from Aaron's mockup.
Created attachment 81651 [details] Updated Screenshot of the Greeter
Lot of discussion there: http://abock.org/2007/02/01/seriously-lets-make-fonts-not-suck/. Shall we create a separate bug for the greeter thing?
Created attachment 81709 [details] mockup for a font settings dialog I did a quick mockup for a font settings dialog with pygtk/glade. Any suggestions?
Created attachment 81711 [details] mockup for a font settings dialog with display of font size Another attempt. This time the font size is displayed.
(In reply to comment #21) > Shall we create a separate bug for the greeter thing? Please do, and take the font settings dialogs with you...
*** Bug 402745 has been marked as a duplicate of this bug. ***
(In reply to comment #24) > (In reply to comment #21) > > Shall we create a separate bug for the greeter thing? > > Please do, and take the font settings dialogs with you... > Done: Bug 403414
Let's get the patch into 2.19 as early as possible so we have time to find out whether improvements such at that mentioned by Tomas are necessary.
(In reply to comment #27) > Let's get the patch into 2.19 as early as possible so we have time to find out > whether improvements such at that mentioned by Tomas are necessary. Thanks! I'll commit it.
Committed to trunk. 2007-03-21 Federico Mena Quintero <federico@novell.com> Fix the gnome-settings-daemon part of https://bugzilla.novell.com/show_bug.cgi?id=217790 and http://bugzilla.gnome.org/show_bug.cgi?id=378338: try to figure out the DPI value from the X server or the user's GConf settings. Should also fix https://bugzilla.novell.com/show_bug.cgi?id=240246. * gnome-settings-daemon/gnome-settings-xsettings.c (gnome_xft_settings_get): Call get_dpi_from_gconf_or_server() to figure out a reasonable DPI value; don't unconditionally get it from GConf. (get_dpi_from_gconf_or_server): New function. If the user has ever set the /desktop/gnome/font_rendering/dpi value in GConf, we use its value. Otherwise, we ask the X server. We constrain the X server's response to a range of reasonable DPI values, since some servers lie about the screen's phisical dimensions --- the user would get unusably huge or tiny fonts otherwise. * capplets/font/main.c (dpi_load): First, see if the DPI value is actually set in GConf. If it is, it means that the user has changed it at least once. In that case, just use the value. Otherwise, find the value from the X server in a similar way to what we do in gnome-settings-daemon.
Federico, could we let the user specify "always use display's DPI"? I do try to make sure my xorg.conf's have my display sizes, and I set the /desktop/gnome/font_rendering/dpi key upon login (I use both local and remote logins). Perhaps a DPI value of 0 could be interpreted as "ignore GConf DPI, use X server instead". This will allow people with leftover GConf DPI values (people upgrading from 2.18.x or earlier) to take advantage of the new feature, without resorting to gconftool-2 CLI stuff.
This patch makes vte based applications absolutely unusable when custom dpi is set with gconf-editor (just try to set it and run gnome-terminal to reproduce it). ( Yes, I want to use 96 dpi instead of autodetected 107 dpi, beacause fonts are way too big with such setting. )
Fryderyk, what happens if you set your gnome-terminal font size to something smaller? Could you supply a screenshot (or window shot) to demonstrate why this would be ineffective?
Created attachment 94872 [details] GNOME terminal window with autodetected DPI
Created attachment 94873 [details] GNOME terminal window after forcing dpi key in gconf-editor
Created attachment 94874 [details] New GNOME terminal window after DPI is forced
Created attachment 94875 [details] GEdit window for demonstrating autodetected font size Font on the screenshot: DejaVu Sans 8
Created attachment 94876 [details] GEdit with forced 96 DPI This screenshot show my preffered font size. That's what I mean saying "way to big". Font in the screenshot DejaVu Sans 8
ok, setting DPI in advanced font setting solves the issue for me, so you can safely ignore my previous comments.
I'm way late at noticing this discussion, but I feel I should follow up here. Servers misreporting DPI was only one of a number of factors in deciding to hardcode DPI at 96, instead of being the primary reason. (as claimed in http://www.gnome.org/~federico/news-2007-01.html#font-sizes) Factors: * Preferred font size is very much a factor of device usage. Using a fixed linear size will give you a much too big font size on a handheld device. Laptops and desktops have different viewing distances too. * As long as pixels are a visible, people tend to choose viewing distance based on screen resolution. They'll sit further way from a lower resolution device. 1400x1050 laptops probably pass that threshold so they need a little bigger fonts than 1024x768 laptops. But I don't think they need 40% bigger fonts. * On adjustable resolution devices (CRT's) adjusting the font size based on resolution is perverse ... people drop resolutions to get bigger images and fonts, and then the fonts get smaller. * Using reported DPI below around 75dpi gives unreadable fonts even if the display is reporting correctly because there are just not enough pixels. * On a projector, you definitely don't want to use 75dpi (the X default), I'm not sure that you want to use want to use the reported DDC values from the projector either ... it's going to be more or less random. (You *definitely* don't want to use the real physical resolution of the screen. One pixel fonts? :-) * We don't have scalable graphics in most places in our desktop; we definitely don't scale graphics with font size. So DPI's out of the, say, 90-120 range are going to upset things. And, yes: * DDC hasn't always been reliable. This isn't just a question of "my monitor doesn't ever DDC", there are problems with monitors intermittently not DDC'ing. And it's very surprising to users if they reboot and there fonts are all the wrong size, and they fix them, then reboot again and they are wrong again. I'm not sure I believe that we'll see DPI go up much beyond the 120 or so range ... it's just not that big an improvement compared to the dpi^2 number of pixel cost. But if we ever get to the world of 200dpi monitors, then clearly the old approach of hardcoding 96dpi wouldn't work. So what approach be in that world? I think we could come up with a reasonable logical_dpi = F(device_type, device_size_pixels, device_size_inches) relationship. And since device_type is *mostly* guessable from size (5" == handheld, 9" == tablet, 14" == laptop, 19" = desktop monitor), that even might be reducible to: logical_dpi = F(device_size_pixels, device_size_inches) But it's *not* device_size_pixels logical_dpi = -------------------- device_size_inches So, to improve on the 96dpi hardcoding, the correct direction is: A) Figure out the appropriate F() heuristics to make a reasonable guess at a good font size B) Redo gnome-font-properties to look more like Federico's mockup; guessing wrong for the DPI then sticking the control into an Advanced tab is *distinctly* unfriendly. (This does raise the question of what the right way of having the user resize individual fonts ... you might want larger or smaller terminal fonts based on their design) C) Start really working on making the other bits of our user interface from the fonts scale properly. I'd suggest that at the very minimum the following change should be made to gnome-settings-xsettings.c: -#define DPI_LOW_REASONABLE_VALUE 50 -#define DPI_LOW_REASONABLE_VALUE 96 I know of no circumstances in which you want to use a logical font DPI of less than 96.
(In reply to comment #39) > I'm way late at noticing this discussion, but I feel I should follow > up here. > Servers misreporting DPI was only one of a number of factors in deciding > to hardcode DPI at 96, instead of being the primary reason. And all those reasons have been refuted before > Factors: > > * Preferred font size is very much a factor of device usage. Using a > fixed linear size will give you a much too big font size on a handheld > device. Laptops and desktops have different viewing distances too. That's only partialy true the human viewing angle is pretty fixed and most people position their screen so it fills it. > * As long as pixels are a visible, people tend to choose viewing distance > based on screen resolution. Aka "as long as the 96dpi hack is in place, and GNOME forces a fake pixel/point fixed ratio, GNOME users will have to think in pixels" > But I don't think they need 40% bigger fonts. That's a user preference and if the user wants a small font he just has to select a small font size in points not change the meaning of point unit (I want 10 inches of rope, no, that's too big, give me smaller inches <= previous GNOME logic) And it's a PITA because point font sizes are hardcoded all over the place but that wouldn't be the case if you'd have spent some time more to define a default font size in pt and expressed the other desktop font sizes in % of this font (problem solved w3c side years ago way before the 96dpi hack) > * On adjustable resolution devices (CRT's) adjusting the font size based > on resolution is perverse ... No. > people drop resolutions to get bigger > images and fonts, and then the fonts get smaller. Because you broke the system and make font size depending on resolution. And a lot of people do not want resolution to change font size and your hack has made them miserable for years (not to mention it is incompatible with KDE so you made GNOME font units incompatible with the units of non-GNOME fonts) > * Using reported DPI below around 75dpi gives unreadable fonts even > if the display is reporting correctly because there are just not > enough pixels. So add a I've-got-a-shitty-screen quirk. Again problem solved years ago KDE side without redefining point meaning for the normal case. > * On a projector, you definitely don't want to use 75dpi (the X default), Why do you think you want to use 96dpi instead? > I'm not sure that you want to use want to use the reported DDC values > from the projector either ... Most of the time all you need is to assume the projector dimension (not DPI) is the same of a screen with the same default resolution, but anyway projector zoom factor should be a user-exposed tunable and zoom factor is a percentage of a base level not a "DPI" thing no normal user understand the meaning of > it's going to be more or less random. Not if done properly > (You *definitely* don't want to use the real physical resolution of > the screen. One pixel fonts? :-) > > * We don't have scalable graphics in most places in our desktop; we > definitely don't scale graphics with font size. That's another thing that need to be fixed > So DPI's out of the, > say, 90-120 range are going to upset things. So you've hidden the problem, more apps were written with bad pixel assumptions and now there is 120+ dpi screens on the market things are glaringly broken > > And, yes: > > * DDC hasn't always been reliable. It's always been more reliable than you tell us; it's getting more reliable with time (and ddc is now used at the kernel level), and the solution to x misdetecting the screen size has always be to help users tell x what the screen size is instead of building on screen size misreporting (users know what a screen size is, they can measure it, when you ask them a DPI value they wonder WTF you've been drinking) > I'm not sure I believe that we'll see DPI go up much beyond the 120 or > so range ... OpenMoko and OLPC have higher dpi screens today, a lot of hardware that was available years ago is already past the 120dpi mark (mostly laptops), even Microsoft removed the dpi-to-pixel fixing in Vista (that's what you get for blindly copying stupid MS hacks, even MS fixes its stuff over time) There is a limit to the physical size that can fit in customer places. When this limit is hit (already the case for embedded, laptops, getting there for desktops) manufacturers have no choice but to compete on pixel density (plus various factors like HD+ movies). > then clearly the old approach of hardcoding 96dpi wouldn't work. It never worked but you chose not to listen to the users you broke the systems of (it's trivially broken for users with nfs or any other kind of shared $HOME that move from system to system with different hardware and screens) > So what > approach be in that world? I think we could come up with a reasonable Please stop thinking in dpis, everyone but you and users you've formatted hates dpis : 1. make screen size detection reliable (merged patch you object to + quirk for sub-75dpi screens) 2. add some gnome config applet to set DislaySize in xorg if DDC misdetected it (using hal magic, with a notifier to tell the user we think intervention is needed) 3. add a gnome config applet to set zoom level of projectors (there is no way to always get it right no matter what the heuristic is so let users select it instead of having them fight your broken heuristics) 3. have a gnome setting with "default UI font size in pt" so the user can tell if he likes big or small text 4. change other UI font size settings to a % of this font size 5. fix apps over time so they scale their other settings like bitmaps depending on this % And you're done you've got a UI that asks users things in units they understand, works for everyone (not just the laptop-user-that-likes-small-fonts you're obsessed with), is compatible with non-GNOME apps, and will still work on high-dpi screens instead of falling over
Nicholas: I think you misunderstand several things. >> * As long as pixels are a visible, people tend to choose viewing distance >> based on screen resolution. > > Aka "as long as the 96dpi hack is in place, and GNOME forces a fake > pixel/point fixed ratio, GNOME users will have to think in pixels" Or it could simply be, a user chooses a resolution high enough, and a long enough viewing distance, that the pixels are no longer visible. > point font sizes are hardcoded all over the place I have yet to find a text size in a GNOME app that I cannot adjust. *Initial* point sizes may be pre-set in user preferences. If a GNOME app has a hard-coded font size in its controls, it should be reported as a bug on that app. > (problem solved w3c side years ago way before the 96dpi hack) Except that even today, too many sites (linux.com is an example) use *pixel* sizes in their text CSS. The problem won't be solved until font pixel sizes are deprecated and then invalidated. It needs to be points and percents, not pixels. See, even web designers get caught in that rut. >> * DDC hasn't always been reliable. > > It's always been more reliable than you tell us But one of FOSS's strengths is that it can make old hardware useful again. My mother uses Fedora Core 3 with a 14 inch monitor that she got after someone else's upgrade. It does not support DDC, and her X server must be told directly in the config (by me) what her screen size is. It certainly won't run Vista, and Vista would flag her system as an "unsupported configuration," but we aren't Vista here. We are trying to give users a choice, including what hardware they use. And some users may choose old, non-DDC monitors. > Again problem solved years ago KDE side without redefining point meaning > for the normal case. Is switching screen resolution a "normal case"? If I change my laptop's resolution in KDE from 1024x768 to 640x480, the fonts become very big, instantly. They *could* be re-drawn correctly, using adjusted DPI, but they are not. Again, problem not solved. > add some gnome config applet to set DislaySize in xorg if DDC misdetected it Not possible if desktop session is via remote login. Server config won't be accessible to remote apps, even for root. The best place to override DPI in this case is in user's font rendering configuration on the client side. > add a gnome config applet to set zoom level of projectors Why do we care about projectors? The whole point of projection is to make an inch a whole lot bigger than an inch, so even the students in the back of the lecture hall can read the text on the wall behind the lecturer. Owen's point about projectors is that 1 inch or 1 cm has no meaning, except when the screen and the projector are on fixed mounts. Move the projector twice as far back, and you've just cut the real DPI in half, in exchange for a 4x larger screen. > 3. have a gnome setting with "default UI font size in pt" so the user can > tell if he likes big or small text Point size has no meaning until monitor DPI is properly set. > 4. change other UI font size settings to a % of this font size I want my buttons to be 14pt DejaVu Sans, my icon names to be "glisp", and my window titles to be 11pt Utopia Oblique. How will that fit into your simplification? > 5. fix apps over time so they scale their other settings like > bitmaps depending on this % On which percent? --------------------------------------- (I wrote the following, before Nicholas' comment came through. Apologies for redundant phrasing.) Maybe we should ask the user "how long is an inch" or "how long is a centimeter", and use a slider (or 2, for both dimensions) to deduce the correct resolution. The sliders can have ranges from 75 to 150. Each can have an associated line with tick-marks, 4 to the inch or one for each cm, and as the user moves a slider, the tick-marks re-draw. This way, the user can put a ruler directly on the screen to match 2 real inches or 5 real centimeters to what GNOME is trying to present in the UI. Once the correct size is in place, the user could click "Apply" to effect the DPI into the desktop apps. Including a big toggle button to magnify the in-window text (for poor vision, or for badly-detected DPI values) might also be a good idea. If/when the user changes screen resolution with gnome-display-properties, once a resolution is tested and accepted, we can suggest the user re-adjust the DPI setting (with option to launch above-described app), in order to keep the font rendering consistent. If the user doesn't like some font sizes (too small, or too big), the place to change this is in gnome-font-properties, not gnome-display-properties or Ctrl-KP+ and Ctrl-KP-. I hope a good UI designer understands what I'm trying to express. My strength is web UI design; I suck at non-web GUI's.
(In reply to comment #41) > Nicholas: I think you misunderstand several things. I don't :) but I may have not been crystal-clear > >> * As long as pixels are a visible, people tend to choose viewing distance > >> based on screen resolution. > > > > Aka "as long as the 96dpi hack is in place, and GNOME forces a fake > > pixel/point fixed ratio, GNOME users will have to think in pixels" > > Or it could simply be, a user chooses a resolution high enough, and a long > enough viewing distance, that the pixels are no longer visible. Either way the old "96dpi everywhere" only works for screens that have approximately this resolution. If forces users to downscale resolution to something GNOME can handle (because it has made GNOME developpers expect 96dpi) when correct font rendering wants the highest possible pixel density available (cf all the auothinter freetype bugs, or the fact CJK users are forced to use bitmap fonts because as long as pixel density does not grow scaling properly CJK glyphs is not possible) > > point font sizes are hardcoded all over the place > > I have yet to find a text size in a GNOME app that I cannot adjust. *Initial* > point sizes may be pre-set in user preferences. Yes, I was meaning pre-sets sorry. If there was only a single point pre-set and the others expressed as % of this pre-set changing GUI text size would not be so annoying to users. > The problem won't be solved until font pixel sizes are > deprecated and then invalidated. It needs to be points and percents, not > pixels. See, even web designers get caught in that rut. And if the patch Owen objects to is reverted that won't happen because GNOME points won't be real points but derived from hardware pixels. Applying the patch leads to fixing all the pixel application assumptions. Reverting the patch hides the problem for the most common hardware today. > >> * DDC hasn't always been reliable. > > > > It's always been more reliable than you tell us > > But one of FOSS's strengths is that it can make old hardware useful again. You need to understand "DDC" there means "physical screen size auto-detection via DDC in xorg (moving kernel-space lately)". The correct solution to failing autodetection has always been to help the user to tell xorg the actual physical screen size (which fixes all apps BTW not just GNOME-aware ones). This works on any hardware even very old one. And it's user-friendly because screen size is expressed in units the user knows and the user knows what his screen size is (unlike DPI which is systematically misinterpreted by users because what is DPI in real life?) This could have been done years ago and it can be done even better nowadays with hal, dbus and policykit. Owen's past solution has been instead to fake a number computed from this size, and store it in a gconf key only GNOME apps use (and that needs to be done for every user of a given system because it's not shared at the xorg level). > My > mother uses Fedora Core 3 with a 14 inch monitor that she got after someone > else's upgrade. It does not support DDC, and her X server must be told directly > in the config (by me) what her screen size is. And a config app to help you do so would have helped you more than faking DPI at the GNOME level. Correct workflow would be 1. user plugs a non-DDC screen, or a screen we don't trust the DDC of 2. system drives it to some standard safe resolution assuming some safe standard dimensions 3. a hal notification is propagated to the DE screen configuration applet of the first user with sufficient admin rights that logs in (policykit) 4. this application shows a user a dialog like "System could not autodetect screen X size, please enter it:" (display big white X on the screen to user knows what we're talking about) Aspect ratio: < > 4:3 < > 16:10 (widescreen) < > other Use standard size: <- 15" [17"] -> (or <- [17" widescreen] 21" widescreen -> if user selected widescreen before, or nothing if he selected other) Use standard resolution: <- resolution list -> (fine tuning hidden behind a >) Screen physical dimensions: Width: < xxx mm > × Height < yyy mm > Sreen resolution Width: < xxx px > × Height < yyy px > at < zz Hz > [ ] Always use these settings for this screen 6. the users selects the values he wants they're tested in the usual way then saved in xorg settings > Is switching screen resolution a "normal case"? For every laptop that can be docked, for every LAN where $HOME is nfs-shared having the physical screen (with its associated resolution capabilities) change regularly is the normal case. OLPC has a a dual-mode dual-resolution screen. > > add some gnome config applet to set DislaySize in xorg if DDC misdetected it > > Not possible if desktop session is via remote login. In remote login you don't need to set the distant xorg settings just your local ones. > > add a gnome config applet to set zoom level of projectors > > Why do we care about projectors? Because the same projector won't always be at the same distance of the projected surface (they're not all wall-mounted, the table you put projector on is not always placed at the same distance, etc), so any time you use a projector you need to ask the user how his projector is set up Correct workflow would be 1. user plugs a projector 2. system drives it to max resolution 3. a hal notification is propagated to the current DE projector config application 4. this application uses built-in heuristics to guess what the equivalent physical screen is likely to be 5. It shows a user a dialog like "New projector detected, treat it as:" <- 17" widescreen [21" widescreen 22"] widescreen 32" widescreen -> Zoom: < 100% > < > clone display to the projector < > add it as separate screen [ ] Always use these settings for this projector (heuristics-chosen equivalent physical screen selected in first list, user can choose another equivalent on the list or create a non-standard equivalent by changing the zoom value, 100% = heuristic-chosen equivalent) 6. the users selects the value he wants in this dialog and settings are propagated to xorg And now you have correct user-perceived fake physical projector size and all the normal font settings can apply > Owen's point > about projectors is that 1 inch or 1 cm has no meaning, We all understand that, what we disagree on is Owen's solution to this problem > > 3. have a gnome setting with "default UI font size in pt" so the user can > > tell if he likes big or small text > > Point size has no meaning until monitor DPI is properly set. Correction: point size has no meaning until monitor physical size (or in projector case equivalent physical size) is set properly at xorg level and DPI is computed from it > > 4. change other UI font size settings to a % of this font size > > I want my buttons to be 14pt DejaVu Sans, my icon names to be "glisp", and my > window titles to be 11pt Utopia Oblique. How will that fit into your > simplification? In my simplification you'd select "default gui text size xx pt" and then set "button font = Dejavu, at YYY% of default text size, windows title = Utopia Oblique, at ZZZ% size of default text size (or you could allow both percentages and pt values but pt values should never be used as default since they force the casual use to change every setting when all he wants to tell is "I want small text" or "I want big text") > > 5. fix apps over time so they scale their other settings like > > bitmaps depending on this % > > On which percent? bitmaps and other decorations should be scaled to follow the relevant font setting. (usually default text size, sometimes another font setting may be more relevant) > Maybe we should ask the user "how long is an inch" or "how long is a > centimeter", That does not work, because you have two different setting levels: hardware-specific settings (physical size of screen, equivalent physical size of projector) and user preferences (young hacker with good eyes who's ready to ruin them by using insanely small text sizes, normal user that wants bigger text). The fundamental flaw in old GNOME DPI forcing and in your proposal is to conflate both in a single setting, when they should be considered separately. Conflating both means you can't dissociate hardware changes (new monitor plugged in, resolution change, etc) from user preferences (big or small text), and things break up big time when one changes but not the other
Nicolas: I don't want to argue with this you point by point, and especially not in Bugzilla comments. (Maybe over a beer sometime...) But the central idea I'd like to emphasize is that users (except for perhaps a few typographers and graphical designers) do *not* think about font sizes in terms of inches or centimeters. They don't think, hmm, I want 6 lines of text per inch on the screen, let me pick a 12pt font. If they are at all familiar with font sizes, they think in terms of reading comfort ... you are going to have to move in and squint a bit to rid a 8pt font, a 10pt font is on the small side, a 12pt is comfortably large, 18pt is a title font. So, physical DPI is just not a relevant concept here. Now, I'm quite willing to admit that a hardcoded pixels/points scale factor of 96/72 does not work in all situations, it is an approximation that works well over most displays used over the last few years. What I don't see is that using the physical DPI as that scale factor is an improvement. It is neither correct nor robust.
(In reply to comment #43) > Nicolas: I don't want to argue with this you point by point, and especially > not in Bugzilla comments. (Maybe over a beer sometime...) Agreed on the beer :) > But the central idea I'd like to emphasize is that users (except for perhaps > a few typographers and graphical designers) do *not* think about font sizes > in terms of inches or centimeters. But they don't think in pixels or pseudo-points either. They think "I want text in the size I'm used to", look on a system they like where this size is indicated (usually in points because that's the main font size unit) and report this value in their GNOME app settings. Faking DPI at the GNOME level has created a situation where GNOME pt is not the rest of the world pt, and is not only GNOME but hardware-dependant, so reporting values breaks (a 12pt GNOME physical font size is not the same as a 12pt non-GNOME physical font size, a 12pt GNOME physical font size on one system is not a 12pt physical font size on another system, there is no physical size stability and users HATE it when physical font sizes change all over the place and the stupid GNOME apps show them it's STILL 12pt only not the same one). > If they are at all familiar with font sizes, they think in terms of reading > comfort ... you are going to have to move in and squint a bit to rid a 8pt > font, a 10pt font is on the small side, a 12pt is comfortably large, > 18pt is a title font. That should be the convention everyone shares but forcing 96dpi has created countless users that have no idea whether 12pt is big or small, because the old way of forcing 96dpi will produce either big or small physical size depending on the hardware, and users assume if 12pt is small/big on their system with GNOME defaults that's the same for everyone. And then they launch a non-GNOME app or change hardware and the physical font size CHANGES and they have to rethink all their assumptions (users hate that) Normal users have no idea they're supposed to set your fake DPI so 12pt is comfortably large. Why should they? As you wrote they're not typographers. The huge majority of users that complain GNOME has removed the 96dpi hack in fact do *not* expect 12pt to be comfortably large because you've induced them to think otherwise by faking units for years. > So, physical DPI is just not a relevant concept here. You'll note I never used DPI in my examples. I used physical screen size - a measure users understand and can input if needed ; I've used real points - a measure users can report from non-GNOME apps, books or other systems ; I've used percentage of prefered text size in real point - again something users can understand and that uses standard units. I've never used physical or fake DPI because that's totally irrelevant to normal users. You could ask users to input fromboz value in bumzams to select text size that would be about as relevant and useful to them. DPI is even less relevant to users than points, exposing DPI to users and helping them fake it at the GNOME but not system level has produced countless hopelessly confused users. > Now, I'm quite willing to admit that a hardcoded pixels/points scale factor of > 96/72 does not work in all situations, it is an approximation that works well > over most displays used over the last few years. What I don't see is that using > the physical DPI as that scale factor is an improvement. It is neither correct > nor robust. Correct physical screen size and resulting *internal* DPI computations gives you: 1. a baseline that is shared with other non-GNOME apps (stable system-wide) and stable against hardware changes. That's a huge thing. That means users can trust this baseline and find stable long-term font settings. That means they know 12pt is comfortably big and will stay comfortably big over time. 2. text setting dialogs in units users understand and can therefore fill correctly Now you can argue physical screen size is not 100% correct because users do not position screens at exactly the same distance, but you know what, they don't position paper or books at exactly the same distance either and physical points work just fine for paper text.
(In reply to comment #39) > relationship. And since device_type is *mostly* guessable from size > (5" == handheld, 9" == tablet, 14" == laptop, 19" = desktop monitor), > that even might be reducible to: > > logical_dpi = F(device_size_pixels, device_size_inches) This is interesting. You may be able to guess the average viewing distance for each of those cases. BUT! Owen, you know the font system much better than I do. Why do we need the DPI value at all? Does Freetype or something need to know the DPI to know how to do hinting, or is it some meaningless value that has been propagated through the API over the years, finally appearing in the user interface? > I'd suggest that at the very minimum the following change should be > made to gnome-settings-xsettings.c: > > -#define DPI_LOW_REASONABLE_VALUE 50 > -#define DPI_LOW_REASONABLE_VALUE 96 Sure. Feel free to make that change (I'm not the control-center maintainer, so ask them first). About projectors: I give talks very frequently in small and big auditoriums, and I know what it is like. You don't want a fiddly "zoom factor" setting. You want to bring up the font capplet, shout "CAN EVERYBODY IN THE BACK READ THIS?" on the microphone, and make the fonts bigger until they say "YES".
(In reply to comment #45) > Does Freetype or something need to know the DPI to know how to > do hinting, or is it some meaningless value that has been propagated through > the API over the years, finally appearing in the user interface? Sub-pixel rendering won't work properly on a flatscreen unless FT knows *exactly* where the glyph outline sits on the pixels. When it works, it works very well (even for my weak eyes); when it doesn't work, it's uuuuugly.
> Why do we need the DPI > value at all? Does Freetype or something need to know the DPI to know how to > do hinting, or is it some meaningless value that has been propagated through > the API over the years, finally appearing in the user interface? The "logical dpi" is a scale factor that converts the size specified by the user, to the number of pixels. I really don't like the term "<foo> DPI" for this number at all. This number definitely needs to exist internally. But the design of the font capplet was to avoid exposing it to the user. The advanced dialog was *all* stuff that users shouldn't touch. The idea was that you'd just change the font sizes on the front tab to be the size you wanted, and other font sizes would be based off those sizes. Difficulties with this approach are: the need to tweak five different font sizes individually (this is definitely a big pain if you need to make the font sizes bigger for a projector); the fact that people *do* use point sizes in applications rather than scaling off the system font size. So, an overall scale factor setting ("logical dpi") may be a more effective user interface for letting the user deal with the interrelated factors of: - Physical device resolution - Display distance - Individual visual acuity.
(In reply to comment #45) > Does Freetype or something need to know the DPI to know how to > do hinting, The value know as DPI (pixel size/physical size ratio) is used by Freetype to convert between font sizes in standard units to values in pixels (dependant on local hardware). Freetype needs to know this because the hinting process (auto-hinting or not) consists of distorting glyphs so they fit in the hardware pixel grid and are sharp. What Owen calls "physical dpi" is the normal pixel resolution/screen physical size computation xorg performs by default. When you fake DPI by declaring a false pixel size/physical size ratio freetype will still make font-rendering fell on the pixel grid but the physical font size won't correspond to the original unit the user requested. So every UI font unit is now meaningless. Also users get used to changing screen resolution to change text size, and as a result lower resolutions than the hardware is capable of will be used. This hurts font rendering big time as the higher the resolution, the sharper fonts are. > About projectors: I give talks very frequently in small and big auditoriums, > and I know what it is like. You don't want a fiddly "zoom factor" setting. > You want to bring up the font capplet, shout "CAN EVERYBODY IN THE BACK READ > THIS?" on the microphone, and make the fonts bigger until they say "YES". And what's the difference between text zooming and what you do? You have a tuneable you push it up text is bigger you push it down text is smaller. Except that right now you're like a kid that uses his feet instead of brakes to stop his bicycle — you achieve the desired effect, but you ruin shoes in the process.
(In reply to comment #39) > Factors: > > * Preferred font size is very much a factor of device usage. Using a > fixed linear size will give you a much too big font size on a handheld > device. Laptops and desktops have different viewing distances too. Just remember that when we talk about settings, there's default settings and then there's user-override. There's no way the default setting will answer everyone's needs, but with careful statistics-gathering, we can hit the tip of the bell curve. Regardless of default, the user should be able to change these settings and the question of "pixel size" vs. "displacement size" have their share of issues. Fortunately, for the setting that's closer to what users think of (displacement), the issues are starting to resolve themselves. > * As long as pixels are a visible, people tend to choose viewing distance > based on screen resolution. They'll sit further way from a lower > resolution device. 1400x1050 laptops probably pass that threshold > so they need a little bigger fonts than 1024x768 laptops. But I don't > think they need 40% bigger fonts. > > * On adjustable resolution devices (CRT's) adjusting the font size based > on resolution is perverse ... people drop resolutions to get bigger > images and fonts, and then the fonts get smaller. I'm sorry, Owen. My take is that people want the highest resolution they can get balanced with the highest refresh rate. If they want larger fonts, then they WANT to adjust font sizes and should be able to adjust font sizes, not screen resolution. Same goes for icon sizes. > * Using reported DPI below around 75dpi gives unreadable fonts even > if the display is reporting correctly because there are just not > enough pixels. That is if DEFAULT font size is configured in pixels. If it were set in length (which is what people are normally likely to think except they were forced to think in pixels because of computers), then it should adjust quite nicely with correct DPI reporting. But as I mentioned in fedora-devel, a couple of factors still have to be configurable like distance from screen and screen height. But we'll get there. Many of the arguments made were caused specifically by faulty DPI settings ... either because they were hardcoded or they were reported wrong by the server. > * We don't have scalable graphics in most places in our desktop; we > definitely don't scale graphics with font size. So DPI's out of the, > say, 90-120 range are going to upset things. It's the ideal, though, and there are moves to make it so (SVG, for one). Still, if a hack has to be made to fix this, then whatever hack it is shouldn't affect font settings which are starting to come out nicely. Hopefully, the day will come when the user doesn't even have to see the numeric values. Just scrolling the mouse-wheel up and down and seeing the result on the desktop in real-time will be the way to go. Then if the hardware gets better and resolution can be upped without sacrificing refresh rate, the fonts and icon sizes will remain just the way the user wants them.
What is the status of this bug? It is marked fixed, and hasn't had any activity in over a year, yet GNOME 2.24.1 (at least in Ubuntu 8.10) still seems to default to 96 DPI regardless of my physical screen DPI. Is this Ubuntu-specific? I would very much like to see this bug resolved, as currently only one of my 4 displays runs at 96 DPI. My laptop is at 147 dpi, and my netbook is 134. I agree with all the arguments made by Nicolas Mailhot above, and hope this will get fixed soon!
I can confirm this. The bug is not fixed in GNOME 2.24.1 (Ubuntu 8.10).