GNOME Bugzilla – Bug 577876
Can't see the difference between a a virtual resolution and a regular one
Last modified: 2009-04-07 20:01:25 UTC
Please describe the problem: When I set up a dual monitor setup, I can't enable it because Display preferences thinks that the resolution is 3040x1050 is an regular resolution, when in fact it's an virtual one and, the resolution on each monitor is under the max resolution limit. Steps to reproduce: Yes, use an card from or ati with either, xf86-video-intel, ati or radeonhd,and try to set up two monitors with high resolution, like in my case one monitor with 1366x768 and 1680x1050. Actual results: It tries to apply it but can't because it thinks my video card can't handle a resolution over 1980x1600, like it should behave if I set the resolution for a single monitor. Expected results: You will get an message similar to this one: "required virtual size does not fit available size: requested=(3040, 1050), minimum=(320, 200), maximum=(1920, 1600)" Does this happen every time? Yes Other information:
Ive found a workaround: just define a custom virtual line in xorg.conf, but it's not a very user friendly solution.
Oh, yes, sorry that that message is not very clear. To use a dual-monitor setup, X needs to allocate a frame-buffer size that is big enough for both monitors. This is what one configures with the "Virtual" line in xorg.conf, as you found out, and it is what GNOME checks when it shows the "required virtual size..." message. Unfortunately, the current versions of X drivers don't let you change the Virtual size at runtime; it needs to be preconfigured in xorg.conf. Ubuntu has a custom patch to tweak xorg.conf from the Display Properties capplet, but we haven't integrated it into the mainstream GNOME yet (it's on my to-do list). Expect this to work in the near-term future; sorry for the trouble. For now I'll close the bug as NOTABUG, as GNOME is indeed checking for the required virtual size.