GNOME Bugzilla – Bug 595839
cannot create multihead setup w/ intel card and openGL window manager
Last modified: 2013-06-13 10:03:42 UTC
This is something that has been annoying me for quite a while now. I'm not sure whose fault it is: X, the open source intel drivers, randr, gnome-settings-daemon or gnome-display-properties. If I have either Compiz or Mutter instead of Metacity, trying to use gnome's display properties app to create an extended desktop (or simply hitting the autodetect monitors" button) will result in a borked up screen that doesn't show anything. It works fine with plain metacity. Don't quite know where/what to look for. I searched gnome's bugzilla and launchpad, but couldn't exactly find this problem reported. The reason I am reporting this as "severity: major" is that the gnome shell will use a composited window manager (mutter), and the multihead feature will be broken.
Bump, could I at least have an indication on what is the faulty component? Surely there are some gnome-shell hackers using Intel cards and plugging into projectors once in a while?
Possibly, but I wouldn't expect them to read this here... Since it works with metacity g-s-d is an unlikely culprit, too.
Reassigning to mutter, then. Not sure it is the culprit, but probably this will get this bug moving forward a bit.
Unfortunately I don't have links to the appropriate bugs/comments, but I think I can contribute from memory: What's going on is that there is a hardware limit in the Intel i945 video chip. I think it's a maximum texture size of 2048x2048 or something close to that. Attempting to exceed that limit crashes xorg. The current version of the gnome-display-properties applet automatically detects a newly plugged in external screen and attempts to configure it to work as a horizontally extended desktop. Since the total width of the two screens is generally going to exceed 2048 pixels, this results in an xorg crash if a compositing-heavy window manager (e.g., compiz) is running. It doesn't cause a crash with metacity. Apparently there is no easy way to fix the intel video driver. But there are several possible workarounds via gnome-display-properties which could reduce the severity of this issue. Some of these workarounds would enhance its functionality at the same time: * Keep a database of the external screens which have been connected, and their settings (resolution, refresh rate, and offset, and primary vs secondary display). If a newly connected screen is found in that database, then use those stored settings instead of the default. Then if the user configures an external display successfully once (under metacity, for example) he will not see these crashes any more, and every time he connects that display it'll come back the way he likes it, saving time and hassles. * Check the chipset and don't attempt to do a horizontally spanned desktop if the chipset can't handle it. Optimally, check the use of compositing and the sizes of the connected displays, and allow the horizontal spanning if it's safe; if compositing is in use, then warn the user and suggest that he can do horizontal spanning by turning off "desktop effects". * Change the default behavior to do vertical spanning instead of horizontal. Since the majority of laptop displays are 800 pixels high and the majority of external LCD panels are 1024 pixels high, this will usually work.
My primary concern is GNOME 3.0, specifically the GNOME Shell. I tried again today with a git checkout, and the shell completely blows up if you plug an external screen in my netbook and run gnome-display-properties. The shell itself disappears, leaving only the wallpaper and no "panel". As a user, you have no way to fix this, except unplugging the monitor and killing Xorg. If the hardware really can't handle a combined resolution over something in particular, make it so that gnome-display-properties' monitor autodetection scheme doesn't (stupidly) try to *always* force-turn-on-all-monitors-at-full-res; let the user configure the resolutions first, and check if the user confirms that "it works" within X seconds after applying it. What makes this so infuriating is that there's no way out of crashing, currently, because the autodetection feature blows things up before you had the chance of doing anything. This is a showstopper for anyone who will be doing talks/presentations with Intel hardware (or at least the common i915 stuff).
I'd love to provide relevant info... if someone can tell me what kind of log files or hardware info to get.
Thanks for the bug report. This particular bug has already been reported into our bug tracking system, but please feel free to report any further bugs you find. *** This bug has been marked as a duplicate of bug 646280 ***