GNOME Bugzilla – Bug 534246
screen resolution settings doesn't work
Last modified: 2008-05-24 12:23:11 UTC
Please describe the problem: Hi, I have a NVidia GeForce 8600GS, and I have a vsync problem, in videos, and even in all the system (when I move a window, I see horizontal lines which shows vertical desynchronisation). I have the problem with or without compiz (with metacity). Exploring the problem, I discovered that nvidia-settings and gnome screen resolution settings doesn't show the same screen frequency : - nvidia-settings shows 60Hz - gnome screen resolution settings shows 50Hz (and there is no other option) (and the screen name is "unknown") Running glxgears with vsync enabled in nvidia-settings shows 60 fps, so I think the real frequency is 60Hz. I think this could be the source of my vsync problem (and apparently, many people have this problem in ubuntu, several answers with people having a nvidia card, and a person who tested on mandriva reported that he has only the problem with ubuntu). So my bug report concerns the bad screen resolution in gnome screen resolution settings. In addition, I would like to totally configure screen resolution and frequency in xorg.conf, disabling the gnome settings (until it works). Could you help me to do that? I tried to save settings in xorg in nvidia-settings : Section "Screen" Identifier "Screen0" Device "Videocard0" Monitor "Monitor0" DefaultDepth 24 Option "TwinView" "0" Option "metamodes" "1680x1050_60 +0+0" EndSection But after a reboot, doesn't resolve the problem. Steps to reproduce: 1. Use a nvidia graphic card 2. See the frenquency in nvidia-settings 3. See the frenquency in gnome screen resolution settings 4. Compare Actual results: The screen frequency in gnome settings is bad Expected results: I expect gnome to display the good frequency Does this happen every time? yes Other information: I will attach 2 screenshots
Created attachment 111294 [details] nvidia-settings
Created attachment 111295 [details] gnome screen resolution settings
Adding the line: Option "DynamicTwinView" "false" in xorg.conf (in Device) resolve the available frequencies... But does not resolve the problem (see the attached screenshot). So I don't think the problem comes from gnome...
Created attachment 111338 [details] screenshot of the bug with compiz
Created attachment 111342 [details] screenshot of the bug without compiz
The Gnome settings dialog gets the available frequencies from the X RandR extension which, in turn, (I suppose) gets them from the graphics driver. In your case it looks like either the NVidia driver is reporting wrong numbers to X, or X is doing the wrong thing with the numbers it gets. But then again, NVidia's TwinView is proprietory and closed-source, so nobody except NVidia knows what it's doing anyway...