After an evaluation, GNOME has moved from Bugzilla to GitLab. Learn more about GitLab.
No new issues can be reported in GNOME Bugzilla anymore.
To report an issue in a GNOME project, go to GNOME GitLab.
Do not go to GNOME Gitlab for: Bluefish, Doxygen, GnuCash, GStreamer, java-gnome, LDTP, NetworkManager, Tomboy.
Bug 534246 - screen resolution settings doesn't work
screen resolution settings doesn't work
Status: RESOLVED NOTGNOME
Product: gnome-control-center
Classification: Core
Component: Display
2.22.x
Other All
: Normal normal
: ---
Assigned To: Control-Center Maintainers
Control-Center Maintainers
Depends on:
Blocks:
 
 
Reported: 2008-05-21 20:12 UTC by rom
Modified: 2008-05-24 12:23 UTC
See Also:
GNOME target: ---
GNOME version: 2.21/2.22


Attachments
nvidia-settings (107.58 KB, image/png)
2008-05-21 20:12 UTC, rom
Details
gnome screen resolution settings (29.89 KB, image/png)
2008-05-21 20:12 UTC, rom
Details
screenshot of the bug with compiz (228.67 KB, image/png)
2008-05-22 14:05 UTC, rom
Details
screenshot of the bug without compiz (112.90 KB, image/png)
2008-05-22 14:34 UTC, rom
Details

Description rom 2008-05-21 20:12:01 UTC
Please describe the problem:
Hi,

I have a NVidia GeForce 8600GS, and I have a vsync problem, in videos, and even in all the system (when I move a window, I see horizontal lines which shows vertical desynchronisation). I have the problem with or without compiz (with metacity).

Exploring the problem, I discovered that nvidia-settings and gnome screen resolution settings doesn't show the same screen frequency :
- nvidia-settings shows 60Hz
- gnome screen resolution settings shows 50Hz (and there is no other option) (and the screen name is "unknown")

Running glxgears with vsync enabled in nvidia-settings shows 60 fps, so I think the real frequency is 60Hz.

I think this could be the source of my vsync problem (and apparently, many people have this problem in ubuntu, several answers with people having a nvidia card, and a person who tested on mandriva reported that he has only the problem with ubuntu).

So my bug report concerns the bad screen resolution in gnome screen resolution settings.

In addition, I would like to totally configure screen resolution and frequency in xorg.conf, disabling the gnome settings (until it works). Could you help me to do that? I tried to save settings in xorg in nvidia-settings :
Section "Screen"
    Identifier     "Screen0"
    Device         "Videocard0"
    Monitor        "Monitor0"
    DefaultDepth    24
    Option         "TwinView" "0"
    Option         "metamodes" "1680x1050_60 +0+0"
EndSection

But after a reboot, doesn't resolve the problem.

Steps to reproduce:
1. Use a nvidia graphic card
2. See the frenquency in nvidia-settings
3. See the frenquency in gnome screen resolution settings
4. Compare


Actual results:
The screen frequency in gnome settings is bad

Expected results:
I expect gnome to display the good frequency

Does this happen every time?
yes

Other information:
I will attach 2 screenshots
Comment 1 rom 2008-05-21 20:12:31 UTC
Created attachment 111294 [details]
nvidia-settings
Comment 2 rom 2008-05-21 20:12:51 UTC
Created attachment 111295 [details]
gnome screen resolution settings
Comment 3 rom 2008-05-22 14:05:13 UTC
Adding the line:
Option        "DynamicTwinView" "false"

in xorg.conf (in Device) resolve the available frequencies...

But does not resolve the problem (see the attached screenshot).
So I don't think the problem comes from gnome...
Comment 4 rom 2008-05-22 14:05:46 UTC
Created attachment 111338 [details]
screenshot of the bug with compiz
Comment 5 rom 2008-05-22 14:34:08 UTC
Created attachment 111342 [details]
screenshot of the bug without compiz
Comment 6 Jens Granseuer 2008-05-24 12:23:11 UTC
The Gnome settings dialog gets the available frequencies from the X RandR extension which, in turn, (I suppose) gets them from the graphics driver. In your case it looks like either the NVidia driver is reporting wrong numbers to X, or X is doing the wrong thing with the numbers it gets. But then again, NVidia's TwinView is proprietory and closed-source, so nobody except NVidia knows what it's doing anyway...