GNOME Bugzilla – Bug 342850
gnome-screensaver doesn't restore gamma value after blanking screen
Last modified: 2008-09-02 20:37:58 UTC
Please describe the problem: After the screen is recovered, the gamma value is reset to 1 and does not restore the value to the one that was used before blanking the screen Steps to reproduce: 1. Set the gamma correction using "xgamma -gammma <value>" 2. Wait for the screensaver to blank the screen, then wake it up Actual results: The gamma correction is set back to 1 Expected results: The gamma correction should be set to the same value as before blanking the screen Does this happen every time? Yes, happens every time Other information: Maybe it doesn't reset it to 1 but to whatever is set in the global x-config, but you should be able to reproduce anyway.
We read the gamma value when gnome-screensaver starts and use that as the nominal value. So if you execute your xgamma call before gnome-screensaver starts this should work as exected.
It is as you wrote. If I kill gnome-screensaver, set the gamma and then launch it, it keeps the gamma. However, I still think it would be better to always use the value that is in use just before the screensaver is activated, not to the value it was set to when the process was started.
I can confirm this problem. There is no gamma set in my xorg.conf (I'm using Ubuntu), and xgamma shows 1.0. If I set it to eg 1.6, it is reset when the screensaver blanks, but the xgamma settings remains at 1.6 though it displays as 1. If I reissue the command xgamma -gamma 1.6, it is set again, though the value hasn't changed. The same is true for using the nVidia Xconfig. If I set it to 1.6, the screensaver seems to reset it to 1, but simply by opening the nVidia panel (and not touching anything), the Gamma is set right again, as the value hasn't changed. This didn't happen with the far superior xscreensaver as far as I know. This is a definite bug. Expected behavior: Gnome Screensaver observes the current Gamma, not what was set in xorg.conf when the system started. It shouldn't change the Gamma at all.
Yes, it is a bug. It is a matter of when the baseline gamma value is determined. Want to try to fix it?
*** Bug 469516 has been marked as a duplicate of this bug. ***
As I said in bug 469516, this isn't just a matter of saving the gamma value, but also the gamma LUTs.
Created attachment 96700 [details] [review] Obtain gamma info before each fade, release it when the fade is reset In this patch, gamma_info_init is now called in gs_fade_start, and gamma_info_free is called at the end of gs_fade_reset. There are some other minor changes to make this work. It seems to work really well. The biggest concern I have is that since the gamma info is changed more often, it may be more susceptible to problems if being accessed by more than one thread at a time. It seems that nothing else in gs-fade.c has been made thread safe, so it must not be necessary, but I can't know for sure.
Everyone who has a calibrated screen would appreciate it if you would save the LUT as well as the RGB gamma values.
(In reply to comment #8) > Everyone who has a calibrated screen would appreciate it if you would save the > LUT as well as the RGB gamma values. > gamma_info_init already saves them, and gs_fade_reset restores them, provided that VidModeExtension is version 2.1 or greater. Testing on my system (Ubuntu Gutsy beta) shows that the gamma ramps are being used instead of the RGB values.
Ah, great, thanks. I was going on anecdotal evidence from G2.18 when the LUT was set after gnome-screensaver started.
See also https://bugs.launchpad.net/gnome-screensaver/+bug/33214 It could be nice (while probably out of scope for this bug) if it would be possible to fully disable gamma trickery without recompiling. That would probably mean introducing a preference so that could be set with gconf-editor. (However, the above patch will probably fix this for me, I haven't yet tried it.)
I just tried using the nvidia control panel (nvidia-settings) to adjust the brightness, contrast, and gamma settings. These were preserved by gnome-screensaver with the patch. I'd try this with ATI but I don't have a card available at the moment.
I am using a 24" screen with calibration using xcalib (which uses the XVidMode extension; mine is 2.2) and noticed this annoying issue. The attached patch fixes it and my calibration stays the same.
The attached patch fixed enemy-territory brightness problem under gentoo with gnome-screensaver-2.18.2 Thanks for fixing this
Created attachment 98485 [details] [review] Fix compile error in last patch I'm glad I could help. There was a compile error in the last patch if HAVE_XF86VMODE_GAMMA wasn't defined, but the new patch takes care of that. I still can't determine if this has to be made thread safe, but if someone more knowledgeable than me decides that it does, I can fix it easy enough.
It also works with gnome-screensaver-2.20 :-)
I can confirm latest patch fixes the issue, I just wrote the exact same patch independently and I only discovered this bug just before submitting a bug with my patch :)
Any news if the proposed patch will applied to trunk ?
Seems that current patch still doesn't retrieve the gamma value from current setting, but uses the one there was at the launching of gnome-screensaver, so it resets the gamma if it was changed dynamically in the running server. At least is much better than current status. (Thanks to leio for explaining me this in http://bugs.gentoo.org/show_bug.cgi?id=201019#c3)
That is incorrect. Notice the new calls to gamma_info_init and gamma_info_free. This ensures that the gamma info is obtained immediately before each fade, and freed once it is done. What leio describes is the behavior of gnome-screensaver without this patch. Since I wouldn't have submitted a patch that does nothing, maybe something is making it ineffective in Gentoo. Maybe something has changed in gnome-screensaver; I haven't tested this in months. The only real concern I still have is thread safety, since I don't know enough about the underlying system to be able to tell if more than one thread will access gamma_info simultaneously. Does anyone think I should protect gamma_info with a mutex?
Yes, the patch didn't get applied because I failed to inherit our patching functions and the patch applying therefore failed. This is since fixed as pacho pointed it out to me and it works as you, John, describe it should work. It's working much better than before, I'd say this is well worth applying upstream after review of the concerns you cite regarding thread safety and such.
Okay, I've learned a few things that make me think this patch is ready. Even though there are two different classes that try to start and stop the fading (GSMonitor and GSManager both call gs_fade_async and gs_fade_reset), these happen in the same thread, at least on my system. Also, the fadeout uses g_timeout_add, which causes fade_out_timer to be called repeatedly, but in the same thread. Since it's all in the same thread, everything should be safe.
Please fix this at least for gnome-screensaver-2.24 Thanks
Committed to trunk. Thanks! Sorry for the long delay.
*** Bug 527029 has been marked as a duplicate of this bug. ***
*** Bug 549895 has been marked as a duplicate of this bug. ***