GNOME Bugzilla – Bug 768976
Draw monitor content to individual framebuffer
Last modified: 2017-03-16 02:35:06 UTC
This is a place-holder bug for commits related to multi DPI rendering and other things that rely on drawing monitor content into per-monitor framebuffers.
For reference, the following commits have landed: https://git.gnome.org/browse/mutter/log/?qt=range&q=b281f9566dfeda758eef99478876f1977c31c463..e891a8b6282eb8617422d4c351d731c47edf373f
Created attachment 332083 [details] [review] x11/nested: Only paint monitor stage views when enabled Only paint the per monitor stage views when enabled, otherwise bad things happen.
Created attachment 332084 [details] [review] ClutterStageView: Initialize viewport/projection as dirty Initially the viewport and projection is not calculated and should thus be marked as dirty.
Created attachment 332085 [details] [review] MetaRendererX11: Allocate offscreen framebuffers up front Allocate the offscreen stage view framebuffers up front; otherwise they may get allocated after the viewport calculated by the stage is set, which would cause the viewport to be incorrect until recalculated.
Created attachment 332086 [details] [review] MetaRendererView: Fix GObject parent Set ClutterStageViewCogl as parent of MetaRendererView, since that is the actual parent.
Comment on attachment 332085 [details] [review] MetaRendererX11: Allocate offscreen framebuffers up front Makes sense
Comment on attachment 332086 [details] [review] MetaRendererView: Fix GObject parent Oops :)
Comment on attachment 332084 [details] [review] ClutterStageView: Initialize viewport/projection as dirty Why not using TRUE? makes complete sense to me though
Comment on attachment 332083 [details] [review] x11/nested: Only paint monitor stage views when enabled Looks good IMO
(In reply to Carlos Garnacho from comment #8) > Comment on attachment 332084 [details] [review] [review] > ClutterStageView: Initialize viewport/projection as dirty > > Why not using TRUE? makes complete sense to me though Because they are guint's, not gboolean's.
Comment on attachment 332084 [details] [review] ClutterStageView: Initialize viewport/projection as dirty But TRUE has 1-ness enough to set it :), you clearly want !! checks if this is set from a larger width variable (which btw I see missing in clutter_stage_view_set_dirty*()), because the bit(s) that makes it !=0 might not be where you expect it, but being hardcoded as it is here I think it improves readability. "1" sounds to "counting something" to me, whereas we actually deal with these vars as packed bools. Anyway, I won't bikeshed about this, the patch looks essentially right.
(In reply to Carlos Garnacho from comment #11) > Comment on attachment 332084 [details] [review] [review] > ClutterStageView: Initialize viewport/projection as dirty > > But TRUE has 1-ness enough to set it :), you clearly want !! checks if this > is set from a larger width variable (which btw I see missing in > clutter_stage_view_set_dirty*()), because the bit(s) that makes it !=0 might > not be where you expect it, but being hardcoded as it is here I think it > improves readability. "1" sounds to "counting something" to me, whereas we > actually deal with these vars as packed bools. > > Anyway, I won't bikeshed about this, the patch looks essentially right. Changed to TRUE. Attachment 332083 [details] pushed as cc4a65f - x11/nested: Only paint monitor stage views when enabled Attachment 332084 [details] pushed as adcd0fe - ClutterStageView: Initialize viewport/projection as dirty Attachment 332085 [details] pushed as 9b4e869 - MetaRendererX11: Allocate offscreen framebuffers up front Attachment 332086 [details] pushed as 6894563 - MetaRendererView: Fix GObject parent
We draw monitor content to individual framebuffers now, so closing this bug.
Sorry to post here, but searched the whole internet for a solution but failed. I see this is closed, so how do I set different DPI now for different monitors now as a user? I have a 12" 2160x1440 monitor and a 24" 1920x1200 monitor. Using 1 or 2 in the Gnome-Tweak-Tool either makes one monitor stuff huge or the other too small. Thanks
Determination of the monitor scales are currently done automatically, and your 12" monitor probably doesn't reach the threshold for getting scale = 2. There is no way to manually configure individual monitor scales yet.
A few more questions as the internet is still empty ;-) * Can you confirm gnome 3.23.91 has these changes? * IIUC the effect of this is the same as changing scaling-factor (but per-monitor). What about changing text-scaling-factor per monitor? Is it related? Is it planned? (this would be especially useful for dpi around 144) * Is the threshold used by the new implementation still taken from HIDPI_LIMIT in meta-monitor-manager-kms.c as per https://git.gnome.org/browse/mutter/commit/?id=78d85256994edc99ed1880ea209586994de52b61 ? Otherwise can you point to the code? Any chance this threshold will be taken from gsettings one day? Or is some kind of manual configuration on the roadmap? Thanks
(In reply to edwardoo from comment #16) > A few more questions as the internet is still empty ;-) > > * Can you confirm gnome 3.23.91 has these changes? 3.22 already draws on individual per monitor framebuffers. The patches allowing scaling those framebuffers have not landed yet. > > * IIUC the effect of this is the same as changing scaling-factor (but > per-monitor). What about changing text-scaling-factor per monitor? Is it > related? Is it planned? (this would be especially useful for dpi around 144) Text scaling has nothing to do with mutter. For "kind of high" DPI, fractional scaling might help though, but has its own side effects. > > * Is the threshold used by the new implementation still taken from > HIDPI_LIMIT in meta-monitor-manager-kms.c as per > https://git.gnome.org/browse/mutter/commit/ > ?id=78d85256994edc99ed1880ea209586994de52b61 ? Otherwise can you point to > the code? Any chance this threshold will be taken from gsettings one day? Or > is some kind of manual configuration on the roadmap? The new API will allow setting the scale manually. The same HIDPI_LIMIT is used for the automatically calculated scale.