After an evaluation, GNOME has moved from Bugzilla to GitLab. Learn more about GitLab.
No new issues can be reported in GNOME Bugzilla anymore.
To report an issue in a GNOME project, go to GNOME GitLab.
Do not go to GNOME Gitlab for: Bluefish, Doxygen, GnuCash, GStreamer, java-gnome, LDTP, NetworkManager, Tomboy.
Bug 517344 - Add BLUR filter on RGBA windows
Add BLUR filter on RGBA windows
Status: RESOLVED WONTFIX
Product: metacity
Classification: Other
Component: Iain's compositor
2.21.x
Other All
: Normal enhancement
: ---
Assigned To: Metacity compositor maintainers
Metacity compositor maintainers
Depends on:
Blocks:
 
 
Reported: 2008-02-18 23:44 UTC by Andrea Cimitan
Modified: 2015-12-02 18:06 UTC
See Also:
GNOME target: ---
GNOME version: 2.23/2.24



Description Andrea Cimitan 2008-02-18 23:44:19 UTC
I know this could be very hard to implement (Xrender != OpenGL :) ) but, as already Iain said, it's one of the most important features that could help usability on transparency.

I've found this code from MircoMuller, maybe it could be usefull to draw blur with Xrender.
http://macslow.thepimp.net/projects/xrender-gradients.tar.bz2
Comment 1 Andrea Cimitan 2008-03-28 16:02:38 UTC
just to help: carl worth (who is also working for adding blur filters in cairo) told me that there's some kind of API directly in xrender to add blur filters in the transparency.

maybe it's simpler than it seems
Comment 2 iain 2008-03-29 11:49:14 UTC
xrenders blur is incredibly slow.
and very complicated and not very good
there was someone who had written a better system for doing effects in xrender about 4 years ago but keithp rejected the whole idea and the guy didnt respond to my emails about it.
Comment 3 Andrea Cimitan 2008-03-29 13:41:25 UTC
i did the same objection (slowness) to carl, and he said me i was wrong.
maybe you should talk directly with him.

are there other ways?
which is the status oh the kde4 kwin xrender engine?
Comment 4 iain 2008-03-29 16:14:15 UTC
Well, I discussed it with a top xorg hacker which went something like this:

me> What are the chances that XRender's gaussian blur works and is fast?
top xorg hacker> slim and none.

macslow's demo is incredibly slow on my system. I don't see anyother way than XRender.
Comment 5 iain 2008-03-29 16:15:15 UTC
I mean, feel free to try, or maybe I will, but its not something that I rate as high priority really, and it seems that it is a lot of work for little pay off...
Comment 6 Andrea Cimitan 2008-03-29 18:28:20 UTC
it allows the best visual impact with transparency, so I won't call it "minor". Also it greatly improves usability.
Here there's a screenshot with blur effects on my rgba gtk engine.
http://www.cimitan.com/blog/wp-content/rgba-murrine-170208.png

Yeah speed is a big issue, we should try and see what happens. Also, there can be other filters similar to gaussian, or even faster algorithms.
Maybe google? I have found an algorithm from some kde guys iirc... Also which code is using kwin4 for the blur effect (if it is implemented)?
I've seen that expose/miniature effect is implemented and works fine
Comment 7 iain 2008-03-29 19:27:08 UTC
I understand the theoretical usability advantages, but if it kills the system then thats not really a very good usability feature.

We can only use whatever blur algorithms the XRender drivers give us, which gives us a very slow and bad gaussian blur algorithm. As far as I know KWin doesn't do blur behind windows, except possibly in their OpenGL backend. Expose/miniature effects have nothing to do with this.

Sorry to be negative, but I have investigated this a bit and the future does not look very promising at the moment.
Comment 8 Andrea Cimitan 2008-03-29 19:37:18 UTC
can't be possible to use a gl backend for the blur effect? or it goes in contrast to the Xrender?

Yes Xrender shadowing works and it is quite stable, but from my point of view there's no such a big future behind. Yeah we have shadows, but just shadows, maybe some slow fading effects, or a slow exposè-like effect.

I don't want to go ot, but compiz shown how opengl is the right choice (also the fastest).
Comment 9 iain 2008-03-29 19:50:20 UTC
Yes, it is either OpenGL or XRender.
And yes, OpenGL is the future
But unless you're going to get everyone a video card that can support it
XRender is here to stay.
Comment 10 Andrea Cimitan 2008-03-29 20:18:18 UTC
To finish the OT:
I do not agree with this, or at least it seems so useless to lose weeks and months of coding behind a compositing backend of the past. Attention, I'm not saying to remove Xrender implementation, but that we should not pay too much attention since we are actually losing coding time :)

So, we have nice shadows, transparency. The essential. Now let's move further and see the future.

Also because of 3 reasons:
1) Most of the cards (or at least the great majority of them) support AIGLX. All ati >= 7500 and maybe all nvidia, intel works. New drivers are coming (Gallium3D) and maybe more support is coming also for other older cards. Now a days at least every computer we may buy supports compiz (they have new cards). So we are developing an abandoned compositing framework (in 2009 maybe everyone will have a AIGLX-capable video card).
2) Users with a video card that works with openGL use compiz (so the more time will pass the less users we will have, I'm expecting less metacity users in 2009).
3) Developing such blur, exposè, alt-tab switching is useless, because users that can't run compiz have (of course) old video cards which can't support those effects due to performances. (So we are developing new features for what, newer cards that support openGL? sounds like a waste of time).

So why not leave Xrender in the code as is (quite good and stable, probably the best Xrender-based compositing WM I've seen) and start a new openGL backend, which will be enabled when the videocard supports it (most of them).

And leave Xrender as a fallback.

Otherwise it sounds like time wasted.
Comment 11 Andrea Cimitan 2008-03-30 00:37:38 UTC
What about that algorithm?
http://zrusin.blogspot.com/2006/07/more-blurring.html
Comment 12 iain 2008-03-30 01:28:58 UTC
I'm aware of that algorithm, but its not part of XRender.
The only thing that can work on XRender images (without major major slow down) is XRender. It doesn't matter if you have the best algorithm in the world, its useless to us unless it is part of XRender.

Feel free to work on an OpenGL backend if you want. I (and many others) would suggest that you use Clutter (http://www.clutter-project.org) for this

I may sometime
But its not high priority for me at this moment in time.
Comment 13 Andrea Cimitan 2008-03-30 01:38:08 UTC
No I want do it, I'm actually developing a in a lot of other projects :)

Someday a metacity hacker will do it and we will be happy again :)

I just said that from my point of view, since you reached great stability (I've tested 5 minutes ago kwin4 and it sucks), that it's a waste of time working on adding extra features trough xrender, and it's better to start the development of the opengl backend.

I also hope clutter will be added as an official Gtk canvas, should be a good motivation to start a Cluttercity :)
Comment 14 Carl Worth 2008-03-31 14:22:40 UTC
"We can't use XRender blur because it is slow" is a statement that's really just setting up a chicken and egg problem.

If XRender drivers don't have fast blur implementations yet, that's definitely not helped by the fact that applications aren't using them. So, if you would like this effect in your application, I do recommend you code it up and then put the ball directly into the Xorg hackers' court---"Here, this effect would be really cool, but it's ugly and slow---please fix it."

At that point, (and admittedly, before that, but it's often hard for the X hackers to get excited about features that aren't getting used), two things could happen:

1. X drivers could implement hardware-accelerated blur

2. The X server software could receive some attention with a more efficient algorithm, (which obviously could use any software algorithm you've found).

And of course, anybody interested could help out with either of these things.

Happy hacking,

-Carl

PS. Discussion of X server features will definitely get more attention on the xorg list rather than in some non-X bugzilla entry. I only found this because Cimi pointed me directly to it.
Comment 15 iain 2008-03-31 15:00:13 UTC
Carl, totally understand your chicken and egg problem, but if I spend time on this and its completely unusable, then I'll get a zillion bug reports telling me that, a million requests for an option to turn it off and 10 people calling me an idiot.  

And the last time I tried using the XRender blur stuff it wasn't that it was slightly slow, it brought my computer to its knees just trying to render one window with some blurring in it.

I'm not really discussing X server features here, I'm just commenting on them in relation to his bug.
Comment 16 Gilles Dartiguelongue 2008-05-05 22:23:18 UTC
In reply to comment #10

No, users (especially linux users) are not all equipped with the latest greatest chipset out there. Users with old shitty i8xx, radeon < r300 and nvidia < geforce are not so uncommon but also think of all the users having sis or s3 cards that have so poor 2d performance it's not even thinkable it can do 3d.

Now let aside the hardware performance issue, some people (like me) _cannot_ run compiz on their setup due to limitations in the X stack. As soon as you get past the 2048x2048 limitation your doomed to not be able to use the compiz bling while Xrender _works_.

Imho even if xrender is not the future, it's still _nice_ for all users not having the chance to be able to upgrade their computer on a whim. Just my 2 cents.
Comment 17 Emmanuele Bassi (:ebassi) 2015-12-02 18:06:17 UTC
I guess this can be closed. Blurring with XRender/Cairo is still not recommended, even if convolution filters got slightly better in the past 8 years; and in general blurring is not a great thing to use.