GNOME Bugzilla – Bug 517344
Add BLUR filter on RGBA windows
Last modified: 2015-12-02 18:06:17 UTC
I know this could be very hard to implement (Xrender != OpenGL :) ) but, as already Iain said, it's one of the most important features that could help usability on transparency. I've found this code from MircoMuller, maybe it could be usefull to draw blur with Xrender. http://macslow.thepimp.net/projects/xrender-gradients.tar.bz2
just to help: carl worth (who is also working for adding blur filters in cairo) told me that there's some kind of API directly in xrender to add blur filters in the transparency. maybe it's simpler than it seems
xrenders blur is incredibly slow. and very complicated and not very good there was someone who had written a better system for doing effects in xrender about 4 years ago but keithp rejected the whole idea and the guy didnt respond to my emails about it.
i did the same objection (slowness) to carl, and he said me i was wrong. maybe you should talk directly with him. are there other ways? which is the status oh the kde4 kwin xrender engine?
Well, I discussed it with a top xorg hacker which went something like this: me> What are the chances that XRender's gaussian blur works and is fast? top xorg hacker> slim and none. macslow's demo is incredibly slow on my system. I don't see anyother way than XRender.
I mean, feel free to try, or maybe I will, but its not something that I rate as high priority really, and it seems that it is a lot of work for little pay off...
it allows the best visual impact with transparency, so I won't call it "minor". Also it greatly improves usability. Here there's a screenshot with blur effects on my rgba gtk engine. http://www.cimitan.com/blog/wp-content/rgba-murrine-170208.png Yeah speed is a big issue, we should try and see what happens. Also, there can be other filters similar to gaussian, or even faster algorithms. Maybe google? I have found an algorithm from some kde guys iirc... Also which code is using kwin4 for the blur effect (if it is implemented)? I've seen that expose/miniature effect is implemented and works fine
I understand the theoretical usability advantages, but if it kills the system then thats not really a very good usability feature. We can only use whatever blur algorithms the XRender drivers give us, which gives us a very slow and bad gaussian blur algorithm. As far as I know KWin doesn't do blur behind windows, except possibly in their OpenGL backend. Expose/miniature effects have nothing to do with this. Sorry to be negative, but I have investigated this a bit and the future does not look very promising at the moment.
can't be possible to use a gl backend for the blur effect? or it goes in contrast to the Xrender? Yes Xrender shadowing works and it is quite stable, but from my point of view there's no such a big future behind. Yeah we have shadows, but just shadows, maybe some slow fading effects, or a slow exposè-like effect. I don't want to go ot, but compiz shown how opengl is the right choice (also the fastest).
Yes, it is either OpenGL or XRender. And yes, OpenGL is the future But unless you're going to get everyone a video card that can support it XRender is here to stay.
To finish the OT: I do not agree with this, or at least it seems so useless to lose weeks and months of coding behind a compositing backend of the past. Attention, I'm not saying to remove Xrender implementation, but that we should not pay too much attention since we are actually losing coding time :) So, we have nice shadows, transparency. The essential. Now let's move further and see the future. Also because of 3 reasons: 1) Most of the cards (or at least the great majority of them) support AIGLX. All ati >= 7500 and maybe all nvidia, intel works. New drivers are coming (Gallium3D) and maybe more support is coming also for other older cards. Now a days at least every computer we may buy supports compiz (they have new cards). So we are developing an abandoned compositing framework (in 2009 maybe everyone will have a AIGLX-capable video card). 2) Users with a video card that works with openGL use compiz (so the more time will pass the less users we will have, I'm expecting less metacity users in 2009). 3) Developing such blur, exposè, alt-tab switching is useless, because users that can't run compiz have (of course) old video cards which can't support those effects due to performances. (So we are developing new features for what, newer cards that support openGL? sounds like a waste of time). So why not leave Xrender in the code as is (quite good and stable, probably the best Xrender-based compositing WM I've seen) and start a new openGL backend, which will be enabled when the videocard supports it (most of them). And leave Xrender as a fallback. Otherwise it sounds like time wasted.
What about that algorithm? http://zrusin.blogspot.com/2006/07/more-blurring.html
I'm aware of that algorithm, but its not part of XRender. The only thing that can work on XRender images (without major major slow down) is XRender. It doesn't matter if you have the best algorithm in the world, its useless to us unless it is part of XRender. Feel free to work on an OpenGL backend if you want. I (and many others) would suggest that you use Clutter (http://www.clutter-project.org) for this I may sometime But its not high priority for me at this moment in time.
No I want do it, I'm actually developing a in a lot of other projects :) Someday a metacity hacker will do it and we will be happy again :) I just said that from my point of view, since you reached great stability (I've tested 5 minutes ago kwin4 and it sucks), that it's a waste of time working on adding extra features trough xrender, and it's better to start the development of the opengl backend. I also hope clutter will be added as an official Gtk canvas, should be a good motivation to start a Cluttercity :)
"We can't use XRender blur because it is slow" is a statement that's really just setting up a chicken and egg problem. If XRender drivers don't have fast blur implementations yet, that's definitely not helped by the fact that applications aren't using them. So, if you would like this effect in your application, I do recommend you code it up and then put the ball directly into the Xorg hackers' court---"Here, this effect would be really cool, but it's ugly and slow---please fix it." At that point, (and admittedly, before that, but it's often hard for the X hackers to get excited about features that aren't getting used), two things could happen: 1. X drivers could implement hardware-accelerated blur 2. The X server software could receive some attention with a more efficient algorithm, (which obviously could use any software algorithm you've found). And of course, anybody interested could help out with either of these things. Happy hacking, -Carl PS. Discussion of X server features will definitely get more attention on the xorg list rather than in some non-X bugzilla entry. I only found this because Cimi pointed me directly to it.
Carl, totally understand your chicken and egg problem, but if I spend time on this and its completely unusable, then I'll get a zillion bug reports telling me that, a million requests for an option to turn it off and 10 people calling me an idiot. And the last time I tried using the XRender blur stuff it wasn't that it was slightly slow, it brought my computer to its knees just trying to render one window with some blurring in it. I'm not really discussing X server features here, I'm just commenting on them in relation to his bug.
In reply to comment #10 No, users (especially linux users) are not all equipped with the latest greatest chipset out there. Users with old shitty i8xx, radeon < r300 and nvidia < geforce are not so uncommon but also think of all the users having sis or s3 cards that have so poor 2d performance it's not even thinkable it can do 3d. Now let aside the hardware performance issue, some people (like me) _cannot_ run compiz on their setup due to limitations in the X stack. As soon as you get past the 2048x2048 limitation your doomed to not be able to use the compiz bling while Xrender _works_. Imho even if xrender is not the future, it's still _nice_ for all users not having the chance to be able to upgrade their computer on a whim. Just my 2 cents.
I guess this can be closed. Blurring with XRender/Cairo is still not recommended, even if convolution filters got slightly better in the past 8 years; and in general blurring is not a great thing to use.