GNOME Bugzilla – Bug 728426
Implement the wl_touch interface
Last modified: 2014-04-23 03:57:25 UTC
For input, the wayland backend solely implements the wl_pointer and wl_keyboard interfaces, which leaves touch events unhandled. I'm attaching 2 patches that: 1) decouples core devices from the seat capabilities by adding slave devices in between, this is certainly closer to what x11 does, (plus GTK+ quite expects core devices to always come in pointer/keyboard pairs) 2) Implements wl_touch, translating events into GdkEventTouch events. One aspect I'm a bit dubious about is pointer emulation, on x11 the first touch gets to emulate pointer events, but on the wayland protocol pointer and touch are completely decoupled, I've replicated that behavior in GDK, having the pointer moving and emitting events independently brings some confusion. It should be better I guess when everything gets GDK_TOUCH_MASK, either directly or indirectly. Probably for the longer term it would be better to stick to the facts here, and just have all touch events as !emulating_pointer on wayland.
Created attachment 274611 [details] [review] wayland: Separate master devices from seat capabilities The master pointer/keyboard pair should never disappear or be inconsistent. The seat capabilities are now reflected through slave devices, those may come and go freely as the seat capabilities change. This also enables adding further capabilities to handle eg. touch.
Created attachment 274612 [details] [review] wayland: handle the wl_touch interface The events are routed through a new slave device with type GDK_SOURCE_TOUCHSCREEN, minimal tracking of touches is done to keep the state for each of those.
Review of attachment 274611 [details] [review]: Looks good to me
Review of attachment 274612 [details] [review]: ::: gdk/wayland/gdkdevice-wayland.c @@ +1251,3 @@ + touch = g_new0 (GdkWaylandTouchData, 1); + touch->id = id; + touch->window = wl_surface_get_user_data(surface); Missing space here @@ +1252,3 @@ + touch->id = id; + touch->window = wl_surface_get_user_data(surface); + touch->emulates_pointer = (g_hash_table_size (device->touches) == 0); Maybe this flag would be more descriptive as 'first_touch' or 'initial_touch' ?
I agree that it would be nice to get out of the emulating business in gdk, and instead handle touch events in event controllers everywhere. The sticky point is compatibility with existing 3rd party widgets, of course. gtk4 ?
(In reply to comment #5) > I agree that it would be nice to get out of the emulating business in gdk, and > instead handle touch events in event controllers everywhere. The sticky point > is compatibility with existing 3rd party widgets, of course. gtk4 ? Indeed, I checked before attaching that this works with the gestures branch, and as for event controllers go, that's handled just fine. The big drawback is backwards compat as you say, after all I think it is positive to have a "single touch" simplification, be it either through emulated pointer events, or full/mixed usage of event controllers towards gtk4 (which are already touch/pointer agnostic). I think the sore point is that GDK doesn't lock on emulated events, ignoring real pointer events if those get to happen, so "emulating pointer" doesn't imply "I control the pointer cursor". FWIW this also happens on x11 to some extent (if you eg. touch on the touchcreen, and move the pointer with some other device while the touch is down) but gets "corrected" as soon as the touchscreen receives an event, just that on wayland it's glaring...
Attachment 274611 [details] pushed as af8d6e6 - wayland: Separate master devices from seat capabilities Attachment 274612 [details] pushed as 1a2a5a4 - wayland: handle the wl_touch interface