Combined gestures screen and panel

What do you want to achieve?

Hi! I want to implement horizontal gestures (right and left) on the entire screen to change the screen and vertical gestures (up and down) only on the green panel (see image) to call functions and change the widgets displayed within that panel.

What have you tried so far?

I defined horizontal gestures for the screen and vertical gestures for the panel, but if the panel has the LV_OBJ_FLAG_GESTURE_BUBBLE flag added, the vertical gesture does not work. If I clear that flag, all gestures work but horizontal gestures don’t work inside the green panel, they only work if done outside of it.
I solved it by clearing the LV_OBJ_FLAG_GESTURE_BUBBLE flag from the green panel and adding horizontal gesture events to it that change the screen, but it doesn’t seem like the best solution to me since I’m repeating events on the screen and on the panel.

If the LV_OBJ_FLAG_GESTURE_BUBBLE flag is added, shouldn’t the gesture be interpreted on the panel first and then transmitted to the screen?


Screenshot or video


  • SquareLine Studio version: v1.3.4
  • Operating system: Windows 11
  • Target hardware: ESP32

Gestures are propagated to the screen by default, according to LVGL documentation: Input devices — LVGL documentation
So widget gesture events are only executed when the ‘Gesture bubble’ flag is removed in SquareLine Studio. The panel widget in your example is in front of the screen so left-rigth gestures are detected on it, not on the screen, so the solution you mention above, that is, setting screen-change event for the panel’s left-right gesture too, is what works. (If it causes too much code redundancy, you can decrease that by setting a ‘CALL_FUNCTION’ action instead of ‘CHANGE_SCREEN’ action, and the called function can set the screens from your code.)

1 Like