Use physical button to select, control GUI widgets.
What have you tried so far?
Use gesture behavior to simulate button click.
Screenshot or video
None.
Others
**SquareLine Studio version: 1.3
**Operating system: Windows 10
**Target hardware: STM32F4xx
Suggestions
SquareLine Studio is clearly the best GUI design tool for Touch Screen, but it may not be friendly enough for applications using physical buttons to control GUI.
How to select multiple widgets in sequence within a single screen? If it is possible, how to define the order of these selectable widgets?
If it is not possible to do this in Squareline Studio 1.30, will there be any plan to implement this feature? Many low-cost applications use physical button to interact with users instead of an expensive Touch Screen.
I actually think this raises the issue of TabIndex, both in lvgl widgets, and using squareline studio. TabIndex for those unaware defines the index of an item for use when tabbing BACKWARDS and forewards through widgets.
This does not work as expected in squareline studio (try tabbing through properties then using shift+tab a few times to go back). Separately (but what the op hinted at) there is no obvious feature surfaced from lvgl for supporting widget ordering when cycling through them using a button / action.
Iāve been using physical buttons to control screens and widgets in Squareline as I also donāt have any touch screens. Could you describe exactly what youāre trying to control/achieve? I might be able to help out.
Thank you very much for your response. I am wondering how you attach button actions to GUI widgets relevant operations within Squareline Studio.
For instance, hereās a simple UI with several buttons, in Squareline Studio, and there are three physical buttons (left, right, enter) to select (or focus) these button widgets and only when button widget is selected, can you press enter. How to realize this?
Thatās not what I am asking for, dude! I already did that in the way you described.
Looks like currently Squareline Studio doesnāt provide ālv_indevā functionalities.
Hopefully, popular kinds of input devices can be added into the Squareline Studio without extra coding, so that you can demonstrate, test the touchless UI with keyboard.
Better is say curently SQS dont provide any hw backend functions.
SQS is GUI frontend creator, and for universum i mean this dont change.
But yes maybe as for example TouchGFX an button abstraction will be added.
In TouchGFX i can add event key and it for example in simulator work with PC keyboard. If assign key āaā i can press āaā ā¦ But real hw layer isnt generated.
OK. Gothca. Another painful stuff is adding a bunch of widgets to a group in order to navigate each widget, even if you already put all these widgets within a panel, however, a panel is just a simple object thus it cannot be navigatedā¦LOLā¦
Iām not 100% sure what youāre trying to achieve but it sounds like your talking about manipulating flags which is normally done in SQS by āpressingā the widget or button. It is a lot more work, but you could assign physical buttons to control the boolean state of the various flags associated with the widgets. Maybe?
Yes. Not only āPressingā, but also navigate, select. I now know that it is currently unavailable in SQS, it can only be achieved by codingā¦
I asked this question coz I was not sure whether SQS can do this, and the GUI for my application is very complicated, multiple screens, widgetsā¦very painfulā¦