GestUI: A Model-driven Method and Tool for Including Gesture-based Interaction in User Interfaces
DOI:
https://doi.org/10.7250/csimq.2016-6.05Keywords:
Model-driven architecture, gesture-based interaction, multi-stroke gestures, information systems, gesture-based user interfaceAbstract
Among the technological advances in touch-based devices, gesture-based interaction have become a prevalent feature in many application domains. Information systems are starting to explore this type of interaction. As a result, gesture specifications are now being hard-coded by developers at the source code level that hinders their reusability and portability. Similarly, defining new gestures that reflect user requirements is a complex process. This paper describes a model-driven approach to include gesture-based interaction in desktop information systems. It incorporates a tool prototype that captures user-sketched multi-stroke gestures and transforms them into a model by automatically generating the gesture catalogue for gesture-based interaction technologies and gesture-based user interface source codes. We demonstrated our approach in several applications ranging from case tools to form-based information systems.Downloads
Published
29.04.2016
Issue
Section
Articles
License
Copyright (c) 2016 Otto Parra, Sergio España, Oscar Pastor (Author)

This work is licensed under a Creative Commons Attribution 4.0 International License.
How to Cite
Parra, O., España, S., & Pastor, O. (2016). GestUI: A Model-driven Method and Tool for Including Gesture-based Interaction in User Interfaces. Complex Systems Informatics and Modeling Quarterly, 6, 73-92. https://doi.org/10.7250/csimq.2016-6.05