GestUI: A Model-driven Method and Tool for Including Gesture-based Interaction in User Interfaces

Otto Parra, Sergio España, Oscar Pastor

Abstract


Among the technological advances in touch-based devices, gesture-based interaction have become a prevalent feature in many application domains. Information systems are starting to explore this type of interaction. As a result, gesture specifications are now being hard-coded by developers at the source code level that hinders their reusability and portability. Similarly, defining new gestures that reflect user requirements is a complex process. This paper describes a model-driven approach to include gesture-based interaction in desktop information systems. It incorporates a tool prototype that captures user-sketched multi-stroke gestures and transforms them into a model by automatically generating the gesture catalogue for gesture-based interaction technologies and gesture-based user interface source codes. We demonstrated our approach in several applications ranging from case tools to form-based information systems.

Keywords:

Model-driven architecture; gesture-based interaction; multi-stroke gestures; information systems; gesture-based user interface

Full Text:

PDF


DOI: 10.7250/csimq.2016-6.05

Refbacks

  • There are currently no refbacks.


Copyright (c) 2016 Complex Systems Informatics and Modeling Quarterly