Tailoring user interfaces to include gesture-based interaction with gestUI


Abstract:

The development of custom gesture-based user interfaces requires software engineers to be skillful in the use of the tools and languages needed to implement them. gestUI, a model-driven method, can help them achieve these skills by defining custom gestures and including gesture-based interaction in existing user interfaces. Up to now, gestUI has used the same gesture catalogue for all software users, with gestures that could not be subsequently redefined. In this paper, we extend gestUI by including a user profile in the metamodel that permits individual users to define custom gestures and to include gesture-based interaction in user interfaces. Using tailoring mechanisms, each user can redefine his custom gestures during the software runtime. Although both features are supported by models, the gestUI tool hides its technical complexity from the users. We validated these gestUI features in a technical action research in an industrial context. The results showed that these features were perceived as both useful and easy to use when defining/redefining custom gestures and including them in a user interface.

Año de publicación:

2016

Keywords:

  • Custom gesture
  • Technical action research
  • Gesture-based interaction
  • Human-computer Interaction
  • Model-driven development
  • User interface

Fuente:

rraaerraae
googlegoogle
scopusscopus

Tipo de documento:

Conference Object

Estado:

Acceso restringido

Áreas de conocimiento:

  • Interfaz de usuario

Áreas temáticas:

  • Métodos informáticos especiales