Recognizing a sketched task model by multi-stroke gesture recognition


Abstract:

taskSketch is an eclipse-based task model editor to enable designers to sketch a task model and to recognize it by a multi-stroke gesture recognition of pre-defined shapes and relations. The tool supports three levels of definition (LoF, LoD, LoC) and is linked to other models by model mapping. taskSketch is flexible enough to accommodate variations of the task model notation by defining new shapes and relations, and by searching template and pattern matchings. To demonstrate this capability, we introduce three concepts: the level of fidelity, the level of detail, and the level of criticism. This process is generalizable to other similar models involved in the model-based user interface design.

Año de publicación:

2020

Keywords:

  • Gesture recognition
  • Task modeling
  • Sketching
  • Design tools and techniques

Fuente:

scopusscopus

Tipo de documento:

Conference Object

Estado:

Acceso restringido

Áreas de conocimiento:

  • Aprendizaje automático
  • Ciencias de la computación

Áreas temáticas:

  • Ciencias de la computación