User-defined interface gestures: Dataset and analysis
Abstract:
We present a video-based gesture dataset and a methodology for annotating video-based gesture datasets. Our dataset consists of user-defined gestures generated by 18 participants from a previous investigation of gesture memorability. We design and use a crowd-sourced classification task to annotate the videos. The results are made available through a web-based visualization that allows researchers and designers to explore the dataset. Finally, we perform an additional descriptive analysis and quantitative modeling exercise that provide additional insights into the results of the original study. To facilitate the use of the presented methodology by other researchers we share the data, the source of the human intelligence tasks for crowdsourcing, a new taxonomy that integrates previous work, and the source code of the visualization tool.
Año de publicación:
2014
Keywords:
- Gesture datasets
- Gesture design
- User-defined gestures
- Gesture analysis methodology
- Gesture memorability
- Gesture elicitation
- Gesture annotation
Fuente:
scopusTipo de documento:
Conference Object
Estado:
Acceso restringido
Áreas de conocimiento:
- Visión por computadora
- Ciencias de la computación
- Ciencias de la computación
Áreas temáticas de Dewey:
- Métodos informáticos especiales
- Ciencias de la computación
- Funcionamiento de bibliotecas y archivos
Objetivos de Desarrollo Sostenible:
- ODS 17: Alianzas para lograr los objetivos
- ODS 4: Educación de calidad
- ODS 9: Industria, innovación e infraestructura