Smart home: Multimodal interaction for control of home devices


Abstract:

This paper presents a novel multimodal interaction system that provides three modalities of human interaction to manage the supervision and control of a home automation system for people with limited physical mobility. The first one is based on a brain-control interface (BCI, Brain Computer Interface) by means of surface electroencephalographic electrodes (EEG) using a noninvasive Neurosky wearable. In this case, the control of the BCI is managed by the blinking of human eyes. The second one is based on a voice recognition system using spoken commands on a dialogue system. Finally, the third one is based on a configurable touch screen of a mobile device. All three-interaction modalities can be interchangeable according to user needs. The multimodal interaction system allows the control of home devices and appliances through a home gateway implemented on a resource-limited embedded system, which it is responsible to apply the user commands detected by the multimodal interface to the corresponding home devices. The set of commands can be configurable and extensible adapted to the needs and abilities of different users. For this research, a prototype of the system was developed to verify the interaction modalities. The system provided an adequate and comprehensible operation for users with different user profiles. Our initial tests show that multimodal control is valid for users who have limited physical mobility.

Año de publicación:

2019

Keywords:

  • Voice assistant
  • Multimodal Interaction
  • Brain control interaction
  • Eye blink
  • Touch interface

Fuente:

scopusscopus
googlegoogle

Tipo de documento:

Conference Object

Estado:

Acceso restringido

Áreas de conocimiento:

  • Ciencias de la computación

Áreas temáticas:

  • Métodos informáticos especiales
  • Ciencias de la computación
  • Física aplicada