Multi sensor fusion framework for indoor-outdoor localization of limited resource mobile robots
Abstract:
This paper presents a sensor fusion framework that improves the localization of mobile robots with limited computational resources. It employs an event based Kalman Filter to combine the measurements of a global sensor and an inertial measurement unit (IMU) on an event based schedule, using fewer resources (execution time and bandwidth) but with similar performance when compared to the traditional methods. The event is defined to reflect the necessity of the global information, when the estimation error covariance exceeds a predefined limit. The proposed experimental platforms are based on the LEGO Mindstorm NXT, and consist of a differential wheel mobile robot navigating indoors with a zenithal camera as global sensor, and an Ackermann steering mobile robot navigating outdoors with a SBG Systems GPS accessed through an IGEP board that also serves as datalogger. The IMU in both robots is built using the NXT motor encoders along with one gyroscope, one compass and two accelerometers from Hitecnic, placed according to a particle based dynamic model of the robots. The tests performed reflect the correct performance and low execution time of the proposed framework. The robustness and stability is observed during a long walk test in both indoors and outdoors environments. © 2013 by the authors; licensee MDPI, Basel, Switzerland.
Año de publicación:
2013
Keywords:
- pose estimation
- mobile robots
- inertial sensors
- Sensor fusion
- Robot sensing systems
- Kalman filtering
- Event based systems
- Dynamic model
- Global positioning systems
- Embedded Systems
Fuente:

Tipo de documento:
Article
Estado:
Acceso abierto
Áreas de conocimiento:
- Robótica
Áreas temáticas:
- Ciencias de la computación
- Lectura y utilización de otros soportes de información
- Otras ramas de la ingeniería