Visual and Inertial Data-Based Virtual Localization for Urban Combat


Abstract:

The present work of investigation presents a system of estimation of position and orientation based on algorithms of artificial vision and inertial data taken from the unit of inertial measurement incorporated in a smartphone device. The implemented system realizes the estimation of position and orientation in real time. An application was developed for android operating systems that allows capturing the images of the environment and executes the algorithms of artificial vision. In the implementation of the system, the detectors of feature points were tested, Harris, Shi-Tomasi, FAST and SIFT, with the objective of finding the detector that allows to have an optimized system so that it can be executed by the processor of a system embedded as are smartphones. To calculate the displacement of the camera adhered to a mobile agent, the optical flow method was implemented. Additionally, gyroscope data incorporated in the smartphone was used to estimate the orientation of the agent. The system incorporates a simulation of estimated movement within a three-dimensional environment that runs on a computer. The position and orientation data are sent from the smartphone to the computer wirelessly through a Wi-Fi connection. The three-dimensional environment is a digital version of the central block of the Universidad de la Fuerzas Armadas ESPE where the tests of the implemented system were carried out.

Año de publicación:

2020

Keywords:

  • Shi-Tomasi
  • android
  • HARRIS
  • Military strategy
  • Augmented reality
  • optical flow
  • Warlike simulator

Fuente:

scopusscopus
googlegoogle

Tipo de documento:

Conference Object

Estado:

Acceso restringido

Áreas de conocimiento:

  • Simulación por computadora
  • Visión por computadora

Áreas temáticas:

  • Ciencia militar