Looking for modular MapReduce solutions for working with BigData: Aspect-oriented hadoop


Abstract:

Justly, in the search for modular MapReduce solutions, the main goal of this work is to apply Hadoop and AspectJ for the definition of Aspect-Combine functions. MapReduce is a computing approach to work with large volumes of data (BigData) in a distributed environment, with high levels of abstraction and the ordered use of Map and Reduce functions, the first one for mapping or identifying relevant data and the second for resuming data and final results. In a MapReduce system, Mapper and Reducer nodes implement the Map and Reduce functions respectively. Hadoop is a free application of MapReduce that allows the definition of Combine functions. However, the execution of Combine is not guaranteed in Hadoop. This problem motivated this work. As a result, a greater degree of modularization is reached from a theoretical point of view. From the practical point of view there are also improvements in performance.

Año de publicación:

2018

Keywords:

  • Bigdata applications
  • information processing
  • Aspects-oriented
  • Modular software architecture

Fuente:

scopusscopus

Tipo de documento:

Article

Estado:

Acceso restringido

Áreas de conocimiento:

  • Big data
  • Software

Áreas temáticas:

  • Programación informática, programas, datos, seguridad