Generalized Spectral Dimensionality Reduction Based on Kernel Representations and Principal Component Analysis


Abstract:

Very often, multivariate data analysis problems require dimensionality reduction (DR) stages to either improve analysis performance or represent the data in an intelligible fashion. Traditionally DR techniques are developed under different frameworks and settings what makes their comparison a non-trivial task. In this sense, generalized DR approaches are of great interest as they enable both to power and compare the DR techniques in a proper and fair manner. This work introduces a generalized spectral dimensionality reduction (GSDR) approach able to represent DR spectral techniques and enhance their representation ability. To do so, GSDR exploits the use of kernel-based representations as an initial nonlinear transformation to obtain a new space. Then, such a new space is used as an input for a feature extraction process based on principal component analysis. As remarkable experimental results, GSDR shows to be able to outperform the conventional implementation of well-known spectral DR techniques (namely, classical multidimensional scaling and Laplacian eigenmaps) in terms of the scaled version of the average agreement rate. Additionally, relevant insights and theoretical developments to understand the effect of data structure preservation at local and global levels are provided.

Año de publicación:

2021

Keywords:

  • Spectral methods
  • Principal Component Analysis
  • Dimensionality reduction
  • Kernel representations

Fuente:

scopusscopus

Tipo de documento:

Conference Object

Estado:

Acceso restringido

Áreas de conocimiento:

  • Aprendizaje automático
  • Optimización matemática
  • Optimización matemática

Áreas temáticas:

  • Ciencias de la computación