From an algorithmic complexity point of view, machine learning methods scale and generalize better using a few key features; using lots is computationally expensive, and overfitting can occur. High dimensional data is often counterintuitive to perceive and process, but unfortunately it is common for observed data to be in a representation of greater dimensionality than it requires. This gives rise to the notion of dimensionality reduction, a sub-field of machine learning that is motivated to find a descriptive low-dimensional representation of data. In this review, it is explored a way to perform dimensionality reduction, provided by a class of Latent Variable Models (LVMs). In particular, the aim is to establish the technical foundations required for understanding the Gaussian Process Latent Variable Model (GP-LVM), a probabilistic nonlinear dimensionality reduction model. The review is organized as follows: after an introduction to the problem of dimensionality reduction and LVMs, Principal Component Analysis (PCA) is recalled and it is reviewed its probabilistic equivalent that contributes to the derivation of GP-LVM. Then, GP-LVM is introduced, and briefly a remarkable extension of the latter, the Bayesian Gaussian Process Latent Variable Model (BGP-LVM) is described. Eventually, and the main advantages of using GP-LVM are summarized.

Probabilistic nonlinear dimensionality reduction through gaussian process latent variable models: An overview / M. Bodini - In: Computer-Aided Developments: Electronics and Communication / [a cura di] A.K. Sinha, J. Pradeep Darsy. - Prima edizione. - Boca Raton : CRC Press, 2019 Sep 30. - ISBN 9780429340710. - pp. 77-89 (( 1. Conference on Computer-Aided Developments in Electronics and Communication (CADEC-2019) Amaravati 2019 [10.1201/9780429340710-10].

Probabilistic nonlinear dimensionality reduction through gaussian process latent variable models: An overview

M. Bodini
Primo
2019

Abstract

From an algorithmic complexity point of view, machine learning methods scale and generalize better using a few key features; using lots is computationally expensive, and overfitting can occur. High dimensional data is often counterintuitive to perceive and process, but unfortunately it is common for observed data to be in a representation of greater dimensionality than it requires. This gives rise to the notion of dimensionality reduction, a sub-field of machine learning that is motivated to find a descriptive low-dimensional representation of data. In this review, it is explored a way to perform dimensionality reduction, provided by a class of Latent Variable Models (LVMs). In particular, the aim is to establish the technical foundations required for understanding the Gaussian Process Latent Variable Model (GP-LVM), a probabilistic nonlinear dimensionality reduction model. The review is organized as follows: after an introduction to the problem of dimensionality reduction and LVMs, Principal Component Analysis (PCA) is recalled and it is reviewed its probabilistic equivalent that contributes to the derivation of GP-LVM. Then, GP-LVM is introduced, and briefly a remarkable extension of the latter, the Bayesian Gaussian Process Latent Variable Model (BGP-LVM) is described. Eventually, and the main advantages of using GP-LVM are summarized.
Dimensionality reduction; Gaussian processes; Latent variable models
Settore INFO-01/A - Informatica
30-set-2019
Book Part (author)
File in questo prodotto:
File Dimensione Formato  
10.1201:9780429340710-10.pdf

accesso riservato

Tipologia: Publisher's version/PDF
Licenza: Nessuna licenza
Dimensione 1.67 MB
Formato Adobe PDF
1.67 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2434/1216662
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
  • OpenAlex 2
social impact