We extend to more general contrast functions a method to speed up kurtosis-based FastICA in presence of information redundancy, i.e., for large samples. It consists in randomly decimating the data set as more as possible while preserving the quality of the reconstructed signals. By performing an analysis of the kurtosis estimator, we found the maximum reduction rate which guarantees a narrow confidence interval of such estimator with high confidence level. Such a rate depends on a parameter beta easily computed a priori combining together the fourth and the eighth norms of the observations. We generalize such a pruning method to FastICA based on nonpolynomial contrast functions, using the same parameter beta in order to validate it also for such functions. Extensive simulations have been done on different sets of real world signals using the most performance contrast functions. They show that the pruning technique is impressively robust with respect to the choice of the function. As a matter of fact, the sample size reduction is very high, preserves the quality of the decomposition and impressively speeds up FastICA for all considered optimization functions. On the other hand, the simulations also show that, decimating data more than the rate fixed by beta, the decomposition ability of FastICA is compromised, thus validating the reliability of the parameter beta.

Extending Mixture Random Pruning to Nonpolynomial Contrast Functions in FastICA / S. Gaito, G. Grossi - In: ISSPIT 2007 : 2007 IEEE Internatilonal Symposium on Signal Processiing and Information Technology, December 15-18, 2007, Cairo, EgyptPiscataway : IEEE Computer Society, 2007. - ISBN 9781424418350. - pp. 334-338 (( convegno IEEE International Symposium on Signal Processing and Information Technology tenutosi a Cairo, Egypt nel 2007 [10.1109/ISSPIT.2007.4458101].

Extending Mixture Random Pruning to Nonpolynomial Contrast Functions in FastICA

S. Gaito
Primo
;
G. Grossi
Ultimo
2007

Abstract

We extend to more general contrast functions a method to speed up kurtosis-based FastICA in presence of information redundancy, i.e., for large samples. It consists in randomly decimating the data set as more as possible while preserving the quality of the reconstructed signals. By performing an analysis of the kurtosis estimator, we found the maximum reduction rate which guarantees a narrow confidence interval of such estimator with high confidence level. Such a rate depends on a parameter beta easily computed a priori combining together the fourth and the eighth norms of the observations. We generalize such a pruning method to FastICA based on nonpolynomial contrast functions, using the same parameter beta in order to validate it also for such functions. Extensive simulations have been done on different sets of real world signals using the most performance contrast functions. They show that the pruning technique is impressively robust with respect to the choice of the function. As a matter of fact, the sample size reduction is very high, preserves the quality of the decomposition and impressively speeds up FastICA for all considered optimization functions. On the other hand, the simulations also show that, decimating data more than the rate fixed by beta, the decomposition ability of FastICA is compromised, thus validating the reliability of the parameter beta.
Settore INF/01 - Informatica
2007
Book Part (author)
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2434/41931
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact