We extend to more general contrast functions a method to speed up kurtosis-based FastICA in presence of information redundancy, i.e., for large samples. It consists in randomly decimating the data set as more as possible while preserving the quality of the reconstructed signals. By performing an analysis of the kurtosis estimator, we found the maximum reduction rate which guarantees a narrow confidence interval of such estimator with high confidence level. Such a rate depends on a parameter beta easily computed a priori combining together the fourth and the eighth norms of the observations. We generalize such a pruning method to FastICA based on nonpolynomial contrast functions, using the same parameter beta in order to validate it also for such functions. Extensive simulations have been done on different sets of real world signals using the most performance contrast functions. They show that the pruning technique is impressively robust with respect to the choice of the function. As a matter of fact, the sample size reduction is very high, preserves the quality of the decomposition and impressively speeds up FastICA for all considered optimization functions. On the other hand, the simulations also show that, decimating data more than the rate fixed by beta, the decomposition ability of FastICA is compromised, thus validating the reliability of the parameter beta.
|Titolo:||Extending Mixture Random Pruning to Nonpolynomial Contrast Functions in FastICA|
|Autori interni:||GROSSI, GIULIANO (Ultimo)|
GAITO, SABRINA TIZIANA (Primo)
|Settore Scientifico Disciplinare:||Settore INF/01 - Informatica|
|Data di pubblicazione:||2007|
|Digital Object Identifier (DOI):||10.1109/ISSPIT.2007.4458101|
|Tipologia:||Book Part (author)|
|Appare nelle tipologie:||03 - Contributo in volume|
File in questo prodotto:
- PubMed Central loading...