Independent component analysis / / Aapo Hyvarinen, Juha Karhunen, Erkki Oja |
Autore | Hyvarinen Aapo |
Pubbl/distr/stampa | New York, : J. Wiley, c2001 |
Descrizione fisica | 1 online resource (505 p.) |
Disciplina | 519.5/35 |
Altri autori (Persone) |
KarhunenJuha
OjaErkki |
Collana | Adaptive and learning systems for signal processing, communications, and control |
Soggetto topico |
Multivariate analysis
Principal components analysis |
ISBN |
1-280-26480-2
9786610264803 0-470-30861-3 0-471-46419-8 0-471-22131-7 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Contents; Preface; 1 Introduction; 1.1 Linear representation of multivariate data; 1.1.1 The general statistical setting; 1.1.2 Dimension reduction methods; 1.1.3 Independence as a guiding principle; 1.2 Blind source separation; 1.2.1 Observing mixtures of unknown signals; 1.2.2 Source separation based on independence; 1.3 Independent component analysis; 1.3.1 Definition; 1.3.2 Applications; 1.3.3 How to find the independent components; 1.4 History of ICA; Part I: MATHEMATICAL PRELIMINARIES; 2 Random Vectors and Independence; 2.1 Probability distributions and densities
2.2 Expectations and moments2.3 Uncorrelatedness and independence; 2.4 Conditional densities and Bayes' rule; 2.5 The multivariate gaussian density; 2.6 Density of a transformation; 2.7 Higher-order statistics; 2.8 Stochastic processes *; 2.9 Concluding remarks and references; Problems; 3 Gradients and Optimization Methods; 3.1 Vector and matrix gradients; 3.2 Learning rules for unconstrained optimization; 3.3 Learning rules for constrained optimization; 3.4 Concluding remarks and references; Problems; 4 Estimation Theory; 4.1 Basic concepts; 4.2 Properties of estimators 4.3 Method of moments4.4 Least-squares estimation; 4.5 Maximum likelihood method; 4.6 Bayesian estimation *; 4.7 Concluding remarks and references; Problems; 5 Information Theory; 5.1 Entropy; 5.2 Mutual information; 5.3 Maximum entropy; 5.4 Negentropy; 5.5 Approximation of entropy by cumulants; 5.6 Approximation of entropy by nonpolynomial functions; 5.7 Concluding remarks and references; Problems; Appendix proofs; 6 Principal Component Analysis and Whitening; 6.1 Principal components; 6.2 PCA by on-line learning; 6.3 Factor analysis; 6.4 Whitening; 6.5 Orthogonalization 6.6 Concluding remarks and referencesProblems; Part II: BASIC INDEPENDENT COMPONENT ANALYSIS; 7 What is Independent Component Analysis?; 7.1 Motivation; 7.2 Definition of independent component analysis; 7.3 Illustration of ICA; 7.4 ICA is stronger that whitening; 7.5 Why gaussian variables are forbidden; 7.6 Concluding remarks and references; Problems; 8 ICA by Maximization of Nongaussianity; 8.1 ""Nongaussian is independent""; 8.2 Measuring nongaussianity by kurtosis; 8.3 Measuring nongaussianity by negentropy; 8.4 Estimating several independent components; 8.5 ICA and projection pursuit 8.6 Concluding remarks and referencesProblems; Appendix proofs; 9 ICA by Maximum Likelihood Estimation; 9.1 The likelihood of the ICA model; 9.2 Algorithms for maximum likelihood estimation; 9.3 The infomax principle; 9.4 Examples; 9.5 Concluding remarks and references; Problems; Appendix proofs; 10 ICA by Minimization of Mutual Information; 10.1 Defining ICA by mutual information; 10.2 Mutual information and nongaussianity; 10.3 Mutual information and likelihood; 10.4 Algorithms for minimization of mutual information; 10.5 Examples; 10.6 Concluding remarks and references; Problems 11 ICA by Tensorial Methods |
Record Nr. | UNINA-9910143176003321 |
Hyvarinen Aapo | ||
New York, : J. Wiley, c2001 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Independent component analysis [[electronic resource] /] / Aapo Hyvärinen, Juha Karhunen, Erkki Oja |
Autore | Hyvärinen Aapo |
Pubbl/distr/stampa | New York, : J. Wiley, c2001 |
Descrizione fisica | 1 online resource (505 p.) |
Disciplina |
519.5
519.5/35 519.535 |
Altri autori (Persone) |
KarhunenJuha
OjaErkki |
Collana | Adaptive and learning systems for signal processing, communications, and control |
Soggetto topico |
Multivariate analysis
Principal components analysis |
ISBN |
1-280-26480-2
9786610264803 0-470-30861-3 0-471-46419-8 0-471-22131-7 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Contents; Preface; 1 Introduction; 1.1 Linear representation of multivariate data; 1.1.1 The general statistical setting; 1.1.2 Dimension reduction methods; 1.1.3 Independence as a guiding principle; 1.2 Blind source separation; 1.2.1 Observing mixtures of unknown signals; 1.2.2 Source separation based on independence; 1.3 Independent component analysis; 1.3.1 Definition; 1.3.2 Applications; 1.3.3 How to find the independent components; 1.4 History of ICA; Part I: MATHEMATICAL PRELIMINARIES; 2 Random Vectors and Independence; 2.1 Probability distributions and densities
2.2 Expectations and moments2.3 Uncorrelatedness and independence; 2.4 Conditional densities and Bayes' rule; 2.5 The multivariate gaussian density; 2.6 Density of a transformation; 2.7 Higher-order statistics; 2.8 Stochastic processes *; 2.9 Concluding remarks and references; Problems; 3 Gradients and Optimization Methods; 3.1 Vector and matrix gradients; 3.2 Learning rules for unconstrained optimization; 3.3 Learning rules for constrained optimization; 3.4 Concluding remarks and references; Problems; 4 Estimation Theory; 4.1 Basic concepts; 4.2 Properties of estimators 4.3 Method of moments4.4 Least-squares estimation; 4.5 Maximum likelihood method; 4.6 Bayesian estimation *; 4.7 Concluding remarks and references; Problems; 5 Information Theory; 5.1 Entropy; 5.2 Mutual information; 5.3 Maximum entropy; 5.4 Negentropy; 5.5 Approximation of entropy by cumulants; 5.6 Approximation of entropy by nonpolynomial functions; 5.7 Concluding remarks and references; Problems; Appendix proofs; 6 Principal Component Analysis and Whitening; 6.1 Principal components; 6.2 PCA by on-line learning; 6.3 Factor analysis; 6.4 Whitening; 6.5 Orthogonalization 6.6 Concluding remarks and referencesProblems; Part II: BASIC INDEPENDENT COMPONENT ANALYSIS; 7 What is Independent Component Analysis?; 7.1 Motivation; 7.2 Definition of independent component analysis; 7.3 Illustration of ICA; 7.4 ICA is stronger that whitening; 7.5 Why gaussian variables are forbidden; 7.6 Concluding remarks and references; Problems; 8 ICA by Maximization of Nongaussianity; 8.1 ""Nongaussian is independent""; 8.2 Measuring nongaussianity by kurtosis; 8.3 Measuring nongaussianity by negentropy; 8.4 Estimating several independent components; 8.5 ICA and projection pursuit 8.6 Concluding remarks and referencesProblems; Appendix proofs; 9 ICA by Maximum Likelihood Estimation; 9.1 The likelihood of the ICA model; 9.2 Algorithms for maximum likelihood estimation; 9.3 The infomax principle; 9.4 Examples; 9.5 Concluding remarks and references; Problems; Appendix proofs; 10 ICA by Minimization of Mutual Information; 10.1 Defining ICA by mutual information; 10.2 Mutual information and nongaussianity; 10.3 Mutual information and likelihood; 10.4 Algorithms for minimization of mutual information; 10.5 Examples; 10.6 Concluding remarks and references; Problems 11 ICA by Tensorial Methods |
Record Nr. | UNISA-996201887503316 |
Hyvärinen Aapo | ||
New York, : J. Wiley, c2001 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. di Salerno | ||
|