LEADER 05456nam 2200661Ia 450 001 9910143176003321 005 20200520144314.0 010 $a1-280-26480-2 010 $a9786610264803 010 $a0-470-30861-3 010 $a0-471-46419-8 010 $a0-471-22131-7 035 $a(CKB)111087027125454 035 $a(EBL)215162 035 $a(SSID)ssj0000080442 035 $a(PQKBManifestationID)11110574 035 $a(PQKBTitleCode)TC0000080442 035 $a(PQKBWorkID)10095249 035 $a(PQKB)10049759 035 $a(MiAaPQ)EBC215162 035 $a(OCoLC)85820929 035 $a(EXLCZ)99111087027125454 100 $a20010412d2001 uy 0 101 0 $aeng 135 $aur|n|---||||| 181 $ctxt 182 $cc 183 $acr 200 10$aIndependent component analysis /$fAapo Hyvarinen, Juha Karhunen, Erkki Oja 210 $aNew York $cJ. Wiley$dc2001 215 $a1 online resource (505 p.) 225 1 $aAdaptive and learning systems for signal processing, communications, and control 300 $aDescription based upon print version of record. 311 $a0-471-40540-X 320 $aIncludes bibliographical references (p. 449-475) and index. 327 $aContents; Preface; 1 Introduction; 1.1 Linear representation of multivariate data; 1.1.1 The general statistical setting; 1.1.2 Dimension reduction methods; 1.1.3 Independence as a guiding principle; 1.2 Blind source separation; 1.2.1 Observing mixtures of unknown signals; 1.2.2 Source separation based on independence; 1.3 Independent component analysis; 1.3.1 Definition; 1.3.2 Applications; 1.3.3 How to find the independent components; 1.4 History of ICA; Part I: MATHEMATICAL PRELIMINARIES; 2 Random Vectors and Independence; 2.1 Probability distributions and densities 327 $a2.2 Expectations and moments2.3 Uncorrelatedness and independence; 2.4 Conditional densities and Bayes' rule; 2.5 The multivariate gaussian density; 2.6 Density of a transformation; 2.7 Higher-order statistics; 2.8 Stochastic processes *; 2.9 Concluding remarks and references; Problems; 3 Gradients and Optimization Methods; 3.1 Vector and matrix gradients; 3.2 Learning rules for unconstrained optimization; 3.3 Learning rules for constrained optimization; 3.4 Concluding remarks and references; Problems; 4 Estimation Theory; 4.1 Basic concepts; 4.2 Properties of estimators 327 $a4.3 Method of moments4.4 Least-squares estimation; 4.5 Maximum likelihood method; 4.6 Bayesian estimation *; 4.7 Concluding remarks and references; Problems; 5 Information Theory; 5.1 Entropy; 5.2 Mutual information; 5.3 Maximum entropy; 5.4 Negentropy; 5.5 Approximation of entropy by cumulants; 5.6 Approximation of entropy by nonpolynomial functions; 5.7 Concluding remarks and references; Problems; Appendix proofs; 6 Principal Component Analysis and Whitening; 6.1 Principal components; 6.2 PCA by on-line learning; 6.3 Factor analysis; 6.4 Whitening; 6.5 Orthogonalization 327 $a6.6 Concluding remarks and referencesProblems; Part II: BASIC INDEPENDENT COMPONENT ANALYSIS; 7 What is Independent Component Analysis?; 7.1 Motivation; 7.2 Definition of independent component analysis; 7.3 Illustration of ICA; 7.4 ICA is stronger that whitening; 7.5 Why gaussian variables are forbidden; 7.6 Concluding remarks and references; Problems; 8 ICA by Maximization of Nongaussianity; 8.1 ""Nongaussian is independent""; 8.2 Measuring nongaussianity by kurtosis; 8.3 Measuring nongaussianity by negentropy; 8.4 Estimating several independent components; 8.5 ICA and projection pursuit 327 $a8.6 Concluding remarks and referencesProblems; Appendix proofs; 9 ICA by Maximum Likelihood Estimation; 9.1 The likelihood of the ICA model; 9.2 Algorithms for maximum likelihood estimation; 9.3 The infomax principle; 9.4 Examples; 9.5 Concluding remarks and references; Problems; Appendix proofs; 10 ICA by Minimization of Mutual Information; 10.1 Defining ICA by mutual information; 10.2 Mutual information and nongaussianity; 10.3 Mutual information and likelihood; 10.4 Algorithms for minimization of mutual information; 10.5 Examples; 10.6 Concluding remarks and references; Problems 327 $a11 ICA by Tensorial Methods 330 $aA comprehensive introduction to ICA for students and practitionersIndependent Component Analysis (ICA) is one of the most exciting new topics in fields such as neural networks, advanced statistics, and signal processing. This is the first book to provide a comprehensive introduction to this new technique complete with the fundamental mathematical background needed to understand and utilize it. It offers a general overview of the basics of ICA, important solutions and algorithms, and in-depth coverage of new applications in image processing, telecommunications, audio signal processing, and 410 0$aAdaptive and learning systems for signal processing, communications, and control. 606 $aMultivariate analysis 606 $aPrincipal components analysis 615 0$aMultivariate analysis. 615 0$aPrincipal components analysis. 676 $a519.5/35 700 $aHyvarinen$b Aapo$00 701 $aKarhunen$b Juha$067632 701 $aOja$b Erkki$067633 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a9910143176003321 996 $aIndependent component analysis$942343 997 $aUNINA