01371nam0 22003011i 450 VAN005646620221202102000.68620061122d1950 |0itac50 baitaIT||||n |||||Modifiche al Codice di procedura civileD.L. 5 maggio 1948, n. 483 modificato con Legge 14-7-1950, n. 581 e D.P.R. 17-10-1950, n. 857 recante le disposizioni di coordinamemto, di attuazione e transitorie..S. Maria C. V.E. Schianostampa 1950LII, 74 p.21 cmCodice di procedura civileVANC001940FISanta Maria Capua VetereVANL000591ItaliaVANV001942423419SchianoVANV110676650Repubblica italianaItaliaVANV057868Regno d'Italia <1861-1946>ItaliaVANV057870Italia <Regno ; 1861-1946>ItaliaVANV057871Italia <Repubblica ; 1946- >ItaliaVANV057873ITSOL20221209RICABIBLIOTECA DEL DIPARTIMENTO DI GIURISPRUDENZAIT-CE0105VAN00VAN0056466BIBLIOTECA DEL DIPARTIMENTO DI GIURISPRUDENZA00CONS XVI.A.55 00FP 29669 20061122 Fondo Raffaele PapaModifiche al Codice di procedura civile1424497UNISOB05630nam 2200733Ia 450 991102009890332120200520144314.0978661026480397812802648011280264802978047030861504703086139780471464198047146419897804712213190471221317(CKB)111087027125454(EBL)215162(SSID)ssj0000080442(PQKBManifestationID)11110574(PQKBTitleCode)TC0000080442(PQKBWorkID)10095249(PQKB)10049759(MiAaPQ)EBC215162(OCoLC)85820929(Perlego)2762399(EXLCZ)9911108702712545420010412d2001 uy 0engur|n|---|||||txtccrIndependent component analysis /Aapo Hyvarinen, Juha Karhunen, Erkki OjaNew York J. Wileyc20011 online resource (505 p.)Adaptive and learning systems for signal processing, communications, and controlDescription based upon print version of record.9780471405405 047140540X Includes bibliographical references (p. 449-475) and index.Contents; Preface; 1 Introduction; 1.1 Linear representation of multivariate data; 1.1.1 The general statistical setting; 1.1.2 Dimension reduction methods; 1.1.3 Independence as a guiding principle; 1.2 Blind source separation; 1.2.1 Observing mixtures of unknown signals; 1.2.2 Source separation based on independence; 1.3 Independent component analysis; 1.3.1 Definition; 1.3.2 Applications; 1.3.3 How to find the independent components; 1.4 History of ICA; Part I: MATHEMATICAL PRELIMINARIES; 2 Random Vectors and Independence; 2.1 Probability distributions and densities2.2 Expectations and moments2.3 Uncorrelatedness and independence; 2.4 Conditional densities and Bayes' rule; 2.5 The multivariate gaussian density; 2.6 Density of a transformation; 2.7 Higher-order statistics; 2.8 Stochastic processes *; 2.9 Concluding remarks and references; Problems; 3 Gradients and Optimization Methods; 3.1 Vector and matrix gradients; 3.2 Learning rules for unconstrained optimization; 3.3 Learning rules for constrained optimization; 3.4 Concluding remarks and references; Problems; 4 Estimation Theory; 4.1 Basic concepts; 4.2 Properties of estimators4.3 Method of moments4.4 Least-squares estimation; 4.5 Maximum likelihood method; 4.6 Bayesian estimation *; 4.7 Concluding remarks and references; Problems; 5 Information Theory; 5.1 Entropy; 5.2 Mutual information; 5.3 Maximum entropy; 5.4 Negentropy; 5.5 Approximation of entropy by cumulants; 5.6 Approximation of entropy by nonpolynomial functions; 5.7 Concluding remarks and references; Problems; Appendix proofs; 6 Principal Component Analysis and Whitening; 6.1 Principal components; 6.2 PCA by on-line learning; 6.3 Factor analysis; 6.4 Whitening; 6.5 Orthogonalization6.6 Concluding remarks and referencesProblems; Part II: BASIC INDEPENDENT COMPONENT ANALYSIS; 7 What is Independent Component Analysis?; 7.1 Motivation; 7.2 Definition of independent component analysis; 7.3 Illustration of ICA; 7.4 ICA is stronger that whitening; 7.5 Why gaussian variables are forbidden; 7.6 Concluding remarks and references; Problems; 8 ICA by Maximization of Nongaussianity; 8.1 ""Nongaussian is independent""; 8.2 Measuring nongaussianity by kurtosis; 8.3 Measuring nongaussianity by negentropy; 8.4 Estimating several independent components; 8.5 ICA and projection pursuit8.6 Concluding remarks and referencesProblems; Appendix proofs; 9 ICA by Maximum Likelihood Estimation; 9.1 The likelihood of the ICA model; 9.2 Algorithms for maximum likelihood estimation; 9.3 The infomax principle; 9.4 Examples; 9.5 Concluding remarks and references; Problems; Appendix proofs; 10 ICA by Minimization of Mutual Information; 10.1 Defining ICA by mutual information; 10.2 Mutual information and nongaussianity; 10.3 Mutual information and likelihood; 10.4 Algorithms for minimization of mutual information; 10.5 Examples; 10.6 Concluding remarks and references; Problems11 ICA by Tensorial MethodsA comprehensive introduction to ICA for students and practitionersIndependent Component Analysis (ICA) is one of the most exciting new topics in fields such as neural networks, advanced statistics, and signal processing. This is the first book to provide a comprehensive introduction to this new technique complete with the fundamental mathematical background needed to understand and utilize it. It offers a general overview of the basics of ICA, important solutions and algorithms, and in-depth coverage of new applications in image processing, telecommunications, audio signal processing, andAdaptive and learning systems for signal processing, communications, and control.Multivariate analysisPrincipal components analysisMultivariate analysis.Principal components analysis.519.5/35Hyvarinen Aapo352634Karhunen Juha67632Oja Erkki67633MiAaPQMiAaPQMiAaPQBOOK9911020098903321Independent component analysis42343UNINA