LEADER 00885cam0-22003131i-450- 001 990000209690403321 005 20060725144054.0 035 $a000020969 035 $aFED01000020969 035 $a(Aleph)000020969FED01 035 $a000020969 100 $a20020821d1920----km-y0itay50------ba 101 0 $aita 105 $ay-------001yy 200 1 $aCorso di statistica grafica$fAlfredo De Nora 210 $aNapoli$cR. Pironti$d1920 215 $av.$d25 cm 300 $aIn testa al front.: R. Scuola superiore politecnica di Napoli 610 0 $aStatica grafica 676 $a624.171 2 700 1$aDe Nora,$bAlfredo$04567 801 0$aIT$bUNINA$gRICA$2UNIMARC 901 $aBK 912 $a990000209690403321 952 $a13 L 32 28$b3950$fFINBC 952 $a13 L 32 27$b3949$fFINBC 959 $aFINBC 996 $aCorso di statistica grafica$9123408 997 $aUNINA LEADER 05576nam 2200745Ia 450 001 9910828712603321 005 20240313225027.0 010 $a9781118573662 010 $a1118573668 010 $a9781118573617 010 $a1118573617 010 $a9781118573655 010 $a111857365X 035 $a(CKB)2670000000360075 035 $a(EBL)1204740 035 $a(OCoLC)850164833 035 $a(SSID)ssj0000887839 035 $a(PQKBManifestationID)11459813 035 $a(PQKBTitleCode)TC0000887839 035 $a(PQKBWorkID)10846595 035 $a(PQKB)10920269 035 $a(MiAaPQ)EBC1204740 035 $a(DLC) 2013012679 035 $a(Au-PeEL)EBL1204740 035 $a(CaPaEBR)ebr10716695 035 $a(PPN)191470961 035 $a(Perlego)1001604 035 $a(EXLCZ)992670000000360075 100 $a20130322d2013 uy 0 101 0 $aeng 135 $aur|n|---||||| 181 $ctxt 182 $cc 183 $acr 200 10$aHigh-dimensional covariance estimation /$fMohsen Pourahmadi 205 $a1st ed. 210 $aHoboken, NJ $cWiley$dc2013 215 $a1 online resource (208 p.) 225 0 $aWiley series in probability and statistics 300 $aDescription based upon print version of record. 311 08$a9781118034293 311 08$a1118034295 320 $aIncludes bibliographical references and index. 327 $aHIGH-DIMENSIONAL COVARIANCE ESTIMATION; CONTENTS; PREFACE; I MOTIVATION AND THE BASICS; 1 INTRODUCTION; 1.1 Least Squares and Regularized Regression; 1.2 Lasso: Survival of the Bigger; 1.3 Thresholding the Sample Covariance Matrix; 1.4 Sparse PCA and Regression; 1.5 Graphical Models: Nodewise Regression; 1.6 Cholesky Decomposition and Regression; 1.7 The Bigger Picture: Latent Factor Models; 1.8 Further Reading; 2 DATA, SPARSITY, AND REGULARIZATION; 2.1 Data Matrix: Examples; 2.2 Shrinking the Sample Covariance Matrix; 2.3 Distribution of the Sample Eigenvalues 327 $a2.4 Regularizing Covariances Like a Mean2.5 The Lasso Regression; 2.6 Lasso: Variable Selection and Prediction; 2.7 Lasso: Degrees of Freedom and BIC; 2.8 Some Alternatives to the Lasso Penalty; 3 COVARIANCE MATRICES; 3.1 Definition and Basic Properties; 3.2 The Spectral Decomposition; 3.3 Structured Covariance Matrices; 3.4 Functions of a Covariance Matrix; 3.5 PCA: The Maximum Variance Property; 3.6 Modified Cholesky Decomposition; 3.7 Latent Factor Models; 3.8 GLM for Covariance Matrices; 3.9 GLM via the Cholesky Decomposition; 3.10 GLM for Incomplete Longitudinal Data 327 $a3.10.1 The Incoherency Problem in Incomplete Longitudinal Data3.10.2 The Incomplete Data and The EM Algorithm; 3.11 A Data Example: Fruit Fly Mortality Rate; 3.12 Simulating Random Correlation Matrices; 3.13 Bayesian Analysis of Covariance Matrices; II COVARIANCE ESTIMATION: REGULARIZATION; 4 REGULARIZING THE EIGENSTRUCTURE; 4.1 Shrinking the Eigenvalues; 4.2 Regularizing The Eigenvectors; 4.3 A Duality between PCA and SVD; 4.4 Implementing Sparse PCA: A Data Example; 4.5 Sparse Singular Value Decomposition (SSVD); 4.6 Consistency of PCA; 4.7 Principal Subspace Estimation; 4.8 Further Reading 327 $a5 SPARSE GAUSSIAN GRAPHICAL MODELS5.1 Covariance Selection Models: Two Examples; 5.2 Regression Interpretation of Entries of ?-1; 5.3 Penalized Likelihood and Graphical Lasso; 5.4 Penalized Quasi-Likelihood Formulation; 5.5 Penalizing the Cholesky Factor; 5.6 Consistency and Sparsistency; 5.7 Joint Graphical Models; 5.8 Further Reading; 6 BANDING, TAPERING, AND THRESHOLDING; 6.1 Banding the Sample Covariance Matrix; 6.2 Tapering the Sample Covariance Matrix; 6.3 Thresholding the Sample Covariance Matrix; 6.4 Low-Rank Plus Sparse Covariance Matrices; 6.5 Further Reading 327 $a7 MULTIVARIATE REGRESSION: ACCOUNTING FOR CORRELATION7.1 Multivariate Regression and LS Estimators; 7.2 Reduced Rank Regressions (RRR); 7.3 Regularized Estimation of B; 7.4 Joint Regularization of (B, ?); 7.5 Implementing MRCE: Data Examples; 7.5.1 Intraday Electricity Prices; 7.5.2 Predicting Asset Returns; 7.6 Further Reading; BIBLIOGRAPHY; INDEX; WILEY SERIES IN PROBABILITY AND STATISTICS 330 $a"Focusing on methodology and computation more than on theorems and proofs, this book provides computationally feasible and statistically efficient methods for estimating sparse and large covariance matrices of high-dimensional data. Extensive in breadth and scope, it features ample applications to a number of applied areas, including business and economics, computer science, engineering, and financial mathematics; recognizes the important and significant contributions of longitudinal and spatial data; and includes various computer codes in R throughout the text and on an author-maintained web site"--$cProvided by publisher. 330 $a"The aim of this book is to provide computationally feasible and statistically efficient methods for estimating sparse and large covariance matrices of high-dimensional data"--$cProvided by publisher. 410 0$aWiley Series in Probability and Statistics 606 $aAnalysis of covariance 606 $aMultivariate analysis 615 0$aAnalysis of covariance. 615 0$aMultivariate analysis. 676 $a519.5/38 686 $aMAT029020$2bisacsh 700 $aPourahmadi$b Mohsen$0496526 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a9910828712603321 996 $aHigh-dimensional covariance estimation$94036715 997 $aUNINA