LEADER 03448nam 22005775 450 001 9911031669703321 005 20251001130526.0 010 $a3-031-95785-7 024 7 $a10.1007/978-3-031-95785-7 035 $a(MiAaPQ)EBC32323333 035 $a(Au-PeEL)EBL32323333 035 $a(CKB)41528493200041 035 $a(OCoLC)1543044799 035 $a(DE-He213)978-3-031-95785-7 035 $a(EXLCZ)9941528493200041 100 $a20251001d2025 u| 0 101 0 $aeng 135 $aurcnu|||||||| 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aLinear Dimensionality Reduction /$fby Alain Franc 205 $a1st ed. 2025. 210 1$aCham :$cSpringer Nature Switzerland :$cImprint: Springer,$d2025. 215 $a1 online resource (281 pages) 225 1 $aLecture Notes in Statistics,$x2197-7186 ;$v228 311 08$a3-031-95784-9 327 $a- 1. Introduction -- 2. Principal Component Analysis (PCA) -- 3. Complements on PCA -- 4. PCA with Metrics on Rows and Columns -- 5. Correspondence Analysis -- 6. PCA with Instrumental Variables -- 7. Canonical Correlation Analysis -- 8. Multiple Canonical Correlation Analysis -- 9. Multidimensional Scaling. 330 $aThis book provides an overview of some classical linear methods in Multivariate Data Analysis. This is an old domain, well established since the 1960s, and refreshed timely as a key step in statistical learning. It can be presented as part of statistical learning, or as dimensionality reduction with a geometric flavor. Both approaches are tightly linked: it is easier to learn patterns from data in low-dimensional spaces than in high-dimensional ones. It is shown how a diversity of methods and tools boil down to a single core method, PCA with SVD, so that the efforts to optimize codes for analyzing massive data sets like distributed memory and task-based programming, or to improve the efficiency of algorithms like Randomized SVD, can focus on this shared core method, and benefit all methods. This book is aimed at graduate students and researchers working on massive data who have encountered the usefulness of linear dimensionality reduction and are looking for a recipe to implement it. It has been written according to the view that the best guarantee of a proper understanding and use of a method is to study in detail the calculations involved in implementing it. With an emphasis on the numerical processing of massive data, it covers the main methods of dimensionality reduction, from linear algebra foundations to implementing the calculations. The basic requisite elements of linear and multilinear algebra, statistics and random algorithms are presented in the appendix. 410 0$aLecture Notes in Statistics,$x2197-7186 ;$v228 606 $aMultivariate analysis 606 $aBig data 606 $aMachine learning 606 $aMultivariate Analysis 606 $aBig Data 606 $aStatistical Learning 615 0$aMultivariate analysis. 615 0$aBig data. 615 0$aMachine learning. 615 14$aMultivariate Analysis. 615 24$aBig Data. 615 24$aStatistical Learning. 676 $a519.535 700 $aFranc$b Alain$01364666 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a9911031669703321 996 $aLinear Dimensionality Reduction$94443083 997 $aUNINA