LEADER 03961nam 22006255 450 001 9910366589703321 005 20251022194905.0 010 $a3-030-17076-4 024 7 $a10.1007/978-3-030-17076-9 035 $a(CKB)4100000008424440 035 $a(DE-He213)978-3-030-17076-9 035 $a(MiAaPQ)EBC5789425 035 $a(PPN)258875615 035 $a(EXLCZ)994100000008424440 100 $a20190612d2020 u| 0 101 0 $aeng 135 $aurnn|008mamaa 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aMathematical Theories of Machine Learning - Theory and Applications /$fby Bin Shi, S. S. Iyengar 205 $a1st ed. 2020. 210 1$aCham :$cSpringer International Publishing :$cImprint: Springer,$d2020. 215 $a1 online resource (XXI, 133 p. 25 illus., 24 illus. in color.) 311 08$a3-030-17075-6 327 $aChapter 1. Introduction -- Chapter 2. General Framework of Mathematics -- Chapter 3. Problem Formulation -- Chapter 4. Development of Novel Techniques of CoCoSSC Method -- Chapter 5. Further Discussions of the Proposed Method -- Chapter 6. Related Work on Geometry of Non-Convex Programs -- Chapter 7. Gradient Descent Converges to Minimizers -- Chapter 8. A Conservation Law Method Based on Optimization -- Chapter 9. Improved Sample Complexity in Sparse Subspace Clustering with Noisy and Missing Observations -- Chapter 10. Online Discovery for Stable and Grouping Causalities in Multi-Variate Time Series -- Chapter 11. Conclusion. 330 $aThis book studies mathematical theories of machine learning. The first part of the book explores the optimality and adaptivity of choosing step sizes of gradient descent for escaping strict saddle points in non-convex optimization problems. In the second part, the authors propose algorithms to find local minima in nonconvex optimization and to obtain global minima in some degree from the Newton Second Law without friction. In the third part, the authors study the problem of subspace clustering with noisy and missing data, which is a problem well-motivated by practical applications data subject to stochastic Gaussian noise and/or incomplete data with uniformly missing entries. In the last part, the authors introduce an novel VAR model with Elastic-Net regularization and its equivalent Bayesian model allowing for both a stable sparsity and a group selection. Provides a thorough look into the variety of mathematical theories of machine learning Presented in four parts, allowing for readers to easily navigate the complex theories Includes extensive empirical studies on both the synthetic and real application time series data. 606 $aTelecommunication 606 $aComputational intelligence 606 $aData mining 606 $aInformation storage and retrieval systems 606 $aQuantitative research 606 $aCommunications Engineering, Networks 606 $aComputational Intelligence 606 $aData Mining and Knowledge Discovery 606 $aInformation Storage and Retrieval 606 $aData Analysis and Big Data 615 0$aTelecommunication. 615 0$aComputational intelligence. 615 0$aData mining. 615 0$aInformation storage and retrieval systems. 615 0$aQuantitative research. 615 14$aCommunications Engineering, Networks. 615 24$aComputational Intelligence. 615 24$aData Mining and Knowledge Discovery. 615 24$aInformation Storage and Retrieval. 615 24$aData Analysis and Big Data. 676 $a621.382 676 $a006.310151 700 $aShi$b Bin$4aut$4http://id.loc.gov/vocabulary/relators/aut$01061990 702 $aIyengar$b S. S.$4aut$4http://id.loc.gov/vocabulary/relators/aut 906 $aBOOK 912 $a9910366589703321 996 $aMathematical Theories of Machine Learning - Theory and Applications$92521739 997 $aUNINA