01154nam a2200313 i 450099100109239970753620020507183125.0931108s1986 uk ||| | eng 0521306604b10799837-39ule_instLE01306899ExLDip.to Matematicaeng512.2AMS 20CAMS 20C20QA171.A545Alperin, J. L.54132Local representation theory :modular representations as an introduction to the local representation theory of finite groups /J. L. AlperinCambridge :Cambridge University Press,1986x, 178 p. ;23 cm.Cambridge studies in advanced mathematics ;11Finite groupsModular representations of groups.b1079983723-02-1728-06-02991001092399707536LE013 20C ALP11 (1986)12013000073767le013-E0.00-l- 00000.i1090368928-06-02Local representation theory82477UNISALENTOle01301-01-93ma -enguk 0103961nam 22006255 450 991036658970332120251022194905.03-030-17076-410.1007/978-3-030-17076-9(CKB)4100000008424440(DE-He213)978-3-030-17076-9(MiAaPQ)EBC5789425(PPN)258875615(EXLCZ)99410000000842444020190612d2020 u| 0engurnn|008mamaatxtrdacontentcrdamediacrrdacarrierMathematical Theories of Machine Learning - Theory and Applications /by Bin Shi, S. S. Iyengar1st ed. 2020.Cham :Springer International Publishing :Imprint: Springer,2020.1 online resource (XXI, 133 p. 25 illus., 24 illus. in color.) 3-030-17075-6 Chapter 1. Introduction -- Chapter 2. General Framework of Mathematics -- Chapter 3. Problem Formulation -- Chapter 4. Development of Novel Techniques of CoCoSSC Method -- Chapter 5. Further Discussions of the Proposed Method -- Chapter 6. Related Work on Geometry of Non-Convex Programs -- Chapter 7. Gradient Descent Converges to Minimizers -- Chapter 8. A Conservation Law Method Based on Optimization -- Chapter 9. Improved Sample Complexity in Sparse Subspace Clustering with Noisy and Missing Observations -- Chapter 10. Online Discovery for Stable and Grouping Causalities in Multi-Variate Time Series -- Chapter 11. Conclusion.This book studies mathematical theories of machine learning. The first part of the book explores the optimality and adaptivity of choosing step sizes of gradient descent for escaping strict saddle points in non-convex optimization problems. In the second part, the authors propose algorithms to find local minima in nonconvex optimization and to obtain global minima in some degree from the Newton Second Law without friction. In the third part, the authors study the problem of subspace clustering with noisy and missing data, which is a problem well-motivated by practical applications data subject to stochastic Gaussian noise and/or incomplete data with uniformly missing entries. In the last part, the authors introduce an novel VAR model with Elastic-Net regularization and its equivalent Bayesian model allowing for both a stable sparsity and a group selection. Provides a thorough look into the variety of mathematical theories of machine learning Presented in four parts, allowing for readers to easily navigate the complex theories Includes extensive empirical studies on both the synthetic and real application time series data.TelecommunicationComputational intelligenceData miningInformation storage and retrieval systemsQuantitative researchCommunications Engineering, NetworksComputational IntelligenceData Mining and Knowledge DiscoveryInformation Storage and RetrievalData Analysis and Big DataTelecommunication.Computational intelligence.Data mining.Information storage and retrieval systems.Quantitative research.Communications Engineering, Networks.Computational Intelligence.Data Mining and Knowledge Discovery.Information Storage and Retrieval.Data Analysis and Big Data.621.382006.310151Shi Binauthttp://id.loc.gov/vocabulary/relators/aut1061990Iyengar S. S.authttp://id.loc.gov/vocabulary/relators/autBOOK9910366589703321Mathematical Theories of Machine Learning - Theory and Applications2521739UNINA