LEADER 03954nam 2200469 450 001 9910799493903321 005 20240119114117.0 010 $a981-9938-41-4 035 $a(CKB)29449601300041 035 $a(MiAaPQ)EBC31051649 035 $a(Au-PeEL)EBL31051649 035 $a(MiAaPQ)EBC31031964 035 $a(Au-PeEL)EBL31031964 035 $a(EXLCZ)9929449601300041 100 $a20240119d2023 uy 0 101 0 $aeng 135 $aur||||||||||| 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aWAIC and WBIC with Python Stan $e100 Exercises for Building Logic /$fJoe Suzuki 205 $aFirst edition. 210 1$aSingapore :$cSpringer Nature Singapore Pte Ltd,$d[2023] 210 4$dİ2023 215 $a1 online resource (249 pages) 311 08$a9789819938407 320 $aIncludes bibliographical references and index. 327 $aIntro -- Preface: Sumio Watanabe-Spreading the Wonder of Bayesian Theory -- One-Point Advice for Those Who Struggle with Math -- Features of This Series -- Contents -- 1 Overview of Watanabe's Bayes -- 1.1 Frequentist Statistics -- 1.2 Bayesian Statistics -- 1.3 Asymptotic Normality of the Posterior Distribution -- 1.4 Model Selection -- 1.5 Why are WAIC and WBIC Bayesian Statistics? -- 1.6 What is ``Regularity'' -- 1.7 Why is Algebraic Geometry Necessary for Understanding WAIC and WBIC? -- 1.8 Hironaka's Desingularization, Nothing to Fear -- 1.9 What is the Meaning of Algebraic Geometry's ? in Bayesian Statistics? -- 2 Introduction to Watanabe Bayesian Theory -- 2.1 Prior Distribution, Posterior Distribution, and Predictive Distribution -- 2.2 True Distribution and Statistical Model -- 2.3 Toward a Generalization Without Assuming Regularity -- 2.4 Exponential Family -- 3 MCMC and Stan -- 3.1 MCMC and Metropolis-Hastings Method -- 3.2 Hamiltonian Monte Carlo Method -- 3.3 Stan in Practice -- 3.3.1 Binomial Distribution -- 3.3.2 Normal Distribution -- 3.3.3 Simple Linear Regression -- 3.3.4 Multiple Regression -- 3.3.5 Mixture of Normal Distributions -- 4 Mathematical Preparation -- 4.1 Elementary Mathematics -- 4.1.1 Matrices and Eigenvalues -- 4.1.2 Open Sets, Closed Sets, and Compact Sets -- 4.1.3 Mean Value Theorem and Taylor Expansion -- 4.2 Analytic Functions -- 4.3 Law of Large Numbers and Central Limit Theorem -- 4.3.1 Random Variables -- 4.3.2 Order Notation -- 4.3.3 Law of Large Numbers -- 4.3.4 Central Limit Theorem -- 4.4 Fisher Information Matrix -- 5 Regular Statistical Models -- 5.1 Empirical Process -- 5.2 Asymptotic Normality of the Posterior Distribution -- 5.3 Generalization Loss and Empirical Loss -- 6 Information Criteria -- 6.1 Model Selection Based on Information Criteria -- 6.2 AIC and TIC -- 6.3 WAIC. 327 $a6.4 Free Energy, BIC, and WBIC -- 7 Algebraic Geometry -- 7.1 Algebraic Sets and Analytical Sets -- 7.2 Manifold -- 7.3 Singular Points and Their Resolution -- 7.4 Hironaka's Theorem -- 7.5 Local Coordinates in Watanabe Bayesian Theory -- 8 The Essence of WAIC -- 8.1 Formula of State Density -- 8.2 Generalization of the Posterior Distribution -- 8.3 Properties of WAIC -- 8.4 Equivalence with Cross-Validation-Like Methods -- 9 WBIC and Its Application to Machine Learning -- 9.1 Properties of WBIC -- 9.2 Calculation of the Learning Coefficient -- 9.3 Application to Deep Learning -- 9.4 Application to Gaussian Mixture Models -- 9.5 Non-informative Prior Distribution -- References -- -- Index. 606 $aBayesian statistical decision theory 606 $aLogic, Symbolic and mathematical 615 0$aBayesian statistical decision theory. 615 0$aLogic, Symbolic and mathematical. 676 $a519.542 700 $aSuzuki$b Joe$0846228 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a9910799493903321 996 $aWAIC and WBIC with Python Stan$93872434 997 $aUNINA