LEADER 05608nam 22007094a 450 001 9910831039703321 005 20201005145315.0 010 $a1-281-28447-5 010 $a9786611284473 010 $a0-470-19161-9 010 $a0-470-19160-0 035 $a(CKB)1000000000389787 035 $a(EBL)335712 035 $a(OCoLC)476150239 035 $a(SSID)ssj0000145608 035 $a(PQKBManifestationID)11157394 035 $a(PQKBTitleCode)TC0000145608 035 $a(PQKBWorkID)10156308 035 $a(PQKB)10810435 035 $a(MiAaPQ)EBC335712 035 $a(PPN)196871379 035 $a(EXLCZ)991000000000389787 100 $a20070427d2008 uy 0 101 0 $aeng 135 $aur|n|---||||| 181 $ctxt 182 $cc 183 $acr 200 14$aThe EM algorithm and extensions$b[electronic resource] /$fGeoffrey J. McLachlan, Thriyambakam Krishnan 205 $a2nd ed. 210 $aHoboken, N.J. $cWiley-Interscience$dc2008 215 $a1 online resource (399 p.) 225 1 $aWiley series in probability and statistics 300 $aDescription based upon print version of record. 311 $a0-471-20170-7 320 $aIncludes bibliographical references (p. 311-337) and indexes. 327 $aThe EM Algorithm and Extensions; CONTENTS; PREFACE TO THE SECOND EDITION; PREFACE TO THE FIRST EDITION; LIST OF EXAMPLES; 1 GENERAL INTRODUCTION; 1.1 Introduction; 1.2 Maximum Likelihood Estimation; 1.3 Newton-Type Methods; 1.3.1 Introduction; 1.3.2 Newton-Raphson Method; 1.3.3 Quasi-Newton Methods; 1.3.4 Modified Newton Methods; 1.4 Introductory Examples; 1.4.1 Introduction; 1.4.2 Example 1.1: A Multinomial Example; 1.4.3 Example 1.2: Estimation of Mixing Proportions; 1.5 Formulation of the EM Algorithm; 1.5.1 EM Algorithm; 1.5.2 Example 1.3: Censored Exponentially Distributed Survival Times 327 $a1.5.3 E- and M-Steps for the Regular Exponential Family1.5.4 Example 1.4: Censored Exponentially Distributed Survival Times (Example 1.3 Continued); 1.5.5 Generalized EM Algorithm; 1.5.6 GEM Algorithm Based on One Newton-Raphson Step; 1.5.7 EM Gradient Algorithm; 1.5.8 EM Mapping; 1.6 EM Algorithm for MAP and MPL Estimation; 1.6.1 Maximum a Posteriori Estimation; 1.6.2 Example 1.5: A Multinomial Example (Example 1.1 Continued); 1.6.3 Maximum Penalized Estimation; 1.7 Brief Summary of the Properties of the EM Algorithm; 1.8 History of the EM Algorithm; 1.8.1 Early EM History 327 $a1.8.2 Work Before Dempster, Laird, and Rubin (1977)1.8.3 EM Examples and Applications Since Dempster, Laird, and Rubin (1977); 1.8.4 Two Interpretations of EM; 1.8.5 Developments in EM Theory, Methodology, and Applications; 1.9 Overview of the Book; 1.10 Notations; 2 EXAMPLES OF THE EM ALGORITHM; 2.1 Introduction; 2.2 Multivariate Data with Missing Values; 2.2.1 Example 2.1: Bivariate Normal Data with Missing Values; 2.2.2 Numerical Illustration; 2.2.3 Multivariate Data: Buck's Method; 2.3 Least Squares with Missing Data; 2.3.1 Healy-Westmacott Procedure 327 $a2.3.2 Example 2.2: Linear Regression with Missing Dependent Values2.3.3 Example 2.3: Missing Values in a Latin Square Design; 2.3.4 Healy-Westmacott Procedure as an EM Algorithm; 2.4 Example 2.4: Multinomial with Complex Cell Structure; 2.5 Example 2.5: Analysis of PET and SPECT Data; 2.6 Example 2.6: Multivariate t-Distribution (Known D.F.); 2.6.1 ML Estimation of Multivariate t-Distribution; 2.6.2 Numerical Example: Stack Loss Data; 2.7 Finite Normal Mixtures; 2.7.1 Example 2.7: Univariate Component Densities; 2.7.2 Example 2.8: Multivariate Component Densities 327 $a2.7.3 Numerical Example: Red Blood Cell Volume Data2.8 Example 2.9: Grouped and Truncated Data; 2.8.1 Introduction; 2.8.2 Specification of Complete Data; 2.8.3 E-Step; 2.8.4 M-Step; 2.8.5 Confirmation of Incomplete-Data Score Statistic; 2.8.6 M-Step for Grouped Normal Data; 2.8.7 Numerical Example: Grouped Log Normal Data; 2.9 Example 2.10: A Hidden Markov AR(1) model; 3 BASIC THEORY OF THE EM ALGORITHM; 3.1 Introduction; 3.2 Monotonicity of the EM Algorithm; 3.3 Monotonicity of a Generalized EM Algorithm; 3.4 Convergence of an EM Sequence to a Stationary Value; 3.4.1 Introduction 327 $a3.4.2 Regularity Conditions of Wu (1983) 330 $aThe only single-source--now completely updated and revised--to offer a unified treatment of the theory, methodology, and applications of the EM algorithm Complete with updates that capture developments from the past decade, The EM Algorithm and Extensions, Second Edition successfully provides a basic understanding of the EM algorithm by describing its inception, implementation, and applicability in numerous statistical contexts. In conjunction with the fundamentals of the topic, the authors discuss convergence issues and computation of standard errors, and, in addition, unveil many parallels 410 0$aWiley series in probability and statistics. 606 $aExpectation-maximization algorithms 606 $aEstimation theory 606 $aMissing observations (Statistics) 615 0$aExpectation-maximization algorithms. 615 0$aEstimation theory. 615 0$aMissing observations (Statistics) 676 $a519.5 676 $a519.5/44 676 $a519.544 700 $aMcLachlan$b Geoffrey J.$f1946-$027687 701 $aKrishnan$b T$g(Thriyambakam),$f1938-$0253437 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a9910831039703321 996 $aEM algorithm and extensions$9104008 997 $aUNINA