1.

Record Nr.

UNINA9910877999303321

Autore

McLachlan Geoffrey J. <1946->

Titolo

The EM algorithm and extensions / / Geoffrey J. McLachlan, Thriyambakam Krishnan

Pubbl/distr/stampa

Hoboken, N.J., : Wiley-Interscience, c2008

ISBN

1-281-28447-5

9786611284473

0-470-19161-9

0-470-19160-0

Edizione

[2nd ed.]

Descrizione fisica

1 online resource (399 p.)

Collana

Wiley series in probability and statistics

Altri autori (Persone)

KrishnanT <1938-> (Thriyambakam)

Disciplina

519.5/44

Soggetti

Expectation-maximization algorithms

Estimation theory

Missing observations (Statistics)

Lingua di pubblicazione

Inglese

Formato

Materiale a stampa

Livello bibliografico

Monografia

Note generali

Description based upon print version of record.

Nota di bibliografia

Includes bibliographical references (p. 311-337) and indexes.

Nota di contenuto

The EM Algorithm and Extensions; CONTENTS; PREFACE TO THE SECOND EDITION; PREFACE TO THE FIRST EDITION; LIST OF EXAMPLES; 1 GENERAL INTRODUCTION; 1.1 Introduction; 1.2 Maximum Likelihood Estimation; 1.3 Newton-Type Methods; 1.3.1 Introduction; 1.3.2 Newton-Raphson Method; 1.3.3 Quasi-Newton Methods; 1.3.4 Modified Newton Methods; 1.4 Introductory Examples; 1.4.1 Introduction; 1.4.2 Example 1.1: A Multinomial Example; 1.4.3 Example 1.2: Estimation of Mixing Proportions; 1.5 Formulation of the EM Algorithm; 1.5.1 EM Algorithm; 1.5.2 Example 1.3: Censored Exponentially Distributed Survival Times

1.5.3 E- and M-Steps for the Regular Exponential Family1.5.4 Example 1.4: Censored Exponentially Distributed Survival Times (Example 1.3 Continued); 1.5.5 Generalized EM Algorithm; 1.5.6 GEM Algorithm Based on One Newton-Raphson Step; 1.5.7 EM Gradient Algorithm; 1.5.8 EM Mapping; 1.6 EM Algorithm for MAP and MPL Estimation; 1.6.1 Maximum a Posteriori Estimation; 1.6.2 Example 1.5: A Multinomial Example (Example 1.1 Continued); 1.6.3 Maximum Penalized Estimation; 1.7 Brief Summary of the Properties of the EM Algorithm;



1.8 History of the EM Algorithm; 1.8.1 Early EM History

1.8.2 Work Before Dempster, Laird, and Rubin (1977)1.8.3 EM Examples and Applications Since Dempster, Laird, and Rubin (1977); 1.8.4 Two Interpretations of EM; 1.8.5 Developments in EM Theory, Methodology, and Applications; 1.9 Overview of the Book; 1.10 Notations; 2 EXAMPLES OF THE EM ALGORITHM; 2.1 Introduction; 2.2 Multivariate Data with Missing Values; 2.2.1 Example 2.1: Bivariate Normal Data with Missing Values; 2.2.2 Numerical Illustration; 2.2.3 Multivariate Data: Buck's Method; 2.3 Least Squares with Missing Data; 2.3.1 Healy-Westmacott Procedure

2.3.2 Example 2.2: Linear Regression with Missing Dependent Values2.3.3 Example 2.3: Missing Values in a Latin Square Design; 2.3.4 Healy-Westmacott Procedure as an EM Algorithm; 2.4 Example 2.4: Multinomial with Complex Cell Structure; 2.5 Example 2.5: Analysis of PET and SPECT Data; 2.6 Example 2.6: Multivariate t-Distribution (Known D.F.); 2.6.1 ML Estimation of Multivariate t-Distribution; 2.6.2 Numerical Example: Stack Loss Data; 2.7 Finite Normal Mixtures; 2.7.1 Example 2.7: Univariate Component Densities; 2.7.2 Example 2.8: Multivariate Component Densities

2.7.3 Numerical Example: Red Blood Cell Volume Data2.8 Example 2.9: Grouped and Truncated Data; 2.8.1 Introduction; 2.8.2 Specification of Complete Data; 2.8.3 E-Step; 2.8.4 M-Step; 2.8.5 Confirmation of Incomplete-Data Score Statistic; 2.8.6 M-Step for Grouped Normal Data; 2.8.7 Numerical Example: Grouped Log Normal Data; 2.9 Example 2.10: A Hidden Markov AR(1) model; 3 BASIC THEORY OF THE EM ALGORITHM; 3.1 Introduction; 3.2 Monotonicity of the EM Algorithm; 3.3 Monotonicity of a Generalized EM Algorithm; 3.4 Convergence of an EM Sequence to a Stationary Value; 3.4.1 Introduction

3.4.2 Regularity Conditions of Wu (1983)

Sommario/riassunto

The only single-source--now completely updated and revised--to offer a unified treatment of the theory, methodology, and applications of the EM algorithm Complete with updates that capture developments from the past decade, The EM Algorithm and Extensions, Second Edition successfully provides a basic understanding of the EM algorithm by describing its inception, implementation, and applicability in numerous statistical contexts. In conjunction with the fundamentals of the topic, the authors discuss convergence issues and computation of standard errors, and, in addition, unveil many parallels