top

  Info

  • Utilizzare la checkbox di selezione a fianco di ciascun documento per attivare le funzionalità di stampa, invio email, download nei formati disponibili del (i) record.

  Info

  • Utilizzare questo link per rimuovere la selezione effettuata.
The EM algorithm and extensions [[electronic resource] /] / Geoffrey J. McLachlan, Thriyambakam Krishnan
The EM algorithm and extensions [[electronic resource] /] / Geoffrey J. McLachlan, Thriyambakam Krishnan
Autore McLachlan Geoffrey J. <1946->
Edizione [2nd ed.]
Pubbl/distr/stampa Hoboken, N.J., : Wiley-Interscience, c2008
Descrizione fisica 1 online resource (399 p.)
Disciplina 519.5
519.5/44
519.544
Altri autori (Persone) KrishnanT <1938-> (Thriyambakam)
Collana Wiley series in probability and statistics
Soggetto topico Expectation-maximization algorithms
Estimation theory
Missing observations (Statistics)
ISBN 1-281-28447-5
9786611284473
0-470-19161-9
0-470-19160-0
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto The EM Algorithm and Extensions; CONTENTS; PREFACE TO THE SECOND EDITION; PREFACE TO THE FIRST EDITION; LIST OF EXAMPLES; 1 GENERAL INTRODUCTION; 1.1 Introduction; 1.2 Maximum Likelihood Estimation; 1.3 Newton-Type Methods; 1.3.1 Introduction; 1.3.2 Newton-Raphson Method; 1.3.3 Quasi-Newton Methods; 1.3.4 Modified Newton Methods; 1.4 Introductory Examples; 1.4.1 Introduction; 1.4.2 Example 1.1: A Multinomial Example; 1.4.3 Example 1.2: Estimation of Mixing Proportions; 1.5 Formulation of the EM Algorithm; 1.5.1 EM Algorithm; 1.5.2 Example 1.3: Censored Exponentially Distributed Survival Times
1.5.3 E- and M-Steps for the Regular Exponential Family1.5.4 Example 1.4: Censored Exponentially Distributed Survival Times (Example 1.3 Continued); 1.5.5 Generalized EM Algorithm; 1.5.6 GEM Algorithm Based on One Newton-Raphson Step; 1.5.7 EM Gradient Algorithm; 1.5.8 EM Mapping; 1.6 EM Algorithm for MAP and MPL Estimation; 1.6.1 Maximum a Posteriori Estimation; 1.6.2 Example 1.5: A Multinomial Example (Example 1.1 Continued); 1.6.3 Maximum Penalized Estimation; 1.7 Brief Summary of the Properties of the EM Algorithm; 1.8 History of the EM Algorithm; 1.8.1 Early EM History
1.8.2 Work Before Dempster, Laird, and Rubin (1977)1.8.3 EM Examples and Applications Since Dempster, Laird, and Rubin (1977); 1.8.4 Two Interpretations of EM; 1.8.5 Developments in EM Theory, Methodology, and Applications; 1.9 Overview of the Book; 1.10 Notations; 2 EXAMPLES OF THE EM ALGORITHM; 2.1 Introduction; 2.2 Multivariate Data with Missing Values; 2.2.1 Example 2.1: Bivariate Normal Data with Missing Values; 2.2.2 Numerical Illustration; 2.2.3 Multivariate Data: Buck's Method; 2.3 Least Squares with Missing Data; 2.3.1 Healy-Westmacott Procedure
2.3.2 Example 2.2: Linear Regression with Missing Dependent Values2.3.3 Example 2.3: Missing Values in a Latin Square Design; 2.3.4 Healy-Westmacott Procedure as an EM Algorithm; 2.4 Example 2.4: Multinomial with Complex Cell Structure; 2.5 Example 2.5: Analysis of PET and SPECT Data; 2.6 Example 2.6: Multivariate t-Distribution (Known D.F.); 2.6.1 ML Estimation of Multivariate t-Distribution; 2.6.2 Numerical Example: Stack Loss Data; 2.7 Finite Normal Mixtures; 2.7.1 Example 2.7: Univariate Component Densities; 2.7.2 Example 2.8: Multivariate Component Densities
2.7.3 Numerical Example: Red Blood Cell Volume Data2.8 Example 2.9: Grouped and Truncated Data; 2.8.1 Introduction; 2.8.2 Specification of Complete Data; 2.8.3 E-Step; 2.8.4 M-Step; 2.8.5 Confirmation of Incomplete-Data Score Statistic; 2.8.6 M-Step for Grouped Normal Data; 2.8.7 Numerical Example: Grouped Log Normal Data; 2.9 Example 2.10: A Hidden Markov AR(1) model; 3 BASIC THEORY OF THE EM ALGORITHM; 3.1 Introduction; 3.2 Monotonicity of the EM Algorithm; 3.3 Monotonicity of a Generalized EM Algorithm; 3.4 Convergence of an EM Sequence to a Stationary Value; 3.4.1 Introduction
3.4.2 Regularity Conditions of Wu (1983)
Record Nr. UNINA-9910145008603321
McLachlan Geoffrey J. <1946->  
Hoboken, N.J., : Wiley-Interscience, c2008
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
The EM algorithm and extensions [[electronic resource] /] / Geoffrey J. McLachlan, Thriyambakam Krishnan
The EM algorithm and extensions [[electronic resource] /] / Geoffrey J. McLachlan, Thriyambakam Krishnan
Autore McLachlan Geoffrey J. <1946->
Edizione [2nd ed.]
Pubbl/distr/stampa Hoboken, N.J., : Wiley-Interscience, c2008
Descrizione fisica 1 online resource (399 p.)
Disciplina 519.5
519.5/44
519.544
Altri autori (Persone) KrishnanT <1938-> (Thriyambakam)
Collana Wiley series in probability and statistics
Soggetto topico Expectation-maximization algorithms
Estimation theory
Missing observations (Statistics)
ISBN 1-281-28447-5
9786611284473
0-470-19161-9
0-470-19160-0
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto The EM Algorithm and Extensions; CONTENTS; PREFACE TO THE SECOND EDITION; PREFACE TO THE FIRST EDITION; LIST OF EXAMPLES; 1 GENERAL INTRODUCTION; 1.1 Introduction; 1.2 Maximum Likelihood Estimation; 1.3 Newton-Type Methods; 1.3.1 Introduction; 1.3.2 Newton-Raphson Method; 1.3.3 Quasi-Newton Methods; 1.3.4 Modified Newton Methods; 1.4 Introductory Examples; 1.4.1 Introduction; 1.4.2 Example 1.1: A Multinomial Example; 1.4.3 Example 1.2: Estimation of Mixing Proportions; 1.5 Formulation of the EM Algorithm; 1.5.1 EM Algorithm; 1.5.2 Example 1.3: Censored Exponentially Distributed Survival Times
1.5.3 E- and M-Steps for the Regular Exponential Family1.5.4 Example 1.4: Censored Exponentially Distributed Survival Times (Example 1.3 Continued); 1.5.5 Generalized EM Algorithm; 1.5.6 GEM Algorithm Based on One Newton-Raphson Step; 1.5.7 EM Gradient Algorithm; 1.5.8 EM Mapping; 1.6 EM Algorithm for MAP and MPL Estimation; 1.6.1 Maximum a Posteriori Estimation; 1.6.2 Example 1.5: A Multinomial Example (Example 1.1 Continued); 1.6.3 Maximum Penalized Estimation; 1.7 Brief Summary of the Properties of the EM Algorithm; 1.8 History of the EM Algorithm; 1.8.1 Early EM History
1.8.2 Work Before Dempster, Laird, and Rubin (1977)1.8.3 EM Examples and Applications Since Dempster, Laird, and Rubin (1977); 1.8.4 Two Interpretations of EM; 1.8.5 Developments in EM Theory, Methodology, and Applications; 1.9 Overview of the Book; 1.10 Notations; 2 EXAMPLES OF THE EM ALGORITHM; 2.1 Introduction; 2.2 Multivariate Data with Missing Values; 2.2.1 Example 2.1: Bivariate Normal Data with Missing Values; 2.2.2 Numerical Illustration; 2.2.3 Multivariate Data: Buck's Method; 2.3 Least Squares with Missing Data; 2.3.1 Healy-Westmacott Procedure
2.3.2 Example 2.2: Linear Regression with Missing Dependent Values2.3.3 Example 2.3: Missing Values in a Latin Square Design; 2.3.4 Healy-Westmacott Procedure as an EM Algorithm; 2.4 Example 2.4: Multinomial with Complex Cell Structure; 2.5 Example 2.5: Analysis of PET and SPECT Data; 2.6 Example 2.6: Multivariate t-Distribution (Known D.F.); 2.6.1 ML Estimation of Multivariate t-Distribution; 2.6.2 Numerical Example: Stack Loss Data; 2.7 Finite Normal Mixtures; 2.7.1 Example 2.7: Univariate Component Densities; 2.7.2 Example 2.8: Multivariate Component Densities
2.7.3 Numerical Example: Red Blood Cell Volume Data2.8 Example 2.9: Grouped and Truncated Data; 2.8.1 Introduction; 2.8.2 Specification of Complete Data; 2.8.3 E-Step; 2.8.4 M-Step; 2.8.5 Confirmation of Incomplete-Data Score Statistic; 2.8.6 M-Step for Grouped Normal Data; 2.8.7 Numerical Example: Grouped Log Normal Data; 2.9 Example 2.10: A Hidden Markov AR(1) model; 3 BASIC THEORY OF THE EM ALGORITHM; 3.1 Introduction; 3.2 Monotonicity of the EM Algorithm; 3.3 Monotonicity of a Generalized EM Algorithm; 3.4 Convergence of an EM Sequence to a Stationary Value; 3.4.1 Introduction
3.4.2 Regularity Conditions of Wu (1983)
Record Nr. UNINA-9910831039703321
McLachlan Geoffrey J. <1946->  
Hoboken, N.J., : Wiley-Interscience, c2008
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
The EM algorithm and extensions / / Geoffrey J. McLachlan, Thriyambakam Krishnan
The EM algorithm and extensions / / Geoffrey J. McLachlan, Thriyambakam Krishnan
Autore McLachlan Geoffrey J. <1946->
Edizione [2nd ed.]
Pubbl/distr/stampa Hoboken, N.J., : Wiley-Interscience, c2008
Descrizione fisica 1 online resource (399 p.)
Disciplina 519.5/44
Altri autori (Persone) KrishnanT <1938-> (Thriyambakam)
Collana Wiley series in probability and statistics
Soggetto topico Expectation-maximization algorithms
Estimation theory
Missing observations (Statistics)
ISBN 1-281-28447-5
9786611284473
0-470-19161-9
0-470-19160-0
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto The EM Algorithm and Extensions; CONTENTS; PREFACE TO THE SECOND EDITION; PREFACE TO THE FIRST EDITION; LIST OF EXAMPLES; 1 GENERAL INTRODUCTION; 1.1 Introduction; 1.2 Maximum Likelihood Estimation; 1.3 Newton-Type Methods; 1.3.1 Introduction; 1.3.2 Newton-Raphson Method; 1.3.3 Quasi-Newton Methods; 1.3.4 Modified Newton Methods; 1.4 Introductory Examples; 1.4.1 Introduction; 1.4.2 Example 1.1: A Multinomial Example; 1.4.3 Example 1.2: Estimation of Mixing Proportions; 1.5 Formulation of the EM Algorithm; 1.5.1 EM Algorithm; 1.5.2 Example 1.3: Censored Exponentially Distributed Survival Times
1.5.3 E- and M-Steps for the Regular Exponential Family1.5.4 Example 1.4: Censored Exponentially Distributed Survival Times (Example 1.3 Continued); 1.5.5 Generalized EM Algorithm; 1.5.6 GEM Algorithm Based on One Newton-Raphson Step; 1.5.7 EM Gradient Algorithm; 1.5.8 EM Mapping; 1.6 EM Algorithm for MAP and MPL Estimation; 1.6.1 Maximum a Posteriori Estimation; 1.6.2 Example 1.5: A Multinomial Example (Example 1.1 Continued); 1.6.3 Maximum Penalized Estimation; 1.7 Brief Summary of the Properties of the EM Algorithm; 1.8 History of the EM Algorithm; 1.8.1 Early EM History
1.8.2 Work Before Dempster, Laird, and Rubin (1977)1.8.3 EM Examples and Applications Since Dempster, Laird, and Rubin (1977); 1.8.4 Two Interpretations of EM; 1.8.5 Developments in EM Theory, Methodology, and Applications; 1.9 Overview of the Book; 1.10 Notations; 2 EXAMPLES OF THE EM ALGORITHM; 2.1 Introduction; 2.2 Multivariate Data with Missing Values; 2.2.1 Example 2.1: Bivariate Normal Data with Missing Values; 2.2.2 Numerical Illustration; 2.2.3 Multivariate Data: Buck's Method; 2.3 Least Squares with Missing Data; 2.3.1 Healy-Westmacott Procedure
2.3.2 Example 2.2: Linear Regression with Missing Dependent Values2.3.3 Example 2.3: Missing Values in a Latin Square Design; 2.3.4 Healy-Westmacott Procedure as an EM Algorithm; 2.4 Example 2.4: Multinomial with Complex Cell Structure; 2.5 Example 2.5: Analysis of PET and SPECT Data; 2.6 Example 2.6: Multivariate t-Distribution (Known D.F.); 2.6.1 ML Estimation of Multivariate t-Distribution; 2.6.2 Numerical Example: Stack Loss Data; 2.7 Finite Normal Mixtures; 2.7.1 Example 2.7: Univariate Component Densities; 2.7.2 Example 2.8: Multivariate Component Densities
2.7.3 Numerical Example: Red Blood Cell Volume Data2.8 Example 2.9: Grouped and Truncated Data; 2.8.1 Introduction; 2.8.2 Specification of Complete Data; 2.8.3 E-Step; 2.8.4 M-Step; 2.8.5 Confirmation of Incomplete-Data Score Statistic; 2.8.6 M-Step for Grouped Normal Data; 2.8.7 Numerical Example: Grouped Log Normal Data; 2.9 Example 2.10: A Hidden Markov AR(1) model; 3 BASIC THEORY OF THE EM ALGORITHM; 3.1 Introduction; 3.2 Monotonicity of the EM Algorithm; 3.3 Monotonicity of a Generalized EM Algorithm; 3.4 Convergence of an EM Sequence to a Stationary Value; 3.4.1 Introduction
3.4.2 Regularity Conditions of Wu (1983)
Record Nr. UNINA-9910877999303321
McLachlan Geoffrey J. <1946->  
Hoboken, N.J., : Wiley-Interscience, c2008
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui