LEADER 06328oam 2200589Mu 450 001 9910838292903321 005 20240508221719.0 010 $a0-429-50886-7 010 $a0-429-50824-7 010 $a0-429-05591-9 035 $a(CKB)4100000007432424 035 $a(MiAaPQ)EBC5628929 035 $a(OCoLC)1082196423 035 $a(OCoLC-P)1082196423 035 $a(FlBoTFG)9780429055911 035 $a(EXLCZ)994100000007432424 100 $a20190112d2018 uy 0 101 0 $aeng 135 $aurcnu---unuuu 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 00$aHandbook of Mixture Analysis /$fedited by Sylvia Fru?hwirth-Schnatter, Gilles Celeux, Christian P. Robert 205 $a1st ed. 210 $aMilton $cChapman and Hall/CRC$d2018 215 $a1 online resource (522 pages) 225 1 $aChapman & Hall/CRC handbooks of modern statistical methods 300 $aDescription based upon print version of record. 300 $a5.4.1 Known number of components 311 $a1-4987-6381-2 320 $aIncludes bibliographical references and index. 327 $aCover; Half Title; Title Page; Copyright Page; Table of Contents; Preface; Editors; Contributors; List of Symbols; I: Foundations and Methods; 1: Introduction to Finite Mixtures; 1.1 Introduction and Motivation; 1.1.1 Basic formulation; 1.1.2 Likelihood; 1.1.3 Latent allocation variables; 1.1.4 A little history; 1.2 Generalizations; 1.2.1 Infinite mixtures; 1.2.2 Continuous mixtures; 1.2.3 Finite mixtures with nonparametric components; 1.2.4 Covariates and mixtures of experts; 1.2.5 Hidden Markov models; 1.2.6 Spatial mixtures; 1.3 Some Technical Concerns; 1.3.1 Identifiability 327 $a1.3.2 Label switching1.4 Inference; 1.4.1 Frequentist inference, and the role of EM; 1.4.2 Bayesian inference, and the role of MCMC; 1.4.3 Variable number of components; 1.4.4 Modes versus components; 1.4.5 Clustering and classification; 1.5 Concluding Remarks; Bibliography; 2: EM Methods for Finite Mixtures; 2.1 Introduction; 2.2 The EM Algorithm; 2.2.1 Description of EM for finite mixtures; 2.2.2 EM as an alternating-maximization algorithm; 2.3 Convergence and Behavior of EM; 2.4 Cousin Algorithms of EM; 2.4.1 Stochastic versions of the EM algorithm; 2.4.2 The Classification EM algorithm 327 $a2.5 Accelerating the EM Algorithm2.6 Initializing the EM Algorithm; 2.6.1 Random initialization; 2.6.2 Hierarchical initialization; 2.6.3 Recursive initialization; 2.7 Avoiding Spurious Local Maximizers; 2.8 Concluding Remarks; Bibliography; 3: An Expansive View of EM Algorithms; 3.1 Introduction; 3.2 The Product-of-Sums Formulation; 3.2.1 Iterative algorithms and the ascent property; 3.2.2 Creating a minorizing surrogate function; 3.3 Likelihood as a Product of Sums; 3.4 Non-standard Examples of EM Algorithms; 3.4.1 Modes of a density; 3.4.2 Gradient maxima; 3.4.3 Two-step EM 327 $a3.5 Stopping Rules for EM Algorithms3.6 Concluding Remarks; Bibliography; 4: Bayesian Mixture Models: Theory and Methods; 4.1 Introduction; 4.2 Bayesian Mixtures: From Priors to Posteriors; 4.2.1 Models and representations; 4.2.2 Impact of the prior distribution; 4.2.2.1 Conjugate priors; 4.2.2.2 Improper and non-informative priors; 4.2.2.3 Data-dependent priors; 4.2.2.4 Priors for overfitted mixtures; 4.3 Asymptotic Properties of the Posterior Distribution in the Finite Case; 4.3.1 Posterior concentration around the marginal density; 4.3.2 Recovering the parameters in the well-behaved case 327 $a4.3.3 Boundary parameters: overfitted mixtures4.3.4 Asymptotic behaviour of posterior estimates of the number of components; 4.4 Concluding Remarks; Bibliography; 5: Computational Solutions for Bayesian Inference in Mixture Models; 5.1 Introduction; 5.2 Algorithms for Posterior Sampling; 5.2.1 A computational problem? Which computational problem?; 5.2.2 Gibbs sampling; 5.2.3 Metropolis-Hastings schemes; 5.2.4 Reversible jump MCMC; 5.2.5 Sequential Monte Carlo; 5.2.6 Nested sampling; 5.3 Bayesian Inference in the Model-Based Clustering Context; 5.4 Simulation Studies 330 $aMixture models have been around for over 150 years, and they are found in many branches of statistical modelling, as a versatile and multifaceted tool. They can be applied to a wide range of data: univariate or multivariate, continuous or categorical, cross-sectional, time series, networks, and much more. Mixture analysis is a very active research topic in statistics and machine learning, with new developments in methodology and applications taking place all the time. The Handbook of Mixture Analysis is a very timely publication, presenting a broad overview of the methods and applications of this important field of research. It covers a wide array of topics, including the EM algorithm, Bayesian mixture models, model-based clustering, high-dimensional data, hidden Markov models, and applications in finance, genomics, and astronomy. Features: Provides a comprehensive overview of the methods and applications of mixture modelling and analysis Divided into three parts: Foundations and Methods; Mixture Modelling and Extensions; and Selected Applications Contains many worked examples using real data, together with computational implementation, to illustrate the methods described Includes contributions from the leading researchers in the field The Handbook of Mixture Analysis is targeted at graduate students and young researchers new to the field. It will also be an important reference for anyone working in this field, whether they are developing new methodology, or applying the models to real scientific problems. 410 0$aChapman & Hall/CRC handbooks of modern statistical methods. 606 $aMixture distributions (Probability theory) 606 $aDistribution (Probability theory) 615 0$aMixture distributions (Probability theory) 615 0$aDistribution (Probability theory) 676 $a519.24 701 $aFru?hwirth-Schnatter$b Sylvia$f1959-$0614435 701 $aCeleux$b Gilles$0104881 701 $aRobert$b Christian P.$f1961-$055943 801 0$bOCoLC-P 801 1$bOCoLC-P 906 $aBOOK 912 $a9910838292903321 996 $aHandbook of Mixture Analysis$94132578 997 $aUNINA