LEADER 01795nlm0 22005051i 450 001 990009250300403321 010 $a9783540730866 035 $a000925030 035 $aFED01000925030 035 $a(Aleph)000925030FED01 035 $a000925030 100 $a20100926d2007----km-y0itay50------ba 101 0 $aeng 102 $aDE 135 $adrnn-008mamaa 200 1 $aTowards Mechanized Mathematical Assistants$bRisorsa elettronica$e14th Symposium, Calculemus 2007, 6th International Conference, MKM 2007, Hagenberg, Austria, June 27-30, 2007. Proceedings$fedited by Manuel Kauers, Manfred Kerber, Robert Mi 210 $aBerlin ; Heidelberg$cSpringer$d2007 225 1 $aLecture Notes in Computer Science$x0302-9743$v4573 230 $aDocumento elettronico 336 $aTesto 337 $aFormato html, pdf 702 1$aKauers,$bManuel 702 1$aKerber,$bManfred 702 1$aMiner,$bRobert 702 1$aWindsteiger,$bWolfgang 801 0$aIT$bUNINA$gREICAT$2UNIMARC 856 4 $zFull text per gli utenti Federico II$uhttp://dx.doi.org/10.1007/978-3-540-73086-6 901 $aEB 912 $a990009250300403321 961 $aArtificial intelligence 961 $aArtificial Intelligence (incl. Robotics) 961 $aComputer Communication Networks 961 $aComputer Communication Networks 961 $aComputer science 961 $aComputer Science 961 $aData mining 961 $aData Mining and Knowledge Discovery 961 $aDatabase management 961 $aDatabase Management 961 $aInformation systems 961 $aInformation Systems Applications (incl.Internet) 961 $aMathematics 961 $aMathematics, general 996 $aTowards Mechanized Mathematical Assistants$9772177 997 $aUNINA LEADER 01248nam--2200409---450- 001 990001614020203316 005 20050125134251.0 035 $a000161402 035 $aUSA01000161402 035 $a(ALEPH)000161402USA01 035 $a000161402 100 $a20040428d1983----km-y0itay0103----ba 101 0 $aita 102 $aIT 105 $a||||||||001yy 200 1 $aElvis Presley$fAlbert Goldman$gtraduzione di Annalisa Baldassarini Rancati 210 $aMilano$cMondadori$d1983 215 $aV, 589 p.$d18 cm 225 2 $aOscar documenti$v121 410 0$12001$aOscar documenti$v121 454 1$12001 461 1$1001-------$12001 600 1 $aPresley,$bElvis$xBiografia 676 $a927.9 700 1$aGOLDMAN,$bAlbert$0524596 702 1$aBALDASSARINI RANCATI,$bAnnalisa 801 0$aIT$bsalbc$gISBD 912 $a990001614020203316 951 $aXIII.3.D. 39 (Varie coll 381/121)$b18609 L.M.$cVarie coll 951 $aXIII.3.D. 39a (Varie coll 381/121 bis)$b19760 L.M.$cVarie coll 959 $aBK 969 $aUMA 979 $aSIAV6$b10$c20040428$lUSA01$h1359 979 $aSIAV6$b10$c20040428$lUSA01$h1400 979 $aCOPAT4$b90$c20050125$lUSA01$h1342 996 $aElvis Presley$9944661 997 $aUNISA LEADER 09937nam 2200697 a 450 001 9910208838003321 005 20240514071355.0 010 $a9786613177698 010 $a9781283177696 010 $a1283177692 010 $a9781119970590 010 $a1119970598 010 $a9781119970583 010 $a111997058X 010 $a9781119973706 010 $a1119973708 035 $a(CKB)4330000000000588 035 $a(MiAaPQ)EBC819225 035 $a(Au-PeEL)EBL819225 035 $a(CaPaEBR)ebr10483308 035 $a(CaONFJC)MIL317769 035 $a(OCoLC)739118488 035 $a(Perlego)1011506 035 $a(EXLCZ)994330000000000588 100 $a20110328d2011 uy 0 101 0 $aeng 135 $aurcn||||||||| 181 $2rdacontent 182 $2rdamedia 183 $2rdacarrier 200 10$aLatent variable models and factor analysis $ea unified approach /$fDavid Bartholomew, Martin Knott, Irini Moustaki 205 $a3rd ed. 210 $aHoboken, N.J. $cWiley$d2011 215 $axiii, 277 p. $cill 225 1 $aWiley series in probability and statistics 311 08$a9780470971925 311 08$a0470971924 320 $aIncludes bibliographical references and indexes. 327 $aIntro -- Latent Variable Models and Factor Analysis -- Contents -- Preface -- Acknowledgements -- 1 Basic ideas and examples -- 1.1 The statistical problem -- 1.2 The basic idea -- 1.3 Two examples -- 1.3.1 Binary manifest variables and a single binary latent variable -- 1.3.2 A model based on normal distributions -- 1.4 A broader theoretical view -- 1.5 Illustration of an alternative approach -- 1.6 An overview of special cases -- 1.7 Principal components -- 1.8 The historical context -- 1.9 Closely related fields in statistics -- 2 The general linear latent variable model -- 2.1 Introduction -- 2.2 The model -- 2.3 Some properties of the model -- 2.4 A special case -- 2.5 The sufficiency principle -- 2.6 Principal special cases -- 2.7 Latent variable models with non-linear terms -- 2.8 Fitting the models -- 2.9 Fitting by maximum likelihood -- 2.10 Fitting by Bayesian methods -- 2.11 Rotation -- 2.12 Interpretation -- 2.13 Sampling error of parameter estimates -- 2.14 The prior distribution -- 2.15 Posterior analysis -- 2.16 A further note on the prior -- 2.17 Psychometric inference -- 3 The normal linear factor model -- 3.1 The model -- 3.2 Some distributional properties -- 3.3 Constraints on the model -- 3.4 Maximum likelihood estimation -- 3.5 Maximum likelihood estimation by the E-M algorithm -- 3.6 Sampling variation of estimators -- 3.7 Goodness of fit and choice of q -- 3.7.1 Model selection criteria -- 3.8 Fitting without normality assumptions: least squares methods -- 3.9 Other methods of fitting -- 3.10 Approximate methods for estimating -- 3.11 Goodness of fit and choice of q for least squares methods -- 3.12 Further estimation issues -- 3.12.1 Consistency -- 3.12.2 Scale-invariant estimation -- 3.12.3 Heywood cases -- 3.13 Rotation and related matters -- 3.13.1 Orthogonal rotation -- 3.13.2 Oblique rotation -- 3.13.3 Related matters. 327 $a3.14 Posterior analysis: the normal case -- 3.15 Posterior analysis: least squares -- 3.16 Posterior analysis: a reliability approach -- 3.17 Examples -- 4 Binary data: latent trait models -- 4.1 Preliminaries -- 4.2 The logit/normal model -- 4.3 The probit/normal model -- 4.4 The equivalence of the response function and underlying variable approaches -- 4.5 Fitting the logit/normal model: the E-M algorithm -- 4.5.1 Fitting the probit/normal model -- 4.5.2 Other methods for approximating the integral -- 4.6 Sampling properties of the maximum likelihood estimators -- 4.7 Approximate maximum likelihood estimators -- 4.8 Generalised least squares methods -- 4.9 Goodness of fit -- 4.10 Posterior analysis -- 4.11 Fitting the logit/normal and probit/normal models: Markov chain Monte Carlo -- 4.11.1 Gibbs sampling -- 4.11.2 Metropolis-Hastings -- 4.11.3 Choosing prior distributions -- 4.11.4 Convergence diagnostics in MCMC -- 4.12 Divergence of the estimation algorithm -- 4.13 Examples -- 5 Polytomous data: latent trait models -- 5.1 Introduction -- 5.2 A response function model based on the sufficiency principle -- 5.3 Parameter interpretation -- 5.4 Rotation -- 5.5 Maximum likelihood estimation of the polytomous logit model -- 5.6 An approximation to the likelihood -- 5.6.1 One factor -- 5.6.2 More than one factor -- 5.7 Binary data as a special case -- 5.8 Ordering of categories -- 5.8.1 A response function model for ordinal variables -- 5.8.2 Maximum likelihood estimation of the model with ordinal variables -- 5.8.3 The partial credit model -- 5.8.4 An underlying variable model -- 5.9 An alternative underlying variable model -- 5.10 Posterior analysis -- 5.11 Further observations -- 5.12 Examples of the analysis of polytomous data using the logit model -- 6 Latent class models -- 6.1 Introduction. 327 $a6.2 The latent class model with binary manifest variables -- 6.3 The latent class model for binary data as a latent trait model -- 6.4 K latent classes within the GLLVM -- 6.5 Maximum likelihood estimation -- 6.6 Standard errors -- 6.7 Posterior analysis of the latent class model with binary manifest variables -- 6.8 Goodness of fit -- 6.9 Examples for binary data -- 6.10 Latent class models with unordered polytomous manifest variables -- 6.11 Latent class models with ordered polytomous manifest variables -- 6.12 Maximum likelihood estimation -- 6.12.1 Allocation of individuals to latent classes -- 6.13 Examples for unordered polytomous data -- 6.14 Identifiability -- 6.15 Starting values -- 6.16 Latent class models with metrical manifest variables -- 6.16.1 Maximum likelihood estimation -- 6.16.2 Other methods -- 6.16.3 Allocation to categories -- 6.17 Models with ordered latent classes -- 6.18 Hybrid models -- 6.18.1 Hybrid model with binary manifest variables -- 6.18.2 Maximum likelihood estimation -- 7 Models and methods for manifest variables of mixed type -- 7.1 Introduction -- 7.2 Principal results -- 7.3 Other members of the exponential family -- 7.3.1 The binomial distribution -- 7.3.2 The Poisson distribution -- 7.3.3 The gamma distribution -- 7.4 Maximum likelihood estimation -- 7.4.1 Bernoulli manifest variables -- 7.4.2 Normal manifest variables -- 7.4.3 A general E-M approach to solving the likelihood equations -- 7.4.4 Interpretation of latent variables -- 7.5 Sampling properties and goodness of fit -- 7.6 Mixed latent class models -- 7.7 Posterior analysis -- 7.8 Examples -- 7.9 Ordered categorical variables and other generalisations -- 8 Relationships between latent variables -- 8.1 Scope -- 8.2 Correlated latent variables -- 8.3 Procrustes methods -- 8.4 Sources of prior knowledge -- 8.5 Linear structural relations models. 327 $a8.6 The LISREL model -- 8.6.1 The structural model -- 8.6.2 The measurement model -- 8.6.3 The model as a whole -- 8.7 Adequacy of a structural equation model -- 8.8 Structural relationships in a general setting -- 8.9 Generalisations of the LISREL model -- 8.10 Examples of models which are indistinguishable -- 8.11 Implications for analysis -- 9 Related techniques for investigating dependency -- 9.1 Introduction -- 9.2 Principal components analysis -- 9.2.1 A distributional treatment -- 9.2.2 A sample-based treatment -- 9.2.3 Unordered categorical data -- 9.2.4 Ordered categorical data -- 9.3 An alternative to the normal factor model -- 9.4 Replacing latent variables by linear functions of the manifest variables -- 9.5 Estimation of correlations and regressions between latent variables -- 9.6 Q-Methodology -- 9.7 Concluding reflections of the role of latent variables in statistical modelling -- Software appendix -- References -- Author index -- Subject index. 330 8 $aLatent Variable Models and Factor Analysis provides a comprehensive and unified approach to factor analysis and latent variable modeling from a statistical perspective. This book presents a general framework to enable the derivation of the commonly used models, along with updated numerical examples. Nature and interpretation of a latent variable is also introduced along with related techniques for investigating dependency. This book: * Provides a unified approach showing how such apparently diverse methods as Latent Class Analysis and Factor Analysis are actually members of the same family. * Presents new material on ordered manifest variables, MCMC methods, non-linear models as well as a new chapter on related techniques for investigating dependency. * Includes new sections on structural equation models (SEM) and Markov Chain Monte Carlo methods for parameter estimation, along with new illustrative examples. * Looks at recent developments on goodness-of-fit test statistics and on non-linear models and models with mixed latent variables, both categorical and continuous. No prior acquaintance with latent variable modelling is pre-supposed but a broad understanding of statistical theory will make it easier to see the approach in its proper perspective. Applied statisticians, psychometricians, medical statisticians, biostatisticians, economists and social science researchers will benefit from this book. 410 0$aWiley series in probability and statistics. 606 $aLatent variables 606 $aLatent structure analysis 606 $aFactor analysis 615 0$aLatent variables. 615 0$aLatent structure analysis. 615 0$aFactor analysis. 676 $a519.5/35 700 $aBartholomew$b David J$0102048 701 $aKnott$b M$g(Martin)$0997127 701 $aMoustaki$b Irini$0522145 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a9910208838003321 996 $aLatent variable models and factor analysis$92286641 997 $aUNINA