05044nam 22012973a 450 991034685640332120250203235436.09783038979371303897937610.3390/books978-3-03897-937-1(CKB)4920000000095103(oapen)https://directory.doabooks.org/handle/20.500.12854/54566(ScCtBLL)1e5b9e1e-a13c-4e29-a5d6-b43fa658909a(OCoLC)1118520400(oapen)doab54566(EXLCZ)99492000000009510320250203i20192019 uu engurmn|---annantxtrdacontentcrdamediacrrdacarrierNew Developments in Statistical Information Theory Based on Entropy and Divergence MeasuresLeandro PardoMDPI - Multidisciplinary Digital Publishing Institute2019Basel, Switzerland :MDPI,2019.1 electronic resource (344 p.)9783038979364 3038979368 This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald's statistics, likelihood ratio statistics and Rao's score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.mixture index of fitKullback-Leibler distancerelative error estimationminimum divergence inferenceNeyman Pearson testinfluence functionconsistencythematic quality assessmentasymptotic normalityHellinger distancenonparametric testBerstein von Mises theoremmaximum composite likelihood estimator2-alternating capacitiesefficiencycorrupted datastatistical distancerobustnesslog-linear modelsrepresentation formulagoodness-of-fitgeneral linear modelWald-type test statisticsHölder divergencedivergencelogarithmic super divergenceinformation geometrysparserobust estimationrelative entropyminimum disparity methodsMM algorithmlocal-polynomial regressionassociation modelstotal variationBayesian nonparametricordinal classification variablesWald test statisticWald-type testcomposite hypothesescompressed datahypothesis testingBayesian semi-parametricsingle index modelindoor localizationcomposite minimum density power divergence estimatorquasi-likelihoodChernoff Stein lemmacomposite likelihoodasymptotic propertyBregman divergencerobust testingmisspecified hypothesis and alternativeleast-favorable hypotheseslocation-scale familycorrelation modelsminimum penalized ?-divergence estimatornon-quadratic distancerobustsemiparametric modeldivergence based testingmeasurement errorsbootstrap distribution estimatorgeneralized renyi entropyminimum divergence methodsgeneralized linear model?-divergenceBregman informationiterated limitscentroidmodel assessmentdivergence measuremodel checktwo-sample testWald statisticPardo Llorente Leandro1878478ScCtBLLScCtBLLBOOK9910346856403321New Developments in Statistical Information Theory Based on Entropy and Divergence Measures4491200UNINA