LEADER 04708nam 2201189z- 450 001 9910346856403321 005 20231214133336.0 010 $a3-03897-937-6 035 $a(CKB)4920000000095103 035 $a(oapen)https://directory.doabooks.org/handle/20.500.12854/54566 035 $a(EXLCZ)994920000000095103 100 $a20202102d2019 |y 0 101 0 $aeng 135 $aurmn|---annan 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aNew Developments in Statistical Information Theory Based on Entropy and Divergence Measures 210 $cMDPI - Multidisciplinary Digital Publishing Institute$d2019 215 $a1 electronic resource (344 p.) 311 $a3-03897-936-8 330 $aThis book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald?s statistics, likelihood ratio statistics and Rao?s score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators. 610 $amixture index of fit 610 $aKullback-Leibler distance 610 $arelative error estimation 610 $aminimum divergence inference 610 $aNeyman Pearson test 610 $ainfluence function 610 $aconsistency 610 $athematic quality assessment 610 $aasymptotic normality 610 $aHellinger distance 610 $anonparametric test 610 $aBerstein von Mises theorem 610 $amaximum composite likelihood estimator 610 $a2-alternating capacities 610 $aefficiency 610 $acorrupted data 610 $astatistical distance 610 $arobustness 610 $alog-linear models 610 $arepresentation formula 610 $agoodness-of-fit 610 $ageneral linear model 610 $aWald-type test statistics 610 $aHo?lder divergence 610 $adivergence 610 $alogarithmic super divergence 610 $ainformation geometry 610 $asparse 610 $arobust estimation 610 $arelative entropy 610 $aminimum disparity methods 610 $aMM algorithm 610 $alocal-polynomial regression 610 $aassociation models 610 $atotal variation 610 $aBayesian nonparametric 610 $aordinal classification variables 610 $aWald test statistic 610 $aWald-type test 610 $acomposite hypotheses 610 $acompressed data 610 $ahypothesis testing 610 $aBayesian semi-parametric 610 $asingle index model 610 $aindoor localization 610 $acomposite minimum density power divergence estimator 610 $aquasi-likelihood 610 $aChernoff Stein lemma 610 $acomposite likelihood 610 $aasymptotic property 610 $aBregman divergence 610 $arobust testing 610 $amisspecified hypothesis and alternative 610 $aleast-favorable hypotheses 610 $alocation-scale family 610 $acorrelation models 610 $aminimum penalized ?-divergence estimator 610 $anon-quadratic distance 610 $arobust 610 $asemiparametric model 610 $adivergence based testing 610 $ameasurement errors 610 $abootstrap distribution estimator 610 $ageneralized renyi entropy 610 $aminimum divergence methods 610 $ageneralized linear model 610 $a?-divergence 610 $aBregman information 610 $aiterated limits 610 $acentroid 610 $amodel assessment 610 $adivergence measure 610 $amodel check 610 $atwo-sample test 610 $aWald statistic 700 $aPardo$b Leandro$4auth$0499080 906 $aBOOK 912 $a9910346856403321 996 $aNew Developments in Statistical Information Theory Based on Entropy and Divergence Measures$93037378 997 $aUNINA