Vai al contenuto principale della pagina

New Developments in Statistical Information Theory Based on Entropy and Divergence Measures



(Visualizza in formato marc)    (Visualizza in BIBFRAME)

Autore: Pardo Leandro Visualizza persona
Titolo: New Developments in Statistical Information Theory Based on Entropy and Divergence Measures Visualizza cluster
Pubblicazione: MDPI - Multidisciplinary Digital Publishing Institute, 2019
Descrizione fisica: 1 electronic resource (344 p.)
Soggetto non controllato: mixture index of fit
Kullback-Leibler distance
relative error estimation
minimum divergence inference
Neyman Pearson test
influence function
consistency
thematic quality assessment
asymptotic normality
Hellinger distance
nonparametric test
Berstein von Mises theorem
maximum composite likelihood estimator
2-alternating capacities
efficiency
corrupted data
statistical distance
robustness
log-linear models
representation formula
goodness-of-fit
general linear model
Wald-type test statistics
Hölder divergence
divergence
logarithmic super divergence
information geometry
sparse
robust estimation
relative entropy
minimum disparity methods
MM algorithm
local-polynomial regression
association models
total variation
Bayesian nonparametric
ordinal classification variables
Wald test statistic
Wald-type test
composite hypotheses
compressed data
hypothesis testing
Bayesian semi-parametric
single index model
indoor localization
composite minimum density power divergence estimator
quasi-likelihood
Chernoff Stein lemma
composite likelihood
asymptotic property
Bregman divergence
robust testing
misspecified hypothesis and alternative
least-favorable hypotheses
location-scale family
correlation models
minimum penalized ?-divergence estimator
non-quadratic distance
robust
semiparametric model
divergence based testing
measurement errors
bootstrap distribution estimator
generalized renyi entropy
minimum divergence methods
generalized linear model
?-divergence
Bregman information
iterated limits
centroid
model assessment
divergence measure
model check
two-sample test
Wald statistic
Sommario/riassunto: This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald’s statistics, likelihood ratio statistics and Rao’s score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.
Titolo autorizzato: New Developments in Statistical Information Theory Based on Entropy and Divergence Measures  Visualizza cluster
ISBN: 3-03897-937-6
Formato: Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione: Inglese
Record Nr.: 9910346856403321
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui