New Developments in Statistical Information Theory Based on Entropy and Divergence Measures |
Autore | Pardo Leandro |
Pubbl/distr/stampa | MDPI - Multidisciplinary Digital Publishing Institute, 2019 |
Descrizione fisica | 1 electronic resource (344 p.) |
Soggetto non controllato |
mixture index of fit
Kullback-Leibler distance relative error estimation minimum divergence inference Neyman Pearson test influence function consistency thematic quality assessment asymptotic normality Hellinger distance nonparametric test Berstein von Mises theorem maximum composite likelihood estimator 2-alternating capacities efficiency corrupted data statistical distance robustness log-linear models representation formula goodness-of-fit general linear model Wald-type test statistics Hölder divergence divergence logarithmic super divergence information geometry sparse robust estimation relative entropy minimum disparity methods MM algorithm local-polynomial regression association models total variation Bayesian nonparametric ordinal classification variables Wald test statistic Wald-type test composite hypotheses compressed data hypothesis testing Bayesian semi-parametric single index model indoor localization composite minimum density power divergence estimator quasi-likelihood Chernoff Stein lemma composite likelihood asymptotic property Bregman divergence robust testing misspecified hypothesis and alternative least-favorable hypotheses location-scale family correlation models minimum penalized ?-divergence estimator non-quadratic distance robust semiparametric model divergence based testing measurement errors bootstrap distribution estimator generalized renyi entropy minimum divergence methods generalized linear model ?-divergence Bregman information iterated limits centroid model assessment divergence measure model check two-sample test Wald statistic |
ISBN | 3-03897-937-6 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Record Nr. | UNINA-9910346856403321 |
Pardo Leandro | ||
MDPI - Multidisciplinary Digital Publishing Institute, 2019 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Robust Procedures for Estimating and Testing in the Framework of Divergence Measures |
Autore | Pardo Leandro |
Pubbl/distr/stampa | Basel, Switzerland, : MDPI - Multidisciplinary Digital Publishing Institute, 2021 |
Descrizione fisica | 1 electronic resource (333 p.) |
Soggetto topico | Research & information: general |
Soggetto non controllato |
classification
Bayes error rate Henze-Penrose divergence Friedman-Rafsky test statistic convergence rates bias and variance trade-off concentration bounds minimal spanning trees composite likelihood composite minimum density power divergence estimators model selection minimum pseudodistance estimation Robustness estimation of α monitoring numerical minimization S-estimation Tukey's biweight integer-valued time series one-parameter exponential family minimum density power divergence estimator density power divergence robust change point test Galton-Watson branching processes with immigration Hellinger integrals power divergences Kullback-Leibler information distance/divergence relative entropy Renyi divergences epidemiology COVID-19 pandemic Bayesian decision making INARCH(1) model GLM model Bhattacharyya coefficient/distance time series of counts INGARCH model SPC CUSUM monitoring MDPDE contingency tables disparity mixed-scale data pearson residuals residual adjustment function robustness statistical distances Hellinger distance large deviations divergence measures rare event probabilities |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Record Nr. | UNINA-9910557680103321 |
Pardo Leandro | ||
Basel, Switzerland, : MDPI - Multidisciplinary Digital Publishing Institute, 2021 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|