Divergence Measures : Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems |
Autore | Sason Igal |
Pubbl/distr/stampa | Basel, : MDPI - Multidisciplinary Digital Publishing Institute, 2022 |
Descrizione fisica | 1 electronic resource (256 p.) |
Soggetto topico |
Research & information: general
Mathematics & science |
Soggetto non controllato |
Bregman divergence
f-divergence Jensen-Bregman divergence Jensen diversity Jensen-Shannon divergence capacitory discrimination Jensen-Shannon centroid mixture family information geometry difference of convex (DC) programming conditional Rényi divergence horse betting Kelly gambling Rényi divergence Rényi mutual information relative entropy chi-squared divergence f-divergences method of types large deviations strong data-processing inequalities information contraction maximal correlation Markov chains information inequalities mutual information Rényi entropy Carlson-Levin inequality information measures hypothesis testing total variation skew-divergence convexity Pinsker's inequality Bayes risk statistical divergences minimum divergence estimator maximum likelihood bootstrap conditional limit theorem Bahadur efficiency α-mutual information Augustin-Csiszár mutual information data transmission error exponents dimensionality reduction discriminant analysis statistical inference |
Formato | Materiale a stampa ![]() |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Altri titoli varianti | Divergence Measures |
Record Nr. | UNINA-9910576871103321 |
Sason Igal
![]() |
||
Basel, : MDPI - Multidisciplinary Digital Publishing Institute, 2022 | ||
![]() | ||
Lo trovi qui: Univ. Federico II | ||
|
New Developments in Statistical Information Theory Based on Entropy and Divergence Measures |
Autore | Pardo Leandro |
Pubbl/distr/stampa | MDPI - Multidisciplinary Digital Publishing Institute, 2019 |
Descrizione fisica | 1 electronic resource (344 p.) |
Soggetto non controllato |
mixture index of fit
Kullback-Leibler distance relative error estimation minimum divergence inference Neyman Pearson test influence function consistency thematic quality assessment asymptotic normality Hellinger distance nonparametric test Berstein von Mises theorem maximum composite likelihood estimator 2-alternating capacities efficiency corrupted data statistical distance robustness log-linear models representation formula goodness-of-fit general linear model Wald-type test statistics Hölder divergence divergence logarithmic super divergence information geometry sparse robust estimation relative entropy minimum disparity methods MM algorithm local-polynomial regression association models total variation Bayesian nonparametric ordinal classification variables Wald test statistic Wald-type test composite hypotheses compressed data hypothesis testing Bayesian semi-parametric single index model indoor localization composite minimum density power divergence estimator quasi-likelihood Chernoff Stein lemma composite likelihood asymptotic property Bregman divergence robust testing misspecified hypothesis and alternative least-favorable hypotheses location-scale family correlation models minimum penalized ?-divergence estimator non-quadratic distance robust semiparametric model divergence based testing measurement errors bootstrap distribution estimator generalized renyi entropy minimum divergence methods generalized linear model ?-divergence Bregman information iterated limits centroid model assessment divergence measure model check two-sample test Wald statistic |
ISBN | 3-03897-937-6 |
Formato | Materiale a stampa ![]() |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Record Nr. | UNINA-9910346856403321 |
Pardo Leandro
![]() |
||
MDPI - Multidisciplinary Digital Publishing Institute, 2019 | ||
![]() | ||
Lo trovi qui: Univ. Federico II | ||
|