Vai al contenuto principale della pagina

Information and Divergence Measures



(Visualizza in formato marc)    (Visualizza in BIBFRAME)

Titolo: Information and Divergence Measures Visualizza cluster
Pubblicazione: MDPI - Multidisciplinary Digital Publishing Institute, 2023
Descrizione fisica: 1 online resource (282 p.)
Soggetto topico: Physics
Research & information: general
Soggetto non controllato: academic evaluation
Awad entropy
bi-logistic growth
bootstrap discrepancy comparison probability (BDCP)
clustering
concomitants
conditional independence
COVID-19 data
cross tabulations
differential geometry
discrepancy comparison probability (DCP)
divergence-based tests
divergences
double index divergence test statistic
empirical survival Jensen-Shannon divergence
epidemic waves
exponential family
extremal combinatorics
FGM family
Fisher information
Fisher-Tsallis information number
geodesic
GOS
graphs
Han's inequality
information inequalities
joint entropy
Kaniadakis logarithm
Kolmogorov-Smirnov two-sample test
Kullback-Leibler divergence
Kullback-Leibler divergence (KLD)
Lauricella D-hypergeometric series
likelihood ratio test (LRT)
LPI
minimum Rényi's pseudodistance estimators
model selection
moment condition models
multiple power series
Multivariate Cauchy distribution (MCD)
multivariate data analysis
multivariate Gaussian
n/a
p-value
passive interception systems
past entropy
polymatroid
radar waveform
rank function
Rao-type tests
Rényi's pseudodistance
residual entropy
restricted minimum Rényi's pseudodistance estimators
robustness
set function
Shannon entropy
Shearer's lemma
skew logistic distribution
statistical divergence
statistical K-means
statistical manifold
submodularity
transversality
truncated exponential family
truncated normal distributions
Tsallis divergence
Tsallis entropy
Tsallis logarithm
weighted Kaniadakis divergence
weighted Tsallis divergence
Sommario/riassunto: The concept of distance is important for establishing the degree of similarity and/or closeness between functions, populations, or distributions. As a result, distances are related to inferential statistics, including problems related to both estimation and hypothesis testing, as well as modelling with applications in regression analysis, multivariate analysis, actuarial science, portfolio optimization, survival analysis, reliability theory, and many other areas. Thus, entropy and divergence measures are always a central concern for scientists, researchers, medical experts, engineers, industrial managers, computer experts, data analysts, and other professionals. This reprint focuses on recent developments in information and divergence measures and presents new theoretical issues as well as solutions to important practical problems and case studies illustrating the great applicability of these innovative techniques and methods. The contributions in this reprint highlight the diversity of topics in this scientific field.
Titolo autorizzato: Information and Divergence Measures  Visualizza cluster
Formato: Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione: Inglese
Record Nr.: 9911053044603321
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui