top

  Info

  • Utilizzare la checkbox di selezione a fianco di ciascun documento per attivare le funzionalità di stampa, invio email, download nei formati disponibili del (i) record.

  Info

  • Utilizzare questo link per rimuovere la selezione effettuata.
Divergence Measures : Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems
Divergence Measures : Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems
Autore Sason Igal
Pubbl/distr/stampa Basel, : MDPI - Multidisciplinary Digital Publishing Institute, 2022
Descrizione fisica 1 online resource (256 p.)
Soggetto topico Mathematics & science
Research & information: general
Soggetto non controllato Augustin-Csiszár mutual information
Bahadur efficiency
Bayes risk
bootstrap
Bregman divergence
capacitory discrimination
Carlson-Levin inequality
chi-squared divergence
conditional limit theorem
conditional Rényi divergence
convexity
data transmission
difference of convex (DC) programming
dimensionality reduction
discriminant analysis
error exponents
f-divergence
f-divergences
horse betting
hypothesis testing
information contraction
information geometry
information inequalities
information measures
Jensen diversity
Jensen-Bregman divergence
Jensen-Shannon centroid
Jensen-Shannon divergence
Kelly gambling
large deviations
Markov chains
maximal correlation
maximum likelihood
method of types
minimum divergence estimator
mixture family
mutual information
n/a
Pinsker's inequality
relative entropy
Rényi divergence
Rényi entropy
Rényi mutual information
skew-divergence
statistical divergences
statistical inference
strong data-processing inequalities
total variation
α-mutual information
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Altri titoli varianti Divergence Measures
Record Nr. UNINA-9910576871103321
Sason Igal  
Basel, : MDPI - Multidisciplinary Digital Publishing Institute, 2022
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Information and Divergence Measures
Information and Divergence Measures
Pubbl/distr/stampa MDPI - Multidisciplinary Digital Publishing Institute, 2023
Descrizione fisica 1 online resource (282 p.)
Soggetto topico Physics
Research & information: general
Soggetto non controllato academic evaluation
Awad entropy
bi-logistic growth
bootstrap discrepancy comparison probability (BDCP)
clustering
concomitants
conditional independence
COVID-19 data
cross tabulations
differential geometry
discrepancy comparison probability (DCP)
divergence-based tests
divergences
double index divergence test statistic
empirical survival Jensen-Shannon divergence
epidemic waves
exponential family
extremal combinatorics
FGM family
Fisher information
Fisher-Tsallis information number
geodesic
GOS
graphs
Han's inequality
information inequalities
joint entropy
Kaniadakis logarithm
Kolmogorov-Smirnov two-sample test
Kullback-Leibler divergence
Kullback-Leibler divergence (KLD)
Lauricella D-hypergeometric series
likelihood ratio test (LRT)
LPI
minimum Rényi's pseudodistance estimators
model selection
moment condition models
multiple power series
Multivariate Cauchy distribution (MCD)
multivariate data analysis
multivariate Gaussian
n/a
p-value
passive interception systems
past entropy
polymatroid
radar waveform
rank function
Rao-type tests
Rényi's pseudodistance
residual entropy
restricted minimum Rényi's pseudodistance estimators
robustness
set function
Shannon entropy
Shearer's lemma
skew logistic distribution
statistical divergence
statistical K-means
statistical manifold
submodularity
transversality
truncated exponential family
truncated normal distributions
Tsallis divergence
Tsallis entropy
Tsallis logarithm
weighted Kaniadakis divergence
weighted Tsallis divergence
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Record Nr. UNINA-9911053044603321
MDPI - Multidisciplinary Digital Publishing Institute, 2023
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui