04049nam 2200961z- 450 991057687110332120220621(CKB)5720000000008465(oapen)https://directory.doabooks.org/handle/20.500.12854/84568(oapen)doab84568(EXLCZ)99572000000000846520202206d2022 |y 0engurmn|---annantxtrdacontentcrdamediacrrdacarrierDivergence MeasuresMathematical Foundations and Applications in Information-Theoretic and Statistical ProblemsBaselMDPI - Multidisciplinary Digital Publishing Institute20221 online resource (256 p.)3-0365-4332-5 3-0365-4331-7 Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled "Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems", includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the R_ényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures.Divergence Measures Mathematics & sciencebicsscResearch & information: generalbicsscAugustin-Csiszár mutual informationBahadur efficiencyBayes riskbootstrapBregman divergencecapacitory discriminationCarlson-Levin inequalitychi-squared divergenceconditional limit theoremconditional Rényi divergenceconvexitydata transmissiondifference of convex (DC) programmingdimensionality reductiondiscriminant analysiserror exponentsf-divergencef-divergenceshorse bettinghypothesis testinginformation contractioninformation geometryinformation inequalitiesinformation measuresJensen diversityJensen-Bregman divergenceJensen-Shannon centroidJensen-Shannon divergenceKelly gamblinglarge deviationsMarkov chainsmaximal correlationmaximum likelihoodmethod of typesminimum divergence estimatormixture familymutual informationn/aPinsker's inequalityrelative entropyRényi divergenceRényi entropyRényi mutual informationskew-divergencestatistical divergencesstatistical inferencestrong data-processing inequalitiestotal variationα-mutual informationMathematics & scienceResearch & information: generalSason Igaledt625922Sason IgalothBOOK9910576871103321Divergence Measures3023147UNINA