04532nam 2201141z- 450 991105304460332120230911(CKB)5690000000228578(oapen)doab113909(EXLCZ)99569000000022857820230920c2023uuuu -u- -engurmn|---annantxtrdacontentcrdamediacrrdacarrierInformation and Divergence MeasuresMDPI - Multidisciplinary Digital Publishing Institute20231 online resource (282 p.)3-0365-8387-4 The concept of distance is important for establishing the degree of similarity and/or closeness between functions, populations, or distributions. As a result, distances are related to inferential statistics, including problems related to both estimation and hypothesis testing, as well as modelling with applications in regression analysis, multivariate analysis, actuarial science, portfolio optimization, survival analysis, reliability theory, and many other areas. Thus, entropy and divergence measures are always a central concern for scientists, researchers, medical experts, engineers, industrial managers, computer experts, data analysts, and other professionals. This reprint focuses on recent developments in information and divergence measures and presents new theoretical issues as well as solutions to important practical problems and case studies illustrating the great applicability of these innovative techniques and methods. The contributions in this reprint highlight the diversity of topics in this scientific field.PhysicsbicsscResearch & information: generalbicsscacademic evaluationAwad entropybi-logistic growthbootstrap discrepancy comparison probability (BDCP)clusteringconcomitantsconditional independenceCOVID-19 datacross tabulationsdifferential geometrydiscrepancy comparison probability (DCP)divergence-based testsdivergencesdouble index divergence test statisticempirical survival Jensen-Shannon divergenceepidemic wavesexponential familyextremal combinatoricsFGM familyFisher informationFisher-Tsallis information numbergeodesicGOSgraphsHan's inequalityinformation inequalitiesjoint entropyKaniadakis logarithmKolmogorov-Smirnov two-sample testKullback-Leibler divergenceKullback-Leibler divergence (KLD)Lauricella D-hypergeometric serieslikelihood ratio test (LRT)LPIminimum Rényi's pseudodistance estimatorsmodel selectionmoment condition modelsmultiple power seriesMultivariate Cauchy distribution (MCD)multivariate data analysismultivariate Gaussiann/ap-valuepassive interception systemspast entropypolymatroidradar waveformrank functionRao-type testsRényi's pseudodistanceresidual entropyrestricted minimum Rényi's pseudodistance estimatorsrobustnessset functionShannon entropyShearer's lemmaskew logistic distributionstatistical divergencestatistical K-meansstatistical manifoldsubmodularitytransversalitytruncated exponential familytruncated normal distributionsTsallis divergenceTsallis entropyTsallis logarithmweighted Kaniadakis divergenceweighted Tsallis divergencePhysicsResearch & information: generalBOOK9911053044603321Information and Divergence Measures4525164UNINA