LEADER 04532nam 2201141z- 450 001 9911053044603321 005 20230911 035 $a(CKB)5690000000228578 035 $a(oapen)doab113909 035 $a(EXLCZ)995690000000228578 100 $a20230920c2023uuuu -u- - 101 0 $aeng 135 $aurmn|---annan 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 00$aInformation and Divergence Measures 210 $cMDPI - Multidisciplinary Digital Publishing Institute$d2023 215 $a1 online resource (282 p.) 311 08$a3-0365-8387-4 330 $aThe concept of distance is important for establishing the degree of similarity and/or closeness between functions, populations, or distributions. As a result, distances are related to inferential statistics, including problems related to both estimation and hypothesis testing, as well as modelling with applications in regression analysis, multivariate analysis, actuarial science, portfolio optimization, survival analysis, reliability theory, and many other areas. Thus, entropy and divergence measures are always a central concern for scientists, researchers, medical experts, engineers, industrial managers, computer experts, data analysts, and other professionals. This reprint focuses on recent developments in information and divergence measures and presents new theoretical issues as well as solutions to important practical problems and case studies illustrating the great applicability of these innovative techniques and methods. The contributions in this reprint highlight the diversity of topics in this scientific field. 606 $aPhysics$2bicssc 606 $aResearch & information: general$2bicssc 610 $aacademic evaluation 610 $aAwad entropy 610 $abi-logistic growth 610 $abootstrap discrepancy comparison probability (BDCP) 610 $aclustering 610 $aconcomitants 610 $aconditional independence 610 $aCOVID-19 data 610 $across tabulations 610 $adifferential geometry 610 $adiscrepancy comparison probability (DCP) 610 $adivergence-based tests 610 $adivergences 610 $adouble index divergence test statistic 610 $aempirical survival Jensen-Shannon divergence 610 $aepidemic waves 610 $aexponential family 610 $aextremal combinatorics 610 $aFGM family 610 $aFisher information 610 $aFisher-Tsallis information number 610 $ageodesic 610 $aGOS 610 $agraphs 610 $aHan's inequality 610 $ainformation inequalities 610 $ajoint entropy 610 $aKaniadakis logarithm 610 $aKolmogorov-Smirnov two-sample test 610 $aKullback-Leibler divergence 610 $aKullback-Leibler divergence (KLD) 610 $aLauricella D-hypergeometric series 610 $alikelihood ratio test (LRT) 610 $aLPI 610 $aminimum Re?nyi's pseudodistance estimators 610 $amodel selection 610 $amoment condition models 610 $amultiple power series 610 $aMultivariate Cauchy distribution (MCD) 610 $amultivariate data analysis 610 $amultivariate Gaussian 610 $an/a 610 $ap-value 610 $apassive interception systems 610 $apast entropy 610 $apolymatroid 610 $aradar waveform 610 $arank function 610 $aRao-type tests 610 $aRe?nyi's pseudodistance 610 $aresidual entropy 610 $arestricted minimum Re?nyi's pseudodistance estimators 610 $arobustness 610 $aset function 610 $aShannon entropy 610 $aShearer's lemma 610 $askew logistic distribution 610 $astatistical divergence 610 $astatistical K-means 610 $astatistical manifold 610 $asubmodularity 610 $atransversality 610 $atruncated exponential family 610 $atruncated normal distributions 610 $aTsallis divergence 610 $aTsallis entropy 610 $aTsallis logarithm 610 $aweighted Kaniadakis divergence 610 $aweighted Tsallis divergence 615 7$aPhysics 615 7$aResearch & information: general 906 $aBOOK 912 $a9911053044603321 996 $aInformation and Divergence Measures$94525164 997 $aUNINA