LEADER 03722nam 2200481 450 001 996466407803316 005 20230619211657.0 010 $a3-030-86552-5 035 $a(CKB)5470000001298866 035 $a(MiAaPQ)EBC6796420 035 $a(Au-PeEL)EBL6796420 035 $a(OCoLC)1281766617 035 $a(PPN)258298170 035 $a(EXLCZ)995470000001298866 100 $a20220723d2021 uy 0 101 0 $aeng 135 $aurcnu|||||||| 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aNew foundations of information theory $elogical entropy and Shannon entropy /$fDavid Ellerman 210 1$aCham, Switzerland :$cSpringer,$d[2021] 210 4$d©2021 215 $a1 online resource (121 pages) 225 1 $aSpringerBriefs in philosophy 311 $a3-030-86551-7 327 $aIntro -- Acknowledgements -- About this book -- Contents -- About the Author -- 1 Logical Entropy -- 1.1 Introduction -- 1.2 Logical Information as the Measure of Distinctions -- 1.3 Classical Logical Probability and Logical Entropy -- 1.4 Information Algebras and Joint Distributions -- 1.5 Brief History of the Logical Entropy Formula -- References -- 2 The Relationship Between Logical Entropy and Shannon Entropy -- 2.1 Shannon Entropy -- 2.2 Logical Entropy, Not Shannon Entropy, Is a (Non-negative) Measure -- 2.3 The Dit-Bit Transform -- References -- 3 The Compound Notions for Logical and Shannon Entropies -- 3.1 Conditional Logical Entropy -- 3.2 Shannon Conditional Entropy -- 3.3 Logical Mutual Information -- 3.4 Shannon Mutual Information -- 3.5 Independent Joint Distributions -- 3.6 Cross-Entropies, Divergences, and Hamming Distance -- 3.6.1 Cross-Entropies -- 3.6.2 Divergences -- 3.6.3 Hamming Distance -- 3.7 Summary of Formulas and Dit-Bit Transforms -- References -- 4 Further Developments of Logical Entropy -- 4.1 Entropies for Multivariate Joint Distributions -- 4.2 An Example of Negative Mutual Information for Shannon Entropy -- 4.3 Entropies for Countable Probability Distributions -- 4.4 Metrical Logical Entropy = (Twice) Variance -- 4.5 Boltzmann and Shannon Entropies: A Conceptual Connection? -- 4.6 MaxEntropies for Discrete Distributions -- 4.7 The Transition to Coding Theory -- 4.8 Logical Entropy on Rooted Trees -- References -- 5 Quantum Logical Information Theory -- 5.1 Density Matrix Treatment of Logical Entropy -- 5.2 Linearizing Logical Entropy to Quantum Logical Entropy -- 5.3 Theorems About Quantum Logical Entropy -- 5.4 Quantum Logical Entropies with Density Matrices as Observables -- 5.5 The Logical Hamming Distance Between Two Partitions -- 5.6 The Quantum Logical Hamming Distance -- References -- 6 Conclusion. 327 $a6.1 Information Theory Re-founded and Re-envisioned -- 6.2 Quantum Information Theory Re-envisioned -- 6.3 What Is to Be Done? -- References -- A Basics of Partition Logic -- A.1 Subset Logic and Partition Logic -- A.2 The Lattice Operations on Partitions -- A.3 Implication and Negation in Partition Logic -- A.4 Relative Negation in Partition Logic and the Boolean Core -- A.5 Partition Tautologies -- References -- Index. 410 0$aSpringerBriefs in philosophy. 606 $aEntropy (Information theory) 606 $aEntropia (Teoria de la informació)$2thub 608 $aLlibres electrònics$2thub 615 0$aEntropy (Information theory) 615 7$aEntropia (Teoria de la informació) 676 $a003.54 700 $aEllerman$b David P.$0232092 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a996466407803316 996 $aNew foundations of information theory$92901098 997 $aUNISA