Intro -- Acknowledgements -- About this book -- Contents -- About the Author -- 1 Logical Entropy -- 1.1 Introduction -- 1.2 Logical Information as the Measure of Distinctions -- 1.3 Classical Logical Probability and Logical Entropy -- 1.4 Information Algebras and Joint Distributions -- 1.5 Brief History of the Logical Entropy Formula -- References -- 2 The Relationship Between Logical Entropy and Shannon Entropy -- 2.1 Shannon Entropy -- 2.2 Logical Entropy, Not Shannon Entropy, Is a (Non-negative) Measure -- 2.3 The Dit-Bit Transform -- References -- 3 The Compound Notions for Logical and Shannon Entropies -- 3.1 Conditional Logical Entropy -- 3.2 Shannon Conditional Entropy -- 3.3 Logical Mutual Information -- 3.4 Shannon Mutual Information -- 3.5 Independent Joint Distributions -- 3.6 Cross-Entropies, Divergences, and Hamming Distance -- 3.6.1 Cross-Entropies -- 3.6.2 Divergences -- 3.6.3 Hamming Distance -- 3.7 Summary of Formulas and Dit-Bit Transforms -- References -- 4 Further Developments of Logical Entropy -- 4.1 Entropies for Multivariate Joint Distributions -- 4.2 An Example of Negative Mutual Information for Shannon Entropy -- 4.3 Entropies for Countable Probability Distributions -- 4.4 Metrical Logical Entropy = (Twice) Variance -- 4.5 Boltzmann and Shannon Entropies: A Conceptual Connection? -- 4.6 MaxEntropies for Discrete Distributions -- 4.7 The |