LEADER 05393nam 2200697 a 450 001 9910829837203321 005 20210209180939.0 010 $a1-118-58577-1 010 $a1-280-51749-2 010 $a9786610517497 010 $a0-470-30315-8 010 $a0-471-74882-X 010 $a0-471-74881-1 035 $a(CKB)1000000000355887 035 $a(EBL)266952 035 $a(OCoLC)84733483 035 $a(SSID)ssj0000145361 035 $a(PQKBManifestationID)11147605 035 $a(PQKBTitleCode)TC0000145361 035 $a(PQKBWorkID)10156567 035 $a(PQKB)10302537 035 $a(MiAaPQ)EBC266952 035 $a(MiAaPQ)EBC4036519 035 $a(Au-PeEL)EBL4036519 035 $a(CaPaEBR)ebr11111111 035 $a(OCoLC)958543438 035 $a(PPN)175212368 035 $a(EXLCZ)991000000000355887 100 $a20050419d2006 uy 0 101 0 $aeng 135 $aur|n|---||||| 181 $ctxt 182 $cc 183 $acr 200 10$aElements of information theory$b[electronic resource] /$fThomas M. Cover, Joy A. Thomas 205 $a2nd ed. 210 $aHoboken, N.J. $cWiley-Interscience$dc2006 215 $a1 online resource (774 p.) 300 $aDescription based upon print version of record. 311 $a0-471-24195-4 320 $aIncludes bibliographical references (p. 689-721) and index. 327 $aELEMENTS OF INFORMATION THEORY; CONTENTS; Preface to the Second Edition; Preface to the First Edition; Acknowledgments for the Second Edition; Acknowledgments for the First Edition; 1 Introduction and Preview; 1.1 Preview of the Book; 2 Entropy, Relative Entropy, and Mutual Information; 2.1 Entropy; 2.2 Joint Entropy and Conditional Entropy; 2.3 Relative Entropy and Mutual Information; 2.4 Relationship Between Entropy and Mutual Information; 2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information; 2.6 Jensen's Inequality and Its Consequences 327 $a2.7 Log Sum Inequality and Its Applications2.8 Data-Processing Inequality; 2.9 Sufficient Statistics; 2.10 Fano's Inequality; Summary; Problems; Historical Notes; 3 Asymptotic Equipartition Property; 3.1 Asymptotic Equipartition Property Theorem; 3.2 Consequences of the AEP: Data Compression; 3.3 High-Probability Sets and the Typical Set; Summary; Problems; Historical Notes; 4 Entropy Rates of a Stochastic Process; 4.1 Markov Chains; 4.2 Entropy Rate; 4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph; 4.4 Second Law of Thermodynamics; 4.5 Functions of Markov Chains; Summary 327 $aProblemsHistorical Notes; 5 Data Compression; 5.1 Examples of Codes; 5.2 Kraft Inequality; 5.3 Optimal Codes; 5.4 Bounds on the Optimal Code Length; 5.5 Kraft Inequality for Uniquely Decodable Codes; 5.6 Huffman Codes; 5.7 Some Comments on Huffman Codes; 5.8 Optimality of Huffman Codes; 5.9 Shannon-Fano-Elias Coding; 5.10 Competitive Optimality of the Shannon Code; 5.11 Generation of Discrete Distributions from Fair Coins; Summary; Problems; Historical Notes; 6 Gambling and Data Compression; 6.1 The Horse Race; 6.2 Gambling and Side Information; 6.3 Dependent Horse Races and Entropy Rate 327 $a6.4 The Entropy of English6.5 Data Compression and Gambling; 6.6 Gambling Estimate of the Entropy of English; Summary; Problems; Historical Notes; 7 Channel Capacity; 7.1 Examples of Channel Capacity; 7.1.1 Noiseless Binary Channel; 7.1.2 Noisy Channel with Nonoverlapping Outputs; 7.1.3 Noisy Typewriter; 7.1.4 Binary Symmetric Channel; 7.1.5 Binary Erasure Channel; 7.2 Symmetric Channels; 7.3 Properties of Channel Capacity; 7.4 Preview of the Channel Coding Theorem; 7.5 Definitions; 7.6 Jointly Typical Sequences; 7.7 Channel Coding Theorem; 7.8 Zero-Error Codes 327 $a7.9 Fano's Inequality and the Converse to the Coding Theorem7.10 Equality in the Converse to the Channel Coding Theorem; 7.11 Hamming Codes; 7.12 Feedback Capacity; 7.13 Source-Channel Separation Theorem; Summary; Problems; Historical Notes; 8 Differential Entropy; 8.1 Definitions; 8.2 AEP for Continuous Random Variables; 8.3 Relation of Differential Entropy to Discrete Entropy; 8.4 Joint and Conditional Differential Entropy; 8.5 Relative Entropy and Mutual Information; 8.6 Properties of Differential Entropy, Relative Entropy, and Mutual Information; Summary; Problems; Historical Notes 327 $a9 Gaussian Channel 330 $aThe latest edition of this classic is updated with new problem sets and materialThe Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory.All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying t 606 $aInformation theory 615 0$aInformation theory. 676 $a003.54 676 $a003/.54 700 $aCover$b T. M.$f1938-2012.$07921 701 $aThomas$b Joy A$07922 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a9910829837203321 996 $aElements of information theory$943423 997 $aUNINA