Elements of information theory [[electronic resource] /] / Thomas M. Cover, Joy A. Thomas |
Autore | Cover T. M. <1938-2012.> |
Edizione | [2nd ed.] |
Pubbl/distr/stampa | Hoboken, N.J., : Wiley-Interscience, c2006 |
Descrizione fisica | 1 online resource (774 p.) |
Disciplina |
003.54
003/.54 |
Altri autori (Persone) | ThomasJoy A |
Soggetto topico | Information theory |
ISBN |
1-118-58577-1
1-280-51749-2 9786610517497 0-470-30315-8 0-471-74882-X 0-471-74881-1 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
ELEMENTS OF INFORMATION THEORY; CONTENTS; Preface to the Second Edition; Preface to the First Edition; Acknowledgments for the Second Edition; Acknowledgments for the First Edition; 1 Introduction and Preview; 1.1 Preview of the Book; 2 Entropy, Relative Entropy, and Mutual Information; 2.1 Entropy; 2.2 Joint Entropy and Conditional Entropy; 2.3 Relative Entropy and Mutual Information; 2.4 Relationship Between Entropy and Mutual Information; 2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information; 2.6 Jensen's Inequality and Its Consequences
2.7 Log Sum Inequality and Its Applications2.8 Data-Processing Inequality; 2.9 Sufficient Statistics; 2.10 Fano's Inequality; Summary; Problems; Historical Notes; 3 Asymptotic Equipartition Property; 3.1 Asymptotic Equipartition Property Theorem; 3.2 Consequences of the AEP: Data Compression; 3.3 High-Probability Sets and the Typical Set; Summary; Problems; Historical Notes; 4 Entropy Rates of a Stochastic Process; 4.1 Markov Chains; 4.2 Entropy Rate; 4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph; 4.4 Second Law of Thermodynamics; 4.5 Functions of Markov Chains; Summary ProblemsHistorical Notes; 5 Data Compression; 5.1 Examples of Codes; 5.2 Kraft Inequality; 5.3 Optimal Codes; 5.4 Bounds on the Optimal Code Length; 5.5 Kraft Inequality for Uniquely Decodable Codes; 5.6 Huffman Codes; 5.7 Some Comments on Huffman Codes; 5.8 Optimality of Huffman Codes; 5.9 Shannon-Fano-Elias Coding; 5.10 Competitive Optimality of the Shannon Code; 5.11 Generation of Discrete Distributions from Fair Coins; Summary; Problems; Historical Notes; 6 Gambling and Data Compression; 6.1 The Horse Race; 6.2 Gambling and Side Information; 6.3 Dependent Horse Races and Entropy Rate 6.4 The Entropy of English6.5 Data Compression and Gambling; 6.6 Gambling Estimate of the Entropy of English; Summary; Problems; Historical Notes; 7 Channel Capacity; 7.1 Examples of Channel Capacity; 7.1.1 Noiseless Binary Channel; 7.1.2 Noisy Channel with Nonoverlapping Outputs; 7.1.3 Noisy Typewriter; 7.1.4 Binary Symmetric Channel; 7.1.5 Binary Erasure Channel; 7.2 Symmetric Channels; 7.3 Properties of Channel Capacity; 7.4 Preview of the Channel Coding Theorem; 7.5 Definitions; 7.6 Jointly Typical Sequences; 7.7 Channel Coding Theorem; 7.8 Zero-Error Codes 7.9 Fano's Inequality and the Converse to the Coding Theorem7.10 Equality in the Converse to the Channel Coding Theorem; 7.11 Hamming Codes; 7.12 Feedback Capacity; 7.13 Source-Channel Separation Theorem; Summary; Problems; Historical Notes; 8 Differential Entropy; 8.1 Definitions; 8.2 AEP for Continuous Random Variables; 8.3 Relation of Differential Entropy to Discrete Entropy; 8.4 Joint and Conditional Differential Entropy; 8.5 Relative Entropy and Mutual Information; 8.6 Properties of Differential Entropy, Relative Entropy, and Mutual Information; Summary; Problems; Historical Notes 9 Gaussian Channel |
Record Nr. | UNINA-9910143567703321 |
Cover T. M. <1938-2012.> | ||
Hoboken, N.J., : Wiley-Interscience, c2006 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Elements of information theory [[electronic resource] /] / Thomas M. Cover, Joy A. Thomas |
Autore | Cover T. M. <1938-2012.> |
Edizione | [2nd ed.] |
Pubbl/distr/stampa | Hoboken, N.J., : Wiley-Interscience, c2006 |
Descrizione fisica | 1 online resource (774 p.) |
Disciplina |
003.54
003/.54 |
Altri autori (Persone) | ThomasJoy A |
Soggetto topico | Information theory |
ISBN |
1-118-58577-1
1-280-51749-2 9786610517497 0-470-30315-8 0-471-74882-X 0-471-74881-1 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
ELEMENTS OF INFORMATION THEORY; CONTENTS; Preface to the Second Edition; Preface to the First Edition; Acknowledgments for the Second Edition; Acknowledgments for the First Edition; 1 Introduction and Preview; 1.1 Preview of the Book; 2 Entropy, Relative Entropy, and Mutual Information; 2.1 Entropy; 2.2 Joint Entropy and Conditional Entropy; 2.3 Relative Entropy and Mutual Information; 2.4 Relationship Between Entropy and Mutual Information; 2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information; 2.6 Jensen's Inequality and Its Consequences
2.7 Log Sum Inequality and Its Applications2.8 Data-Processing Inequality; 2.9 Sufficient Statistics; 2.10 Fano's Inequality; Summary; Problems; Historical Notes; 3 Asymptotic Equipartition Property; 3.1 Asymptotic Equipartition Property Theorem; 3.2 Consequences of the AEP: Data Compression; 3.3 High-Probability Sets and the Typical Set; Summary; Problems; Historical Notes; 4 Entropy Rates of a Stochastic Process; 4.1 Markov Chains; 4.2 Entropy Rate; 4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph; 4.4 Second Law of Thermodynamics; 4.5 Functions of Markov Chains; Summary ProblemsHistorical Notes; 5 Data Compression; 5.1 Examples of Codes; 5.2 Kraft Inequality; 5.3 Optimal Codes; 5.4 Bounds on the Optimal Code Length; 5.5 Kraft Inequality for Uniquely Decodable Codes; 5.6 Huffman Codes; 5.7 Some Comments on Huffman Codes; 5.8 Optimality of Huffman Codes; 5.9 Shannon-Fano-Elias Coding; 5.10 Competitive Optimality of the Shannon Code; 5.11 Generation of Discrete Distributions from Fair Coins; Summary; Problems; Historical Notes; 6 Gambling and Data Compression; 6.1 The Horse Race; 6.2 Gambling and Side Information; 6.3 Dependent Horse Races and Entropy Rate 6.4 The Entropy of English6.5 Data Compression and Gambling; 6.6 Gambling Estimate of the Entropy of English; Summary; Problems; Historical Notes; 7 Channel Capacity; 7.1 Examples of Channel Capacity; 7.1.1 Noiseless Binary Channel; 7.1.2 Noisy Channel with Nonoverlapping Outputs; 7.1.3 Noisy Typewriter; 7.1.4 Binary Symmetric Channel; 7.1.5 Binary Erasure Channel; 7.2 Symmetric Channels; 7.3 Properties of Channel Capacity; 7.4 Preview of the Channel Coding Theorem; 7.5 Definitions; 7.6 Jointly Typical Sequences; 7.7 Channel Coding Theorem; 7.8 Zero-Error Codes 7.9 Fano's Inequality and the Converse to the Coding Theorem7.10 Equality in the Converse to the Channel Coding Theorem; 7.11 Hamming Codes; 7.12 Feedback Capacity; 7.13 Source-Channel Separation Theorem; Summary; Problems; Historical Notes; 8 Differential Entropy; 8.1 Definitions; 8.2 AEP for Continuous Random Variables; 8.3 Relation of Differential Entropy to Discrete Entropy; 8.4 Joint and Conditional Differential Entropy; 8.5 Relative Entropy and Mutual Information; 8.6 Properties of Differential Entropy, Relative Entropy, and Mutual Information; Summary; Problems; Historical Notes 9 Gaussian Channel |
Record Nr. | UNISA-996212505003316 |
Cover T. M. <1938-2012.> | ||
Hoboken, N.J., : Wiley-Interscience, c2006 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. di Salerno | ||
|
Elements of information theory [[electronic resource] /] / Thomas M. Cover, Joy A. Thomas |
Autore | Cover T. M. <1938-2012.> |
Edizione | [2nd ed.] |
Pubbl/distr/stampa | Hoboken, N.J., : Wiley-Interscience, c2006 |
Descrizione fisica | 1 online resource (774 p.) |
Disciplina |
003.54
003/.54 |
Altri autori (Persone) | ThomasJoy A |
Soggetto topico | Information theory |
ISBN |
1-118-58577-1
1-280-51749-2 9786610517497 0-470-30315-8 0-471-74882-X 0-471-74881-1 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
ELEMENTS OF INFORMATION THEORY; CONTENTS; Preface to the Second Edition; Preface to the First Edition; Acknowledgments for the Second Edition; Acknowledgments for the First Edition; 1 Introduction and Preview; 1.1 Preview of the Book; 2 Entropy, Relative Entropy, and Mutual Information; 2.1 Entropy; 2.2 Joint Entropy and Conditional Entropy; 2.3 Relative Entropy and Mutual Information; 2.4 Relationship Between Entropy and Mutual Information; 2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information; 2.6 Jensen's Inequality and Its Consequences
2.7 Log Sum Inequality and Its Applications2.8 Data-Processing Inequality; 2.9 Sufficient Statistics; 2.10 Fano's Inequality; Summary; Problems; Historical Notes; 3 Asymptotic Equipartition Property; 3.1 Asymptotic Equipartition Property Theorem; 3.2 Consequences of the AEP: Data Compression; 3.3 High-Probability Sets and the Typical Set; Summary; Problems; Historical Notes; 4 Entropy Rates of a Stochastic Process; 4.1 Markov Chains; 4.2 Entropy Rate; 4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph; 4.4 Second Law of Thermodynamics; 4.5 Functions of Markov Chains; Summary ProblemsHistorical Notes; 5 Data Compression; 5.1 Examples of Codes; 5.2 Kraft Inequality; 5.3 Optimal Codes; 5.4 Bounds on the Optimal Code Length; 5.5 Kraft Inequality for Uniquely Decodable Codes; 5.6 Huffman Codes; 5.7 Some Comments on Huffman Codes; 5.8 Optimality of Huffman Codes; 5.9 Shannon-Fano-Elias Coding; 5.10 Competitive Optimality of the Shannon Code; 5.11 Generation of Discrete Distributions from Fair Coins; Summary; Problems; Historical Notes; 6 Gambling and Data Compression; 6.1 The Horse Race; 6.2 Gambling and Side Information; 6.3 Dependent Horse Races and Entropy Rate 6.4 The Entropy of English6.5 Data Compression and Gambling; 6.6 Gambling Estimate of the Entropy of English; Summary; Problems; Historical Notes; 7 Channel Capacity; 7.1 Examples of Channel Capacity; 7.1.1 Noiseless Binary Channel; 7.1.2 Noisy Channel with Nonoverlapping Outputs; 7.1.3 Noisy Typewriter; 7.1.4 Binary Symmetric Channel; 7.1.5 Binary Erasure Channel; 7.2 Symmetric Channels; 7.3 Properties of Channel Capacity; 7.4 Preview of the Channel Coding Theorem; 7.5 Definitions; 7.6 Jointly Typical Sequences; 7.7 Channel Coding Theorem; 7.8 Zero-Error Codes 7.9 Fano's Inequality and the Converse to the Coding Theorem7.10 Equality in the Converse to the Channel Coding Theorem; 7.11 Hamming Codes; 7.12 Feedback Capacity; 7.13 Source-Channel Separation Theorem; Summary; Problems; Historical Notes; 8 Differential Entropy; 8.1 Definitions; 8.2 AEP for Continuous Random Variables; 8.3 Relation of Differential Entropy to Discrete Entropy; 8.4 Joint and Conditional Differential Entropy; 8.5 Relative Entropy and Mutual Information; 8.6 Properties of Differential Entropy, Relative Entropy, and Mutual Information; Summary; Problems; Historical Notes 9 Gaussian Channel |
Record Nr. | UNINA-9910829837203321 |
Cover T. M. <1938-2012.> | ||
Hoboken, N.J., : Wiley-Interscience, c2006 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Elements of information theory / / Thomas M. Cover, Joy A. Thomas |
Autore | Cover T. M. <1938-2012.> |
Edizione | [2nd ed.] |
Pubbl/distr/stampa | Hoboken, N.J., : Wiley-Interscience, c2006 |
Descrizione fisica | 1 online resource (774 p.) |
Disciplina | 003/.54 |
Altri autori (Persone) | ThomasJoy A |
Soggetto topico | Information theory |
ISBN |
1-118-58577-1
1-280-51749-2 9786610517497 0-470-30315-8 0-471-74882-X 0-471-74881-1 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
ELEMENTS OF INFORMATION THEORY; CONTENTS; Preface to the Second Edition; Preface to the First Edition; Acknowledgments for the Second Edition; Acknowledgments for the First Edition; 1 Introduction and Preview; 1.1 Preview of the Book; 2 Entropy, Relative Entropy, and Mutual Information; 2.1 Entropy; 2.2 Joint Entropy and Conditional Entropy; 2.3 Relative Entropy and Mutual Information; 2.4 Relationship Between Entropy and Mutual Information; 2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information; 2.6 Jensen's Inequality and Its Consequences
2.7 Log Sum Inequality and Its Applications2.8 Data-Processing Inequality; 2.9 Sufficient Statistics; 2.10 Fano's Inequality; Summary; Problems; Historical Notes; 3 Asymptotic Equipartition Property; 3.1 Asymptotic Equipartition Property Theorem; 3.2 Consequences of the AEP: Data Compression; 3.3 High-Probability Sets and the Typical Set; Summary; Problems; Historical Notes; 4 Entropy Rates of a Stochastic Process; 4.1 Markov Chains; 4.2 Entropy Rate; 4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph; 4.4 Second Law of Thermodynamics; 4.5 Functions of Markov Chains; Summary ProblemsHistorical Notes; 5 Data Compression; 5.1 Examples of Codes; 5.2 Kraft Inequality; 5.3 Optimal Codes; 5.4 Bounds on the Optimal Code Length; 5.5 Kraft Inequality for Uniquely Decodable Codes; 5.6 Huffman Codes; 5.7 Some Comments on Huffman Codes; 5.8 Optimality of Huffman Codes; 5.9 Shannon-Fano-Elias Coding; 5.10 Competitive Optimality of the Shannon Code; 5.11 Generation of Discrete Distributions from Fair Coins; Summary; Problems; Historical Notes; 6 Gambling and Data Compression; 6.1 The Horse Race; 6.2 Gambling and Side Information; 6.3 Dependent Horse Races and Entropy Rate 6.4 The Entropy of English6.5 Data Compression and Gambling; 6.6 Gambling Estimate of the Entropy of English; Summary; Problems; Historical Notes; 7 Channel Capacity; 7.1 Examples of Channel Capacity; 7.1.1 Noiseless Binary Channel; 7.1.2 Noisy Channel with Nonoverlapping Outputs; 7.1.3 Noisy Typewriter; 7.1.4 Binary Symmetric Channel; 7.1.5 Binary Erasure Channel; 7.2 Symmetric Channels; 7.3 Properties of Channel Capacity; 7.4 Preview of the Channel Coding Theorem; 7.5 Definitions; 7.6 Jointly Typical Sequences; 7.7 Channel Coding Theorem; 7.8 Zero-Error Codes 7.9 Fano's Inequality and the Converse to the Coding Theorem7.10 Equality in the Converse to the Channel Coding Theorem; 7.11 Hamming Codes; 7.12 Feedback Capacity; 7.13 Source-Channel Separation Theorem; Summary; Problems; Historical Notes; 8 Differential Entropy; 8.1 Definitions; 8.2 AEP for Continuous Random Variables; 8.3 Relation of Differential Entropy to Discrete Entropy; 8.4 Joint and Conditional Differential Entropy; 8.5 Relative Entropy and Mutual Information; 8.6 Properties of Differential Entropy, Relative Entropy, and Mutual Information; Summary; Problems; Historical Notes 9 Gaussian Channel |
Record Nr. | UNINA-9910876820703321 |
Cover T. M. <1938-2012.> | ||
Hoboken, N.J., : Wiley-Interscience, c2006 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|