Decoding reality : the universe as quantum information / / Vlatko Vedral |
Autore | Vedral Vlatko |
Edizione | [2nd ed.] |
Pubbl/distr/stampa | Oxford [England] ; ; New York : , : Oxford University Press, , [2010] |
Descrizione fisica | 1 online resource (xii, 229 pages) |
Disciplina | 003/.54 |
Collana |
Oxford landmark science
Oxford scholarship online |
Soggetto topico |
Information theory
Quantum theory Quantum computers Space and time Physics - Philosophy |
ISBN |
0-19-255425-5
0-19-191724-9 0-19-255299-6 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto | Creation ex nihilo : something from nothing -- Information for all seasons -- Back to basics : bits and pieces -- Digital romance : life is a four-letter word -- Murphy's Law : I knew this would happen to me -- Place your bets in it to win it -- Social informatics : get connected or die tryin' -- Quantum schmuntum : lights, camera, action! -- Surfing the waves : hyper-fast computers -- Children of the aimless chance : randomness versus determinism -- Sand reckoning : whose information is it, anyway? -- Destruction ab toto : nothing from something. |
Record Nr. | UNINA-9910796921903321 |
Vedral Vlatko | ||
Oxford [England] ; ; New York : , : Oxford University Press, , [2010] | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Decoding reality : the universe as quantum information / / Vlatko Vedral |
Autore | Vedral Vlatko |
Edizione | [2nd ed.] |
Pubbl/distr/stampa | Oxford [England] ; ; New York : , : Oxford University Press, , [2010] |
Descrizione fisica | 1 online resource (xii, 229 pages) |
Disciplina | 003/.54 |
Collana |
Oxford landmark science
Oxford scholarship online |
Soggetto topico |
Information theory
Quantum theory Quantum computers Space and time Physics - Philosophy |
ISBN |
0-19-255425-5
0-19-191724-9 0-19-255299-6 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto | Creation ex nihilo : something from nothing -- Information for all seasons -- Back to basics : bits and pieces -- Digital romance : life is a four-letter word -- Murphy's Law : I knew this would happen to me -- Place your bets in it to win it -- Social informatics : get connected or die tryin' -- Quantum schmuntum : lights, camera, action! -- Surfing the waves : hyper-fast computers -- Children of the aimless chance : randomness versus determinism -- Sand reckoning : whose information is it, anyway? -- Destruction ab toto : nothing from something. |
Record Nr. | UNINA-9910815924603321 |
Vedral Vlatko | ||
Oxford [England] ; ; New York : , : Oxford University Press, , [2010] | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Designing information [[electronic resource] ] : human factors and common sense in information design / / Joel Katz |
Autore | Katz Joel <1943-> |
Pubbl/distr/stampa | Hoboken, N.J., : John Wiley & Sons, Inc., c2012 |
Descrizione fisica | 1 online resource (226 p.) |
Disciplina | 003/.54 |
Soggetto topico | Visual communication |
Soggetto genere / forma | Electronic books. |
ISBN |
1-283-57606-6
9786613888518 1-118-42009-8 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto | Aspects of Information Design : The nature of information -- Qualitative Issues : Perceptions, conventions, proximity -- Quantitative Issues : Dimensionality, comparisons, numbers, scale -- Structure, Organization, Type : Hierarchy and visual grammar -- Finding Your Way? : Movement, orientation, situational geography -- Documents : Stories, inventories, notes. |
Record Nr. | UNINA-9910462117103321 |
Katz Joel <1943-> | ||
Hoboken, N.J., : John Wiley & Sons, Inc., c2012 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Designing information [[electronic resource] ] : human factors and common sense in information design / / Joel Katz |
Autore | Katz Joel <1943-> |
Pubbl/distr/stampa | Hoboken, N.J., : John Wiley & Sons, Inc., c2012 |
Descrizione fisica | 1 online resource (226 p.) |
Disciplina | 003/.54 |
Soggetto topico | Visual communication |
ISBN |
1-118-41686-4
1-283-57606-6 9786613888518 1-118-42009-8 |
Classificazione | DES007000 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto | Aspects of Information Design : The nature of information -- Qualitative Issues : Perceptions, conventions, proximity -- Quantitative Issues : Dimensionality, comparisons, numbers, scale -- Structure, Organization, Type : Hierarchy and visual grammar -- Finding Your Way? : Movement, orientation, situational geography -- Documents : Stories, inventories, notes. |
Record Nr. | UNINA-9910785606203321 |
Katz Joel <1943-> | ||
Hoboken, N.J., : John Wiley & Sons, Inc., c2012 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Designing information : human factors and common sense in information design / / Joel Katz |
Autore | Katz Joel <1943-> |
Edizione | [1st ed.] |
Pubbl/distr/stampa | Hoboken, N.J., : John Wiley & Sons, Inc., c2012 |
Descrizione fisica | 1 online resource (226 p.) |
Disciplina | 003/.54 |
Soggetto topico | Visual communication |
ISBN |
1-118-41686-4
1-283-57606-6 9786613888518 1-118-42009-8 |
Classificazione | DES007000 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto | Aspects of Information Design : The nature of information -- Qualitative Issues : Perceptions, conventions, proximity -- Quantitative Issues : Dimensionality, comparisons, numbers, scale -- Structure, Organization, Type : Hierarchy and visual grammar -- Finding Your Way? : Movement, orientation, situational geography -- Documents : Stories, inventories, notes. |
Record Nr. | UNINA-9910818835103321 |
Katz Joel <1943-> | ||
Hoboken, N.J., : John Wiley & Sons, Inc., c2012 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Elements of information theory [[electronic resource] /] / Thomas M. Cover, Joy A. Thomas |
Autore | Cover T. M. <1938-2012.> |
Edizione | [2nd ed.] |
Pubbl/distr/stampa | Hoboken, N.J., : Wiley-Interscience, c2006 |
Descrizione fisica | 1 online resource (774 p.) |
Disciplina |
003.54
003/.54 |
Altri autori (Persone) | ThomasJoy A |
Soggetto topico | Information theory |
ISBN |
1-118-58577-1
1-280-51749-2 9786610517497 0-470-30315-8 0-471-74882-X 0-471-74881-1 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
ELEMENTS OF INFORMATION THEORY; CONTENTS; Preface to the Second Edition; Preface to the First Edition; Acknowledgments for the Second Edition; Acknowledgments for the First Edition; 1 Introduction and Preview; 1.1 Preview of the Book; 2 Entropy, Relative Entropy, and Mutual Information; 2.1 Entropy; 2.2 Joint Entropy and Conditional Entropy; 2.3 Relative Entropy and Mutual Information; 2.4 Relationship Between Entropy and Mutual Information; 2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information; 2.6 Jensen's Inequality and Its Consequences
2.7 Log Sum Inequality and Its Applications2.8 Data-Processing Inequality; 2.9 Sufficient Statistics; 2.10 Fano's Inequality; Summary; Problems; Historical Notes; 3 Asymptotic Equipartition Property; 3.1 Asymptotic Equipartition Property Theorem; 3.2 Consequences of the AEP: Data Compression; 3.3 High-Probability Sets and the Typical Set; Summary; Problems; Historical Notes; 4 Entropy Rates of a Stochastic Process; 4.1 Markov Chains; 4.2 Entropy Rate; 4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph; 4.4 Second Law of Thermodynamics; 4.5 Functions of Markov Chains; Summary ProblemsHistorical Notes; 5 Data Compression; 5.1 Examples of Codes; 5.2 Kraft Inequality; 5.3 Optimal Codes; 5.4 Bounds on the Optimal Code Length; 5.5 Kraft Inequality for Uniquely Decodable Codes; 5.6 Huffman Codes; 5.7 Some Comments on Huffman Codes; 5.8 Optimality of Huffman Codes; 5.9 Shannon-Fano-Elias Coding; 5.10 Competitive Optimality of the Shannon Code; 5.11 Generation of Discrete Distributions from Fair Coins; Summary; Problems; Historical Notes; 6 Gambling and Data Compression; 6.1 The Horse Race; 6.2 Gambling and Side Information; 6.3 Dependent Horse Races and Entropy Rate 6.4 The Entropy of English6.5 Data Compression and Gambling; 6.6 Gambling Estimate of the Entropy of English; Summary; Problems; Historical Notes; 7 Channel Capacity; 7.1 Examples of Channel Capacity; 7.1.1 Noiseless Binary Channel; 7.1.2 Noisy Channel with Nonoverlapping Outputs; 7.1.3 Noisy Typewriter; 7.1.4 Binary Symmetric Channel; 7.1.5 Binary Erasure Channel; 7.2 Symmetric Channels; 7.3 Properties of Channel Capacity; 7.4 Preview of the Channel Coding Theorem; 7.5 Definitions; 7.6 Jointly Typical Sequences; 7.7 Channel Coding Theorem; 7.8 Zero-Error Codes 7.9 Fano's Inequality and the Converse to the Coding Theorem7.10 Equality in the Converse to the Channel Coding Theorem; 7.11 Hamming Codes; 7.12 Feedback Capacity; 7.13 Source-Channel Separation Theorem; Summary; Problems; Historical Notes; 8 Differential Entropy; 8.1 Definitions; 8.2 AEP for Continuous Random Variables; 8.3 Relation of Differential Entropy to Discrete Entropy; 8.4 Joint and Conditional Differential Entropy; 8.5 Relative Entropy and Mutual Information; 8.6 Properties of Differential Entropy, Relative Entropy, and Mutual Information; Summary; Problems; Historical Notes 9 Gaussian Channel |
Record Nr. | UNINA-9910143567703321 |
Cover T. M. <1938-2012.> | ||
Hoboken, N.J., : Wiley-Interscience, c2006 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Elements of information theory [[electronic resource] /] / Thomas M. Cover, Joy A. Thomas |
Autore | Cover T. M. <1938-2012.> |
Edizione | [2nd ed.] |
Pubbl/distr/stampa | Hoboken, N.J., : Wiley-Interscience, c2006 |
Descrizione fisica | 1 online resource (774 p.) |
Disciplina |
003.54
003/.54 |
Altri autori (Persone) | ThomasJoy A |
Soggetto topico | Information theory |
ISBN |
1-118-58577-1
1-280-51749-2 9786610517497 0-470-30315-8 0-471-74882-X 0-471-74881-1 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
ELEMENTS OF INFORMATION THEORY; CONTENTS; Preface to the Second Edition; Preface to the First Edition; Acknowledgments for the Second Edition; Acknowledgments for the First Edition; 1 Introduction and Preview; 1.1 Preview of the Book; 2 Entropy, Relative Entropy, and Mutual Information; 2.1 Entropy; 2.2 Joint Entropy and Conditional Entropy; 2.3 Relative Entropy and Mutual Information; 2.4 Relationship Between Entropy and Mutual Information; 2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information; 2.6 Jensen's Inequality and Its Consequences
2.7 Log Sum Inequality and Its Applications2.8 Data-Processing Inequality; 2.9 Sufficient Statistics; 2.10 Fano's Inequality; Summary; Problems; Historical Notes; 3 Asymptotic Equipartition Property; 3.1 Asymptotic Equipartition Property Theorem; 3.2 Consequences of the AEP: Data Compression; 3.3 High-Probability Sets and the Typical Set; Summary; Problems; Historical Notes; 4 Entropy Rates of a Stochastic Process; 4.1 Markov Chains; 4.2 Entropy Rate; 4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph; 4.4 Second Law of Thermodynamics; 4.5 Functions of Markov Chains; Summary ProblemsHistorical Notes; 5 Data Compression; 5.1 Examples of Codes; 5.2 Kraft Inequality; 5.3 Optimal Codes; 5.4 Bounds on the Optimal Code Length; 5.5 Kraft Inequality for Uniquely Decodable Codes; 5.6 Huffman Codes; 5.7 Some Comments on Huffman Codes; 5.8 Optimality of Huffman Codes; 5.9 Shannon-Fano-Elias Coding; 5.10 Competitive Optimality of the Shannon Code; 5.11 Generation of Discrete Distributions from Fair Coins; Summary; Problems; Historical Notes; 6 Gambling and Data Compression; 6.1 The Horse Race; 6.2 Gambling and Side Information; 6.3 Dependent Horse Races and Entropy Rate 6.4 The Entropy of English6.5 Data Compression and Gambling; 6.6 Gambling Estimate of the Entropy of English; Summary; Problems; Historical Notes; 7 Channel Capacity; 7.1 Examples of Channel Capacity; 7.1.1 Noiseless Binary Channel; 7.1.2 Noisy Channel with Nonoverlapping Outputs; 7.1.3 Noisy Typewriter; 7.1.4 Binary Symmetric Channel; 7.1.5 Binary Erasure Channel; 7.2 Symmetric Channels; 7.3 Properties of Channel Capacity; 7.4 Preview of the Channel Coding Theorem; 7.5 Definitions; 7.6 Jointly Typical Sequences; 7.7 Channel Coding Theorem; 7.8 Zero-Error Codes 7.9 Fano's Inequality and the Converse to the Coding Theorem7.10 Equality in the Converse to the Channel Coding Theorem; 7.11 Hamming Codes; 7.12 Feedback Capacity; 7.13 Source-Channel Separation Theorem; Summary; Problems; Historical Notes; 8 Differential Entropy; 8.1 Definitions; 8.2 AEP for Continuous Random Variables; 8.3 Relation of Differential Entropy to Discrete Entropy; 8.4 Joint and Conditional Differential Entropy; 8.5 Relative Entropy and Mutual Information; 8.6 Properties of Differential Entropy, Relative Entropy, and Mutual Information; Summary; Problems; Historical Notes 9 Gaussian Channel |
Record Nr. | UNISA-996212505003316 |
Cover T. M. <1938-2012.> | ||
Hoboken, N.J., : Wiley-Interscience, c2006 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. di Salerno | ||
|
Elements of information theory [[electronic resource] /] / Thomas M. Cover, Joy A. Thomas |
Autore | Cover T. M. <1938-2012.> |
Edizione | [2nd ed.] |
Pubbl/distr/stampa | Hoboken, N.J., : Wiley-Interscience, c2006 |
Descrizione fisica | 1 online resource (774 p.) |
Disciplina |
003.54
003/.54 |
Altri autori (Persone) | ThomasJoy A |
Soggetto topico | Information theory |
ISBN |
1-118-58577-1
1-280-51749-2 9786610517497 0-470-30315-8 0-471-74882-X 0-471-74881-1 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
ELEMENTS OF INFORMATION THEORY; CONTENTS; Preface to the Second Edition; Preface to the First Edition; Acknowledgments for the Second Edition; Acknowledgments for the First Edition; 1 Introduction and Preview; 1.1 Preview of the Book; 2 Entropy, Relative Entropy, and Mutual Information; 2.1 Entropy; 2.2 Joint Entropy and Conditional Entropy; 2.3 Relative Entropy and Mutual Information; 2.4 Relationship Between Entropy and Mutual Information; 2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information; 2.6 Jensen's Inequality and Its Consequences
2.7 Log Sum Inequality and Its Applications2.8 Data-Processing Inequality; 2.9 Sufficient Statistics; 2.10 Fano's Inequality; Summary; Problems; Historical Notes; 3 Asymptotic Equipartition Property; 3.1 Asymptotic Equipartition Property Theorem; 3.2 Consequences of the AEP: Data Compression; 3.3 High-Probability Sets and the Typical Set; Summary; Problems; Historical Notes; 4 Entropy Rates of a Stochastic Process; 4.1 Markov Chains; 4.2 Entropy Rate; 4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph; 4.4 Second Law of Thermodynamics; 4.5 Functions of Markov Chains; Summary ProblemsHistorical Notes; 5 Data Compression; 5.1 Examples of Codes; 5.2 Kraft Inequality; 5.3 Optimal Codes; 5.4 Bounds on the Optimal Code Length; 5.5 Kraft Inequality for Uniquely Decodable Codes; 5.6 Huffman Codes; 5.7 Some Comments on Huffman Codes; 5.8 Optimality of Huffman Codes; 5.9 Shannon-Fano-Elias Coding; 5.10 Competitive Optimality of the Shannon Code; 5.11 Generation of Discrete Distributions from Fair Coins; Summary; Problems; Historical Notes; 6 Gambling and Data Compression; 6.1 The Horse Race; 6.2 Gambling and Side Information; 6.3 Dependent Horse Races and Entropy Rate 6.4 The Entropy of English6.5 Data Compression and Gambling; 6.6 Gambling Estimate of the Entropy of English; Summary; Problems; Historical Notes; 7 Channel Capacity; 7.1 Examples of Channel Capacity; 7.1.1 Noiseless Binary Channel; 7.1.2 Noisy Channel with Nonoverlapping Outputs; 7.1.3 Noisy Typewriter; 7.1.4 Binary Symmetric Channel; 7.1.5 Binary Erasure Channel; 7.2 Symmetric Channels; 7.3 Properties of Channel Capacity; 7.4 Preview of the Channel Coding Theorem; 7.5 Definitions; 7.6 Jointly Typical Sequences; 7.7 Channel Coding Theorem; 7.8 Zero-Error Codes 7.9 Fano's Inequality and the Converse to the Coding Theorem7.10 Equality in the Converse to the Channel Coding Theorem; 7.11 Hamming Codes; 7.12 Feedback Capacity; 7.13 Source-Channel Separation Theorem; Summary; Problems; Historical Notes; 8 Differential Entropy; 8.1 Definitions; 8.2 AEP for Continuous Random Variables; 8.3 Relation of Differential Entropy to Discrete Entropy; 8.4 Joint and Conditional Differential Entropy; 8.5 Relative Entropy and Mutual Information; 8.6 Properties of Differential Entropy, Relative Entropy, and Mutual Information; Summary; Problems; Historical Notes 9 Gaussian Channel |
Record Nr. | UNINA-9910829837203321 |
Cover T. M. <1938-2012.> | ||
Hoboken, N.J., : Wiley-Interscience, c2006 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Elements of information theory / / Thomas M. Cover, Joy A. Thomas |
Autore | Cover T. M. <1938-2012.> |
Edizione | [2nd ed.] |
Pubbl/distr/stampa | Hoboken, N.J., : Wiley-Interscience, c2006 |
Descrizione fisica | 1 online resource (774 p.) |
Disciplina | 003/.54 |
Altri autori (Persone) | ThomasJoy A |
Soggetto topico | Information theory |
ISBN |
1-118-58577-1
1-280-51749-2 9786610517497 0-470-30315-8 0-471-74882-X 0-471-74881-1 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
ELEMENTS OF INFORMATION THEORY; CONTENTS; Preface to the Second Edition; Preface to the First Edition; Acknowledgments for the Second Edition; Acknowledgments for the First Edition; 1 Introduction and Preview; 1.1 Preview of the Book; 2 Entropy, Relative Entropy, and Mutual Information; 2.1 Entropy; 2.2 Joint Entropy and Conditional Entropy; 2.3 Relative Entropy and Mutual Information; 2.4 Relationship Between Entropy and Mutual Information; 2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information; 2.6 Jensen's Inequality and Its Consequences
2.7 Log Sum Inequality and Its Applications2.8 Data-Processing Inequality; 2.9 Sufficient Statistics; 2.10 Fano's Inequality; Summary; Problems; Historical Notes; 3 Asymptotic Equipartition Property; 3.1 Asymptotic Equipartition Property Theorem; 3.2 Consequences of the AEP: Data Compression; 3.3 High-Probability Sets and the Typical Set; Summary; Problems; Historical Notes; 4 Entropy Rates of a Stochastic Process; 4.1 Markov Chains; 4.2 Entropy Rate; 4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph; 4.4 Second Law of Thermodynamics; 4.5 Functions of Markov Chains; Summary ProblemsHistorical Notes; 5 Data Compression; 5.1 Examples of Codes; 5.2 Kraft Inequality; 5.3 Optimal Codes; 5.4 Bounds on the Optimal Code Length; 5.5 Kraft Inequality for Uniquely Decodable Codes; 5.6 Huffman Codes; 5.7 Some Comments on Huffman Codes; 5.8 Optimality of Huffman Codes; 5.9 Shannon-Fano-Elias Coding; 5.10 Competitive Optimality of the Shannon Code; 5.11 Generation of Discrete Distributions from Fair Coins; Summary; Problems; Historical Notes; 6 Gambling and Data Compression; 6.1 The Horse Race; 6.2 Gambling and Side Information; 6.3 Dependent Horse Races and Entropy Rate 6.4 The Entropy of English6.5 Data Compression and Gambling; 6.6 Gambling Estimate of the Entropy of English; Summary; Problems; Historical Notes; 7 Channel Capacity; 7.1 Examples of Channel Capacity; 7.1.1 Noiseless Binary Channel; 7.1.2 Noisy Channel with Nonoverlapping Outputs; 7.1.3 Noisy Typewriter; 7.1.4 Binary Symmetric Channel; 7.1.5 Binary Erasure Channel; 7.2 Symmetric Channels; 7.3 Properties of Channel Capacity; 7.4 Preview of the Channel Coding Theorem; 7.5 Definitions; 7.6 Jointly Typical Sequences; 7.7 Channel Coding Theorem; 7.8 Zero-Error Codes 7.9 Fano's Inequality and the Converse to the Coding Theorem7.10 Equality in the Converse to the Channel Coding Theorem; 7.11 Hamming Codes; 7.12 Feedback Capacity; 7.13 Source-Channel Separation Theorem; Summary; Problems; Historical Notes; 8 Differential Entropy; 8.1 Definitions; 8.2 AEP for Continuous Random Variables; 8.3 Relation of Differential Entropy to Discrete Entropy; 8.4 Joint and Conditional Differential Entropy; 8.5 Relative Entropy and Mutual Information; 8.6 Properties of Differential Entropy, Relative Entropy, and Mutual Information; Summary; Problems; Historical Notes 9 Gaussian Channel |
Record Nr. | UNINA-9910876820703321 |
Cover T. M. <1938-2012.> | ||
Hoboken, N.J., : Wiley-Interscience, c2006 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Elements of information theory / / Thomas M. Cover, Joy A. Thomas [[electronic resource]] |
Autore | Cover T. M. <1938-> |
Pubbl/distr/stampa | New York, : Wiley, c1991 |
Descrizione fisica | 1 online resource (xxii, 542 p. ) : ill. ; |
Disciplina | 003/.54 |
Altri autori (Persone) | ThomasJoy A |
Collana |
Wiley series in telecommunications Elements of information theory
Wiley series in telecommunications |
Soggetto topico |
Information theory
Engineering & Applied Sciences Computer Science |
Soggetto genere / forma | Electronic books |
Soggetto non controllato | Information - Statistical mechanics |
ISBN |
1-280-55618-8
9786610556182 0-471-20061-1 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto | Entropy, relative entropy and mutual information -- The asymptotic equipartition property -- Entropy rates of stochastic process -- Data compression -- Gambling and data compression -- Kolmogorov complexity -- Channel capacity -- Differential entropy -- The Gaussian channel -- Maximum entropy and spectral estimation -- Information theory and statistics -- Rate distortion theory -- Network information theory -- Information theory and the stock market -- Inequalities in information theory. |
Record Nr. | UNINA-9910145768603321 |
Cover T. M. <1938-> | ||
New York, : Wiley, c1991 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|