1.

Record Nr.

UNISA996466402803316

Titolo

Information theory : Poincare Seminar 2018 / / Bertrand Duplantier, Vincent Rivasseau, editors

Pubbl/distr/stampa

Cham, Switzerland : , : Birkhäuser, , [2021]

©2021

ISBN

3-030-81480-7

Descrizione fisica

1 online resource (222 pages)

Collana

Progress in mathematical physics ; ; 78

Disciplina

003.54

Soggetti

Information theory

Tecnologia de la informació

Llibres electrònics

Lingua di pubblicazione

Inglese

Formato

Materiale a stampa

Livello bibliografico

Monografia

Nota di contenuto

Intro -- Contents -- Foreword -- Thermodynamics and Information Theory -- 1. Introduction -- 2. Thermodynamics: A Brief Review -- 2.1. The Two Principles of Thermodynamics -- 2.2. Molecular Theory of Heat and the Framework of Statistical Mechanics -- 2.3. Brownian Motion: Equilibrium Is Dynamical -- 2.4. Universality of Brownian Motion: Feynman's Ratchet and Pawl -- 2.4.1. Application to molecular motors -- 3. Equilibrium and Non-equilibrium Dynamics -- 3.1. Markovian Dynamics -- 3.2. Connection to Thermodynamics -- 3.3. Time-Reversal Invariance and Detailed Balance -- 3.4. Physical Iinterpretation of Detailed Balance -- 3.5. Entropy Production in Markovian Systems -- 4. The Gallavotti{Cohen Fluctuation Theorem for Markovian Thermodynamics -- 4.1. Generalised Detailed Balance -- 4.2. Time Reversal and the Gallavotti{Cohen Symmetry -- 5. Non-equilibrium Work Identities -- 5.1. Jarzynski's Work Theorem -- 5.2. Crooks' Relation -- 6. Information Theory -- 7. Thermodynamics and Information: The Maxwell Demon -- 8. Conclusion -- Appendix A. Large Deviations and Cumulant Generating Functions -- Appendix B. Proof of the Jarzynski Formula for Hamiltonian Dynamics -- Acknowledgments -- References -- This is IT: A Primer on Shannon's Entropy and Information -- 1. Shannon's Life as a Child -- 2. A Noble Prize Laureate -- 3. Intelligence or Information? -- 4. Probabilistic, not



Semantic -- 5. The Celebrated 1948 Paper -- 6. Shannon, not Weaver -- 7. Shannon, not Wiener -- 8. Shannon's Bandwagon -- 9. An Axiomatic Approach to Entropy -- 10. Units of Information -- 11. H or ^Eta? -- 12. No One Knows What Entropy Really Is -- 13. How Does Entropy Arise Naturally? -- 14. Shannon's Source Coding Theorem -- 15. Continuous Entropy -- 16. Change of Variable in the Entropy -- 17. Discrete vs. Continuous Entropy -- 18. Most Beautiful Equation -- 19. Entropy Power.

20. A Fundamental Information Inequality -- 21. The MaxEnt Principle -- 22. Relative Entropy or Divergence -- 23. Generalized Entropies and Divergences -- 24. How Does Relative Entropy Arise Naturally? -- 25. Cherno Information -- 26. Fisher Information -- 27. Kolmogorov Information -- 28. Shannon's Mutual Information -- 29. Conditional Entropy or Equivocation -- 30. Knowledge Reduces Uncertainty - Mixing Increases Entropy -- 31. A Suggestive Venn Diagram -- 32. Shannon's Channel Coding Theorem -- 33. Shannon's Capacity Formula -- 34. The Entropy Power Inequality and a Saddle Point Property -- 35. MaxEnt vs. MinEnt Principles -- 36. A Simple Proof of the Entropy Power Inequality -- 37. Conclusion -- References -- Landauer's Bound and Maxwell's Demon -- 1. Introduction -- 1.1. Maxwell's Demon and Szilard's Engine -- 1.2. Landauer's Principle and Bennett's Resolution -- 2. Experimental Implementations -- 2.1. Experiments on Maxwell's Demon -- 2.1.1. The Szilard engine: work production from information -- 2.1.2. The autonomous Maxwell demon improves cooling -- 2.2. Experiments on Landauer's Principle -- 2.3. Other Experiments on the Physics of Information -- 3. Extensions to the Quantum Regime -- 3.1. Experiments on Quantum Maxwell's Demon -- 3.1. Experiments on Quantum Maxwell's Demon -- 3.2. Experiments on Quantum Landauer's Principle -- 4. Applications -- Appendix A. Stochastic Thermodynamics and Information Energy Cost -- A.1. Estimate the Free Energy Difference from Work Fluctuations -- A.2. Landauer Bound and the Jarzynski Equality -- A.2.1. Experimental test of the generalized Jarzynski equality -- Appendix B. Set-up Used in the Experiment Presented in Section 2.2 -- B.1. The One-Bit Memory System -- B.2. Heat Measurements -- References -- Verification of Quantum Computation: An Overview of Existing Approaches -- 1. Introduction.

1.1. Blind Quantum Computing -- 1.1.1. Quantum one-time pad -- 1.1.2. Childs' protocol for blind computation -- 1.1.3. Universal Blind Quantum Computation (UBQC) -- 2. Prepare-and-Send Protocols -- 2.1. Quantum Authentication-Based Veri cation -- 2.1.1. Clifford-QAS VQC -- 2.1.2. Poly-QAS VQC -- 2.2. Trap-Based Verification -- 2.3. Veri cation Based on Repeated Runs -- 2.4. Summary of Prepare-and-Send Protocols -- 3. Receive-and-Measure Protocols -- 3.1. Measurement-only Verification -- 3.2. Post-hoc Verification -- 3.3. Summary of receive-and-measure protocols -- 4. Entanglement-based Protocols -- 4.1. Verification Based on CHSH Rigidity -- 4.1.1. RUV protocol -- 4.1.2. GKW protocol -- 4.1.3. HPDF protocol -- 4.2. Verification Based on Self-testing Graph States -- 4.3. Post-hoc Verifi cation -- 4.3.1. FH protocol -- 4.3.2. NV protocol -- 4.4. Summary of Entanglement-based Protocols -- 5. Outlook -- 5.1. Sub-universal Protocols -- 5.2. Fault Tolerance -- 5.3. Experiments and Implementations -- 6. Conclusions -- 7. Appendix -- 7.1. Quantum Information and Computation -- 7.1.1. Basics of quantum mechanics -- 7.1.2. Density matrices -- 7.1.3. Puri cation -- 7.1.4. CPTP maps -- 7.1.5. Trace distance -- 7.1.6. Quantum computation -- 7.1.7. Bloch sphere -- 7.1.8. Quantum error correction -- 7.2. Measurement-based Quantum Computation -- 7.3. Complexity Theory -- References.