1.

Record Nr.

UNINA9910782417203321

Autore

Applebaum David <1956->

Titolo

Probability and information : an integrated approach / / David Applebaum [[electronic resource]]

Pubbl/distr/stampa

Cambridge : , : Cambridge University Press, , 2008

ISBN

1-107-18810-5

0-511-65003-5

0-511-41330-0

0-511-57441-X

0-511-75526-0

0-511-41424-2

Edizione

[Second edition.]

Descrizione fisica

1 online resource (xvi, 273 pages) : digital, PDF file(s)

Disciplina

519.2

Soggetti

Probabilities

Information theory

Lingua di pubblicazione

Inglese

Formato

Materiale a stampa

Livello bibliografico

Monografia

Note generali

Title from publisher's bibliographic system (viewed on 05 Oct 2015).

Nota di bibliografia

Includes bibliographical references and index.

Nota di contenuto

Cover; Half-title; Title; Copyright; Contents; Preface to the second edition; Preface to the first edition; 1 Introduction; 2 Combinatorics; 3 Sets and measures; 4 Probability; 5 Discrete random variables; 6 Information and entropy; 7 Communication; 8 Random variables with probability density functions; 9 Random vectors; 10 Markov chains and their entropy; Exploring further; Appendix 1: Proof by mathematical induction; Appendix 2: Lagrange multipliers; Appendix 3: Integration of exp -12x2; Appendix 4: Table of probabilities associated with the standardnormal distribution

Appendix 5: A rapid review of matrix algebraSelected solutions; Index

Sommario/riassunto

This updated textbook is an excellent way to introduce probability and information theory to new students in mathematics, computer science, engineering, statistics, economics, or business studies. Only requiring knowledge of basic calculus, it starts by building a clear and systematic foundation to the subject: the concept of probability is given particular attention via a simplified discussion of measures on Boolean algebras. The theoretical ideas are then applied to practical areas such as



statistical inference, random walks, statistical mechanics and communications modelling. Topics covered include discrete and continuous random variables, entropy and mutual information, maximum entropy methods, the central limit theorem and the coding and transmission of information, and added for this new edition is material on Markov chains and their entropy. Lots of examples and exercises are included to illustrate how to use the theory in a wide range of applications, with detailed solutions to most exercises available online for instructors.