1.

Record Nr.

UNINA9910462657803321

Autore

Lee Peter M

Titolo

Bayesian statistics [[electronic resource] ] : an introduction / / Peter M. Lee

Pubbl/distr/stampa

Chichester, West Sussex ; ; Hoboken, N.J., 2012

ISBN

1-280-77576-9

9786613686152

1-118-35975-5

Edizione

[4th ed.]

Descrizione fisica

1 online resource (488 p.)

Disciplina

519.5/42

Soggetti

Bayesian statistical decision theory

Mathematical statistics

Electronic books.

Lingua di pubblicazione

Inglese

Formato

Materiale a stampa

Livello bibliografico

Monografia

Note generali

Description based upon print version of record.

Nota di bibliografia

Includes bibliographical references and index.

Nota di contenuto

Bayesian Statistics; Contents; Preface; Preface to the First Edition; 1 Preliminaries; 1.1 Probability and Bayes' Theorem; 1.1.1 Notation; 1.1.2 Axioms for probability; 1.1.3 'Unconditional' probability; 1.1.4 Odds; 1.1.5 Independence; 1.1.6 Some simple consequences of the axioms;  Bayes' Theorem; 1.2 Examples on Bayes' Theorem; 1.2.1 The Biology of Twins; 1.2.2 A political example; 1.2.3 A warning; 1.3 Random variables; 1.3.1 Discrete random variables; 1.3.2 The binomial distribution; 1.3.3 Continuous random variables; 1.3.4 The normal distribution; 1.3.5 Mixed random variables

1.4 Several random variables1.4.1 Two discrete random variables; 1.4.2 Two continuous random variables; 1.4.3 Bayes' Theorem for random variables; 1.4.4 Example; 1.4.5 One discrete variable and one continuous variable; 1.4.6 Independent random variables; 1.5 Means and variances; 1.5.1 Expectations; 1.5.2 The expectation of a sum and of a product; 1.5.3 Variance, precision and standard deviation; 1.5.4 Examples; 1.5.5 Variance of a sum;  covariance and correlation; 1.5.6 Approximations to the mean and variance of a function of a random variable; 1.5.7 Conditional expectations and variances

1.5.8 Medians and modes1.6 Exercises on Chapter 1; 2 Bayesian



inference for the normal distribution; 2.1 Nature of Bayesian inference; 2.1.1 Preliminary remarks; 2.1.2 Post is prior times likelihood; 2.1.3 Likelihood can be multiplied by any constant; 2.1.4 Sequential use of Bayes' Theorem; 2.1.5 The predictive distribution; 2.1.6 A warning; 2.2 Normal prior and likelihood; 2.2.1 Posterior from a normal prior and likelihood; 2.2.2 Example; 2.2.3 Predictive distribution; 2.2.4 The nature of the assumptions made; 2.3 Several normal observations with a normal prior; 2.3.1 Posterior distribution

2.3.2 Example2.3.3 Predictive distribution; 2.3.4 Robustness; 2.4 Dominant likelihoods; 2.4.1 Improper priors; 2.4.2 Approximation of proper priors by improper priors; 2.5 Locally uniform priors; 2.5.1 Bayes' postulate; 2.5.2 Data translated likelihoods; 2.5.3 Transformation of unknown parameters; 2.6 Highest density regions; 2.6.1 Need for summaries of posterior information; 2.6.2 Relation to classical statistics; 2.7 Normal variance; 2.7.1 A suitable prior for the normal variance; 2.7.2 Reference prior for the normal variance; 2.8 HDRs for the normal variance

2.8.1 What distribution should we be considering?2.8.2 Example; 2.9 The role of sufficiency; 2.9.1 Definition of sufficiency; 2.9.2 Neyman's factorization theorem; 2.9.3 Sufficiency principle; 2.9.4 Examples; 2.9.5 Order statistics and minimal sufficient statistics; 2.9.6 Examples on minimal sufficiency; 2.10 Conjugate prior distributions; 2.10.1 Definition and difficulties; 2.10.2 Examples; 2.10.3 Mixtures of conjugate densities; 2.10.4 Is your prior really conjugate?; 2.11 The exponential family; 2.11.1 Definition; 2.11.2 Examples; 2.11.3 Conjugate densities

2.11.4 Two-parameter exponential family

Sommario/riassunto

Bayesian Statistics is the school of thought that combines prior  beliefs with the likelihood of a hypothesis to arrive at posterior  beliefs. The first edition of Peter Lee's book appeared in 1989, but the  subject has moved ever onwards, with increasing emphasis on Monte Carlo  based techniques.  This new fourth edition looks at recent techniques such as  variational methods, Bayesian importance sampling, approximate Bayesian  computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC),  providing a concise account of the way in which the Bayesian approach to  statistics develo