top

  Info

  • Utilizzare la checkbox di selezione a fianco di ciascun documento per attivare le funzionalità di stampa, invio email, download nei formati disponibili del (i) record.

  Info

  • Utilizzare questo link per rimuovere la selezione effettuata.
Bayesian statistics [[electronic resource] ] : an introduction / / Peter M. Lee
Bayesian statistics [[electronic resource] ] : an introduction / / Peter M. Lee
Autore Lee Peter M
Edizione [4th ed.]
Descrizione fisica 1 online resource (488 p.)
Disciplina 519.5/42
Soggetto topico Bayesian statistical decision theory
Mathematical statistics
Soggetto genere / forma Electronic books.
ISBN 1-280-77576-9
9786613686152
1-118-35975-5
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Bayesian Statistics; Contents; Preface; Preface to the First Edition; 1 Preliminaries; 1.1 Probability and Bayes' Theorem; 1.1.1 Notation; 1.1.2 Axioms for probability; 1.1.3 'Unconditional' probability; 1.1.4 Odds; 1.1.5 Independence; 1.1.6 Some simple consequences of the axioms; Bayes' Theorem; 1.2 Examples on Bayes' Theorem; 1.2.1 The Biology of Twins; 1.2.2 A political example; 1.2.3 A warning; 1.3 Random variables; 1.3.1 Discrete random variables; 1.3.2 The binomial distribution; 1.3.3 Continuous random variables; 1.3.4 The normal distribution; 1.3.5 Mixed random variables
1.4 Several random variables1.4.1 Two discrete random variables; 1.4.2 Two continuous random variables; 1.4.3 Bayes' Theorem for random variables; 1.4.4 Example; 1.4.5 One discrete variable and one continuous variable; 1.4.6 Independent random variables; 1.5 Means and variances; 1.5.1 Expectations; 1.5.2 The expectation of a sum and of a product; 1.5.3 Variance, precision and standard deviation; 1.5.4 Examples; 1.5.5 Variance of a sum; covariance and correlation; 1.5.6 Approximations to the mean and variance of a function of a random variable; 1.5.7 Conditional expectations and variances
1.5.8 Medians and modes1.6 Exercises on Chapter 1; 2 Bayesian inference for the normal distribution; 2.1 Nature of Bayesian inference; 2.1.1 Preliminary remarks; 2.1.2 Post is prior times likelihood; 2.1.3 Likelihood can be multiplied by any constant; 2.1.4 Sequential use of Bayes' Theorem; 2.1.5 The predictive distribution; 2.1.6 A warning; 2.2 Normal prior and likelihood; 2.2.1 Posterior from a normal prior and likelihood; 2.2.2 Example; 2.2.3 Predictive distribution; 2.2.4 The nature of the assumptions made; 2.3 Several normal observations with a normal prior; 2.3.1 Posterior distribution
2.3.2 Example2.3.3 Predictive distribution; 2.3.4 Robustness; 2.4 Dominant likelihoods; 2.4.1 Improper priors; 2.4.2 Approximation of proper priors by improper priors; 2.5 Locally uniform priors; 2.5.1 Bayes' postulate; 2.5.2 Data translated likelihoods; 2.5.3 Transformation of unknown parameters; 2.6 Highest density regions; 2.6.1 Need for summaries of posterior information; 2.6.2 Relation to classical statistics; 2.7 Normal variance; 2.7.1 A suitable prior for the normal variance; 2.7.2 Reference prior for the normal variance; 2.8 HDRs for the normal variance
2.8.1 What distribution should we be considering?2.8.2 Example; 2.9 The role of sufficiency; 2.9.1 Definition of sufficiency; 2.9.2 Neyman's factorization theorem; 2.9.3 Sufficiency principle; 2.9.4 Examples; 2.9.5 Order statistics and minimal sufficient statistics; 2.9.6 Examples on minimal sufficiency; 2.10 Conjugate prior distributions; 2.10.1 Definition and difficulties; 2.10.2 Examples; 2.10.3 Mixtures of conjugate densities; 2.10.4 Is your prior really conjugate?; 2.11 The exponential family; 2.11.1 Definition; 2.11.2 Examples; 2.11.3 Conjugate densities
2.11.4 Two-parameter exponential family
Record Nr. UNINA-9910462657803321
Lee Peter M  
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Bayesian statistics [[electronic resource] ] : an introduction / / Peter M. Lee
Bayesian statistics [[electronic resource] ] : an introduction / / Peter M. Lee
Autore Lee Peter M
Edizione [4th ed.]
Descrizione fisica xxiii, 462 p
Disciplina 519.5/42
Soggetto topico Bayesian statistical decision theory
Mathematical statistics
ISBN 1-280-77576-9
9786613686152
1-118-35975-5
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Record Nr. UNINA-9910795823703321
Lee Peter M  
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Bayesian statistics : an introduction / / Peter M. Lee
Bayesian statistics : an introduction / / Peter M. Lee
Autore Lee Peter M
Edizione [4th ed.]
Descrizione fisica xxiii, 462 p
Disciplina 519.5/42
Soggetto topico Bayesian statistical decision theory
Mathematical statistics
ISBN 1-280-77576-9
9786613686152
1-118-35975-5
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Intro -- Bayesian Statistics -- Contents -- Preface -- Preface to the First Edition -- 1 Preliminaries -- 1.1 Probability and Bayes' Theorem -- 1.1.1 Notation -- 1.1.2 Axioms for probability -- 1.1.3 'Unconditional' probability -- 1.1.4 Odds -- 1.1.5 Independence -- 1.1.6 Some simple consequences of the axioms -- Bayes' Theorem -- 1.2 Examples on Bayes' Theorem -- 1.2.1 The Biology of Twins -- 1.2.2 A political example -- 1.2.3 A warning -- 1.3 Random variables -- 1.3.1 Discrete random variables -- 1.3.2 The binomial distribution -- 1.3.3 Continuous random variables -- 1.3.4 The normal distribution -- 1.3.5 Mixed random variables -- 1.4 Several random variables -- 1.4.1 Two discrete random variables -- 1.4.2 Two continuous random variables -- 1.4.3 Bayes' Theorem for random variables -- 1.4.4 Example -- 1.4.5 One discrete variable and one continuous variable -- 1.4.6 Independent random variables -- 1.5 Means and variances -- 1.5.1 Expectations -- 1.5.2 The expectation of a sum and of a product -- 1.5.3 Variance, precision and standard deviation -- 1.5.4 Examples -- 1.5.5 Variance of a sum -- covariance and correlation -- 1.5.6 Approximations to the mean and variance of a function of a random variable -- 1.5.7 Conditional expectations and variances -- 1.5.8 Medians and modes -- 1.6 Exercises on Chapter 1 -- 2 Bayesian inference for the normal distribution -- 2.1 Nature of Bayesian inference -- 2.1.1 Preliminary remarks -- 2.1.2 Post is prior times likelihood -- 2.1.3 Likelihood can be multiplied by any constant -- 2.1.4 Sequential use of Bayes' Theorem -- 2.1.5 The predictive distribution -- 2.1.6 A warning -- 2.2 Normal prior and likelihood -- 2.2.1 Posterior from a normal prior and likelihood -- 2.2.2 Example -- 2.2.3 Predictive distribution -- 2.2.4 The nature of the assumptions made -- 2.3 Several normal observations with a normal prior.
2.3.1 Posterior distribution -- 2.3.2 Example -- 2.3.3 Predictive distribution -- 2.3.4 Robustness -- 2.4 Dominant likelihoods -- 2.4.1 Improper priors -- 2.4.2 Approximation of proper priors by improper priors -- 2.5 Locally uniform priors -- 2.5.1 Bayes' postulate -- 2.5.2 Data translated likelihoods -- 2.5.3 Transformation of unknown parameters -- 2.6 Highest density regions -- 2.6.1 Need for summaries of posterior information -- 2.6.2 Relation to classical statistics -- 2.7 Normal variance -- 2.7.1 A suitable prior for the normal variance -- 2.7.2 Reference prior for the normal variance -- 2.8 HDRs for the normal variance -- 2.8.1 What distribution should we be considering? -- 2.8.2 Example -- 2.9 The role of sufficiency -- 2.9.1 Definition of sufficiency -- 2.9.2 Neyman's factorization theorem -- 2.9.3 Sufficiency principle -- 2.9.4 Examples -- 2.9.5 Order statistics and minimal sufficient statistics -- 2.9.6 Examples on minimal sufficiency -- 2.10 Conjugate prior distributions -- 2.10.1 Definition and difficulties -- 2.10.2 Examples -- 2.10.3 Mixtures of conjugate densities -- 2.10.4 Is your prior really conjugate? -- 2.11 The exponential family -- 2.11.1 Definition -- 2.11.2 Examples -- 2.11.3 Conjugate densities -- 2.11.4 Two-parameter exponential family -- 2.12 Normal mean and variance both unknown -- 2.12.1 Formulation of the problem -- 2.12.2 Marginal distribution of the mean -- 2.12.3 Example of the posterior density for the mean -- 2.12.4 Marginal distribution of the variance -- 2.12.5 Example of the posterior density of the variance -- 2.12.6 Conditional density of the mean for given variance -- 2.13 Conjugate joint prior for the normal distribution -- 2.13.1 The form of the conjugate prior -- 2.13.2 Derivation of the posterior -- 2.13.3 Example -- 2.13.4 Concluding remarks -- 2.14 Exercises on Chapter 2.
3 Some other common distributions -- 3.1 The binomial distribution -- 3.1.1 Conjugate prior -- 3.1.2 Odds and log-odds -- 3.1.3 Highest density regions -- 3.1.4 Example -- 3.1.5 Predictive distribution -- 3.2 Reference prior for the binomial likelihood -- 3.2.1 Bayes' postulate -- 3.2.2 Haldane's prior -- 3.2.3 The arc-sine distribution -- 3.2.4 Conclusion -- 3.3 Jeffreys' rule -- 3.3.1 Fisher's information -- 3.3.2 The information from several observations -- 3.3.3 Jeffreys' prior -- 3.3.4 Examples -- 3.3.5 Warning -- 3.3.6 Several unknown parameters -- 3.3.7 Example -- 3.4 The Poisson distribution -- 3.4.1 Conjugate prior -- 3.4.2 Reference prior -- 3.4.3 Example -- 3.4.4 Predictive distribution -- 3.5 The uniform distribution -- 3.5.1 Preliminary definitions -- 3.5.2 Uniform distribution with a fixed lower endpoint -- 3.5.3 The general uniform distribution -- 3.5.4 Examples -- 3.6 Reference prior for the uniform distribution -- 3.6.1 Lower limit of the interval fixed -- 3.6.2 Example -- 3.6.3 Both limits unknown -- 3.7 The tramcar problem -- 3.7.1 The discrete uniform distribution -- 3.8 The first digit problem -- invariant priors -- 3.8.1 A prior in search of an explanation -- 3.8.2 The problem -- 3.8.3 A solution -- 3.8.4 Haar priors -- 3.9 The circular normal distribution -- 3.9.1 Distributions on the circle -- 3.9.2 Example -- 3.9.3 Construction of an HDR by numerical integration -- 3.9.4 Remarks -- 3.10 Approximations based on the likelihood -- 3.10.1 Maximum likelihood -- 3.10.2 Iterative methods -- 3.10.3 Approximation to the posterior density -- 3.10.4 Examples -- 3.10.5 Extension to more than one parameter -- 3.10.6 Example -- 3.11 Reference posterior distributions -- 3.11.1 The information provided by an experiment -- 3.11.2 Reference priors under asymptotic normality -- 3.11.3 Uniform distribution of unit length.
3.11.4 Normal mean and variance -- 3.11.5 Technical complications -- 3.12 Exercises on Chapter 3 -- 4 Hypothesis testing -- 4.1 Hypothesis testing -- 4.1.1 Introduction -- 4.1.2 Classical hypothesis testing -- 4.1.3 Difficulties with the classical approach -- 4.1.4 The Bayesian approach -- 4.1.5 Example -- 4.1.6 Comment -- 4.2 One-sided hypothesis tests -- 4.2.1 Definition -- 4.2.2 P-values -- 4.3 Lindley's method -- 4.3.1 A compromise with classical statistics -- 4.3.2 Example -- 4.3.3 Discussion -- 4.4 Point (or sharp) null hypotheses with prior information -- 4.4.1 When are point null hypotheses reasonable? -- 4.4.2 A case of nearly constant likelihood -- 4.4.3 The Bayesian method for point null hypotheses -- 4.4.4 Sufficient statistics -- 4.5 Point null hypotheses for the normal distribution -- 4.5.1 Calculation of the Bayes' factor -- 4.5.2 Numerical examples -- 4.5.3 Lindley's paradox -- 4.5.4 A bound which does not depend on the prior distribution -- 4.5.5 The case of an unknown variance -- 4.6 The Doogian philosophy -- 4.6.1 Description of the method -- 4.6.2 Numerical example -- 4.7 Exercises on Chapter 4 -- 5 Two-sample problems -- 5.1 Two-sample problems - both variances unknown -- 5.1.1 The problem of two normal samples -- 5.1.2 Paired comparisons -- 5.1.3 Example of a paired comparison problem -- 5.1.4 The case where both variances are known -- 5.1.5 Example -- 5.1.6 Non-trivial prior information -- 5.2 Variances unknown but equal -- 5.2.1 Solution using reference priors -- 5.2.2 Example -- 5.2.3 Non-trivial prior information -- 5.3 Variances unknown and unequal (Behrens-Fisher problem) -- 5.3.1 Formulation of the problem -- 5.3.2 Patil's approximation -- 5.3.3 Example -- 5.3.4 Substantial prior information -- 5.4 The Behrens-Fisher controversy -- 5.4.1 The Behrens-Fisher problem from a classical standpoint -- 5.4.2 Example.
5.4.3 The controversy -- 5.5 Inferences concerning a variance ratio -- 5.5.1 Statement of the problem -- 5.5.2 Derivation of the F distribution -- 5.5.3 Example -- 5.6 Comparison of two proportions -- the 2 × 2 table -- 5.6.1 Methods based on the log-odds ratio -- 5.6.2 Example -- 5.6.3 The inverse root-sine transformation -- 5.6.4 Other methods -- 5.7 Exercises on Chapter 5 -- 6 Correlation, regression and the analysis of variance -- 6.1 Theory of the correlation coefficient -- 6.1.1 Definitions -- 6.1.2 Approximate posterior distribution of the correlation coefficient -- 6.1.3 The hyperbolic tangent substitution -- 6.1.4 Reference prior -- 6.1.5 Incorporation of prior information -- 6.2 Examples on the use of the correlation coefficient -- 6.2.1 Use of the hyperbolic tangent transformation -- 6.2.2 Combination of several correlation coefficients -- 6.2.3 The squared correlation coefficient -- 6.3 Regression and the bivariate normal model -- 6.3.1 The model -- 6.3.2 Bivariate linear regression -- 6.3.3 Example -- 6.3.4 Case of known variance -- 6.3.5 The mean value at a given value of the explanatory variable -- 6.3.6 Prediction of observations at a given value of the explanatory variable -- 6.3.7 Continuation of the example -- 6.3.8 Multiple regression -- 6.3.9 Polynomial regression -- 6.4 Conjugate prior for the bivariate regression model -- 6.4.1 The problem of updating a regression line -- 6.4.2 Formulae for recursive construction of a regression line -- 6.4.3 Finding an appropriate prior -- 6.5 Comparison of several means - the one way model -- 6.5.1 Description of the one way layout -- 6.5.2 Integration over the nuisance parameters -- 6.5.3 Derivation of the F distribution -- 6.5.4 Relationship to the analysis of variance -- 6.5.5 Example -- 6.5.6 Relationship to a simple linear regression model -- 6.5.7 Investigation of contrasts.
6.6 The two way layout.
Record Nr. UNINA-9910820114203321
Lee Peter M  
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui