Bayesian statistics : an introduction / / Peter M. Lee
| Bayesian statistics : an introduction / / Peter M. Lee |
| Autore | Lee Peter M |
| Edizione | [4th ed.] |
| Descrizione fisica | xxiii, 462 p |
| Disciplina | 519.5/42 |
| Soggetto topico |
Bayesian statistical decision theory
Mathematical statistics |
| ISBN |
9786613686152
9781280775765 1280775769 9781118359754 1118359755 |
| Classificazione | 417 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Nota di contenuto |
Intro -- Bayesian Statistics -- Contents -- Preface -- Preface to the First Edition -- 1 Preliminaries -- 1.1 Probability and Bayes' Theorem -- 1.1.1 Notation -- 1.1.2 Axioms for probability -- 1.1.3 'Unconditional' probability -- 1.1.4 Odds -- 1.1.5 Independence -- 1.1.6 Some simple consequences of the axioms -- Bayes' Theorem -- 1.2 Examples on Bayes' Theorem -- 1.2.1 The Biology of Twins -- 1.2.2 A political example -- 1.2.3 A warning -- 1.3 Random variables -- 1.3.1 Discrete random variables -- 1.3.2 The binomial distribution -- 1.3.3 Continuous random variables -- 1.3.4 The normal distribution -- 1.3.5 Mixed random variables -- 1.4 Several random variables -- 1.4.1 Two discrete random variables -- 1.4.2 Two continuous random variables -- 1.4.3 Bayes' Theorem for random variables -- 1.4.4 Example -- 1.4.5 One discrete variable and one continuous variable -- 1.4.6 Independent random variables -- 1.5 Means and variances -- 1.5.1 Expectations -- 1.5.2 The expectation of a sum and of a product -- 1.5.3 Variance, precision and standard deviation -- 1.5.4 Examples -- 1.5.5 Variance of a sum -- covariance and correlation -- 1.5.6 Approximations to the mean and variance of a function of a random variable -- 1.5.7 Conditional expectations and variances -- 1.5.8 Medians and modes -- 1.6 Exercises on Chapter 1 -- 2 Bayesian inference for the normal distribution -- 2.1 Nature of Bayesian inference -- 2.1.1 Preliminary remarks -- 2.1.2 Post is prior times likelihood -- 2.1.3 Likelihood can be multiplied by any constant -- 2.1.4 Sequential use of Bayes' Theorem -- 2.1.5 The predictive distribution -- 2.1.6 A warning -- 2.2 Normal prior and likelihood -- 2.2.1 Posterior from a normal prior and likelihood -- 2.2.2 Example -- 2.2.3 Predictive distribution -- 2.2.4 The nature of the assumptions made -- 2.3 Several normal observations with a normal prior.
2.3.1 Posterior distribution -- 2.3.2 Example -- 2.3.3 Predictive distribution -- 2.3.4 Robustness -- 2.4 Dominant likelihoods -- 2.4.1 Improper priors -- 2.4.2 Approximation of proper priors by improper priors -- 2.5 Locally uniform priors -- 2.5.1 Bayes' postulate -- 2.5.2 Data translated likelihoods -- 2.5.3 Transformation of unknown parameters -- 2.6 Highest density regions -- 2.6.1 Need for summaries of posterior information -- 2.6.2 Relation to classical statistics -- 2.7 Normal variance -- 2.7.1 A suitable prior for the normal variance -- 2.7.2 Reference prior for the normal variance -- 2.8 HDRs for the normal variance -- 2.8.1 What distribution should we be considering? -- 2.8.2 Example -- 2.9 The role of sufficiency -- 2.9.1 Definition of sufficiency -- 2.9.2 Neyman's factorization theorem -- 2.9.3 Sufficiency principle -- 2.9.4 Examples -- 2.9.5 Order statistics and minimal sufficient statistics -- 2.9.6 Examples on minimal sufficiency -- 2.10 Conjugate prior distributions -- 2.10.1 Definition and difficulties -- 2.10.2 Examples -- 2.10.3 Mixtures of conjugate densities -- 2.10.4 Is your prior really conjugate? -- 2.11 The exponential family -- 2.11.1 Definition -- 2.11.2 Examples -- 2.11.3 Conjugate densities -- 2.11.4 Two-parameter exponential family -- 2.12 Normal mean and variance both unknown -- 2.12.1 Formulation of the problem -- 2.12.2 Marginal distribution of the mean -- 2.12.3 Example of the posterior density for the mean -- 2.12.4 Marginal distribution of the variance -- 2.12.5 Example of the posterior density of the variance -- 2.12.6 Conditional density of the mean for given variance -- 2.13 Conjugate joint prior for the normal distribution -- 2.13.1 The form of the conjugate prior -- 2.13.2 Derivation of the posterior -- 2.13.3 Example -- 2.13.4 Concluding remarks -- 2.14 Exercises on Chapter 2. 3 Some other common distributions -- 3.1 The binomial distribution -- 3.1.1 Conjugate prior -- 3.1.2 Odds and log-odds -- 3.1.3 Highest density regions -- 3.1.4 Example -- 3.1.5 Predictive distribution -- 3.2 Reference prior for the binomial likelihood -- 3.2.1 Bayes' postulate -- 3.2.2 Haldane's prior -- 3.2.3 The arc-sine distribution -- 3.2.4 Conclusion -- 3.3 Jeffreys' rule -- 3.3.1 Fisher's information -- 3.3.2 The information from several observations -- 3.3.3 Jeffreys' prior -- 3.3.4 Examples -- 3.3.5 Warning -- 3.3.6 Several unknown parameters -- 3.3.7 Example -- 3.4 The Poisson distribution -- 3.4.1 Conjugate prior -- 3.4.2 Reference prior -- 3.4.3 Example -- 3.4.4 Predictive distribution -- 3.5 The uniform distribution -- 3.5.1 Preliminary definitions -- 3.5.2 Uniform distribution with a fixed lower endpoint -- 3.5.3 The general uniform distribution -- 3.5.4 Examples -- 3.6 Reference prior for the uniform distribution -- 3.6.1 Lower limit of the interval fixed -- 3.6.2 Example -- 3.6.3 Both limits unknown -- 3.7 The tramcar problem -- 3.7.1 The discrete uniform distribution -- 3.8 The first digit problem -- invariant priors -- 3.8.1 A prior in search of an explanation -- 3.8.2 The problem -- 3.8.3 A solution -- 3.8.4 Haar priors -- 3.9 The circular normal distribution -- 3.9.1 Distributions on the circle -- 3.9.2 Example -- 3.9.3 Construction of an HDR by numerical integration -- 3.9.4 Remarks -- 3.10 Approximations based on the likelihood -- 3.10.1 Maximum likelihood -- 3.10.2 Iterative methods -- 3.10.3 Approximation to the posterior density -- 3.10.4 Examples -- 3.10.5 Extension to more than one parameter -- 3.10.6 Example -- 3.11 Reference posterior distributions -- 3.11.1 The information provided by an experiment -- 3.11.2 Reference priors under asymptotic normality -- 3.11.3 Uniform distribution of unit length. 3.11.4 Normal mean and variance -- 3.11.5 Technical complications -- 3.12 Exercises on Chapter 3 -- 4 Hypothesis testing -- 4.1 Hypothesis testing -- 4.1.1 Introduction -- 4.1.2 Classical hypothesis testing -- 4.1.3 Difficulties with the classical approach -- 4.1.4 The Bayesian approach -- 4.1.5 Example -- 4.1.6 Comment -- 4.2 One-sided hypothesis tests -- 4.2.1 Definition -- 4.2.2 P-values -- 4.3 Lindley's method -- 4.3.1 A compromise with classical statistics -- 4.3.2 Example -- 4.3.3 Discussion -- 4.4 Point (or sharp) null hypotheses with prior information -- 4.4.1 When are point null hypotheses reasonable? -- 4.4.2 A case of nearly constant likelihood -- 4.4.3 The Bayesian method for point null hypotheses -- 4.4.4 Sufficient statistics -- 4.5 Point null hypotheses for the normal distribution -- 4.5.1 Calculation of the Bayes' factor -- 4.5.2 Numerical examples -- 4.5.3 Lindley's paradox -- 4.5.4 A bound which does not depend on the prior distribution -- 4.5.5 The case of an unknown variance -- 4.6 The Doogian philosophy -- 4.6.1 Description of the method -- 4.6.2 Numerical example -- 4.7 Exercises on Chapter 4 -- 5 Two-sample problems -- 5.1 Two-sample problems - both variances unknown -- 5.1.1 The problem of two normal samples -- 5.1.2 Paired comparisons -- 5.1.3 Example of a paired comparison problem -- 5.1.4 The case where both variances are known -- 5.1.5 Example -- 5.1.6 Non-trivial prior information -- 5.2 Variances unknown but equal -- 5.2.1 Solution using reference priors -- 5.2.2 Example -- 5.2.3 Non-trivial prior information -- 5.3 Variances unknown and unequal (Behrens-Fisher problem) -- 5.3.1 Formulation of the problem -- 5.3.2 Patil's approximation -- 5.3.3 Example -- 5.3.4 Substantial prior information -- 5.4 The Behrens-Fisher controversy -- 5.4.1 The Behrens-Fisher problem from a classical standpoint -- 5.4.2 Example. 5.4.3 The controversy -- 5.5 Inferences concerning a variance ratio -- 5.5.1 Statement of the problem -- 5.5.2 Derivation of the F distribution -- 5.5.3 Example -- 5.6 Comparison of two proportions -- the 2 × 2 table -- 5.6.1 Methods based on the log-odds ratio -- 5.6.2 Example -- 5.6.3 The inverse root-sine transformation -- 5.6.4 Other methods -- 5.7 Exercises on Chapter 5 -- 6 Correlation, regression and the analysis of variance -- 6.1 Theory of the correlation coefficient -- 6.1.1 Definitions -- 6.1.2 Approximate posterior distribution of the correlation coefficient -- 6.1.3 The hyperbolic tangent substitution -- 6.1.4 Reference prior -- 6.1.5 Incorporation of prior information -- 6.2 Examples on the use of the correlation coefficient -- 6.2.1 Use of the hyperbolic tangent transformation -- 6.2.2 Combination of several correlation coefficients -- 6.2.3 The squared correlation coefficient -- 6.3 Regression and the bivariate normal model -- 6.3.1 The model -- 6.3.2 Bivariate linear regression -- 6.3.3 Example -- 6.3.4 Case of known variance -- 6.3.5 The mean value at a given value of the explanatory variable -- 6.3.6 Prediction of observations at a given value of the explanatory variable -- 6.3.7 Continuation of the example -- 6.3.8 Multiple regression -- 6.3.9 Polynomial regression -- 6.4 Conjugate prior for the bivariate regression model -- 6.4.1 The problem of updating a regression line -- 6.4.2 Formulae for recursive construction of a regression line -- 6.4.3 Finding an appropriate prior -- 6.5 Comparison of several means - the one way model -- 6.5.1 Description of the one way layout -- 6.5.2 Integration over the nuisance parameters -- 6.5.3 Derivation of the F distribution -- 6.5.4 Relationship to the analysis of variance -- 6.5.5 Example -- 6.5.6 Relationship to a simple linear regression model -- 6.5.7 Investigation of contrasts. 6.6 The two way layout. |
| Record Nr. | UNINA-9910966654003321 |
Lee Peter M
|
||
| Lo trovi qui: Univ. Federico II | ||
| ||
Categorical Data Analysis
| Categorical Data Analysis |
| Edizione | [3rd ed.] |
| Pubbl/distr/stampa | Wiley, 2014 |
| Descrizione fisica | 1 online resource (714 p.) |
| Disciplina | 519.535 |
| Collana | Wiley Series in Probability and Statistics |
| Soggetto topico |
Categories (Mathematics)
Mathematical analysis Multivariate analysis |
| ISBN |
1-118-71085-1
1-118-71094-0 |
| Classificazione | 417 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Record Nr. | UNINA-9910842600003321 |
| Wiley, 2014 | ||
| Lo trovi qui: Univ. Federico II | ||
| ||
Examples and problems in mathematical statistics / / Shelemyahu Zacks
| Examples and problems in mathematical statistics / / Shelemyahu Zacks |
| Autore | Zacks Shelemyahu <1932-> |
| Edizione | [1st ed.] |
| Pubbl/distr/stampa | Hoboken, New Jersey : , : Wiley, , 2014 |
| Descrizione fisica | 1 online resource (654 pages) |
| Disciplina | 519.5 |
| Collana | Wiley series in probability and statistics |
| Soggetto topico |
Mathematical statistics
Statistics |
| ISBN |
9781118605837
1118605837 9781118606001 1118606000 |
| Classificazione |
417
519.5 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Nota di contenuto |
Intro -- Examples and Problems in Mathematical Statistics -- Contents -- Preface -- List of Random Variables -- List of Abbreviations -- 1 Basic Probability Theory -- PART I: THEORY -- 1.1 OPERATIONS ON SETS -- 1.2 ALGEBRA AND σ-FIELDS -- 1.3 PROBABILITY SPACES -- 1.4 CONDITIONAL PROBABILITIES AND INDEPENDENCE -- 1.5 RANDOM VARIABLES AND THEIR DISTRIBUTIONS -- 1.6 THE LEBESGUE AND STIELTJES INTEGRALS -- 1.6.1 General Definition of Expected Value: The Lebesgue Integral -- 1.6.2 The Stieltjes-Riemann Integral -- 1.6.3 Mixtures of Discrete and Absolutely Continuous Distributions -- 1.6.4 Quantiles of Distributions -- 1.6.5 Transformations -- 1.7 JOINT DISTRIBUTIONS, CONDITIONAL DISTRIBUTIONS AND INDEPENDENCE -- 1.7.1 Joint Distributions -- 1.7.2 Conditional Expectations: General Definition -- 1.7.3 Independence -- 1.8 MOMENTS AND RELATED FUNCTIONALS -- 1.9 MODES OF CONVERGENCE -- 1.10 WEAK CONVERGENCE -- 1.11 LAWS OF LARGE NUMBERS -- 1.11.1 The Weak Law of Large Numbers (WLLN) -- 1.11.2 The Strong Law of Large Numbers (SLLN) -- 1.12 CENTRAL LIMIT THEOREM -- 1.13 MISCELLANEOUS RESULTS -- 1.13.1 Law of the Iterated Logarithm -- 1.13.2 Uniform Integrability -- 1.13.3 Inequalities -- 1.13.4 The Delta Method -- 1.13.5 The Symbols op and Op -- 1.13.6 The Empirical Distribution and Sample Quantiles -- PART II: EXAMPLES -- PART III: PROBLEMS -- PART IV: SOLUTIONS TO SELECTED PROBLEMS -- 2 Statistical Distributions -- PART I: THEORY -- 2.1 INTRODUCTORY REMARKS -- 2.2 FAMILIES OF DISCRETE DISTRIBUTIONS -- 2.2.1 Binomial Distributions -- 2.2.2 Hypergeometric Distributions -- 2.2.3 Poisson Distributions -- 2.2.4 Geometric, Pascal, and Negative Binomial Distributions -- 2.3 SOME FAMILIES OF CONTINUOUS DISTRIBUTIONS -- 2.3.1 Rectangular Distributions -- 2.3.2 Beta Distributions -- 2.3.3 Gamma Distributions -- 2.3.4 Weibull and Extreme Value Distributions.
2.3.5 Normal Distributions -- 2.3.6 Normal Approximations -- 2.4 TRANSFORMATIONS -- 2.4.1 One-to-One Transformations of Several Variables -- 2.4.2 Distribution of Sums -- 2.4.3 Distribution of Ratios -- 2.5 VARIANCES AND COVARIANCES OF SAMPLE MOMENTS -- 2.6 DISCRETE MULTIVARIATE DISTRIBUTIONS -- 2.6.1 The Multinomial Distribution -- 2.6.2 Multivariate Negative Binomial -- 2.6.3 Multivariate Hypergeometric Distributions -- 2.7 MULTINORMAL DISTRIBUTIONS -- 2.7.1 Basic Theory -- 2.7.2 Distribution of Subvectors and Distributions of Linear Forms -- 2.7.3 Independence of Linear Forms -- 2.8 DISTRIBUTIONS OF SYMMETRIC QUADRATIC FORMS OF NORMAL VARIABLES -- 2.9 INDEPENDENCE OF LINEAR AND QUADRATIC FORMS OF NORMAL VARIABLES -- 2.10 THE ORDER STATISTICS -- 2.11 t-DISTRIBUTIONS -- 2.12 F-DISTRIBUTIONS -- 2.13 THE DISTRIBUTION OF THE SAMPLE CORRELATION -- 2.14 EXPONENTIAL TYPE FAMILIES -- 2.15 APPROXIMATING THE DISTRIBUTION OF THE SAMPLE MEAN: EDGEWORTH AND SADDLEPOINT APPROXIMATIONS -- 2.15.1 Edgeworth Expansion -- 2.15.2 Saddlepoint Approximation -- PART II: EXAMPLES -- PART III: PROBLEMS -- PART IV: SOLUTIONS TO SELECTED PROBLEMS -- 3 Sufficient Statistics and the Information in Samples -- PART I: THEORY -- 3.1 INTRODUCTION -- 3.2 DEFINITION AND CHARACTERIZATION OF SUFFICIENT STATISTICS -- 3.2.1 Introductory Discussion -- 3.2.2 Theoretical Formulation -- 3.3 LIKELIHOOD FUNCTIONS AND MINIMAL SUFFICIENT STATISTICS -- 3.4 SUFFICIENT STATISTICS AND EXPONENTIAL TYPE FAMILIES -- 3.5 SUFFICIENCY AND COMPLETENESS -- 3.6 SUFFICIENCY AND ANCILLARITY -- 3.7 INFORMATION FUNCTIONS AND SUFFICIENCY -- 3.7.1 The Fisher Information -- 3.7.2 The Kullback-Leibler Information -- 3.8 THE FISHER INFORMATION MATRIX -- 3.9 SENSITIVITY TO CHANGES IN PARAMETERS -- 3.9.1 The Hellinger Distance -- PART II: EXAMPLES -- PART III: PROBLEMS -- PART IV: SOLUTIONS TO SELECTED PROBLEMS. 4 Testing Statistical Hypotheses -- PART I: THEORY -- 4.1 THE GENERAL FRAMEWORK -- 4.2 THE NEYMAN-PEARSON FUNDAMENTAL LEMMA -- 4.3 TESTING ONE-SIDED COMPOSITE HYPOTHESES IN MLR MODELS -- 4.4 TESTING TWO-SIDED HYPOTHESES IN ONE-PARAMETER EXPONENTIAL FAMILIES -- 4.5 TESTING COMPOSITE HYPOTHESES WITH NUISANCE PARAMETERS-UNBIASED TESTS -- 4.6 LIKELIHOOD RATIO TESTS -- 4.6.1 Testing in Normal Regression Theory -- 4.6.2 Comparison of Normal Means: The Analysis of Variance -- 4.7 THE ANALYSIS OF CONTINGENCY TABLES -- 4.7.1 The Structure of Multi-Way Contingency Tables and the Statistical Model -- 4.7.2 Testing the Significance of Association -- 4.7.3 The Analysis of Tables -- 4.7.4 Likelihood Ratio Tests for Categorical Data -- 4.8 SEQUENTIAL TESTING OF HYPOTHESES -- 4.8.1 The Wald Sequential Probability Ratio Test -- PART II: EXAMPLES -- PART III: PROBLEMS -- PART IV: SOLUTIONS TO SELECTED PROBLEMS -- 5 Statistical Estimation -- PART I: THEORY -- 5.1 GENERAL DISCUSSION -- 5.2 UNBIASED ESTIMATORS -- 5.2.1 General Definition and Example -- 5.2.2 Minimum Variance Unbiased Estimators -- 5.2.3 The Cramér-Rao Lower Bound for the One-Parameter Case -- 5.2.4 Extension of the Cramér-Rao Inequality to Multiparameter Cases -- 5.2.5 General Inequalities of the Cramér-Rao Type -- 5.3 THE EFFICIENCY OF UNBIASED ESTIMATORS IN REGULAR CASES -- 5.4 BEST LINEAR UNBIASED AND LEAST-SQUARES ESTIMATORS -- 5.4.1 BLUEs of the Mean -- 5.4.2 Least-Squares and BLUEs in Linear Models -- 5.4.3 Best Linear Combinations of Order Statistics -- 5.5 STABILIZING THE LSE: RIDGE REGRESSIONS -- 5.6 MAXIMUM LIKELIHOOD ESTIMATORS -- 5.6.1 Definition and Examples -- 5.6.2 MLEs in Exponential Type Families -- 5.6.3 The Invariance Principle -- 5.6.4 MLE of the Parameters of Tolerance Distributions -- 5.7 EQUIVARIANT ESTIMATORS -- 5.7.1 The Structure of Equivariant Estimators. 5.7.2 Minimum MSE Equivariant Estimators -- 5.7.3 Minimum Risk Equivariant Estimators -- 5.7.4 The Pitman Estimators -- 5.8 ESTIMATING EQUATIONS -- 5.8.1 Moment-Equations Estimators -- 5.8.2 General Theory of Estimating Functions -- 5.9 PRETEST ESTIMATORS -- 5.10 ROBUST ESTIMATION OF THE LOCATION AND SCALE PARAMETERS OF SYMMETRIC DISTRIBUTIONS -- PART II: EXAMPLES -- PART III: PROBLEMS -- PART IV: SOLUTIONS OF SELECTED PROBLEMS -- 6 Confidence and Tolerance Intervals -- PART I: THEORY -- 6.1 GENERAL INTRODUCTION -- 6.2 THE CONSTRUCTION OF CONFIDENCE INTERVALS -- 6.3 OPTIMAL CONFIDENCE INTERVALS -- 6.4 TOLERANCE INTERVALS -- 6.5 DISTRIBUTION FREE CONFIDENCE AND TOLERANCE INTERVALS -- 6.6 SIMULTANEOUS CONFIDENCE INTERVALS -- 6.7 TWO-STAGE AND SEQUENTIAL SAMPLING FOR FIXED WIDTH CONFIDENCE INTERVALS -- PART II: EXAMPLES -- PART III: PROBLEMS -- PART IV: SOLUTION TO SELECTED PROBLEMS -- 7 Large Sample Theory for Estimation and Testing -- PART I: THEORY -- 7.1 CONSISTENCY OF ESTIMATORS AND TESTS -- 7.2 CONSISTENCY OF THE MLE -- 7.3 ASYMPTOTIC NORMALITY AND EFFICIENCY OF CONSISTENT ESTIMATORS -- 7.4 SECOND-ORDER EFFICIENCY OF BAN ESTIMATORS -- 7.5 LARGE SAMPLE CONFIDENCE INTERVALS -- 7.6 EDGEWORTH AND SADDLEPOINT APPROXIMATIONS TO THE DISTRIBUTION OF THE MLE: ONE-PARAMETER CANONICAL EXPONENTIAL FAMILIES -- 7.7 LARGE SAMPLE TESTS -- 7.8 PITMAN'S ASYMPTOTIC EFFICIENCY OF TESTS -- 7.9 ASYMPTOTIC PROPERTIES OF SAMPLE QUANTILES -- PART II: EXAMPLES -- PART III: PROBLEMS -- PART IV: SOLUTION OF SELECTED PROBLEMS -- 8 Bayesian Analysis in Testing and Estimation -- PART I: THEORY -- 8.1 THE BAYESIAN FRAMEWORK -- 8.1.1 Prior, Posterior, and Predictive Distributions -- 8.1.2 Noninformative and Improper Prior Distributions -- 8.1.3 Risk Functions and Bayes Procedures -- 8.2 BAYESIAN TESTING OF HYPOTHESIS -- 8.2.1 Testing Simple Hypothesis. 8.2.2 Testing Composite Hypotheses -- 8.2.3 Bayes Sequential Testing of Hypotheses -- 8.3 BAYESIAN CREDIBILITY AND PREDICTION INTERVALS -- 8.3.1 Credibility Intervals -- 8.3.2 Prediction Intervals -- 8.4 BAYESIAN ESTIMATION -- 8.4.1 General Discussion and Examples -- 8.4.2 Hierarchical Models -- 8.4.3 The Normal Dynamic Linear Model -- 8.5 APPROXIMATION METHODS -- 8.5.1 Analytical Approximations -- 8.5.2 Numerical Approximations -- 8.6 EMPIRICAL BAYES ESTIMATORS -- PART II: EXAMPLES -- PART III: PROBLEMS -- PART IV: SOLUTIONS OF SELECTED PROBLEMS -- 9 Advanced Topics in Estimation Theory -- PART I: THEORY -- 9.1 MINIMAX ESTIMATORS -- 9.2 MINIMUM RISK EQUIVARIANT, BAYES EQUIVARIANT, AND STRUCTURAL ESTIMATORS -- 9.2.1 Formal Bayes Estimators for Invariant Priors -- 9.2.2 Equivariant Estimators Based on Structural Distributions -- 9.3 THE ADMISSIBILITY OF ESTIMATORS -- 9.3.1 Some Basic Results -- 9.3.2 The Inadmissibility of Some Commonly Used Estimators -- 9.3.3 Minimax and Admissible Estimators of the Location Parameter -- 9.3.4 The Relationship of Empirical Bayes and Stein-Type Estimators of the Location Parameter in the Normal Case -- PART II: EXAMPLES -- PART III: PROBLEMS -- PART IV: SOLUTIONS OF SELECTED PROBLEMS -- References -- Author Index -- Subject Index -- WILEY SERIES IN PROBABILITY AND STATISTICS. |
| Record Nr. | UNINA-9910973614803321 |
Zacks Shelemyahu <1932->
|
||
| Hoboken, New Jersey : , : Wiley, , 2014 | ||
| Lo trovi qui: Univ. Federico II | ||
| ||
Excelでかんたん統計分析 [[Excelデカンタントウケイブンセキ]]
| Excelでかんたん統計分析 [[Excelデカンタントウケイブンセキ]] |
| Pubbl/distr/stampa | 東京, : オーム社, 2007.8 |
| Descrizione fisica | オンライン資料1件 |
| Altri autori (Persone) | 近藤宏 |
| Soggetto topico | 数理統計学 -- データ処理 |
| ISBN | 4-274-80116-0 |
| Classificazione | 417 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | jpn |
| Nota di contenuto | 表紙 -- まえがき -- 目次 -- 第1章 基本統計 -- 1-1●最も簡単な分析--順位と百分位数 -- 1-2●データの状態を知る--ヒストグラム -- 1-3●データの状態を数値で示す--基本統計量 -- (1)平均 -- (2)最小、最大、範囲 -- (3)中央値(メジアン)、最頻値(モード) -- (4)標準偏差、分散 -- (5)尖度、歪度 -- (6)標準誤差 -- (7)合計、標本数 -- 1-4●標準正規分布と偏差値 -- 第1章のまとめ -- 第2章 検定 -- 2-1●平均の検定(1) z検定--z検定:2標本による平均の検定 -- 2-2●分散の検定 F検定--F検定:2標本を使った分散の検定 -- 2-3●平均の検定(2) t検定1--t検定:等分散を仮定した2標本による検定 -- 2-4●平均の検定(3) t検定2--t検定:分散が等しくないと仮定した2標本による検定 -- 2-5●平均の検定(4) t検定3--t検定:一対の標本による平均の検定 -- 第2章のまとめ -- 第3章 分散分析 -- 3-1●一元配置の分散分析--分散分析:一元配置 -- 3-2●二元配置の分散分析(1)--分散分析:繰り返しのない二元配置 -- 3-3●二元配置の分散分析(2)--分散分析:繰り返しのある二元配置 -- 3-4●多元配置の分散分析 -- 第3章のまとめ -- 第4章 サンプリング -- 4-1●データをランダムにとる--サンプリング -- (1)単純ランダムサンプリング -- (2)2段サンプリング -- (3)層別サンプリング -- (4)集落サンプリング -- (5)系統サンプリング -- 4-2●乱数の生成--乱数発生 -- (1)正規 -- (2)ベルヌーイ -- (3)二項 -- (4)ポワソン -- (5)パターン -- (6)離散 -- 4-3●[乱数発生]の応用例 -- 第4章のまとめ -- 第5章 相関と回帰分析 -- 5-1●対応のあるデータの関係を見る(1)--共分散 -- 5-2●対応のあるデータの関係を見る(2)--相関 -- 5-3●対応のあるデータの関係を見る(3)--回帰分析 -- 5-4●多くの要因の影響を見る(1)--重回帰分析 -- 5-5●多くの要因の影響を見る(2)--数量化理論I類 -- 第5章のまとめ -- 第6章 時系列データの予測と解析 -- 6-1●時系列データの予測(1)--移動平均 -- 6-2●時系列データの予測(2)--指数平滑 -- 6-3●時系列データの解析(1)--フーリエ解析 -- 6-4●時系列データの解析(2)--逆フーリエ変換 -- 6-5●フーリエ解析の数学 -- 第6章のまとめ -- 付録 Excel[データ分析]のツール一覧 -- 索引 -- 奥付. |
| Altri titoli varianti | Excelでかんたん統計分析 : 分析ツールを使いこなそう |
| Record Nr. | UNINA-9910148963603321 |
| 東京, : オーム社, 2007.8 | ||
| Lo trovi qui: Univ. Federico II | ||
| ||
Excelで学ぶ統計的予測 / / 菅民郎著
| Excelで学ぶ統計的予測 / / 菅民郎著 |
| Pubbl/distr/stampa | 東京, : オーム社, 2014.3 |
| Descrizione fisica | オンライン資料1件 |
| Altri autori (Persone) | 菅民郎 |
| Soggetto topico |
数理統計学 -- データ処理
統計 -- データ処理 |
| ISBN | 4-274-80216-7 |
| Classificazione | 417 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | jpn |
| Nota di contenuto |
表紙 -- クレジット -- まえがき -- 本書のねらい -- 本書で学ぶ内容 -- 目次 -- 第1章 はじめての予測 -- 1.1 予測とは何か -- 1.2 企業における予測 -- 1.3 予測上手はどんな人 -- 1.3.1 予測に取り組む姿勢では -- 1.3.2 予測結果に対する見方では -- 1.4 予測上手な会社は -- 1.5 予測に欠かせない規定要因関連図 -- 1.6 予測目的の明確化 -- 1.6.1 どの分野を予測するのか -- 1.6.2 定性的予測・定量的予測のいずれかを判断する -- 1.6.3 既商品・新商品いずれの予測かを判断する -- 1.6.4 予測したいデータ種別を明確にする -- 1.6.5 予測時期を明確にする -- 1.6.6 予測対象物の区分を明確にする -- 1.7 予測作業の進め方 -- 1.8 予測に用いるデータ -- 1.8.1 時系列データ、クロスセクションデータとは -- 1.8.2 時系列データとクロスセクションデータの違い -- 第2章 予測の仕方 -- 2.1 クロスセクションデータを用いた売上予測の手順と仕方 -- この節で学ぶこと -- クロスセクションデータを用いた売上予測の活用場面 -- クロスセクションデータを用いた売上予測の手順 -- クロスセクションデータを用いた売上予測の仕方 -- 2.2 時系列データを用いた売上予測の手順と仕方 -- この節で学ぶこと -- 時系列データを用いた売上予測の活用場面 -- 時系列データを用いた売上予測の手順 -- 時系列データを用いた売上予測の仕方 -- 2.3 時系列データの予測で最初にするトレンドT、Sの把握 -- この節で学ぶこと -- 変動して推移しているデータの予測は難題 -- 売上予測で最初にすることは売上のトレンドT、季節変動Sを調べること -- トレンドT -- 季節変動 -- 2.4 トレンドT(傾向線)の作成の考え方と仕方 -- この節で学ぶこと -- トレンドTを求める解析手法 -- 回帰式 -- トレンドTの値 -- 当てはまりの良さ -- 変動幅の大きい売上におけるトレンドT -- 変動幅の大きいデータにおけるトレンドTの作成の考え方 -- 2.5 トレンドT(傾向線)の作成5か条 -- この節で学ぶこと -- その1 回帰式の選択は決定係数より予測プロセスから判断すること -- その2 変動の大きいデータのトレンドTはTCデータで求めること -- その3 変動のないデータは変動を除去せずにトレンドTを算出すること -- その4 時系列推移が上下するトレンドTは用いないこと -- その5 上昇から減少に転じるTCのTは減少部分データで算出すること -- 2.6 TC及びTCI、Iの作成の考え方と方法 -- この節で学ぶこと -- TCSI、TCIとは -- 2.7 S、TCI、TC、Iを求める解析手法 -- この節で学ぶこと -- S、TCI、TC、Iを求める解析手法 -- 2.8 時系列データの予測モデル式作成のまとめ -- 第3章 予測の事例 -- 3.1 医療機器販売台数の季節性は? -- 事例 -- 適用データとグラフ -- 適用する解析手法 -- 分析 -- 分析結果 -- ソフトウェアの適用 -- 3.2 変動があるゴルフスコアの傾向は減少傾向にあるか? -- 事例 -- 適用データとグラフ -- 適用する解析手法 -- 分析 -- 分析結果 -- ソフトウェアの適用 -- 3.3 年々暑くなる地域において、今後の気温はどうなるか? -- 事例 -- 適用データとグラフ -- 適用する解析手法 -- 分析 -- 分析結果 -- ソフトウェアの適用 -- 3.4 増え続ける高齢者人口、今後どこまでいくか? -- 事例 -- 適用データとグラフ -- 適用する解析手法 -- 分析 -- 分析結果 -- ソフトウェアの適用 -- 3.5 変動幅が大きく推移する住宅販売戸数の傾向は? -- 事例 -- 適用データとグラフ -- 適用する解析手法 -- 分析 -- 分析結果 -- ソフトウェアの適用 -- 3.6 売上を予測するのに重要な要因を教えて! -- 事例 -- 適用データとグラフ -- 適用する解析手法 -- 分析 -- 分析結果 -- ソフトウェアの適用 -- 3.7 気温、イベント有無から明日のアイスクリーム仕入れ数を教えて!.
事例 -- 適用データとグラフ -- 適用する解析手法 -- 分析 -- 分析結果 -- ソフトウェアの適用 -- 3.8 駅前新聞スタンドの明日のスポーツ新聞売上部数は? -- 事例 -- 適用データ -- 適用する解析手法 -- 分析 -- 分析結果 -- ソフトウェアの適用 -- 3.9 不況を迎えた今年、私のお店の売上額を教えて! -- 事例 -- 適用データとグラフ -- 適用する解析手法 -- 分析 -- 分析結果 -- ソフトウェアの適用 -- 3.10 量的・質的の両方がある販促活動からの医療機器販売台数の予測は? -- 事例 -- 適用データとグラフ -- 適用する解析手法 -- 分析 -- 分析結果 -- ソフトウェアの適用 -- 3.11 競合品売上、自社営業活動の変化を想定したときの売上予測? -- 事例 -- 適用データとグラフ -- 適用する解析手法 -- 分析 -- 分析結果 -- ソフトウェアの適用 -- 3.12 どのような営業活動をすれば施設別売上を伸ばすことができるか? -- 事例 -- 適用データ -- 適用する解析手法 -- 分析 -- 分析結果 -- ソフトウェアの適用 -- 第4章 季節変動S、傾向変動Tを把握するための解析手法 -- 4.1 解析手法の種類と概要 -- この節で学ぶこと -- この章で学ぶ解析手法の種類と概要 -- 各種変動のグラフ形状 -- 4.2 月別平均法 -- 解析手法の役割 -- 適用できるデータ形態と時期数 -- 具体例 -- 季節変動指数Sの求め方 -- Sからわかること -- 季節変動調整済み系列TCIの求め方 -- TCIからわかること -- 4.3 MAT -- 解析手法の役割 -- 適用できるデータ形態と時期数 -- 具体例 -- MATの求め方 -- MATからわかること -- 4.4 移動平均 -- 解析手法の役割 -- 適用できるデータ形態と時期数 -- 具体例 -- 移動平均の求め方 -- 移動平均からわかること -- 項数が偶数の場合 -- 4.5 加重移動平均法 -- 解析手法の役割 -- 適用できるデータ形態と時期数 -- 具体例 -- 加重移動平均の求め方 -- 加重移動平均からわかること -- 項数が奇数の場合 -- 項数が偶数の場合 -- 加重移動平均の項数 -- 不規則変動指数Iの算出 -- 4.6 各年同月対象加重移動平均 -- 解析手法の役割 -- 適用できるデータ形態と時期数 -- 具体例 -- 各年同月対象加重移動平均の求め方 -- 各年同月対象加重移動平均からわかること -- 4.7 EPA法 -- 解析手法の役割 -- 適用できるデータ形態と時期数 -- 乗法モデルと加法モデル -- EPA法の結果 -- EPA法からわかること -- EPA法の計算方法 -- 第5章 トレンドT(傾向線)を算出するための解析手法 -- 5.1 解析手法の概要と種類 -- この節で把握すること -- 回帰式 -- 曲線回帰式のグラフ形状 -- 5.2 直線回帰式 -- 解析手法の役割 -- 適用できるデータ形態と時期数 -- 具体例 -- 直線回帰式とトレンドTについて -- 直線回帰式からわかること -- 直線回帰式の求め方 -- 決定係数 -- トレンドTの予測 -- 5.3 ルート回帰式 -- 解析手法の役割 -- 適用できるデータ形態と時期数 -- 具体例 -- ルート回帰式とトレンドTについて -- ルート回帰式からわかること -- ルート回帰式の求め方 -- 決定係数 -- 5.4 自然対数回帰式 -- 解析手法の役割 -- 適用できるデータ形態と時期数 -- 具体例 -- 自然対数回帰式とトレンドTについて -- 自然対数回帰式からわかること -- 自然対数回帰式の求め方 -- 決定係数 -- 5.5 分数回帰式 -- 解析手法の役割 -- 適用できるデータ形態と時期数 -- 具体例 -- 分数回帰式とトレンドTについて -- 分数回帰式からわかること -- 分数回帰式の求め方 -- 決定係数 -- 5.6 べき乗回帰式 -- 解析手法の役割 -- 適用できるデータ形態と時期数 -- 具体例 -- べき乗回帰式とトレンドTについて -- べき乗回帰式からわかること -- べき乗回帰式の求め方 -- 決定係数 -- 5.7 指数回帰式 -- 解析手法の役割. 適用できるデータ形態と時期数 -- 具体例 -- 指数回帰式とトレンドTについて -- 指数回帰式からわかること -- 指数回帰式の求め方 -- 決定係数 -- 5.8 修正指数回帰式 -- 解析手法の役割 -- 適用できるデータ形態と時期数 -- 具体例 -- 修正指数回帰式とトレンドTについて -- 修正指数回帰式からわかること -- 修正指数回帰式の求め方 -- 決定係数 -- 5.9 ロジスティック回帰式 -- 解析手法の役割 -- 適用できるデータ形態と時期数 -- 具体例 -- ロジスティック回帰式とトレンドTについて -- ロジスティック回帰式からわかること -- ロジスティック回帰式の求め方 -- 決定係数 -- 5.10 ゴンペルツ回帰式 -- 解析手法の役割 -- 適用できるデータ形態と時期数 -- 具体例 -- ゴンペルツ回帰式とトレンドTについて -- ゴンペルツ回帰式からわかること -- ゴンペルツ回帰式の求め方 -- 決定係数 -- 5.11 上限値K -- 上限値Kとは -- 適用できるデータ形態と時期数 -- 具体例 -- 上限値Kの求め方 -- 5.12 高次関数回帰式 -- 高次関数回帰式とは -- 第6章 相関分析 -- 6.1 相関分析の役割と相関係数の種類 -- この節で把握する内容 -- 相関分析による売上規定要因の見つけ方 -- 具体例における売上規定要因の見つけ方 -- 相関係数の種類 -- 6.2 単相関係数 -- 解析手法の役割 -- 適用できるデータ形態と時期数 -- 具体例 -- 単相関係数の求め方 -- 単相関係数はいくつ以上あれば良いか -- 6.3 時系列相関係数 -- 解析手法の役割 -- 適用できるデータ形態と時期数 -- 具体例 -- 時系列相関とは -- 時系列相関係数の求め方 -- 異なるGDPで時系列相関を算出し比較 -- 具体例における時系列相関 -- 6.4 タイムラグ相関係数 -- 解析手法の役割 -- 適用できるデータ形態と時期数 -- 具体例 -- タイムラグ相関係数の求め方 -- 第7章 予測モデル式を作成するための解析手法 -- 7.1 解析手法の種類と概要 -- この節で把握する内容 -- 予測モデル式作成のための解析手法 -- 7.2 重回帰分析 -- 解析手法の役割 -- 適用できるデータ -- 具体例 -- 重回帰分析とは -- 重回帰分析の関係式の係数の求め方 -- 説明変数の売上貢献度 -- 売上予測を行うための説明変数の重要度 -- 標準回帰係数 -- 分析精度を示す決定係数 -- 7.3 時系列重回帰分析 -- 解析手法の役割 -- 適用できるデータ形態と時期数 -- 具体例 -- 時系列重回帰分析とは -- 時系列重回帰分析の仕方と手順 -- 7.4 数量化1類 -- 解析手法の役割 -- 適用できるデータ -- 具体例 -- 数量化1類とは -- カテゴリースコア -- 予測 -- 7.5 拡張型数量化1類 -- 解析手法の役割 -- 適用できるデータ -- 具体例 -- 拡張型数量化1類とは -- 第8章 Excelの統計解析機能 -- 8.1 Excelの演算式と関数 -- Excelの演算式 -- Excelの関数 -- 8.2 関数の入力方法 -- 8.3 式の内容の変更 -- 8.4 関数の挿入での指定方法 -- 8.5 引数、関数の変更方法 -- 引数の変更 -- 関数名の変更 -- 8.6 関数式のコピー -- 8.7 数学で用いられるExcelの関数 -- ROUND 四捨五入 -- ROUNDDOWN、ROUNDUP 切り捨て、切り上げ -- INT 整数 -- LN、LOG 対数 -- SQRT 平方根 -- EXP eのべき乗 -- ABS 絶対値 -- 8.8 絶対参照と相対参照 -- 第9章 Excelアドインソフトウェアの概要と操作方法 -- 9.1 本書で利用するソフトウェアについて -- 9.2 無料ソフトウェアのダウンロード方法 -- 無料ソフトウェアの内容 -- ソフトウェアの入手方法 -- ソフトウェア実行上の注意点 -- 9.3 ソフトウェア「EPA法」の操作方法 -- ソフトウェアの実行 -- 範囲指定 -- 9.4 市販ソフトウェア「マルチ予測」について -- 無料貸し出しソフトウェア「マルチ予測」の内容 -- ソフトウェアについて. ソフトウェアの起動方法 -- 索引 -- 奥付. |
| Altri titoli varianti | 統計的予測 : Excelで学ぶ |
| Record Nr. | UNINA-9910149149603321 |
| 東京, : オーム社, 2014.3 | ||
| Lo trovi qui: Univ. Federico II | ||
| ||
Introductory statistics and analytics : a resampling perspective / / Peter C. Bruce
| Introductory statistics and analytics : a resampling perspective / / Peter C. Bruce |
| Autore | Bruce Peter C. <1953-> |
| Pubbl/distr/stampa | Hoboken, New Jersey : , : Wiley, , 2015 |
| Descrizione fisica | 1 online resource (285 pages) |
| Disciplina | 519.5 |
| Soggetto topico | Statistics |
| ISBN |
1-118-88166-4
1-118-88133-8 |
| Classificazione |
417
519.5 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Nota di contenuto | Title Page; Copyright; Preface; Book Website; Acknowledgments; Stan Blank; Michelle Everson; Robert Hayden; Introduction; If You Can't Measure it, You Can't Manage It; Phantom Protection from Vitamin E; Statistician, Heal Thyself; Identifying Terrorists in Airports; Looking Ahead in the Book; Resampling; Big Data and Statisticians; Chapter 1: Designing and Carrying Out a Statistical Study; 1.1 A Small Example; 1.2 Is Chance Responsible? The Foundation of Hypothesis Testing; 1.3 A Major Example; 1.4 Designing an Experiment; 1.5 What to Measure-Central Location; 1.6 What to Measure-Variability. 1.7 What to Measure-Distance (Nearness)1.8 Test Statistic; 1.9 The Data; 1.10 Variables and Their Flavors; 1.11 Examining and Displaying the Data; 1.12 Are we Sure we Made a Difference?; Appendix: Historical Note; 1.13 EXERCISES; Chapter 2: Statistical Inference; 2.1 Repeating the Experiment; 2.2 How Many Reshuffles?; 2.3 How Odd is Odd?; 2.4 Statistical and Practical Significance; 2.5 When to Use Hypothesis Tests; 2.6 Exercises; Chapter 3: Displaying and Exploring Data; 3.1 Bar Charts; 3.2 Pie Charts; 3.3 Misuse of Graphs; 3.4 Indexing; 3.5 Exercises; Chapter 4: Probability. 4.1 Mendel's Peas4.2 Simple Probability; 4.3 Random Variables and their Probability Distributions; 4.4 The Normal Distribution; 4.5 Exercises; Chapter 5: Relationship Between Two Categorical Variables; 5.1 Two-Way Tables; 5.2 Comparing Proportions; 5.3 More Probability; 5.4 From Conditional Probabilities to Bayesian Estimates; 5.5 Independence; 5.6 Exploratory Data Analysis (EDA); 5.7 Exercises; Chapter 6: Surveys and Sampling; 6.1 Simple Random Samples; 6.2 Margin of Error: Sampling Distribution for a Proportion; 6.3 Sampling Distribution for a Mean; 6.4 A Shortcut-The Bootstrap. 6.5 Beyond Simple Random Sampling6.6 Absolute Versus Relative Sample Size; 6.7 Exercises; Chapter 7: Confidence Intervals; 7.1 Point Estimates; 7.2 Interval Estimates (Confidence Intervals); 7.3 Confidence Interval for a Mean; 7.4 Formula-Based Counterparts to the Bootstrap; 7.5 Standard Error; 7.6 Confidence Intervals for a Single Proportion; 7.7 Confidence Interval for a Difference in Means; 7.8 Confidence Interval for a Difference in Proportions; 7.9 Recapping; Appendix A: More on the Bootstrap; Resampling Procedure-Parametric Bootstrap; Formulas and the Parametric Bootstrap. Appendix B: Alternative PopulationsAppendix C: Binomial Formula Procedure; 7.10 Exercises; Chapter 8: Hypothesis Tests; 8.1 Review of Terminology; 8.2 A-B Tests: The Two Sample Comparison; 8.3 Comparing Two Means; 8.4 Comparing Two Proportions; 8.5 Formula-Based Alternative-t-Test for Means; 8.6 The Null and Alternative Hypotheses; 8.7 Paired Comparisons; Appendix A: Confidence Intervals Versus Hypothesis Tests; Confidence Interval; Relationship Between the Hypothesis Test and the Confidence Interval; Comment; Appendix B: Formula-Based Variations of Two-Sample Tests. |
| Record Nr. | UNINA-9910795803803321 |
Bruce Peter C. <1953->
|
||
| Hoboken, New Jersey : , : Wiley, , 2015 | ||
| Lo trovi qui: Univ. Federico II | ||
| ||
Introductory statistics and analytics : a resampling perspective / / Peter C. Bruce
| Introductory statistics and analytics : a resampling perspective / / Peter C. Bruce |
| Autore | Bruce Peter C. <1953-> |
| Pubbl/distr/stampa | Hoboken, New Jersey : , : Wiley, , 2015 |
| Descrizione fisica | 1 online resource (285 pages) |
| Disciplina | 519.5 |
| Soggetto topico | Statistics |
| ISBN |
1-118-88166-4
1-118-88133-8 |
| Classificazione |
417
519.5 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Nota di contenuto | Title Page; Copyright; Preface; Book Website; Acknowledgments; Stan Blank; Michelle Everson; Robert Hayden; Introduction; If You Can't Measure it, You Can't Manage It; Phantom Protection from Vitamin E; Statistician, Heal Thyself; Identifying Terrorists in Airports; Looking Ahead in the Book; Resampling; Big Data and Statisticians; Chapter 1: Designing and Carrying Out a Statistical Study; 1.1 A Small Example; 1.2 Is Chance Responsible? The Foundation of Hypothesis Testing; 1.3 A Major Example; 1.4 Designing an Experiment; 1.5 What to Measure-Central Location; 1.6 What to Measure-Variability. 1.7 What to Measure-Distance (Nearness)1.8 Test Statistic; 1.9 The Data; 1.10 Variables and Their Flavors; 1.11 Examining and Displaying the Data; 1.12 Are we Sure we Made a Difference?; Appendix: Historical Note; 1.13 EXERCISES; Chapter 2: Statistical Inference; 2.1 Repeating the Experiment; 2.2 How Many Reshuffles?; 2.3 How Odd is Odd?; 2.4 Statistical and Practical Significance; 2.5 When to Use Hypothesis Tests; 2.6 Exercises; Chapter 3: Displaying and Exploring Data; 3.1 Bar Charts; 3.2 Pie Charts; 3.3 Misuse of Graphs; 3.4 Indexing; 3.5 Exercises; Chapter 4: Probability. 4.1 Mendel's Peas4.2 Simple Probability; 4.3 Random Variables and their Probability Distributions; 4.4 The Normal Distribution; 4.5 Exercises; Chapter 5: Relationship Between Two Categorical Variables; 5.1 Two-Way Tables; 5.2 Comparing Proportions; 5.3 More Probability; 5.4 From Conditional Probabilities to Bayesian Estimates; 5.5 Independence; 5.6 Exploratory Data Analysis (EDA); 5.7 Exercises; Chapter 6: Surveys and Sampling; 6.1 Simple Random Samples; 6.2 Margin of Error: Sampling Distribution for a Proportion; 6.3 Sampling Distribution for a Mean; 6.4 A Shortcut-The Bootstrap. 6.5 Beyond Simple Random Sampling6.6 Absolute Versus Relative Sample Size; 6.7 Exercises; Chapter 7: Confidence Intervals; 7.1 Point Estimates; 7.2 Interval Estimates (Confidence Intervals); 7.3 Confidence Interval for a Mean; 7.4 Formula-Based Counterparts to the Bootstrap; 7.5 Standard Error; 7.6 Confidence Intervals for a Single Proportion; 7.7 Confidence Interval for a Difference in Means; 7.8 Confidence Interval for a Difference in Proportions; 7.9 Recapping; Appendix A: More on the Bootstrap; Resampling Procedure-Parametric Bootstrap; Formulas and the Parametric Bootstrap. Appendix B: Alternative PopulationsAppendix C: Binomial Formula Procedure; 7.10 Exercises; Chapter 8: Hypothesis Tests; 8.1 Review of Terminology; 8.2 A-B Tests: The Two Sample Comparison; 8.3 Comparing Two Means; 8.4 Comparing Two Proportions; 8.5 Formula-Based Alternative-t-Test for Means; 8.6 The Null and Alternative Hypotheses; 8.7 Paired Comparisons; Appendix A: Confidence Intervals Versus Hypothesis Tests; Confidence Interval; Relationship Between the Hypothesis Test and the Confidence Interval; Comment; Appendix B: Formula-Based Variations of Two-Sample Tests. |
| Record Nr. | UNINA-9910822322303321 |
Bruce Peter C. <1953->
|
||
| Hoboken, New Jersey : , : Wiley, , 2015 | ||
| Lo trovi qui: Univ. Federico II | ||
| ||
Linear programming and resource allocation modeling / / Michael J. Panik
| Linear programming and resource allocation modeling / / Michael J. Panik |
| Autore | Panik Michael J. |
| Edizione | [1st edition] |
| Pubbl/distr/stampa | Hoboken, New Jersey : , : Wiley, , 2019 |
| Descrizione fisica | 1 online resource (451 pages) |
| Disciplina | 519.72 |
| Collana | THEi Wiley ebooks |
| Soggetto topico |
Linear programming
Resource allocation - Mathematical models |
| ISBN |
1-119-50946-7
1-119-50947-5 1-119-50945-9 |
| Classificazione |
417
519.7/2 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Nota di contenuto | Introduction -- Mathematical Foundations -- Introduction to Linear Programming -- Computational Aspects of Linear Programming -- Variations of the Standard Simplex Routine -- Duality Theory -- Linear Programming and the Theory of the Firm -- Sensitivity Analysis -- Analyzing Structural Changes -- Parametric Programming -- Parametric Programming and the Theory of the Firm -- Duality Revisited -- Simplex-Based Methods of Optimization -- Data Envelopment Analysis (DEA). |
| Record Nr. | UNINA-9910539336403321 |
Panik Michael J.
|
||
| Hoboken, New Jersey : , : Wiley, , 2019 | ||
| Lo trovi qui: Univ. Federico II | ||
| ||
Mixed models : theory and applications with R / / Eugene Demidenko
| Mixed models : theory and applications with R / / Eugene Demidenko |
| Autore | Demidenko Eugene <1948-> |
| Edizione | [2nd ed.] |
| Pubbl/distr/stampa | Hoboken, : Wiley, 2013 |
| Descrizione fisica | xxvii, 717 p. : ill |
| Disciplina | 519.5/38 |
| Collana | Wiley series in probability and statistics |
| Soggetto topico | Analysis of variance |
| ISBN |
9781118593066
1118593065 9781118592991 1118592999 9781118651537 1118651537 |
| Classificazione |
417
519.5/38 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Nota di contenuto | Machine generated contents note: Preface xviiPreface to the Second Edition xixR software and functions xxData Sets xxiiOpen Problems in Mixed Models xxiii1 Introduction: Why Mixed Models? 11.1 Mixed effects for clustered data 21.2 ANOVA, variance components, and the mixed model 41.3 Other special cases of the mixed effects model 61.4 A compromise between Bayesian and frequentist approaches 71.5 Penalized likelihood and mixed effects 91.6 Healthy Akaike information criterion 111.7 Penalized smoothing 131.8 Penalized polynomial fitting 161.9 Restraining parameters, or what to eat 181.10 Ill-posed problems, Tikhonov regularization, and mixed effects 201.11 Computerized tomography and linear image reconstruction 231.12 GLMM for PET 261.13 Maple shape leaf analysis 291.14 DNA Western blot analysis 311.15 Where does the wind blow? 331.16 Software and books361.17 Summary points 372 MLE for LME Model 412.1 Example: Weight versus height 422.2 The model and log-likelihood functions 452.3 Balanced random-coefficient model 602.4 LME model with random intercepts 642.5 Criterion for the MLE existence 722.6 Criterion for positive definiteness of matrix D742.7 Preestimation bounds for variance parameters 772.8 Maximization algorithms792.9 Derivatives of the log-likelihood function 812.10 Newton--Raphson algorithm 832.11 Fisher scoring algorithm852.12 EM algorithm 882.13 Starting point 932.14 Algorithms for restricted MLE 962.15 Optimization on nonnegative definite matrices 972.16 lmeFS and lme in R 1082.17 Appendix: Proof of the MLE existence 1122.18 Summary points 1153 Statistical Properties of the LME Model 1193.1 Introduction 1193.2 Identifiability of the LMEmodel 1193.3 Information matrix for variance parameters 1223.4 Profile-likelihood confidence intervals 1333.5 Statistical testing of the presence of random effects 1353.6 Statistical properties of MLE 1393.7 Estimation of random effects 1483.8 Hypothesis and membership testing 1533.9 Ignoring random effects 1573.10 MINQUE for variance parameters 1603.11 Method of moments 1693.12 Variance least squares estimator 1733.13 Projection on D+ space 1783.14 Comparison of the variance parameter estimation 1783.15 Asymptotically efficient estimation for [beta] 1823.16 Summary points 1834 Growth Curve Model and Generalizations 1874.1 Linear growth curve model 1874.2 General linear growth curve model 2034.3 Linear model with linear covariance structure 2214.4 Robust linear mixed effects model 2354.5 Appendix: Derivation of the MM estimator 2434.6 Summary points 2445 Meta-analysis Model 2475.1 Simple meta-analysis model 2485.2 Meta-analysis model with covariates 2755.3 Multivariate meta-analysis model 2805.4 Summary points 2916 Nonlinear Marginal Model 2936.1 Fixed matrix of random effects 2946.2 Varied matrix of random effects 3076.3 Three types of nonlinear marginal models 3186.4 Total generalized estimating equations approach 3236.5 Summary points 3307 Generalized Linear Mixed Models 3337.1 Regression models for binary data 3347.2 Binary model with subject-specific intercept 3577.3 Logistic regression with random intercept 3647.4 Probit model with random intercept 3847.5 Poisson model with random intercept 3887.6 Random intercept model: overview 4037.7 Mixed models with multiple random effects 4047.8 GLMM and simulation methods 4137.9 GEE for clustered marginal GLM 4187.10 Criteria for MLE existence for binary model 4267.11 Summary points 4318 Nonlinear Mixed Effects Model 4358.1 Introduction 4358.2 The model 4368.3 Example: Height of girls and boys 4398.4 Maximum likelihood estimation 4418.5 Two-stage estimator 4448.6 First-order approximation 4508.7 Lindstrom--Bates estimator 4528.8 Likelihood approximations 4578.9 One-parameter exponential model 4608.10 Asymptotic equivalence of the TS and LB estimators 4678.11 Bias-corrected two-stage estimator 4698.12 Distribution misspecification 4718.13 Partially nonlinear marginal mixed model 4748.14 Fixed sample likelihood approach4758.15 Estimation of random effects and hypothesis testing 4788.16 Example (continued) 4798.17 Practical recommendations 4818.18 Appendix: Proof of theorem on equivalence 4828.19 Summary points 4859 Diagnostics and Influence Analysis 4899.1 Introduction 4899.2 Influence analysis for linear regression 4909.3 The idea of infinitesimal influence 4939.4 Linear regression model 4959.5 Nonlinear regression model 5129.6 Logistic regression for binary outcome 5179.7 Influence of correlation structure 5269.8 Influence of measurement error 5279.9 Influence analysis for the LME model 5309.10 Appendix: MLE derivative with respect to σ2 5369.11 Summary points 53710 Tumor Regrowth Curves 54110.1 Survival curves 54310.2 Double--exponential regrowth curve 54510.3 Exponential growth with fixed regrowth time 55910.4 General regrowth curve 56510.5 Double--exponential transient regrowth curve 56610.6 Gompertz transient regrowth curve 57310.7 Summary points 57611 Statistical Analysis of Shape 57911.1 Introduction 57911.2 Statistical analysis of random triangles 58111.3 Face recognition 58411.4 Scale-irrelevant shape model 58511.5 Gorilla vertebrae analysis 58911.6 Procrustes estimation of the mean shape 59111.7 Fourier descriptor analysis 59811.8 Summary points 60712 Statistical Image Analysis 60912.1 Introduction 60912.2 Testing for uniform lighting 61212.3 Kolmogorov--Smirnov image comparison 61612.4 Multinomial statistical model for images 62012.5 Image entropy 62312.6 Ensemble of unstructured images 62712.7 Image alignment and registration 64012.8 Ensemble of structured images 65212.9 Modeling spatial correlation 65412.10 Summary points 66013 Appendix: Useful Facts and Formulas 66313.1 Basic facts of asymptotic theory 66313.2 Some formulas of matrix algebra 67013.3 Basic facts of optimization theory 674References 683Index 713. |
| Record Nr. | UNINA-9910972134903321 |
Demidenko Eugene <1948->
|
||
| Hoboken, : Wiley, 2013 | ||
| Lo trovi qui: Univ. Federico II | ||
| ||
Model building in mathematical programming / / H. Paul Williams
| Model building in mathematical programming / / H. Paul Williams |
| Autore | Williams H. P |
| Edizione | [5th ed.] |
| Pubbl/distr/stampa | Hoboken, N.J., : Wiley, 2013 |
| Descrizione fisica | xx, 411 p. : ill |
| Disciplina | 519.7 |
| Soggetto topico |
Mathematical models
Programming (Mathematics) |
| ISBN |
9781118506172
1118506170 9781299188709 1299188702 9781118506189 1118506189 |
| Classificazione |
417
519.7 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Nota di contenuto |
Cover -- Title Page -- Copyright -- Contents -- Preface -- Part I -- Chapter 1 Introduction -- 1.1 The concept of a model -- 1.2 Mathematical programming models -- Chapter 2 Solving mathematical programming models -- 2.1 Algorithms and packages -- 2.1.1 Reduction -- 2.1.2 Starting solutions -- 2.1.3 Simple bounding constraints -- 2.1.4 Ranged constraints -- 2.1.5 Generalized upper bounding constraints -- 2.1.6 Sensitivity analysis -- 2.2 Practical considerations -- 2.3 Decision support and expert systems -- 2.4 Constraint programming (CP) -- Chapter 3 Building linear programming models -- 3.1 The importance of linearity -- 3.2 Defining objectives -- 3.2.1 Single objectives -- 3.2.2 Multiple and conflicting objectives -- 3.2.3 Minimax objectives -- 3.2.4 Ratio objectives -- 3.2.5 Non-existent and non-optimizable objectives -- 3.3 Defining constraints -- 3.3.1 Productive capacity constraints -- 3.3.2 Raw material availabilities -- 3.3.3 Marketing demands and limitations -- 3.3.4 Material balance (continuity) constraints -- 3.3.5 Quality stipulations -- 3.3.6 Hard and soft constraints -- 3.3.7 Chance constraints -- 3.3.8 Conflicting constraints -- 3.3.9 Redundant constraints -- 3.3.10 Simple and generalized upper bounds -- 3.3.11 Unusual constraints -- 3.4 How to build a good model -- 3.4.1 Ease of understanding the model -- 3.4.2 Ease of detecting errors in the model -- 3.4.3 Ease of computing the solution -- 3.4.4 Modal formulation -- 3.4.5 Units of measurement -- 3.5 The use of modelling languages -- 3.5.1 A more natural input format -- 3.5.2 Debugging is made easier -- 3.5.3 Modification is made easier -- 3.5.4 Repetition is automated -- 3.5.5 Special purpose generators using a high level language -- 3.5.6 Matrix block building systems -- 3.5.7 Data structuring systems -- 3.5.8 Mathematical languages -- 3.5.8.1 SETs -- 3.5.8.2 DATA.
3.5.8.3 VARIABLES -- 3.5.8.4 OBJECTIVE -- 3.5.8.5 CONSTRAINTS -- Chapter 4 Structured linear programming models -- 4.1 Multiple plant, product and period models -- 4.2 Stochastic programmes -- 4.3 Decomposing a large model -- 4.3.1 The submodels -- 4.3.2 The restricted master model -- Chapter 5 Applications and special types of mathematical programming model -- 5.1 Typical applications -- 5.1.1 The petroleum industry -- 5.1.2 The chemical industry -- 5.1.3 Manufacturing industry -- 5.1.4 Transport and distribution -- 5.1.5 Finance -- 5.1.6 Agriculture -- 5.1.7 Health -- 5.1.8 Mining -- 5.1.9 Manpower planning -- 5.1.10 Food -- 5.1.11 Energy -- 5.1.12 Pulp and paper -- 5.1.13 Advertising -- 5.1.14 Defence -- 5.1.15 The supply chain -- 5.1.16 Other applications -- 5.2 Economic models -- 5.2.1 The static model -- 5.2.2 The dynamic model -- 5.2.3 Aggregation -- 5.3 Network models -- 5.3.1 The transportation problem -- 5.3.2 The assignment problem -- 5.3.3 The transhipment problem -- 5.3.4 The minimum cost flow problem -- 5.3.5 The shortest path problem -- 5.3.6 Maximum flow through a network -- 5.3.7 Critical path analysis -- 5.4 Converting linear programs to networks -- Chapter 6 Interpreting and using the solution of a linear programming model -- 6.1 Validating a model -- 6.1.1 Infeasible models -- 6.1.2 Unbounded models -- 6.1.3 Solvable models -- 6.2 Economic interpretations -- 6.2.1 The dual model -- 6.2.2 Shadow prices -- 6.2.3 Productive capacity constraints -- 6.2.4 Raw material availabilities -- 6.2.5 Marketing demands and limitations -- 6.2.6 Material balance (continuity) constraints -- 6.2.7 Quality stipulations -- 6.2.8 Reduced costs -- 6.3 Sensitivity analysis and the stability of a model -- 6.3.1 Right-hand side ranges -- 6.3.2 Objective ranges -- 6.3.3 Ranges on interior coefficients -- 6.3.4 Marginal rates of substitution. 6.3.5 Building stable models -- 6.4 Further investigations using a model -- 6.5 Presentation of the solutions -- Chapter 7 Non-linear models -- 7.1 Typical applications -- 7.2 Local and global optima -- 7.3 Separable programming -- 7.4 Converting a problem to a separable model -- Chapter 8 Integer programming -- 8.1 Introduction -- 8.2 The applicability of integer programming -- 8.2.1 Problems with discrete inputs and outputs -- 8.2.2 Problems with logical conditions -- 8.2.3 Combinatorial problems -- 8.2.4 Non-linear problems -- 8.2.5 Network problems -- 8.3 Solving integer programming models -- 8.3.1 Cutting planes methods -- 8.3.2 Enumerative methods -- 8.3.3 Pseudo-Boolean methods -- 8.3.4 Branch and bound methods -- Chapter 9 Building integer programming models I -- 9.1 The uses of discrete variables -- 9.1.1 Indivisible (discrete) quantities -- 9.1.2 Decision variables -- 9.1.3 Indicator variables -- 9.2 Logical conditions and 0-1 variables -- 9.3 Special ordered sets of variables -- 9.4 Extra conditions applied to linear programming models -- 9.4.1 Disjunctive constraints -- 9.4.2 Non-convex regions -- 9.4.3 Limiting the number of variables in a solution -- 9.4.4 Sequentially dependent decisions -- 9.4.5 Economies of scale -- 9.4.6 Discrete capacity extensions -- 9.4.7 Maximax objectives -- 9.5 Special kinds of integer programming model -- 9.5.1 Set covering problems -- 9.5.2 Set packing problems -- 9.5.3 Set partitioning problems -- 9.5.4 The knapsack problem -- 9.5.5 The travelling salesman problem -- 9.5.6 The vehicle routing problem -- 9.5.7 The quadratic assignment problem -- 9.6 Column generation -- Chapter 10 Building integer programming models II -- 10.1 Good and bad formulations -- 10.1.1 The number of variables in an IP model -- 10.1.2 The number of constraints in an IP model -- 10.2 Simplifying an integer programming model. 10.2.1 Tightening bounds -- 10.2.2 Simplifying a single integer constraint to another single integer constraint -- 10.2.3 Simplifying a single integer constraint to a collection of integer constraints -- 10.2.4 Simplifying collections of constraints -- 10.2.5 Discontinuous variables -- 10.2.6 An alternative formulation for disjunctive constraints -- 10.2.7 Symmetry -- 10.3 Economic information obtainable by integer programming -- 10.4 Sensitivity analysis and the stability of a model -- 10.4.1 Sensitivity analysis and integer programming -- 10.4.2 Building a stable model -- 10.5 When and how to use integer programming -- Chapter 11 The implementation of a mathematical programming system of planning -- 11.1 Acceptance and implementation -- 11.2 The unification of organizational functions -- 11.3 Centralization versus decentralization -- 11.4 The collection of data and the maintenance of a model -- Part II -- Chapter 12 The problems -- 12.1 Food manufacture 1 -- 12.2 Food manufacture 2 -- 12.3 Factory planning 1 -- 12.4 Factory planning 2 -- 12.5 Manpower planning -- 12.5.1 Recruitment -- 12.5.2 Retraining -- 12.5.3 Redundancy -- 12.5.4 Overmanning -- 12.5.5 Short-time working -- 12.6 Refinery optimisation -- 12.6.1 Distillation -- 12.6.2 Reforming -- 12.6.3 Cracking -- 12.6.4 Blending -- 12.7 Mining -- 12.8 Farm planning -- 12.9 Economic planning -- 12.10 Decentralisation -- 12.11 Curve fitting -- 12.12 Logical design -- 12.13 Market sharing -- 12.14 Opencast mining -- 12.15 Tariff rates (power generation) -- 12.16 Hydro power -- 12.17 Three-dimensional noughts and crosses -- 12.18 Optimising a constraint -- 12.19 Distribution 1 -- 12.20 Depot location (distribution 2) -- 12.21 Agricultural pricing -- 12.22 Efficiency analysis -- 12.23 Milk collection -- 12.24 Yield management -- 12.25 Car rental 1 -- 12.26 Car rental 2. 12.27 Lost baggage distribution -- 12.28 Protein folding -- 12.29 Protein comparison -- Part III -- Chapter 13 Formulation and discussion of problems -- 13.1 Food manufacture 1 -- 13.1.1 The single-period problem -- 13.1.2 The multi-period problem -- 13.2 Food manufacture 2 -- 13.3 Factory planning 1 -- 13.3.1 The single-period problem -- 13.3.2 The multi-period problem -- 13.4 Factory planning 2 -- 13.4.1 Extra variables -- 13.4.2 Revised constraints -- 13.5 Manpower planning -- 13.5.1 Variables -- 13.5.2 Constraints -- 13.5.3 Initial conditions -- 13.6 Refinery optimization -- 13.6.1 Variables -- 13.6.2 Constraints -- 13.6.3 Objective -- 13.7 Mining -- 13.7.1 Variables -- 13.7.2 Constraints -- 13.7.3 Objective -- 13.8 Farm planning -- 13.8.1 Variables -- 13.8.2 Constraints -- 13.8.3 Objective function -- 13.9 Economic planning -- 13.9.1 Variables -- 13.9.2 Constraints -- 13.9.3 Objective function -- 13.10 Decentralization -- 13.10.1 Variables -- 13.10.2 Constraints -- 13.10.3 Objective -- 13.11 Curve fitting -- 13.12 Logical design -- 13.13 Market sharing -- 13.14 Opencast mining -- 13.15 Tariff rates (power generation) -- 13.15.1 Variables -- 13.15.2 Constraints -- 13.15.3 Objective function (to be minimized) -- 13.16 Hydro power -- 13.16.1 Variables -- 13.16.2 Constraints -- 13.16.3 Objective function (to be minimized) -- 13.17 Three-dimensional noughts and crosses -- 13.17.1 Variables -- 13.17.2 Constraints -- 13.17.3 Objective -- 13.18 Optimizing a constraint -- 13.19 Distribution 1 -- 13.19.1 Variables -- 13.19.2 Constraints -- 13.19.3 Objectives -- 13.20 Depot location (distribution 2) -- 13.21 Agricultural pricing -- 13.22 Efficiency analysis -- 13.23 Milk collection -- 13.23.1 Variables -- 13.23.2 Constraints -- 13.23.3 Objective -- 13.24 Yield management -- 13.24.1 Variables -- 13.24.2 Constraints -- 13.24.3 Objective -- 13.25 Car rental 1. 13.25.1 Indices. |
| Record Nr. | UNINA-9910961737703321 |
Williams H. P
|
||
| Hoboken, N.J., : Wiley, 2013 | ||
| Lo trovi qui: Univ. Federico II | ||
| ||