Vai al contenuto principale della pagina

Statistical Planning and Inference : Concepts and Applications



(Visualizza in formato marc)    (Visualizza in BIBFRAME)

Autore: Ghosh Subir Visualizza persona
Titolo: Statistical Planning and Inference : Concepts and Applications Visualizza cluster
Pubblicazione: Newark : , : John Wiley & Sons, Incorporated, , 2026
©2026
Edizione: 1st ed.
Descrizione fisica: 1 online resource (236 pages)
Disciplina: 001.422
Nota di contenuto: Cover -- Title Page -- Copyright -- Contents -- Preface -- Chapter 1 Foundation of Experiments -- 1.1 Uncertainties in Evidences -- 1.2 Examples -- 1.2.1 The Louis Pasteur Anthrax Vaccination Experiment -- 1.2.2 The Lanarkshire Milk Experiment: Milk Tests in Lanarkshire Schools -- 1.3 Replication, Randomization, Blocking, and Blinding -- 1.3.1 Replication -- 1.3.2 Randomization -- 1.3.3 Blocking -- 1.3.4 Blinding -- 1.4 Figuring It Out! -- Questions and Answers -- Bibliography -- Bibliography -- Chapter 2 Completely Randomized Design -- 2.1 An Example -- 2.2 Analyses Using R and SAS -- 2.3 Figuring It Out! -- Bibliography -- Chapter 3 Randomized Complete Block Design -- 3.1 Fixed Effects Model -- 3.2 Binomial Model for Signs -- 3.3 Randomization Model -- 3.4 Mixed Effects Model -- 3.5 General Mixed Effects Model -- 3.6 The REML Variance Components Estimates -- 3.7 BLUEs and BLUPs -- 3.7.1 The Conditional Model -- 3.7.2 The Unconditional Model -- 3.7.3 Computation-The Conditional Model -- 3.7.4 Computation-The Unconditional Model -- 3.8 Figuring It Out! -- Bibliography -- Chapter 4 Randomized Incomplete Block Design -- 4.1 Model M1: Fixed‐Effects Model -- 4.2 Model M2: Mixed‐Effects Model -- 4.3 Research Questions -- 4.4 Figuring It Out! -- 4.5 Definitions -- Exercises -- Bibliography -- Chapter 5 Error Rates -- 5.1 Definitions of Error Rates -- 5.2 Single‐Stage Methods -- 5.3 A Multistage Method -- 5.3.1 Benjamini and Hochberg Method -- 5.4 Figuring It Out -- Questions -- Bibliography -- Chapter 6 Nutrition Experiment -- 6.1 Figuring It Out! -- Bibliography -- Chapter 7 The Pearson Dependence -- 7.1 Bivariate Normal Distribution -- 7.2 Estimation of Unknown Parameters -- 7.2.1 The Unconditional Model -- 7.2.2 The Conditional Model -- 7.2.3 Test of Significance -- 7.3 A Bayesian Estimation -- 7.4 Exercises -- Bibliography.
Chapter 8 The Multivariate Dependence -- 8.1 The Multivariate Normal Distribution -- 8.2 Inference -- 8.3 Partial Dependence -- 8.4 Exercises -- Bibliography -- Chapter 9 The Conditional Mean Dependence -- 9.1 LS Estimation -- 9.2 Ridge Estimation -- 9.2.1 A Bayesian Estimation -- 9.3 Dependence of Ridge Estimator on the Tuning Parameter -- 9.4 LASSO Estimation -- 9.5 Dependence of LASSO Estimators on the Tuning Parameter -- Bibliography -- Chapter 10 More Parameters Than Observations -- 10.1 Learning by Doing-Exercises -- Exercises -- Bibliography -- Chapter 11 Eigenvalues, Eigenvectors, and Applications -- 11.1 Eigenvalues and Eigenvectors -- 11.2 Second‐Order Response Surface -- Exercises -- Bibliography -- Chapter 12 Covariance Estimation -- 12.1 Model 1 -- 12.1.1 Characterization of the Covariance Matrix and Its Estimators -- 12.1.2 Likelihood Function -- 12.1.3 Properties -- 12.2 Model 2 -- 12.2.1 Characterization of the Covariance Matrix and Its Estimators -- 12.3 Model 3 -- 12.4 Model 4 -- 12.5 Model 5 -- 12.6 Exercises -- Bibliography -- Chapter 13 Discriminant Analysis -- 13.1 Learning from the Univariate Data-Two Normal Populations with Equal Variances -- 13.1.1 Discriminant Analysis for the Univariate Data -- 13.1.2 Example-Univariate Discriminant Analysis -- 13.2 Learning from the Univariate Data-Two Normal Populations with Unequal Variances -- 13.2.1 Classification of 25 Versicolor Iris Flowers -- 13.2.2 Classification of 25 Setosa Iris Flowers -- 13.2.3 Test of Homogeneity of Variances -- 13.3 Learning from the Multivariate Data -- 13.3.1 Classification of Versicolor and Setosa -- 13.3.2 Classification of Versicolor and Virginica -- 13.4 Logistic Regression -- 13.5 Exercises -- Bibliography -- Chapter 14 Optimizing the Variance-Bias Trade‐Off -- 14.1 Variance-Bias Trade‐Off -- 14.1.1 Example 1 -- 14.1.2 Example 2 -- 14.1.3 Example 3.
14.2 Information in Data -- 14.3 Information and Design in Presence of a Covariate -- 14.3.1 Information -- 14.3.2 Optimum Design for a Covariate -- 14.4 Information and Design in Presence of Multiple Covariates -- 14.4.1 Information -- 14.4.2 Exponential Model -- 14.4.3 Exponential Regression Model with Multiple Covariates -- 14.4.4 Poisson Log‐Linear Model -- 14.4.5 Non‐parametric Regression Model -- 14.5 Exercises -- Bibliography -- Chapter 15 Specification, Discrimination, Robustness, and Sensitivity -- 15.1 The Global and Local Optimal Models -- 15.2 The T‐Optimal Design -- 15.3 Convex and Concave Functions -- 15.4 The Kullback-Leibler (KL) Divergence -- 15.5 The KL Design Optimality -- 15.6 The Differential Entropy -- 15.7 Lindley Information Measure -- 15.8 Joint Entropy, Conditional Entropy, and Mutual Information -- 15.9 Maximum Entropy Sampling -- 15.10 Search Linear Models and Search Designs -- 15.10.1 Factorial Experiments -- 15.10.2 Search Probability Matrix -- 15.11 Robustness Against Unavailable Data -- 15.12 Influential Sets of Observations -- 15.13 Exercises -- Bibliography -- Data Index -- Subject Index -- EULA.
Sommario/riassunto: Explore the foundations of, and cutting-edge developments in, statistics Statistical Planning and Inference: Concepts and Applications delivers a robust introduction to statistical planning and inference, including classical and computer age developments in statistical science.
Titolo autorizzato: Statistical Planning and Inference  Visualizza cluster
ISBN: 1-118-95889-6
1-118-95890-X
Formato: Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione: Inglese
Record Nr.: 9911040926203321
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Serie: Wiley Series in Probability and Statistics Series