1.

Record Nr.

UNINA9910830090503321

Autore

Chatterjee Samprit <1938->

Titolo

Sensitivity analysis in linear regression [[electronic resource] /] / Samprit Chatterjee, Ali S. Hadi

Pubbl/distr/stampa

New York, : Wiley, c1988

ISBN

1-282-30736-3

9786612307362

0-470-31676-4

0-470-31742-6

Descrizione fisica

1 online resource (341 p.)

Collana

Wiley series in probability and mathematical statistics. Applied probability and statistics

Altri autori (Persone)

HadiAli S

Disciplina

519.5

519.536

Soggetti

Regression analysis

Perturbation (Mathematics)

Mathematical optimization

Lingua di pubblicazione

Inglese

Formato

Materiale a stampa

Livello bibliografico

Monografia

Note generali

Description based upon print version of record.

Nota di bibliografia

Includes bibliography and index.

Nota di contenuto

Sensitivity Analysis in Linear Regression; PREFACE; Contents; 1. INTRODUCTION; 1.1. Introduction; 1.2. Notations; 1.3. Standard Estimation Results in Least Squares; 1.4. Assumptions; 1.5. Iterative Regression Process; 1.6. Organization of the Book; 2. PREDICTION MATRIX; 2.1.Introduction; 2.2. Roles of P and (I -P) in Linear Regression; 2.3. Properties of the Prediction Matrix; 2.3.1. General Properties; 2.3.2. Omitting (Adding) Variables; 2.3.3. Omitting (Adding) an Observation; 2.3.4. Conditions for Large Values of pii; 2.3.5. Omitting Multiple Rows of X; 2.3.6. Eigenvalues of P and (I- P)

2.3.7. Distribution of pü2.4. Examples; 3. ROLE OF VARIABLES IN A REGRESSION EQUATION; 3.1. Introduction; 3.2. Effects of Underfitting; 3.3. Effects of Overfining; 3.4. Interpreting Successive Fining; 3.5. Computing Implications for Successive Fitting; 3.6. Introduction of One Additional Regressor; 3.7. Comparing Models: Comparison Criteria; 3.8. Diagnostic Plots for the Effects of Variables; 3.8.1. Added Variable (Partial Regression) Plots; 3.8.2. Residual Versus Predictor Plots; 3.8.3.



Component-Plus-Residual (Partial Residual) Plots; 3.8.4. Augmented Partial Residual Plots

3.9. Effects of an Additional Regressor4. EFFECTS OF AN OBSERVATION ON A REGRESSION EQUATION; 4.1. Introduction; 4.2. Omission Approach; 4.2.1. Measures Based on Residuals; 4.2.1.1. Testing for a Single Outlier; 4.2.1.2. Graphical Methods; 4.2.2. Outliers, High-leverage, and Influential Points; 4.2.3. Measures Based on Remoteness of Points in X-Y Space; 4.2.3.1. Diagonal Elements of P; 4.2.3.2. Mahalanobis Distance; 4.2.3.3. Weighted Squared Standardized Distance; 4.2.3.4. Diagonal Elements of Pz; 4.2.4. Influence Curve; 4.2.4.1. Definition of the Influence Curve

4.2.4.2. Influence Curves for β and σ24.2.4.3. Approximating the Influence Curve; 4.2.5. Measures Based on the Influence Curve; 4.2.5.1. Cook's Distance; 4.2.5.2. Welsch-Kuh's Distance; 4.2.5.3. Welsch's Distance; 4.2.5.4. Modified Cooks Distance; 4.2.6. Measures Based on the Volume of Confidence Ellipsoids; 4.2.6.1. Andrews-Pregibon Statistic; 4.2.6.2. Variance Ratio; 4.2.6.3. Cook-Weisberg Statistic; 4.2.7. Measures Based on the Likelihood Function; 4.2.8. Measures Based on a Subset of the Regression Coefficients; 4.2.8.1. Influence on a Single Regression Coefficient

4.2.8.2. Ilnfluence on Linear Functions of β4.2.9. Measures based on the Eigensmcture of X; 4.2.9.1. Condition Number and Collinearity Indices; 4.2.9.2. Collinearity-Influential Points; 4.2.9.3. Effects of an Observation on the Condition Number; 4.2.9.4. Diagnosing Collinearhy-Influential Observations; 4.3. Differentiation Approach; 4.4. Summary and Concluding Remarks; 5. ASSESSING THE EFFECTS OF MULTIPLE OBSERVATIONS; 5.1. Introduction; 5.2. Measures Based on Residuals; 5.3. Measures Based on the Influence Curve; 5.3.1. Sample lnfluence Curve; 5.3.2. Empirical Influence Curve

5.3.3. Generalized Cook's Distance

Sommario/riassunto

Treats linear regression diagnostics as a tool for application of linear regression models to real-life data. Presentation makes extensive use of examples to illustrate theory. Assesses the effect of measurement errors on the estimated coefficients, which is not accounted for in a standard least squares estimate but is important where regression coefficients are used to apportion effects due to different variables. Also assesses qualitatively and numerically the robustness of the regression fit.