LEADER 05477nam 2200685Ia 450 001 9910830090503321 005 20170810195603.0 010 $a1-282-30736-3 010 $a9786612307362 010 $a0-470-31676-4 010 $a0-470-31742-6 035 $a(CKB)1000000000687550 035 $a(EBL)468906 035 $a(OCoLC)264621182 035 $a(SSID)ssj0000342784 035 $a(PQKBManifestationID)11247811 035 $a(PQKBTitleCode)TC0000342784 035 $a(PQKBWorkID)10286098 035 $a(PQKB)10013860 035 $a(MiAaPQ)EBC468906 035 $a(PPN)159487889 035 $a(EXLCZ)991000000000687550 100 $a19871005d1988 uy 0 101 0 $aeng 135 $aur|n|---||||| 181 $ctxt 182 $cc 183 $acr 200 10$aSensitivity analysis in linear regression$b[electronic resource] /$fSamprit Chatterjee, Ali S. Hadi 210 $aNew York $cWiley$dc1988 215 $a1 online resource (341 p.) 225 1 $aWiley series in probability and mathematical statistics. Applied probability and statistics 300 $aDescription based upon print version of record. 311 $a0-471-82216-7 320 $aIncludes bibliography and index. 327 $aSensitivity Analysis in Linear Regression; PREFACE; Contents; 1. INTRODUCTION; 1.1. Introduction; 1.2. Notations; 1.3. Standard Estimation Results in Least Squares; 1.4. Assumptions; 1.5. Iterative Regression Process; 1.6. Organization of the Book; 2. PREDICTION MATRIX; 2.1.Introduction; 2.2. Roles of P and (I -P) in Linear Regression; 2.3. Properties of the Prediction Matrix; 2.3.1. General Properties; 2.3.2. Omitting (Adding) Variables; 2.3.3. Omitting (Adding) an Observation; 2.3.4. Conditions for Large Values of pii; 2.3.5. Omitting Multiple Rows of X; 2.3.6. Eigenvalues of P and (I- P) 327 $a2.3.7. Distribution of pu?2.4. Examples; 3. ROLE OF VARIABLES IN A REGRESSION EQUATION; 3.1. Introduction; 3.2. Effects of Underfitting; 3.3. Effects of Overfining; 3.4. Interpreting Successive Fining; 3.5. Computing Implications for Successive Fitting; 3.6. Introduction of One Additional Regressor; 3.7. Comparing Models: Comparison Criteria; 3.8. Diagnostic Plots for the Effects of Variables; 3.8.1. Added Variable (Partial Regression) Plots; 3.8.2. Residual Versus Predictor Plots; 3.8.3. Component-Plus-Residual (Partial Residual) Plots; 3.8.4. Augmented Partial Residual Plots 327 $a3.9. Effects of an Additional Regressor4. EFFECTS OF AN OBSERVATION ON A REGRESSION EQUATION; 4.1. Introduction; 4.2. Omission Approach; 4.2.1. Measures Based on Residuals; 4.2.1.1. Testing for a Single Outlier; 4.2.1.2. Graphical Methods; 4.2.2. Outliers, High-leverage, and Influential Points; 4.2.3. Measures Based on Remoteness of Points in X-Y Space; 4.2.3.1. Diagonal Elements of P; 4.2.3.2. Mahalanobis Distance; 4.2.3.3. Weighted Squared Standardized Distance; 4.2.3.4. Diagonal Elements of Pz; 4.2.4. Influence Curve; 4.2.4.1. Definition of the Influence Curve 327 $a4.2.4.2. Influence Curves for ? and ?24.2.4.3. Approximating the Influence Curve; 4.2.5. Measures Based on the Influence Curve; 4.2.5.1. Cook's Distance; 4.2.5.2. Welsch-Kuh's Distance; 4.2.5.3. Welsch's Distance; 4.2.5.4. Modified Cooks Distance; 4.2.6. Measures Based on the Volume of Confidence Ellipsoids; 4.2.6.1. Andrews-Pregibon Statistic; 4.2.6.2. Variance Ratio; 4.2.6.3. Cook-Weisberg Statistic; 4.2.7. Measures Based on the Likelihood Function; 4.2.8. Measures Based on a Subset of the Regression Coefficients; 4.2.8.1. Influence on a Single Regression Coefficient 327 $a4.2.8.2. Ilnfluence on Linear Functions of ?4.2.9. Measures based on the Eigensmcture of X; 4.2.9.1. Condition Number and Collinearity Indices; 4.2.9.2. Collinearity-Influential Points; 4.2.9.3. Effects of an Observation on the Condition Number; 4.2.9.4. Diagnosing Collinearhy-Influential Observations; 4.3. Differentiation Approach; 4.4. Summary and Concluding Remarks; 5. ASSESSING THE EFFECTS OF MULTIPLE OBSERVATIONS; 5.1. Introduction; 5.2. Measures Based on Residuals; 5.3. Measures Based on the Influence Curve; 5.3.1. Sample lnfluence Curve; 5.3.2. Empirical Influence Curve 327 $a5.3.3. Generalized Cook's Distance 330 $aTreats linear regression diagnostics as a tool for application of linear regression models to real-life data. Presentation makes extensive use of examples to illustrate theory. Assesses the effect of measurement errors on the estimated coefficients, which is not accounted for in a standard least squares estimate but is important where regression coefficients are used to apportion effects due to different variables. Also assesses qualitatively and numerically the robustness of the regression fit. 410 0$aWiley series in probability and mathematical statistics.$pApplied probability and statistics. 606 $aRegression analysis 606 $aPerturbation (Mathematics) 606 $aMathematical optimization 615 0$aRegression analysis. 615 0$aPerturbation (Mathematics) 615 0$aMathematical optimization. 676 $a519.5 676 $a519.536 700 $aChatterjee$b Samprit$f1938-$014454 701 $aHadi$b Ali S$021014 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a9910830090503321 996 $aSensitivity analysis in linear regression$91142811 997 $aUNINA