06466nam 22006135 450 991030014560332120220510230834.04-431-54321-X9784431543213(eBook)9784431543206(print)10.1007/978-4-431-54321-3(CKB)3710000000025712(EBL)1538802(SSID)ssj0001049510(PQKBManifestationID)11682087(PQKBTitleCode)TC0001049510(PQKBWorkID)11019572(PQKB)11656704(MiAaPQ)EBC1538802(DE-He213)978-4-431-54321-3(PPN)176126023(EXLCZ)99371000000002571220131008h20142014 uy 0engur|n|---|||||txtrdacontentcrdamediacrrdacarrierLearning regression analysis by simulation /Kunio TakezawaTokyo :Springer,[2014]©20141 online resource (xii, 300 pages) illustrations4-431-54320-1 Includes bibliographical references and index.Preface; Acknowledgments; Contents; Chapter 1: Linear Algebra; 1.1 Starting Up and Executing R; 1.2 Vectors; 1.3 Matrices; 1.4 Addition of Two Matrices; 1.5 Multiplying Two Matrices; 1.6 Identity and Inverse Matrices; 1.7 Simultaneous Equations; 1.8 Diagonalization of a Symmetric Matrix; 1.9 Quadratic Forms; References; Chapter 2: Distributions and Tests; 2.1 Sampling and Random Variables; 2.2 Probability Distribution; 2.3 Normal Distribution and the Central Limit Theorem; 2.4 Interval Estimation by t Distribution; 2.5 t-Test2.6 Interval Estimation of Population Varianceand the χ2 Distribution2.7 F Distribution and F-Test; 2.8 Wilcoxon Signed-Rank Sum Test; References; Chapter 3: Simple Regression; 3.1 Derivation of Regression Coefficients; 3.2 Exchange Between Predictor Variable and Target Variable; 3.3 Regression to the Mean; 3.4 Confidence Interval of Regression Coefficients in Simple Regression; 3.5 t-Test in Simple Regression; 3.6 F-Test on Simple Regression; 3.7 Selection Between Constant and Nonconstant Regression Equations; 3.8 Prediction Error of Simple Regression; 3.9 Weighted Regression3.10 Least Squares Method and Prediction ErrorReferences; Chapter 4: Multiple Regression; 4.1 Derivation of Regression Coefficients; 4.2 Test on Multiple Regression; 4.3 Prediction Error on Multiple Regression; 4.4 Notes on Model Selection Using Prediction Error; 4.5 Polynomial Regression; 4.6 Variance of Regression Coefficient and Multicollinearity; 4.7 Detection of Multicollinearity Using Variance Inflation Factors; 4.8 Hessian Matrix of Log-Likelihood; References; Chapter 5: Akaike's Information Criterion (AIC) and the Third Variance; 5.1 Cp and FPE5.2 AIC of a Multiple Regression Equation with Independent and Identical Normal Distribution5.3 Derivation of AIC for Multiple Regression; 5.4 AIC with Unbiased Estimator for Error Variance; 5.5 Error Variance by Maximizing Expectationof Log-Likelihood in Light of the Datain the Future and the ``Third Variance''; 5.6 Relationship Between AIC (or GCV) and F-Test; 5.7 AIC on Poisson Regression; References; Chapter 6: Linear Mixed Model; 6.1 Random-Effects Model; 6.2 Random Intercept Model; 6.3 Random Intercept and Slope Model; 6.4 Generalized Linear Mixed Model6.5 Generalized Additive Mixed ModelIndexThe standard approach of most introductory books for practical statistics is that readers first learn the minimum mathematical basics of statistics and rudimentary concepts of statistical methodology. They then are given examples of analyses of data obtained from natural and social phenomena so that they can grasp practical definitions of statistical methods. Finally they go on to acquaint themselves with statistical software for the PC and analyze similar data to expand and deepen their understanding of statistical methods. This book, however, takes a slightly different approach, using simulation data instead of actual data to illustrate the functions of statistical methods. Also, "R" programs listed in the book help readers realize clearly how these methods work to bring intrinsic values of data to the surface. "R" is free software enabling users to handle vectors, matrices, data frames, and so on. For example, when a statistical theory indicates that an event happens with a 5 % probability, readers can confirm the fact using "R" programs that this event actually occurs with roughly that probability, by handling data generated by pseudo-random numbers. Simulation gives readers populations with known backgrounds and the nature of the population can be adjusted easily. This feature of the simulation data helps provide a clear picture of statistical methods painlessly. Most readers of introductory books of statistics for practical purposes do not like complex mathematical formulae, but they do not mind using a PC to produce various numbers and graphs by handling a huge variety of numbers. If they know the characteristics of these numbers beforehand, they treat them with ease. Struggling with actual data should come later. Conventional books on this topic frighten readers by presenting unidentified data to them indiscriminately. This book provides a new path to statistical concepts and practical skills in a readily accessible manner.Regression analysisSimulation methodsStatistical Theory and Methodshttps://scigraph.springernature.com/ontologies/product-market-codes/S11001Statistics and Computing/Statistics Programshttps://scigraph.springernature.com/ontologies/product-market-codes/S12008Statistics for Engineering, Physics, Computer Science, Chemistry and Earth Scienceshttps://scigraph.springernature.com/ontologies/product-market-codes/S17020Regression analysisSimulation methods.Statistical Theory and Methods.Statistics and Computing/Statistics Programs.Statistics for Engineering, Physics, Computer Science, Chemistry and Earth Sciences.519.536Takezawa Kunio1959-520704BOOK9910300145603321Learning regression analysis by simulation1409956UNINA