01788nam 2200409 n 450 99638405930331620221107144053.0(CKB)1000000000581488(EEBO)2240898718(UnM)99837500(EXLCZ)99100000000058148819901008d1534 uy |engurbn||||a|bb|An exhortacyon to the dylygent study of scripture: made by Erasmus of Roterodamus. And lately translated into Englysshe[electronic resource][[London] Imprynted by me Robert Wyer, dwellyng in saynt Martyns parysshe, in the bysshoppe of Norwytche rentes[1534?]][148] pA translation of: Paraclesis.Translated by William Roy?--STC.Printer's name and address from colophon; publication date conjectured by STC.Signatures: a-c d² ; ² a-² f.Part 2 only. ² a3v-² a4r missed in filming."An exhortacyon to the study of readynge the Gospell", translated from the "Paraphrases in Novum Testamentum", has separate title page and register.The last leaf bears a printer's mark.Reproduction of original in the John Rylands University Library, Manchester, England.eebo-0018Erasmus Desideriusd. 1536.799747Roy Williamfl. 1527-1531.1008859Erasmus Desideriusd. 1536.autCu-RivESCu-RivESCStRLINWaOLNBOOK996384059303316An exhortacyon to the dylygent study of scripture: made by Erasmus of Roterodamus. And lately translated into Englysshe2371907UNISA05499nam 22007094a 450 991101894500332120200520144314.0978661028698097812802869881280286989978047036261704703626189780471771456047177145797804717714490471771449(CKB)1000000000354973(EBL)242881(OCoLC)71791637(SSID)ssj0000182790(PQKBManifestationID)11170399(PQKBTitleCode)TC0000182790(PQKBWorkID)10172849(PQKB)10124791(MiAaPQ)EBC242881(Perlego)2762621(EXLCZ)99100000000035497320050622d2006 uy 0engur|n|---|||||txtccrIntroduction to nonparametric regression /Kunio TakezawaHoboken, N.J. Wiley-Intersciencec20061 online resource (566 p.)Wiley series in probability and statisticsDescription based upon print version of record.9780471745839 0471745839 Includes bibliographical references (p. 529-531) and index.INTRODUCTION TO NONPARAMETRIC REGRESSION; CONTENTS; Preface; Acknowledgments; 1 Exordium; 1.1 Introduction; 1.2 Are the moving average and Fourier series sufficiently useful?; 1.3 Is a histogram or normal distribution sufficiently powerful?; 1.4 Is interpolation sufficiently powerful?; 1.5 Should we use a descriptive equation?; 1.6 Parametric regression and nonparametric regression; 2 Smoothing for data with an equispaced predictor; 2.1 Introduction; 2.2 Moving average and binomial filter; 2.3 Hat matrix; 2.4 Local linear regression; 2.5 Smoothing spline2.6 Analysis on eigenvalue of hat matrix2.7 Examples of S-Plus object; References; Problems; 3 Nonparametric regression for one-dimensional predictor; 3.1 Introduction; 3.2 Trade-off between bias and variance; 3.3 Index to select beneficial regression equations; 3.4 Nadaraya-Watson estimator; 3.5 Local polynomial regression; 3.6 Natural spline and smoothing spline; 3.7 LOESS; 3.8 Supersmoother; 3.9 LOWESS; 3.10 Examples of S-Plus object; References; Problems; 4 Multidimensional smoothing; 4.1 Introduction; 4.2 Local polynomial regression for multidimensional predictor4.3 Thin plate smoothing splines4.4 LOESS and LOWESS with plural predictors; 4.5 Kriging; 4.6 Additive model; 4.7 ACE; 4.8 Projection pursuit regression; 4.9 Examples of S-Plus object; References; Problems; 5 Nonparametric regression with predictors represented as distributions; 5.1 Introduction; 5.2 Use of distributions as predictors; 5.3 Nonparametric DVR method; 5.4 Form of nonparametric regression with predictors represented as distributions; 5.5 Examples of S-Plus object; References; Problems; 6 Smoothing of histograms and nonparametric probability density functions; 6.1 Introduction6.2 Histogram6.3 Smoothing a histogram; 6.4 Nonparametnc probability density function; 6.5 Examples of S-Plus object; References; Problems; 7 Pattern recognition; 7.1 Introduction; 7.2 Bayes' decision rule; 7.3 Linear discriminant rule and quadratic discriminant rule; 7.4 Classification using nonparametric probability density function; 7.5 Logistic regression; 7.6 Neural networks; 7.7 Tree-based model; 7.8 k-nearest-neighbor classifier; 7.9 Nonparametric regression based on the least squares; 7.10 Transformation of feature vectors; 7.11 Examples of S-Plus object; References; ProblemsAppendix A: Creation and applications of B-spline basesA.1 Introduction; A.2 Method to create B-spline basis; A.3 Natural spline created by B-spline; A.4 Application to smoothing spline; A.5 Examples of S-Plus object; References; Appendix B: R objects; B.1 Introduction; B.2 Transformation of S-Plus objects in Chapter 2; B.3 Transformation of S-Plus objects in Chapter 3; B.4 Transformation of S-Plus objects in Chapter 4; B.5 Transformation of S-Plus objects in Chapter 5; B.6 Transformation of S-Plus objects in Chapter 6; B.7 Transformation of S-Plus objects in Chapter 7B.8 Transformation of S-Plus objects in Appendix AAn easy-to-grasp introduction to nonparametric regressionThis book's straightforward, step-by-step approach provides an excellent introduction to the field for novices of nonparametric regression. Introduction to Nonparametric Regression clearly explains the basic concepts underlying nonparametric regression and features:* Thorough explanations of various techniques, which avoid complex mathematics and excessive abstract theory to help readers intuitively grasp the value of nonparametric regression methods* Statistical techniques accompanied by clear numerical examples that furWiley series in probability and statistics.Regression analysisTextbooksNonparametric statisticsTextbooksRegression analysisNonparametric statistics519.5/36Takezawa Kunio1959-520704MiAaPQMiAaPQMiAaPQBOOK9911018945003321Introduction to nonparametric regression4419719UNINA