Guidebook to R graphics using Microsoft Windows [[electronic resource] /] / Kunio Takezawa |
Autore | Takezawa Kunio <1959-> |
Pubbl/distr/stampa | Hoboken, N.J., : John Wiley & Sons, Inc., c2012 |
Descrizione fisica | 1 online resource (280 p.) |
Disciplina | 006.6/633 |
Soggetto topico |
Computer graphics
R (Computer program language) |
ISBN |
1-280-58859-4
9786613618429 1-118-27015-0 1-118-27016-9 1-118-27013-4 |
Classificazione | MAT029000 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Guidebook to R Graphics Using Microsoft® Windows; CONTENTS; Preface; Acknowledgments; 1 Basic Graphics; 1.1 Introduction; 1.2 Downloading and installation of R; 1.3 Start-up of R, and construction and execution of R programs; 1.4 Coordinate axes; 1.5 Points and straight lines; 1.6 Reuse of graphs produced by R; 1.7 Text; 1.8 Various points and straight lines; 1.9 Fonts; 1.10 Figures such as circles and rectangles; 1.11 Legends and logarithmic plots; 1.12 Bar charts; 1.13 Pie charts; 1.14 Layout of multiple graphs; 1.15 Summary; Exercises; 2 Graphics for Statistical Analysis; 2.1 Introduction
2.2 Stem-and-leaf displays2.3 Histograms and probability density functions; 2.4 Strip chart; 2.5 Boxplots; 2.6 Multiple-axis layouts; 2.7 Display of confidence intervals; 2.8 Scatter plot matrices; 2.9 Radar charts and parallel charts; 2.10 Functions of one variable; 2.11 Functions of two variables; 2.12 Map graphs; 2.13 Histograms of two variables; 2.14 Time series graphs of two variables; 2.15 Implicit functions; 2.16 Probability density functions; 2.17 Differential values and values of integrals; 2.18 Summary; Exercises; 3 Interactive R Programs; 3.1 Introduction 3.2 Positioning by mouse on a graphics window3.3 Inputting values on the console window to draw a graph; 3.4 Reading data from a data file; 3.5 Moving data on a natural spline; 3.6 Understanding simple regression; 3.7 Adjusting three-dimensional graphs; 3.8 Constructing polynomial regression equations interactively; 3.9 Understanding local linear regression; 3.10 Summary; Exercises; 4 Graphics Obtained Using Packages Based on R; 4.1 Introduction; 4.2 Package ""rimage""; 4.3 Package ""gplots""; 4.4 Package ""ggplot2""; 4.5 Package ""scatterplot3d""; 4.6 Package ""rgl""; 4.7 Package ""misc3d"" 4.8 Package ""aplpack""4.9 Package ""vegan""; 4.10 Package ""tripack""; 4.11 Package ""ade4""; 4.12 Package ""vioplot""; 4.13 Package ""plotrix""; 4.14 Package ""rworldmap""; Exercises; 5 Appendix; A.l Digital files; A.2 Free software; A.3 Data; Index |
Record Nr. | UNINA-9910141319303321 |
Takezawa Kunio <1959-> | ||
Hoboken, N.J., : John Wiley & Sons, Inc., c2012 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Guidebook to R graphics using Microsoft Windows [[electronic resource] /] / Kunio Takezawa |
Autore | Takezawa Kunio <1959-> |
Pubbl/distr/stampa | Hoboken, N.J., : John Wiley & Sons, Inc., c2012 |
Descrizione fisica | 1 online resource (280 p.) |
Disciplina | 006.6/633 |
Soggetto topico |
Computer graphics
R (Computer program language) |
ISBN |
1-280-58859-4
9786613618429 1-118-27015-0 1-118-27016-9 1-118-27013-4 |
Classificazione | MAT029000 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Guidebook to R Graphics Using Microsoft® Windows; CONTENTS; Preface; Acknowledgments; 1 Basic Graphics; 1.1 Introduction; 1.2 Downloading and installation of R; 1.3 Start-up of R, and construction and execution of R programs; 1.4 Coordinate axes; 1.5 Points and straight lines; 1.6 Reuse of graphs produced by R; 1.7 Text; 1.8 Various points and straight lines; 1.9 Fonts; 1.10 Figures such as circles and rectangles; 1.11 Legends and logarithmic plots; 1.12 Bar charts; 1.13 Pie charts; 1.14 Layout of multiple graphs; 1.15 Summary; Exercises; 2 Graphics for Statistical Analysis; 2.1 Introduction
2.2 Stem-and-leaf displays2.3 Histograms and probability density functions; 2.4 Strip chart; 2.5 Boxplots; 2.6 Multiple-axis layouts; 2.7 Display of confidence intervals; 2.8 Scatter plot matrices; 2.9 Radar charts and parallel charts; 2.10 Functions of one variable; 2.11 Functions of two variables; 2.12 Map graphs; 2.13 Histograms of two variables; 2.14 Time series graphs of two variables; 2.15 Implicit functions; 2.16 Probability density functions; 2.17 Differential values and values of integrals; 2.18 Summary; Exercises; 3 Interactive R Programs; 3.1 Introduction 3.2 Positioning by mouse on a graphics window3.3 Inputting values on the console window to draw a graph; 3.4 Reading data from a data file; 3.5 Moving data on a natural spline; 3.6 Understanding simple regression; 3.7 Adjusting three-dimensional graphs; 3.8 Constructing polynomial regression equations interactively; 3.9 Understanding local linear regression; 3.10 Summary; Exercises; 4 Graphics Obtained Using Packages Based on R; 4.1 Introduction; 4.2 Package ""rimage""; 4.3 Package ""gplots""; 4.4 Package ""ggplot2""; 4.5 Package ""scatterplot3d""; 4.6 Package ""rgl""; 4.7 Package ""misc3d"" 4.8 Package ""aplpack""4.9 Package ""vegan""; 4.10 Package ""tripack""; 4.11 Package ""ade4""; 4.12 Package ""vioplot""; 4.13 Package ""plotrix""; 4.14 Package ""rworldmap""; Exercises; 5 Appendix; A.l Digital files; A.2 Free software; A.3 Data; Index |
Record Nr. | UNINA-9910816114103321 |
Takezawa Kunio <1959-> | ||
Hoboken, N.J., : John Wiley & Sons, Inc., c2012 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Introduction to nonparametric regression [[electronic resource] /] / Kunio Takezawa |
Autore | Takezawa Kunio <1959-> |
Pubbl/distr/stampa | Hoboken, N.J., : Wiley-Interscience, c2006 |
Descrizione fisica | 1 online resource (566 p.) |
Disciplina |
519.5/36
519.536 |
Collana | Wiley series in probability and statistics |
Soggetto topico |
Regression analysis
Nonparametric statistics |
Soggetto genere / forma | Electronic books. |
ISBN |
1-280-28698-9
9786610286980 0-470-36261-8 0-471-77145-7 0-471-77144-9 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
INTRODUCTION TO NONPARAMETRIC REGRESSION; CONTENTS; Preface; Acknowledgments; 1 Exordium; 1.1 Introduction; 1.2 Are the moving average and Fourier series sufficiently useful?; 1.3 Is a histogram or normal distribution sufficiently powerful?; 1.4 Is interpolation sufficiently powerful?; 1.5 Should we use a descriptive equation?; 1.6 Parametric regression and nonparametric regression; 2 Smoothing for data with an equispaced predictor; 2.1 Introduction; 2.2 Moving average and binomial filter; 2.3 Hat matrix; 2.4 Local linear regression; 2.5 Smoothing spline
2.6 Analysis on eigenvalue of hat matrix2.7 Examples of S-Plus object; References; Problems; 3 Nonparametric regression for one-dimensional predictor; 3.1 Introduction; 3.2 Trade-off between bias and variance; 3.3 Index to select beneficial regression equations; 3.4 Nadaraya-Watson estimator; 3.5 Local polynomial regression; 3.6 Natural spline and smoothing spline; 3.7 LOESS; 3.8 Supersmoother; 3.9 LOWESS; 3.10 Examples of S-Plus object; References; Problems; 4 Multidimensional smoothing; 4.1 Introduction; 4.2 Local polynomial regression for multidimensional predictor 4.3 Thin plate smoothing splines4.4 LOESS and LOWESS with plural predictors; 4.5 Kriging; 4.6 Additive model; 4.7 ACE; 4.8 Projection pursuit regression; 4.9 Examples of S-Plus object; References; Problems; 5 Nonparametric regression with predictors represented as distributions; 5.1 Introduction; 5.2 Use of distributions as predictors; 5.3 Nonparametric DVR method; 5.4 Form of nonparametric regression with predictors represented as distributions; 5.5 Examples of S-Plus object; References; Problems; 6 Smoothing of histograms and nonparametric probability density functions; 6.1 Introduction 6.2 Histogram6.3 Smoothing a histogram; 6.4 Nonparametnc probability density function; 6.5 Examples of S-Plus object; References; Problems; 7 Pattern recognition; 7.1 Introduction; 7.2 Bayes' decision rule; 7.3 Linear discriminant rule and quadratic discriminant rule; 7.4 Classification using nonparametric probability density function; 7.5 Logistic regression; 7.6 Neural networks; 7.7 Tree-based model; 7.8 k-nearest-neighbor classifier; 7.9 Nonparametric regression based on the least squares; 7.10 Transformation of feature vectors; 7.11 Examples of S-Plus object; References; Problems Appendix A: Creation and applications of B-spline basesA.1 Introduction; A.2 Method to create B-spline basis; A.3 Natural spline created by B-spline; A.4 Application to smoothing spline; A.5 Examples of S-Plus object; References; Appendix B: R objects; B.1 Introduction; B.2 Transformation of S-Plus objects in Chapter 2; B.3 Transformation of S-Plus objects in Chapter 3; B.4 Transformation of S-Plus objects in Chapter 4; B.5 Transformation of S-Plus objects in Chapter 5; B.6 Transformation of S-Plus objects in Chapter 6; B.7 Transformation of S-Plus objects in Chapter 7 B.8 Transformation of S-Plus objects in Appendix A |
Record Nr. | UNINA-9910143420503321 |
Takezawa Kunio <1959-> | ||
Hoboken, N.J., : Wiley-Interscience, c2006 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Introduction to nonparametric regression [[electronic resource] /] / Kunio Takezawa |
Autore | Takezawa Kunio <1959-> |
Pubbl/distr/stampa | Hoboken, N.J., : Wiley-Interscience, c2006 |
Descrizione fisica | 1 online resource (566 p.) |
Disciplina |
519.5/36
519.536 |
Collana | Wiley series in probability and statistics |
Soggetto topico |
Regression analysis
Nonparametric statistics |
ISBN |
1-280-28698-9
9786610286980 0-470-36261-8 0-471-77145-7 0-471-77144-9 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
INTRODUCTION TO NONPARAMETRIC REGRESSION; CONTENTS; Preface; Acknowledgments; 1 Exordium; 1.1 Introduction; 1.2 Are the moving average and Fourier series sufficiently useful?; 1.3 Is a histogram or normal distribution sufficiently powerful?; 1.4 Is interpolation sufficiently powerful?; 1.5 Should we use a descriptive equation?; 1.6 Parametric regression and nonparametric regression; 2 Smoothing for data with an equispaced predictor; 2.1 Introduction; 2.2 Moving average and binomial filter; 2.3 Hat matrix; 2.4 Local linear regression; 2.5 Smoothing spline
2.6 Analysis on eigenvalue of hat matrix2.7 Examples of S-Plus object; References; Problems; 3 Nonparametric regression for one-dimensional predictor; 3.1 Introduction; 3.2 Trade-off between bias and variance; 3.3 Index to select beneficial regression equations; 3.4 Nadaraya-Watson estimator; 3.5 Local polynomial regression; 3.6 Natural spline and smoothing spline; 3.7 LOESS; 3.8 Supersmoother; 3.9 LOWESS; 3.10 Examples of S-Plus object; References; Problems; 4 Multidimensional smoothing; 4.1 Introduction; 4.2 Local polynomial regression for multidimensional predictor 4.3 Thin plate smoothing splines4.4 LOESS and LOWESS with plural predictors; 4.5 Kriging; 4.6 Additive model; 4.7 ACE; 4.8 Projection pursuit regression; 4.9 Examples of S-Plus object; References; Problems; 5 Nonparametric regression with predictors represented as distributions; 5.1 Introduction; 5.2 Use of distributions as predictors; 5.3 Nonparametric DVR method; 5.4 Form of nonparametric regression with predictors represented as distributions; 5.5 Examples of S-Plus object; References; Problems; 6 Smoothing of histograms and nonparametric probability density functions; 6.1 Introduction 6.2 Histogram6.3 Smoothing a histogram; 6.4 Nonparametnc probability density function; 6.5 Examples of S-Plus object; References; Problems; 7 Pattern recognition; 7.1 Introduction; 7.2 Bayes' decision rule; 7.3 Linear discriminant rule and quadratic discriminant rule; 7.4 Classification using nonparametric probability density function; 7.5 Logistic regression; 7.6 Neural networks; 7.7 Tree-based model; 7.8 k-nearest-neighbor classifier; 7.9 Nonparametric regression based on the least squares; 7.10 Transformation of feature vectors; 7.11 Examples of S-Plus object; References; Problems Appendix A: Creation and applications of B-spline basesA.1 Introduction; A.2 Method to create B-spline basis; A.3 Natural spline created by B-spline; A.4 Application to smoothing spline; A.5 Examples of S-Plus object; References; Appendix B: R objects; B.1 Introduction; B.2 Transformation of S-Plus objects in Chapter 2; B.3 Transformation of S-Plus objects in Chapter 3; B.4 Transformation of S-Plus objects in Chapter 4; B.5 Transformation of S-Plus objects in Chapter 5; B.6 Transformation of S-Plus objects in Chapter 6; B.7 Transformation of S-Plus objects in Chapter 7 B.8 Transformation of S-Plus objects in Appendix A |
Record Nr. | UNINA-9910829967303321 |
Takezawa Kunio <1959-> | ||
Hoboken, N.J., : Wiley-Interscience, c2006 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Introduction to nonparametric regression [[electronic resource] /] / Kunio Takezawa |
Autore | Takezawa Kunio <1959-> |
Pubbl/distr/stampa | Hoboken, N.J., : Wiley-Interscience, c2006 |
Descrizione fisica | 1 online resource (566 p.) |
Disciplina |
519.5/36
519.536 |
Collana | Wiley series in probability and statistics |
Soggetto topico |
Regression analysis
Nonparametric statistics |
ISBN |
1-280-28698-9
9786610286980 0-470-36261-8 0-471-77145-7 0-471-77144-9 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
INTRODUCTION TO NONPARAMETRIC REGRESSION; CONTENTS; Preface; Acknowledgments; 1 Exordium; 1.1 Introduction; 1.2 Are the moving average and Fourier series sufficiently useful?; 1.3 Is a histogram or normal distribution sufficiently powerful?; 1.4 Is interpolation sufficiently powerful?; 1.5 Should we use a descriptive equation?; 1.6 Parametric regression and nonparametric regression; 2 Smoothing for data with an equispaced predictor; 2.1 Introduction; 2.2 Moving average and binomial filter; 2.3 Hat matrix; 2.4 Local linear regression; 2.5 Smoothing spline
2.6 Analysis on eigenvalue of hat matrix2.7 Examples of S-Plus object; References; Problems; 3 Nonparametric regression for one-dimensional predictor; 3.1 Introduction; 3.2 Trade-off between bias and variance; 3.3 Index to select beneficial regression equations; 3.4 Nadaraya-Watson estimator; 3.5 Local polynomial regression; 3.6 Natural spline and smoothing spline; 3.7 LOESS; 3.8 Supersmoother; 3.9 LOWESS; 3.10 Examples of S-Plus object; References; Problems; 4 Multidimensional smoothing; 4.1 Introduction; 4.2 Local polynomial regression for multidimensional predictor 4.3 Thin plate smoothing splines4.4 LOESS and LOWESS with plural predictors; 4.5 Kriging; 4.6 Additive model; 4.7 ACE; 4.8 Projection pursuit regression; 4.9 Examples of S-Plus object; References; Problems; 5 Nonparametric regression with predictors represented as distributions; 5.1 Introduction; 5.2 Use of distributions as predictors; 5.3 Nonparametric DVR method; 5.4 Form of nonparametric regression with predictors represented as distributions; 5.5 Examples of S-Plus object; References; Problems; 6 Smoothing of histograms and nonparametric probability density functions; 6.1 Introduction 6.2 Histogram6.3 Smoothing a histogram; 6.4 Nonparametnc probability density function; 6.5 Examples of S-Plus object; References; Problems; 7 Pattern recognition; 7.1 Introduction; 7.2 Bayes' decision rule; 7.3 Linear discriminant rule and quadratic discriminant rule; 7.4 Classification using nonparametric probability density function; 7.5 Logistic regression; 7.6 Neural networks; 7.7 Tree-based model; 7.8 k-nearest-neighbor classifier; 7.9 Nonparametric regression based on the least squares; 7.10 Transformation of feature vectors; 7.11 Examples of S-Plus object; References; Problems Appendix A: Creation and applications of B-spline basesA.1 Introduction; A.2 Method to create B-spline basis; A.3 Natural spline created by B-spline; A.4 Application to smoothing spline; A.5 Examples of S-Plus object; References; Appendix B: R objects; B.1 Introduction; B.2 Transformation of S-Plus objects in Chapter 2; B.3 Transformation of S-Plus objects in Chapter 3; B.4 Transformation of S-Plus objects in Chapter 4; B.5 Transformation of S-Plus objects in Chapter 5; B.6 Transformation of S-Plus objects in Chapter 6; B.7 Transformation of S-Plus objects in Chapter 7 B.8 Transformation of S-Plus objects in Appendix A |
Record Nr. | UNINA-9910841700203321 |
Takezawa Kunio <1959-> | ||
Hoboken, N.J., : Wiley-Interscience, c2006 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Learning regression analysis by simulation / / Kunio Takezawa |
Autore | Takezawa Kunio <1959-> |
Pubbl/distr/stampa | Tokyo : , : Springer, , [2014] |
Descrizione fisica | 1 online resource (xii, 300 pages) : illustrations |
Disciplina | 519.536 |
Soggetto topico |
Regression analysis - Simulation methods
Statistical Theory and Methods Statistics and Computing/Statistics Programs Statistics for Engineering, Physics, Computer Science, Chemistry and Earth Sciences |
ISBN |
4-431-54321-X
9784431543213 9784431543206 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Preface; Acknowledgments; Contents; Chapter 1: Linear Algebra; 1.1 Starting Up and Executing R; 1.2 Vectors; 1.3 Matrices; 1.4 Addition of Two Matrices; 1.5 Multiplying Two Matrices; 1.6 Identity and Inverse Matrices; 1.7 Simultaneous Equations; 1.8 Diagonalization of a Symmetric Matrix; 1.9 Quadratic Forms; References; Chapter 2: Distributions and Tests; 2.1 Sampling and Random Variables; 2.2 Probability Distribution; 2.3 Normal Distribution and the Central Limit Theorem; 2.4 Interval Estimation by t Distribution; 2.5 t-Test
2.6 Interval Estimation of Population Varianceand the χ2 Distribution2.7 F Distribution and F-Test; 2.8 Wilcoxon Signed-Rank Sum Test; References; Chapter 3: Simple Regression; 3.1 Derivation of Regression Coefficients; 3.2 Exchange Between Predictor Variable and Target Variable; 3.3 Regression to the Mean; 3.4 Confidence Interval of Regression Coefficients in Simple Regression; 3.5 t-Test in Simple Regression; 3.6 F-Test on Simple Regression; 3.7 Selection Between Constant and Nonconstant Regression Equations; 3.8 Prediction Error of Simple Regression; 3.9 Weighted Regression 3.10 Least Squares Method and Prediction ErrorReferences; Chapter 4: Multiple Regression; 4.1 Derivation of Regression Coefficients; 4.2 Test on Multiple Regression; 4.3 Prediction Error on Multiple Regression; 4.4 Notes on Model Selection Using Prediction Error; 4.5 Polynomial Regression; 4.6 Variance of Regression Coefficient and Multicollinearity; 4.7 Detection of Multicollinearity Using Variance Inflation Factors; 4.8 Hessian Matrix of Log-Likelihood; References; Chapter 5: Akaike's Information Criterion (AIC) and the Third Variance; 5.1 Cp and FPE 5.2 AIC of a Multiple Regression Equation with Independent and Identical Normal Distribution5.3 Derivation of AIC for Multiple Regression; 5.4 AIC with Unbiased Estimator for Error Variance; 5.5 Error Variance by Maximizing Expectationof Log-Likelihood in Light of the Datain the Future and the ``Third Variance''; 5.6 Relationship Between AIC (or GCV) and F-Test; 5.7 AIC on Poisson Regression; References; Chapter 6: Linear Mixed Model; 6.1 Random-Effects Model; 6.2 Random Intercept Model; 6.3 Random Intercept and Slope Model; 6.4 Generalized Linear Mixed Model 6.5 Generalized Additive Mixed ModelIndex |
Record Nr. | UNINA-9910300145603321 |
Takezawa Kunio <1959-> | ||
Tokyo : , : Springer, , [2014] | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|