Nonlinear parameter optimization using R tools / / John C. Nash |
Autore | Nash John C. <1947-> |
Edizione | [1st edition] |
Pubbl/distr/stampa | Chichester, England : , : Wiley, , 2014 |
Descrizione fisica | 1 online resource (305 p.) |
Disciplina | 519.60285/5133 |
Soggetto topico |
Mathematical optimization
Nonlinear theories R (Computer program language) |
ISBN |
1-118-88400-0
1-118-88396-9 1-118-88475-2 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Cover; Title Page; Copyright; Contents; Preface; Chapter 1 Optimization problem tasks and how they arise; 1.1 The general optimization problem; 1.2 Why the general problem is generally uninteresting; 1.3 (Non-)Linearity; 1.4 Objective function properties; 1.4.1 Sums of squares; 1.4.2 Minimax approximation; 1.4.3 Problems with multiple minima; 1.4.4 Objectives that can only be imprecisely computed; 1.5 Constraint types; 1.6 Solving sets of equations; 1.7 Conditions for optimality; 1.8 Other classifications; References; Chapter 2 Optimization algorithms-an overview
2.1 Methods that use the gradient2.2 Newton-like methods; 2.3 The promise of Newton's method; 2.4 Caution: convergence versus termination; 2.5 Difficulties with Newton's method; 2.6 Least squares: Gauss-Newton methods; 2.7 Quasi-Newton or variable metric method; 2.8 Conjugate gradient and related methods; 2.9 Other gradient methods; 2.10 Derivative-free methods; 2.10.1 Numerical approximation of gradients; 2.10.2 Approximate and descend; 2.10.3 Heuristic search; 2.11 Stochastic methods; 2.12 Constraint-based methods-mathematical programming; References Chapter 3 Software structure and interfaces3.1 Perspective; 3.2 Issues of choice; 3.3 Software issues; 3.4 Specifying the objective and constraints to the optimizer; 3.5 Communicating exogenous data to problem definition functions; 3.5.1 Use of ""global'' data and variables; 3.6 Masked (temporarily fixed) optimization parameters; 3.7 Dealing with inadmissible results; 3.8 Providing derivatives for functions; 3.9 Derivative approximations when there are constraints; 3.10 Scaling of parameters and function; 3.11 Normal ending of computations; 3.12 Termination tests-abnormal ending 3.13 Output to monitor progress of calculations3.14 Output of the optimization results; 3.15 Controls for the optimizer; 3.16 Default control settings; 3.17 Measuring performance; 3.18 The optimization interface; References; Chapter 4 One-parameter root-finding problems; 4.1 Roots; 4.2 Equations in one variable; 4.3 Some examples; 4.3.1 Exponentially speaking; 4.3.2 A normal concern; 4.3.3 Little Polly Nomial; 4.3.4 A hypothequial question; 4.4 Approaches to solving 1D root-finding problems; 4.5 What can go wrong?; 4.6 Being a smart user of root-finding programs 4.7 Conclusions and extensionsReferences; Chapter 5 One-parameter minimization problems; 5.1 The optimize() function; 5.2 Using a root-finder; 5.3 But where is the minimum?; 5.4 Ideas for 1D minimizers; 5.5 The line-search subproblem; References; Chapter 6 Nonlinear least squares; 6.1 nls() from package stats; 6.1.1 A simple example; 6.1.2 Regression versus least squares; 6.2 A more difficult case; 6.3 The structure of the nls() solution; 6.4 Concerns with nls(); 6.4.1 Small residuals; 6.4.2 Robustness-""singular gradient'' woes; 6.4.3 Bounds with nls() 6.5 Some ancillary tools for nonlinear least squares |
Record Nr. | UNINA-9910139133403321 |
Nash John C. <1947-> | ||
Chichester, England : , : Wiley, , 2014 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Nonlinear parameter optimization using R tools / / John C. Nash |
Autore | Nash John C. <1947-> |
Edizione | [1st edition] |
Pubbl/distr/stampa | Chichester, England : , : Wiley, , 2014 |
Descrizione fisica | 1 online resource (305 p.) |
Disciplina | 519.60285/5133 |
Soggetto topico |
Mathematical optimization
Nonlinear theories R (Computer program language) |
ISBN |
1-118-88400-0
1-118-88396-9 1-118-88475-2 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Cover; Title Page; Copyright; Contents; Preface; Chapter 1 Optimization problem tasks and how they arise; 1.1 The general optimization problem; 1.2 Why the general problem is generally uninteresting; 1.3 (Non-)Linearity; 1.4 Objective function properties; 1.4.1 Sums of squares; 1.4.2 Minimax approximation; 1.4.3 Problems with multiple minima; 1.4.4 Objectives that can only be imprecisely computed; 1.5 Constraint types; 1.6 Solving sets of equations; 1.7 Conditions for optimality; 1.8 Other classifications; References; Chapter 2 Optimization algorithms-an overview
2.1 Methods that use the gradient2.2 Newton-like methods; 2.3 The promise of Newton's method; 2.4 Caution: convergence versus termination; 2.5 Difficulties with Newton's method; 2.6 Least squares: Gauss-Newton methods; 2.7 Quasi-Newton or variable metric method; 2.8 Conjugate gradient and related methods; 2.9 Other gradient methods; 2.10 Derivative-free methods; 2.10.1 Numerical approximation of gradients; 2.10.2 Approximate and descend; 2.10.3 Heuristic search; 2.11 Stochastic methods; 2.12 Constraint-based methods-mathematical programming; References Chapter 3 Software structure and interfaces3.1 Perspective; 3.2 Issues of choice; 3.3 Software issues; 3.4 Specifying the objective and constraints to the optimizer; 3.5 Communicating exogenous data to problem definition functions; 3.5.1 Use of ""global'' data and variables; 3.6 Masked (temporarily fixed) optimization parameters; 3.7 Dealing with inadmissible results; 3.8 Providing derivatives for functions; 3.9 Derivative approximations when there are constraints; 3.10 Scaling of parameters and function; 3.11 Normal ending of computations; 3.12 Termination tests-abnormal ending 3.13 Output to monitor progress of calculations3.14 Output of the optimization results; 3.15 Controls for the optimizer; 3.16 Default control settings; 3.17 Measuring performance; 3.18 The optimization interface; References; Chapter 4 One-parameter root-finding problems; 4.1 Roots; 4.2 Equations in one variable; 4.3 Some examples; 4.3.1 Exponentially speaking; 4.3.2 A normal concern; 4.3.3 Little Polly Nomial; 4.3.4 A hypothequial question; 4.4 Approaches to solving 1D root-finding problems; 4.5 What can go wrong?; 4.6 Being a smart user of root-finding programs 4.7 Conclusions and extensionsReferences; Chapter 5 One-parameter minimization problems; 5.1 The optimize() function; 5.2 Using a root-finder; 5.3 But where is the minimum?; 5.4 Ideas for 1D minimizers; 5.5 The line-search subproblem; References; Chapter 6 Nonlinear least squares; 6.1 nls() from package stats; 6.1.1 A simple example; 6.1.2 Regression versus least squares; 6.2 A more difficult case; 6.3 The structure of the nls() solution; 6.4 Concerns with nls(); 6.4.1 Small residuals; 6.4.2 Robustness-""singular gradient'' woes; 6.4.3 Bounds with nls() 6.5 Some ancillary tools for nonlinear least squares |
Record Nr. | UNINA-9910806870203321 |
Nash John C. <1947-> | ||
Chichester, England : , : Wiley, , 2014 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|