Vai al contenuto principale della pagina
| Autore: |
Chong Edwin Kah Pin
|
| Titolo: |
An introduction to optimization / / Edwin K. P. Chong, Stanislaw H. Żak
|
| Pubblicazione: | Hoboken, New Jersey : , : Wiley, , 2013 |
| ©2013 | |
| Edizione: | Fourth edition. |
| Descrizione fisica: | 1 online resource (1007 p.) |
| Disciplina: | 519.6 |
| Soggetto topico: | Mathematical optimization |
| Soggetto genere / forma: | Electronic books. |
| Classificazione: | MAT008000 |
| Persona (resp. second.): | ŻakStanislaw H. |
| Note generali: | Description based upon print version of record. |
| Nota di bibliografia: | Includes bibliographical references and index. |
| Nota di contenuto: | Cover; Half Title page; Title page; Copyright page; Dedication; Preface; Part I: Mathematical Review; Chapter 1: Methods of Proof and Some Notation; 1.1 Methods of Proof; 1.2 Notation; Exercises; Chapter 2: Vector Spaces and Matrices; 2.1 Vector and Matrix; 2.2 Rank of a Matrix; 2.3 Linear Equations; 2.4 Inner Products and Norms; Exercises; Chapter 3: Transformations; 3.1 Linear Transformations; 3.2 Eigenvalues and Eigenvectors; 3.3 Orthogonal Projections; 3.4 Quadratic Forms; 3.5 Matrix Norms; Exercises; Chapter 4: Concepts from Geometry; 4.1 Line Segments |
| 4.2 Hyperplanes and Linear Varieties4.3 Convex Sets; 4.4 Neighborhoods; 4.5 Polytopes and Polyhedra; Exercises; Chapter 5: Elements of Calculus; 5.1 Sequences and Limits; 5.2 Differentiability; 5.3 The Derivative Matrix; 5.4 Differentiation Rules; 5.5 Level Sets and Gradients; 5.6 Taylor Series; Exercises; Part II: Unconstrained Optimization; Chapter 6: Basics of Set-Constrained and Unconstrained Optimization; 6.1 Introduction; 6.2 Conditions for Local Minimizers; Exercises; Chapter 7: One-Dimensional Search Methods; 7.1 Introduction; 7.2 Golden Section Search; 7.3 Fibonacci Method | |
| 7.4 Bisection Method7.5 Newton's Method; 7.6 Secant Method; 7.7 Bracketing; 7.8 Line Search in Multidimensional Optimization; Exercises; Chapter 8: Gradient Methods; 8.1 Introduction; 8.2 The Method of Steepest Descent; 8.3 Analysis of Gradient Methods; Exercises; Chapter 9: Newton's Method; 9.1 Introduction; 9.2 Analysis of Newton's Method; 9.3 Levenberg-Marquardt Modification; 9.4 Newton's Method for Nonlinear Least Squares; Exercises; Chapter 10: Conjugate Direction Methods; 10.1 Introduction; 10.2 The Conjugate Direction Algorithm; 10.3 The Conjugate Gradient Algorithm | |
| 10.4 The Conjugate Gradient Algorithm for Nonquadratic ProblemsExercises; Chapter 11: Quasi-Newton Methods; 11.1 Introduction; 11.2 Approximating the Inverse Hessian; 11.3 The Rank One Correction Formula; 11.4 The DFP Algorithm; 11.5 The BFGS Algorithm; Exercises; Chapter 12: Solving Linear Equations; 12.1 Least-Squares Analysis; 12.2 The Recursive Least-Squares Algorithm; 12.3 Solution to a Linear Equation with Minimum Norm; 12.4 Kaczmarz's Algorithm; 12.5 Solving Linear Equations in General; Exercises; Chapter 13: Unconstrained Optimization and Neural Networks; 13.1 Introduction | |
| 13.2 Single-Neuron Training13.3 The Backpropagation Algorithm; Exercises; Chapter 14: Global Search Algorithms; 14.1 Introduction; 14.2 The Nelder-Mead Simplex Algorithm; 14.3 Simulated Annealing; 14.4 Particle Swarm Optimization; 14.5 Genetic Algorithms; Exercises; Part III: Linear Programming; Chapter 15: Introduction to Linear Programming; 15.1 Brief History of Linear Programming; 15.2 Simple Examples of Linear Programs; 15.3 Two-Dimensional Linear Programs; 15.4 Convex Polyhedra and Linear Programming; 15.5 Standard Form Linear Programs; 15.6 Basic Solutions | |
| 15.7 Properties of Basic Solutions | |
| Sommario/riassunto: | "The purpose of the book is to give the reader a working knowledge of optimization theory and methods"-- |
| Titolo autorizzato: | Introduction to optimization ![]() |
| ISBN: | 1-118-52369-5 |
| 1-118-51515-3 | |
| Formato: | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione: | Inglese |
| Record Nr.: | 9910462743603321 |
| Lo trovi qui: | Univ. Federico II |
| Opac: | Controlla la disponibilità qui |