1.

Record Nr.

UNINA9910484234603321

Autore

Andrei Neculai

Titolo

Nonlinear Conjugate Gradient Methods for Unconstrained Optimization / / by Neculai Andrei

Pubbl/distr/stampa

Cham : , : Springer International Publishing : , : Imprint : Springer, , 2020

ISBN

3-030-42950-4

Edizione

[1st ed. 2020.]

Descrizione fisica

1 online resource (515 pages)

Collana

Springer Optimization and Its Applications, , 1931-6828 ; ; 158

Disciplina

512.5

512.94

Soggetti

Mathematical optimization

Mathematical models

Optimization

Mathematical Modeling and Industrial Mathematics

Lingua di pubblicazione

Inglese

Formato

Materiale a stampa

Livello bibliografico

Monografia

Nota di contenuto

1. Introduction -- 2. Linear Conjugate Gradient Algorithm -- 3. General Convergence Results for Nonlinear Conjugate Gradient Methods -- 4. Standard Conjugate Gradient Methods -- 5. Acceleration of Conjugate Gradient Algorithms -- 6. Hybrid and Parameterized Conjugate Gradient Methods -- 7. Conjugate Gradient Methods as Modifications of the Standard Schemes -- 8. Conjugate Gradient Methods Memoryless BFGS Preconditioned -- 9. Three-Term Conjugate Gradient Methods -- 10. Other Conjugate Gradient Methods -- 11. Discussion and Conclusions -- References -- Author Index -- Subject Index.

Sommario/riassunto

Two approaches are known for solving large-scale unconstrained optimization problems—the limited-memory quasi-Newton method (truncated Newton method) and the conjugate gradient method. This is the first book to detail conjugate gradient methods, showing their properties and convergence characteristics as well as their performance in solving large-scale unconstrained optimization problems and applications. Comparisons to the limited-memory and truncated Newton methods are also discussed. Topics studied in detail include: linear conjugate gradient methods, standard conjugate gradient



methods, acceleration of conjugate gradient methods, hybrid, modifications of the standard scheme, memoryless BFGS preconditioned, and three-term. Other conjugate gradient methods with clustering the eigenvalues or with the minimization of the condition number of the iteration matrix, are also treated. For each method, the convergence analysis, the computational performances and the comparisons versus other conjugate gradient methods are given. The theory behind the conjugate gradient algorithms presented as a methodology is developed with a clear, rigorous, and friendly exposition; the reader will gain an understanding of their properties and their convergence and will learn to develop and prove the convergence of his/her own methods. Numerous numerical studies are supplied with comparisons and comments on the behavior of conjugate gradient algorithms for solving a collection of 800 unconstrained optimization problems of different structures and complexities with the number of variables in the range [1000,10000]. The book is addressed to all those interested in developing and using new advanced techniques for solving unconstrained optimization complex problems. Mathematical programming researchers, theoreticians and practitioners in operations research, practitioners in engineering and industry researchers, as well as graduate students in mathematics, Ph.D. and master students in mathematical programming, will find plenty of information and practical applications for solving large-scale unconstrained optimization problems and applications by conjugate gradient methods.