Continuous Nonlinear Optimization for Engineering Applications in GAMS Technology / / by Neculai Andrei |
Autore | Andrei Neculai |
Edizione | [1st ed. 2017.] |
Pubbl/distr/stampa | Cham : , : Springer International Publishing : , : Imprint : Springer, , 2017 |
Descrizione fisica | 1 online resource (XXIV, 506 p. 68 illus., 66 illus. in color.) |
Disciplina | 519.3 |
Collana | Springer Optimization and Its Applications |
Soggetto topico |
Mathematical optimization
Mathematical models Algorithms Optimization Mathematical Modeling and Industrial Mathematics |
ISBN | 3-319-58356-5 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto | 1. Introduction -- 2. Mathematical modeling using algebraically oriented languages for nonlinear optimization -- 3. Introduction to GAMS technology -- 4. Applications of continuous nonlinear optimization -- 5. Optimality conditions for continuous nonlinear optimization -- 6. Simple bound constraint optimization -- 7. Penalty and augmented Langrangian methods -- 8. Penalty-Barrier Algorithm -- 9. Linearly Constrained Augmented Lagrangian -- 10. Quadratic programming -- 11. Sequential quadratic programming -- 12. A SQP Method using only Equalit Constrained Sub-problem -- 12. A Sequential Quadratic Programming Algorithm with Successive Error Restoration -- 14. Active-set Sequential Linear-Quadratic Programming -- 15. A SQP algorithm for Large-Scale Constrained Optimization -- 16. Generalized Reduced Gradient with sequential linearization -- 17. Interior point methods -- 18. Filter methods -- 19. Interior Point Sequential Linear-Quadratic Programming -- 20. Interior Point Filer Line-Search IPOPT -- 21. Numerical studies. |
Record Nr. | UNINA-9910254296603321 |
Andrei Neculai | ||
Cham : , : Springer International Publishing : , : Imprint : Springer, , 2017 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
A derivative-free two level random search method for unconstrained optimization / / Neculai Andrei |
Autore | Andrei Neculai |
Pubbl/distr/stampa | Cham, Switzerland : , : Springer, , [2021] |
Descrizione fisica | 1 online resource (126 pages) : illustrations |
Disciplina | 519.6 |
Collana | SpringerBriefs in Optimization |
Soggetto topico |
Mathematical optimization
Optimització matemàtica |
Soggetto genere / forma | Llibres electrònics |
ISBN | 3-030-68517-9 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Record Nr. | UNISA-996466399303316 |
Andrei Neculai | ||
Cham, Switzerland : , : Springer, , [2021] | ||
Materiale a stampa | ||
Lo trovi qui: Univ. di Salerno | ||
|
A derivative-free two level random search method for unconstrained optimization / / Neculai Andrei |
Autore | Andrei Neculai |
Pubbl/distr/stampa | Cham, Switzerland : , : Springer, , [2021] |
Descrizione fisica | 1 online resource (126 pages) : illustrations |
Disciplina | 519.6 |
Collana | SpringerBriefs in Optimization |
Soggetto topico |
Mathematical optimization
Optimització matemàtica |
Soggetto genere / forma | Llibres electrònics |
ISBN | 3-030-68517-9 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Record Nr. | UNINA-9910484845603321 |
Andrei Neculai | ||
Cham, Switzerland : , : Springer, , [2021] | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Modern numerical nonlinear optimization / / Neculai Andrei |
Autore | Andrei Neculai |
Pubbl/distr/stampa | Cham, Switzerland : , : Springer, , [2022] |
Descrizione fisica | 1 online resource (824 pages) |
Disciplina | 016.5192 |
Collana | Springer Optimization and Its Applications |
Soggetto topico |
Mathematical optimization
Algebras, Linear Optimització matemàtica Àlgebra lineal |
Soggetto genere / forma | Llibres electrònics |
ISBN |
9783031087202
9783031087196 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Intro -- Preface -- Contents -- List of Algorithms -- List of Applications -- List of Figures -- List of Tables -- 1: Introduction -- 1.1 Mathematical Modeling: Linguistic Models Versus Mathematical Models -- 1.2 Mathematical Modeling and Computational Sciences -- 1.3 The Modern Modeling Scheme for Optimization -- 1.4 Classification of Optimization Problems -- 1.5 Optimization Algorithms -- 1.6 Collections of Applications for Numerical Experiments -- 1.7 Comparison of Algorithms -- 1.8 The Structure of the Book -- 2: Fundamentals on Unconstrained Optimization. Stepsize Computation -- 2.1 The Problem -- 2.2 Fundamentals on the Convergence of the Line-Search Methods -- 2.3 The General Algorithm for Unconstrained Optimization -- 2.4 Convergence of the Algorithm with Exact Line-Search -- 2.5 Inexact Line-Search Methods -- 2.6 Convergence of the Algorithm with Inexact Line-Search -- 2.7 Three Fortran Implementations of the Inexact Line-Search -- 2.8 Numerical Studies: Stepsize Computation -- 3: Steepest Descent Methods -- 3.1 The Steepest Descent -- Convergence of the Steepest Descent Method for Quadratic Functions -- Inequality of Kantorovich -- Numerical Study -- Convergence of the Steepest Descent Method for General Functions -- 3.2 The Relaxed Steepest Descent -- Numerical Study: SDB Versus RSDB -- 3.3 The Accelerated Steepest Descent -- Numerical Study -- 3.4 Comments on the Acceleration Scheme -- 4: The Newton Method -- 4.1 The Newton Method for Solving Nonlinear Algebraic Systems -- 4.2 The Gauss-Newton Method -- 4.3 The Newton Method for Function Minimization -- 4.4 The Newton Method with Line-Search -- 4.5 Analysis of Complexity -- 4.6 The Modified Newton Method -- 4.7 The Newton Method with Finite-Differences -- 4.8 Errors in Functions, Gradients, and Hessians -- 4.9 Negative Curvature Direction Methods -- 4.10 The Composite Newton Method.
5: Conjugate Gradient Methods -- 5.1 The Concept of Nonlinear Conjugate Gradient -- 5.2 The Linear Conjugate Gradient Method -- The Linear Conjugate Gradient Algorithm -- Convergence Rate of the Linear Conjugate Gradient Algorithm -- Preconditioning -- Incomplete Cholesky Factorization -- Comparison of the Convergence Rate of the Linear Conjugate Gradient and of the Steepest Descent -- 5.3 General Convergence Results for Nonlinear Conjugate Gradient Methods -- Convergence Under the Strong Wolfe Line-Search -- Convergence Under the Wolfe Line-Search -- 5.4 Standard Conjugate Gradient Methods -- Conjugate Gradient Methods with gk+12 in the Numerator of βk -- The Fletcher-Reeves Method -- The CD Method -- The Dai-Yuan Method -- Conjugate Gradient Methods with in the Numerator of βk -- The Polak-Ribière-Polyak Method -- The Hestenes-Stiefel Method -- The Liu-Storey Method -- Numerical Study: Standard Conjugate Gradient Methods -- 5.5 Hybrid Conjugate Gradient Methods -- Hybrid Conjugate Gradient Methods Based on the Projection Concept -- Numerical Study: Hybrid Conjugate Gradient Methods -- Hybrid Conjugate Gradient Methods as Convex Combinations of the Standard Conjugate Gradient Methods -- The Hybrid Convex Combination of LS and DY -- Numerical Study: NDLSDY -- 5.6 Conjugate Gradient Methods as Modifications of the Standard Schemes -- The Dai-Liao Conjugate Gradient Method -- The Conjugate Gradient with Guaranteed Descent (CG-DESCENT) -- Numerical Study: CG-DESCENT -- The Conjugate Gradient with Guaranteed Descent and Conjugacy Conditions and a Modified Wolfe Line-Search (DESCON) -- Numerical Study: DESCON -- 5.7 Conjugate Gradient Methods Memoryless BFGS Preconditioned -- The Memoryless BFGS Preconditioned Conjugate Gradient (CONMIN) -- Numerical Study: CONMIN. The Conjugate Gradient Method Closest to the Scaled Memoryless BFGS Search Direction (DK / CGOPT) -- Numerical Study: DK/CGOPT -- 5.8 Solving Large-Scale Applications -- 6: Quasi-Newton Methods -- 6.1 DFP and BFGS Methods -- 6.2 Modifications of the BFGS Method -- 6.3 Quasi-Newton Methods with Diagonal Updating of the Hessian -- 6.4 Limited-Memory Quasi-Newton Methods -- 6.5 The SR1 Method -- 6.6 Sparse Quasi-Newton Updates -- 6.7 Quasi-Newton Methods and Separable Functions -- 6.8 Solving Large-Scale Applications -- 7: Inexact Newton Methods -- 7.1 The Inexact Newton Method for Nonlinear Algebraic Systems -- 7.2 Inexact Newton Methods for Functions Minimization -- 7.3 The Line-Search Newton-CG Method -- 7.4 Comparison of TN Versus Conjugate Gradient Algorithms -- 7.5 Comparison of TN Versus L-BFGS -- 7.6 Solving Large-Scale Applications -- 8: The Trust-Region Method -- 8.1 The Trust-Region -- 8.2 Algorithms Based on the Cauchy Point -- 8.3 The Trust-Region Newton-CG Method -- 8.4 The Global Convergence -- 8.5 Iterative Solution of the Subproblem -- 8.6 The Scaled Trust-Region -- 9: Direct Methods for Unconstrained Optimization -- 9.1 The NELMED Algorithm -- 9.2 The NEWUOA Algorithm -- 9.3 The DEEPS Algorithm -- 9.4 Numerical Study: NELMED, NEWUOA, and DEEPS -- 10: Constrained Nonlinear Optimization Methods: An Overview -- 10.1 Convergence Tests -- 10.2 Infeasible Points -- 10.3 Approximate Subproblem: Local Models and Their Solving -- 10.4 Globalization Strategy: Convergence from Remote Starting Points -- 10.5 The Refining the Local Model -- 11: Optimality Conditions for Nonlinear Optimization -- 11.1 General Concepts in Nonlinear Optimization -- 11.2 Optimality Conditions for Unconstrained Optimization -- 11.3 Optimality Conditions for Problems with Inequality Constraints -- 11.4 Optimality Conditions for Problems with Equality Constraints. 11.5 Optimality Conditions for General Nonlinear Optimization Problems -- 11.6 Duality -- 12: Simple Bound Constrained Optimization -- 12.1 Necessary Conditions for Optimality -- 12.2 Sufficient Conditions for Optimality -- 12.3 Methods for Solving Simple Bound Optimization Problems -- 12.4 The Spectral Projected Gradient Method (SPG) -- Numerical Study-SPG: Quadratic Interpolation versus Cubic Interpolation -- 12.5 L-BFGS with Simple Bounds (L-BFGS-B) -- Numerical Study: L-BFGS-B Versus SPG -- 12.6 Truncated Newton with Simple Bounds (TNBC) -- 12.7 Applications -- Application A1 (Elastic-Plastic Torsion) -- Application A2 (Pressure Distribution in a Journal Bearing) -- Application A3 (Optimal Design with Composite Materials) -- Application A4 (Steady-State Combustion) -- Application A6 (Inhomogeneous Superconductors: 1-D Ginzburg-Landau) -- 13: Quadratic Programming -- 13.1 Equality Constrained Quadratic Programming -- Factorization of the Full KKT System -- The Schur-Complement Method -- The Null-Space Method -- Large-Scale Problems -- The Conjugate Gradient Applied to the Reduced System -- The Projected Conjugate Gradient Method -- 13.2 Inequality Constrained Quadratic Programming -- The Primal Active-Set Method -- An Algorithm for Positive Definite Hessian -- Reduced Gradient for Inequality Constraints -- The Reduced Gradient for Simple Bounds -- The Primal-Dual Active-Set Method -- 13.3 Interior Point Methods -- Stepsize Selection -- 13.4 Methods for Convex QP Problems with Equality Constraints -- 13.5 Quadratic Programming with Simple Bounds: The Gradient Projection Method -- The Cauchy Point -- Subproblem Minimization -- 13.6 Elimination of Variables -- 14: Penalty and Augmented Lagrangian Methods -- 14.1 The Quadratic Penalty Method -- 14.2 The Nonsmooth Penalty Method -- 14.3 The Augmented Lagrangian Method. 14.4 Criticism of the Penalty and Augmented Lagrangian Methods -- 14.5 A Penalty-Barrier Algorithm (SPENBAR) -- The Penalty-Barrier Method -- Global Convergence -- Numerical Study-SPENBAR: Solving Applications from the LACOP Collection -- 14.6 The Linearly Constrained Augmented Lagrangian (MINOS) -- MINOS for Linear Constraints -- Numerical Study: MINOS for Linear Programming -- MINOS for Nonlinear Constraints -- Numerical Study-MINOS: Solving Applications from the LACOP Collection -- 15: Sequential Quadratic Programming -- 15.1 A Simple Approach to SQP -- 15.2 Reduced-Hessian Quasi-Newton Approximations -- 15.3 Merit Functions -- 15.4 Second-Order Correction (Maratos Effect) -- 15.5 The Line-Search SQP Algorithm -- 15.6 The Trust-Region SQP Algorithm -- 15.7 Sequential Linear-Quadratic Programming (SLQP) -- 15.8 A SQP Algorithm for Large-Scale-Constrained Optimization (SNOPT) -- 15.9 A SQP Algorithm with Successive Error Restoration (NLPQLP) -- 15.10 Active-Set Sequential Linear-Quadratic Programming (KNITRO/ACTIVE) -- 16: Primal Methods: The Generalized Reduced Gradient with Sequential Linearization -- 16.1 Feasible Direction Methods -- 16.2 Active Set Methods -- 16.3 The Gradient Projection Method -- 16.4 The Reduced Gradient Method -- 16.5 The Convex Simplex Method -- 16.6 The Generalized Reduced Gradient Method (GRG) -- 16.7 GRG with Sequential Linear or Sequential Quadratic Programming (CONOPT) -- 17: Interior-Point Methods -- 17.1 Prototype of the Interior-Point Algorithm -- 17.2 Aspects of the Algorithmic Developments -- 17.3 Line-Search Interior-Point Algorithm -- 17.4 A Variant of the Line-Search Interior-Point Algorithm -- 17.5 Trust-Region Interior-Point Algorithm -- 17.6 Interior-Point Sequential Linear-Quadratic Programming (KNITRO/INTERIOR) -- 18: Filter Methods -- 18.1 Sequential Linear Programming Filter Algorithm. 18.2 Sequential Quadratic Programming Filter Algorithm. |
Record Nr. | UNINA-9910619281203321 |
Andrei Neculai | ||
Cham, Switzerland : , : Springer, , [2022] | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Modern numerical nonlinear optimization / / Neculai Andrei |
Autore | Andrei Neculai |
Pubbl/distr/stampa | Cham, Switzerland : , : Springer, , [2022] |
Descrizione fisica | 1 online resource (824 pages) |
Disciplina | 016.5192 |
Collana | Springer Optimization and Its Applications |
Soggetto topico |
Mathematical optimization
Algebras, Linear Optimització matemàtica Àlgebra lineal |
Soggetto genere / forma | Llibres electrònics |
ISBN |
9783031087202
9783031087196 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Intro -- Preface -- Contents -- List of Algorithms -- List of Applications -- List of Figures -- List of Tables -- 1: Introduction -- 1.1 Mathematical Modeling: Linguistic Models Versus Mathematical Models -- 1.2 Mathematical Modeling and Computational Sciences -- 1.3 The Modern Modeling Scheme for Optimization -- 1.4 Classification of Optimization Problems -- 1.5 Optimization Algorithms -- 1.6 Collections of Applications for Numerical Experiments -- 1.7 Comparison of Algorithms -- 1.8 The Structure of the Book -- 2: Fundamentals on Unconstrained Optimization. Stepsize Computation -- 2.1 The Problem -- 2.2 Fundamentals on the Convergence of the Line-Search Methods -- 2.3 The General Algorithm for Unconstrained Optimization -- 2.4 Convergence of the Algorithm with Exact Line-Search -- 2.5 Inexact Line-Search Methods -- 2.6 Convergence of the Algorithm with Inexact Line-Search -- 2.7 Three Fortran Implementations of the Inexact Line-Search -- 2.8 Numerical Studies: Stepsize Computation -- 3: Steepest Descent Methods -- 3.1 The Steepest Descent -- Convergence of the Steepest Descent Method for Quadratic Functions -- Inequality of Kantorovich -- Numerical Study -- Convergence of the Steepest Descent Method for General Functions -- 3.2 The Relaxed Steepest Descent -- Numerical Study: SDB Versus RSDB -- 3.3 The Accelerated Steepest Descent -- Numerical Study -- 3.4 Comments on the Acceleration Scheme -- 4: The Newton Method -- 4.1 The Newton Method for Solving Nonlinear Algebraic Systems -- 4.2 The Gauss-Newton Method -- 4.3 The Newton Method for Function Minimization -- 4.4 The Newton Method with Line-Search -- 4.5 Analysis of Complexity -- 4.6 The Modified Newton Method -- 4.7 The Newton Method with Finite-Differences -- 4.8 Errors in Functions, Gradients, and Hessians -- 4.9 Negative Curvature Direction Methods -- 4.10 The Composite Newton Method.
5: Conjugate Gradient Methods -- 5.1 The Concept of Nonlinear Conjugate Gradient -- 5.2 The Linear Conjugate Gradient Method -- The Linear Conjugate Gradient Algorithm -- Convergence Rate of the Linear Conjugate Gradient Algorithm -- Preconditioning -- Incomplete Cholesky Factorization -- Comparison of the Convergence Rate of the Linear Conjugate Gradient and of the Steepest Descent -- 5.3 General Convergence Results for Nonlinear Conjugate Gradient Methods -- Convergence Under the Strong Wolfe Line-Search -- Convergence Under the Wolfe Line-Search -- 5.4 Standard Conjugate Gradient Methods -- Conjugate Gradient Methods with gk+12 in the Numerator of βk -- The Fletcher-Reeves Method -- The CD Method -- The Dai-Yuan Method -- Conjugate Gradient Methods with in the Numerator of βk -- The Polak-Ribière-Polyak Method -- The Hestenes-Stiefel Method -- The Liu-Storey Method -- Numerical Study: Standard Conjugate Gradient Methods -- 5.5 Hybrid Conjugate Gradient Methods -- Hybrid Conjugate Gradient Methods Based on the Projection Concept -- Numerical Study: Hybrid Conjugate Gradient Methods -- Hybrid Conjugate Gradient Methods as Convex Combinations of the Standard Conjugate Gradient Methods -- The Hybrid Convex Combination of LS and DY -- Numerical Study: NDLSDY -- 5.6 Conjugate Gradient Methods as Modifications of the Standard Schemes -- The Dai-Liao Conjugate Gradient Method -- The Conjugate Gradient with Guaranteed Descent (CG-DESCENT) -- Numerical Study: CG-DESCENT -- The Conjugate Gradient with Guaranteed Descent and Conjugacy Conditions and a Modified Wolfe Line-Search (DESCON) -- Numerical Study: DESCON -- 5.7 Conjugate Gradient Methods Memoryless BFGS Preconditioned -- The Memoryless BFGS Preconditioned Conjugate Gradient (CONMIN) -- Numerical Study: CONMIN. The Conjugate Gradient Method Closest to the Scaled Memoryless BFGS Search Direction (DK / CGOPT) -- Numerical Study: DK/CGOPT -- 5.8 Solving Large-Scale Applications -- 6: Quasi-Newton Methods -- 6.1 DFP and BFGS Methods -- 6.2 Modifications of the BFGS Method -- 6.3 Quasi-Newton Methods with Diagonal Updating of the Hessian -- 6.4 Limited-Memory Quasi-Newton Methods -- 6.5 The SR1 Method -- 6.6 Sparse Quasi-Newton Updates -- 6.7 Quasi-Newton Methods and Separable Functions -- 6.8 Solving Large-Scale Applications -- 7: Inexact Newton Methods -- 7.1 The Inexact Newton Method for Nonlinear Algebraic Systems -- 7.2 Inexact Newton Methods for Functions Minimization -- 7.3 The Line-Search Newton-CG Method -- 7.4 Comparison of TN Versus Conjugate Gradient Algorithms -- 7.5 Comparison of TN Versus L-BFGS -- 7.6 Solving Large-Scale Applications -- 8: The Trust-Region Method -- 8.1 The Trust-Region -- 8.2 Algorithms Based on the Cauchy Point -- 8.3 The Trust-Region Newton-CG Method -- 8.4 The Global Convergence -- 8.5 Iterative Solution of the Subproblem -- 8.6 The Scaled Trust-Region -- 9: Direct Methods for Unconstrained Optimization -- 9.1 The NELMED Algorithm -- 9.2 The NEWUOA Algorithm -- 9.3 The DEEPS Algorithm -- 9.4 Numerical Study: NELMED, NEWUOA, and DEEPS -- 10: Constrained Nonlinear Optimization Methods: An Overview -- 10.1 Convergence Tests -- 10.2 Infeasible Points -- 10.3 Approximate Subproblem: Local Models and Their Solving -- 10.4 Globalization Strategy: Convergence from Remote Starting Points -- 10.5 The Refining the Local Model -- 11: Optimality Conditions for Nonlinear Optimization -- 11.1 General Concepts in Nonlinear Optimization -- 11.2 Optimality Conditions for Unconstrained Optimization -- 11.3 Optimality Conditions for Problems with Inequality Constraints -- 11.4 Optimality Conditions for Problems with Equality Constraints. 11.5 Optimality Conditions for General Nonlinear Optimization Problems -- 11.6 Duality -- 12: Simple Bound Constrained Optimization -- 12.1 Necessary Conditions for Optimality -- 12.2 Sufficient Conditions for Optimality -- 12.3 Methods for Solving Simple Bound Optimization Problems -- 12.4 The Spectral Projected Gradient Method (SPG) -- Numerical Study-SPG: Quadratic Interpolation versus Cubic Interpolation -- 12.5 L-BFGS with Simple Bounds (L-BFGS-B) -- Numerical Study: L-BFGS-B Versus SPG -- 12.6 Truncated Newton with Simple Bounds (TNBC) -- 12.7 Applications -- Application A1 (Elastic-Plastic Torsion) -- Application A2 (Pressure Distribution in a Journal Bearing) -- Application A3 (Optimal Design with Composite Materials) -- Application A4 (Steady-State Combustion) -- Application A6 (Inhomogeneous Superconductors: 1-D Ginzburg-Landau) -- 13: Quadratic Programming -- 13.1 Equality Constrained Quadratic Programming -- Factorization of the Full KKT System -- The Schur-Complement Method -- The Null-Space Method -- Large-Scale Problems -- The Conjugate Gradient Applied to the Reduced System -- The Projected Conjugate Gradient Method -- 13.2 Inequality Constrained Quadratic Programming -- The Primal Active-Set Method -- An Algorithm for Positive Definite Hessian -- Reduced Gradient for Inequality Constraints -- The Reduced Gradient for Simple Bounds -- The Primal-Dual Active-Set Method -- 13.3 Interior Point Methods -- Stepsize Selection -- 13.4 Methods for Convex QP Problems with Equality Constraints -- 13.5 Quadratic Programming with Simple Bounds: The Gradient Projection Method -- The Cauchy Point -- Subproblem Minimization -- 13.6 Elimination of Variables -- 14: Penalty and Augmented Lagrangian Methods -- 14.1 The Quadratic Penalty Method -- 14.2 The Nonsmooth Penalty Method -- 14.3 The Augmented Lagrangian Method. 14.4 Criticism of the Penalty and Augmented Lagrangian Methods -- 14.5 A Penalty-Barrier Algorithm (SPENBAR) -- The Penalty-Barrier Method -- Global Convergence -- Numerical Study-SPENBAR: Solving Applications from the LACOP Collection -- 14.6 The Linearly Constrained Augmented Lagrangian (MINOS) -- MINOS for Linear Constraints -- Numerical Study: MINOS for Linear Programming -- MINOS for Nonlinear Constraints -- Numerical Study-MINOS: Solving Applications from the LACOP Collection -- 15: Sequential Quadratic Programming -- 15.1 A Simple Approach to SQP -- 15.2 Reduced-Hessian Quasi-Newton Approximations -- 15.3 Merit Functions -- 15.4 Second-Order Correction (Maratos Effect) -- 15.5 The Line-Search SQP Algorithm -- 15.6 The Trust-Region SQP Algorithm -- 15.7 Sequential Linear-Quadratic Programming (SLQP) -- 15.8 A SQP Algorithm for Large-Scale-Constrained Optimization (SNOPT) -- 15.9 A SQP Algorithm with Successive Error Restoration (NLPQLP) -- 15.10 Active-Set Sequential Linear-Quadratic Programming (KNITRO/ACTIVE) -- 16: Primal Methods: The Generalized Reduced Gradient with Sequential Linearization -- 16.1 Feasible Direction Methods -- 16.2 Active Set Methods -- 16.3 The Gradient Projection Method -- 16.4 The Reduced Gradient Method -- 16.5 The Convex Simplex Method -- 16.6 The Generalized Reduced Gradient Method (GRG) -- 16.7 GRG with Sequential Linear or Sequential Quadratic Programming (CONOPT) -- 17: Interior-Point Methods -- 17.1 Prototype of the Interior-Point Algorithm -- 17.2 Aspects of the Algorithmic Developments -- 17.3 Line-Search Interior-Point Algorithm -- 17.4 A Variant of the Line-Search Interior-Point Algorithm -- 17.5 Trust-Region Interior-Point Algorithm -- 17.6 Interior-Point Sequential Linear-Quadratic Programming (KNITRO/INTERIOR) -- 18: Filter Methods -- 18.1 Sequential Linear Programming Filter Algorithm. 18.2 Sequential Quadratic Programming Filter Algorithm. |
Record Nr. | UNISA-996495169603316 |
Andrei Neculai | ||
Cham, Switzerland : , : Springer, , [2022] | ||
Materiale a stampa | ||
Lo trovi qui: Univ. di Salerno | ||
|
Nonlinear Conjugate Gradient Methods for Unconstrained Optimization [[electronic resource] /] / by Neculai Andrei |
Autore | Andrei Neculai |
Edizione | [1st ed. 2020.] |
Pubbl/distr/stampa | Cham : , : Springer International Publishing : , : Imprint : Springer, , 2020 |
Descrizione fisica | 1 online resource (515 pages) |
Disciplina | 512.5 |
Collana | Springer Optimization and Its Applications |
Soggetto topico |
Mathematical optimization
Mathematical models Optimization Mathematical Modeling and Industrial Mathematics |
ISBN | 3-030-42950-4 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto | 1. Introduction -- 2. Linear Conjugate Gradient Algorithm -- 3. General Convergence Results for Nonlinear Conjugate Gradient Methods -- 4. Standard Conjugate Gradient Methods -- 5. Acceleration of Conjugate Gradient Algorithms -- 6. Hybrid and Parameterized Conjugate Gradient Methods -- 7. Conjugate Gradient Methods as Modifications of the Standard Schemes -- 8. Conjugate Gradient Methods Memoryless BFGS Preconditioned -- 9. Three-Term Conjugate Gradient Methods -- 10. Other Conjugate Gradient Methods -- 11. Discussion and Conclusions -- References -- Author Index -- Subject Index. |
Record Nr. | UNISA-996418199103316 |
Andrei Neculai | ||
Cham : , : Springer International Publishing : , : Imprint : Springer, , 2020 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. di Salerno | ||
|
Nonlinear Conjugate Gradient Methods for Unconstrained Optimization / / by Neculai Andrei |
Autore | Andrei Neculai |
Edizione | [1st ed. 2020.] |
Pubbl/distr/stampa | Cham : , : Springer International Publishing : , : Imprint : Springer, , 2020 |
Descrizione fisica | 1 online resource (515 pages) |
Disciplina |
512.5
512.94 |
Collana | Springer Optimization and Its Applications |
Soggetto topico |
Mathematical optimization
Mathematical models Optimization Mathematical Modeling and Industrial Mathematics |
ISBN | 3-030-42950-4 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto | 1. Introduction -- 2. Linear Conjugate Gradient Algorithm -- 3. General Convergence Results for Nonlinear Conjugate Gradient Methods -- 4. Standard Conjugate Gradient Methods -- 5. Acceleration of Conjugate Gradient Algorithms -- 6. Hybrid and Parameterized Conjugate Gradient Methods -- 7. Conjugate Gradient Methods as Modifications of the Standard Schemes -- 8. Conjugate Gradient Methods Memoryless BFGS Preconditioned -- 9. Three-Term Conjugate Gradient Methods -- 10. Other Conjugate Gradient Methods -- 11. Discussion and Conclusions -- References -- Author Index -- Subject Index. |
Record Nr. | UNINA-9910484234603321 |
Andrei Neculai | ||
Cham : , : Springer International Publishing : , : Imprint : Springer, , 2020 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Nonlinear optimization applications using the GAMS technology / / Neculai Andrei |
Autore | Andrei Neculai |
Edizione | [1st ed. 2013.] |
Pubbl/distr/stampa | New York, : Springer, 2013 |
Descrizione fisica | 1 online resource (xxii, 340 pages) : illustrations (some color) |
Disciplina | 003.75 |
Collana | Springer optimization and its applications |
Soggetto topico |
Nonlinear theories
Mathematical analysis |
ISBN | 1-4614-6797-7 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto | Preface -- List of Figures -- List of Applications -- 1. Mathematical Modeling Using Algebraic Oriented Languages -- 2. Introduction to GAMS Technology -- 3. Nonlinear Optimization Applications in GAMS Technology -- References -- Subject Index -- Author Index. |
Record Nr. | UNINA-9910739433203321 |
Andrei Neculai | ||
New York, : Springer, 2013 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|