top

  Info

  • Utilizzare la checkbox di selezione a fianco di ciascun documento per attivare le funzionalità di stampa, invio email, download nei formati disponibili del (i) record.

  Info

  • Utilizzare questo link per rimuovere la selezione effettuata.
Advanced metaheuristic algorithms and their applications in structural optimization / / Ali Kaveh and Kiarash Biabani Hamedani
Advanced metaheuristic algorithms and their applications in structural optimization / / Ali Kaveh and Kiarash Biabani Hamedani
Autore Kaveh Ali
Pubbl/distr/stampa Cham, Switzerland : , : Springer, , [2022]
Descrizione fisica 1 online resource (369 pages)
Disciplina 016.5192
Collana Studies in Computational Intelligence
Soggetto topico Mathematical optimization
ISBN 3-031-13429-X
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Record Nr. UNINA-9910595037303321
Kaveh Ali  
Cham, Switzerland : , : Springer, , [2022]
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Advancing parametric optimization : on multiparametric linear complementarity problems with parameters in general locations / / Nathan Adelgren
Advancing parametric optimization : on multiparametric linear complementarity problems with parameters in general locations / / Nathan Adelgren
Autore Adelgren Nathan
Edizione [1st ed. 2021.]
Pubbl/distr/stampa Cham, Switzerland : , : Springer, , [2021]
Descrizione fisica 1 online resource (XII, 113 p. 8 illus., 7 illus. in color.)
Disciplina 016.5192
Collana SpringerBriefs in Optimization
Soggetto topico Mathematical optimization
Geometry, Algebraic
ISBN 3-030-61821-8
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto 1. Introduction -- 2. Background on mpLCP -- 3. Algebraic Properties of Invariancy Regions -- 4. Phase 2: Partitioning the Parameter Space -- 5. Phase 1: Determining an Initial Feasible Solution -- 6. Further Considerations -- 7. Assessment of Performance -- 8. Conclusion -- Appendix A. Tableaux for Example 2.1 -- Appendix B. Tableaux for Example 2.2 -- References.
Record Nr. UNINA-9910484421303321
Adelgren Nathan  
Cham, Switzerland : , : Springer, , [2021]
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Advancing parametric optimization : on multiparametric linear complementarity problems with parameters in general locations / / Nathan Adelgren
Advancing parametric optimization : on multiparametric linear complementarity problems with parameters in general locations / / Nathan Adelgren
Autore Adelgren Nathan
Edizione [1st ed. 2021.]
Pubbl/distr/stampa Cham, Switzerland : , : Springer, , [2021]
Descrizione fisica 1 online resource (XII, 113 p. 8 illus., 7 illus. in color.)
Disciplina 016.5192
Collana SpringerBriefs in Optimization
Soggetto topico Mathematical optimization
Geometry, Algebraic
ISBN 3-030-61821-8
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto 1. Introduction -- 2. Background on mpLCP -- 3. Algebraic Properties of Invariancy Regions -- 4. Phase 2: Partitioning the Parameter Space -- 5. Phase 1: Determining an Initial Feasible Solution -- 6. Further Considerations -- 7. Assessment of Performance -- 8. Conclusion -- Appendix A. Tableaux for Example 2.1 -- Appendix B. Tableaux for Example 2.2 -- References.
Record Nr. UNISA-996466551403316
Adelgren Nathan  
Cham, Switzerland : , : Springer, , [2021]
Materiale a stampa
Lo trovi qui: Univ. di Salerno
Opac: Controlla la disponibilità qui
Modern numerical nonlinear optimization / / Neculai Andrei
Modern numerical nonlinear optimization / / Neculai Andrei
Autore Andrei Neculai
Pubbl/distr/stampa Cham, Switzerland : , : Springer, , [2022]
Descrizione fisica 1 online resource (824 pages)
Disciplina 016.5192
Collana Springer Optimization and Its Applications
Soggetto topico Mathematical optimization
Algebras, Linear
Optimització matemàtica
Àlgebra lineal
Soggetto genere / forma Llibres electrònics
ISBN 9783031087202
9783031087196
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Intro -- Preface -- Contents -- List of Algorithms -- List of Applications -- List of Figures -- List of Tables -- 1: Introduction -- 1.1 Mathematical Modeling: Linguistic Models Versus Mathematical Models -- 1.2 Mathematical Modeling and Computational Sciences -- 1.3 The Modern Modeling Scheme for Optimization -- 1.4 Classification of Optimization Problems -- 1.5 Optimization Algorithms -- 1.6 Collections of Applications for Numerical Experiments -- 1.7 Comparison of Algorithms -- 1.8 The Structure of the Book -- 2: Fundamentals on Unconstrained Optimization. Stepsize Computation -- 2.1 The Problem -- 2.2 Fundamentals on the Convergence of the Line-Search Methods -- 2.3 The General Algorithm for Unconstrained Optimization -- 2.4 Convergence of the Algorithm with Exact Line-Search -- 2.5 Inexact Line-Search Methods -- 2.6 Convergence of the Algorithm with Inexact Line-Search -- 2.7 Three Fortran Implementations of the Inexact Line-Search -- 2.8 Numerical Studies: Stepsize Computation -- 3: Steepest Descent Methods -- 3.1 The Steepest Descent -- Convergence of the Steepest Descent Method for Quadratic Functions -- Inequality of Kantorovich -- Numerical Study -- Convergence of the Steepest Descent Method for General Functions -- 3.2 The Relaxed Steepest Descent -- Numerical Study: SDB Versus RSDB -- 3.3 The Accelerated Steepest Descent -- Numerical Study -- 3.4 Comments on the Acceleration Scheme -- 4: The Newton Method -- 4.1 The Newton Method for Solving Nonlinear Algebraic Systems -- 4.2 The Gauss-Newton Method -- 4.3 The Newton Method for Function Minimization -- 4.4 The Newton Method with Line-Search -- 4.5 Analysis of Complexity -- 4.6 The Modified Newton Method -- 4.7 The Newton Method with Finite-Differences -- 4.8 Errors in Functions, Gradients, and Hessians -- 4.9 Negative Curvature Direction Methods -- 4.10 The Composite Newton Method.
5: Conjugate Gradient Methods -- 5.1 The Concept of Nonlinear Conjugate Gradient -- 5.2 The Linear Conjugate Gradient Method -- The Linear Conjugate Gradient Algorithm -- Convergence Rate of the Linear Conjugate Gradient Algorithm -- Preconditioning -- Incomplete Cholesky Factorization -- Comparison of the Convergence Rate of the Linear Conjugate Gradient and of the Steepest Descent -- 5.3 General Convergence Results for Nonlinear Conjugate Gradient Methods -- Convergence Under the Strong Wolfe Line-Search -- Convergence Under the Wolfe Line-Search -- 5.4 Standard Conjugate Gradient Methods -- Conjugate Gradient Methods with gk+12 in the Numerator of βk -- The Fletcher-Reeves Method -- The CD Method -- The Dai-Yuan Method -- Conjugate Gradient Methods with in the Numerator of βk -- The Polak-Ribière-Polyak Method -- The Hestenes-Stiefel Method -- The Liu-Storey Method -- Numerical Study: Standard Conjugate Gradient Methods -- 5.5 Hybrid Conjugate Gradient Methods -- Hybrid Conjugate Gradient Methods Based on the Projection Concept -- Numerical Study: Hybrid Conjugate Gradient Methods -- Hybrid Conjugate Gradient Methods as Convex Combinations of the Standard Conjugate Gradient Methods -- The Hybrid Convex Combination of LS and DY -- Numerical Study: NDLSDY -- 5.6 Conjugate Gradient Methods as Modifications of the Standard Schemes -- The Dai-Liao Conjugate Gradient Method -- The Conjugate Gradient with Guaranteed Descent (CG-DESCENT) -- Numerical Study: CG-DESCENT -- The Conjugate Gradient with Guaranteed Descent and Conjugacy Conditions and a Modified Wolfe Line-Search (DESCON) -- Numerical Study: DESCON -- 5.7 Conjugate Gradient Methods Memoryless BFGS Preconditioned -- The Memoryless BFGS Preconditioned Conjugate Gradient (CONMIN) -- Numerical Study: CONMIN.
The Conjugate Gradient Method Closest to the Scaled Memoryless BFGS Search Direction (DK / CGOPT) -- Numerical Study: DK/CGOPT -- 5.8 Solving Large-Scale Applications -- 6: Quasi-Newton Methods -- 6.1 DFP and BFGS Methods -- 6.2 Modifications of the BFGS Method -- 6.3 Quasi-Newton Methods with Diagonal Updating of the Hessian -- 6.4 Limited-Memory Quasi-Newton Methods -- 6.5 The SR1 Method -- 6.6 Sparse Quasi-Newton Updates -- 6.7 Quasi-Newton Methods and Separable Functions -- 6.8 Solving Large-Scale Applications -- 7: Inexact Newton Methods -- 7.1 The Inexact Newton Method for Nonlinear Algebraic Systems -- 7.2 Inexact Newton Methods for Functions Minimization -- 7.3 The Line-Search Newton-CG Method -- 7.4 Comparison of TN Versus Conjugate Gradient Algorithms -- 7.5 Comparison of TN Versus L-BFGS -- 7.6 Solving Large-Scale Applications -- 8: The Trust-Region Method -- 8.1 The Trust-Region -- 8.2 Algorithms Based on the Cauchy Point -- 8.3 The Trust-Region Newton-CG Method -- 8.4 The Global Convergence -- 8.5 Iterative Solution of the Subproblem -- 8.6 The Scaled Trust-Region -- 9: Direct Methods for Unconstrained Optimization -- 9.1 The NELMED Algorithm -- 9.2 The NEWUOA Algorithm -- 9.3 The DEEPS Algorithm -- 9.4 Numerical Study: NELMED, NEWUOA, and DEEPS -- 10: Constrained Nonlinear Optimization Methods: An Overview -- 10.1 Convergence Tests -- 10.2 Infeasible Points -- 10.3 Approximate Subproblem: Local Models and Their Solving -- 10.4 Globalization Strategy: Convergence from Remote Starting Points -- 10.5 The Refining the Local Model -- 11: Optimality Conditions for Nonlinear Optimization -- 11.1 General Concepts in Nonlinear Optimization -- 11.2 Optimality Conditions for Unconstrained Optimization -- 11.3 Optimality Conditions for Problems with Inequality Constraints -- 11.4 Optimality Conditions for Problems with Equality Constraints.
11.5 Optimality Conditions for General Nonlinear Optimization Problems -- 11.6 Duality -- 12: Simple Bound Constrained Optimization -- 12.1 Necessary Conditions for Optimality -- 12.2 Sufficient Conditions for Optimality -- 12.3 Methods for Solving Simple Bound Optimization Problems -- 12.4 The Spectral Projected Gradient Method (SPG) -- Numerical Study-SPG: Quadratic Interpolation versus Cubic Interpolation -- 12.5 L-BFGS with Simple Bounds (L-BFGS-B) -- Numerical Study: L-BFGS-B Versus SPG -- 12.6 Truncated Newton with Simple Bounds (TNBC) -- 12.7 Applications -- Application A1 (Elastic-Plastic Torsion) -- Application A2 (Pressure Distribution in a Journal Bearing) -- Application A3 (Optimal Design with Composite Materials) -- Application A4 (Steady-State Combustion) -- Application A6 (Inhomogeneous Superconductors: 1-D Ginzburg-Landau) -- 13: Quadratic Programming -- 13.1 Equality Constrained Quadratic Programming -- Factorization of the Full KKT System -- The Schur-Complement Method -- The Null-Space Method -- Large-Scale Problems -- The Conjugate Gradient Applied to the Reduced System -- The Projected Conjugate Gradient Method -- 13.2 Inequality Constrained Quadratic Programming -- The Primal Active-Set Method -- An Algorithm for Positive Definite Hessian -- Reduced Gradient for Inequality Constraints -- The Reduced Gradient for Simple Bounds -- The Primal-Dual Active-Set Method -- 13.3 Interior Point Methods -- Stepsize Selection -- 13.4 Methods for Convex QP Problems with Equality Constraints -- 13.5 Quadratic Programming with Simple Bounds: The Gradient Projection Method -- The Cauchy Point -- Subproblem Minimization -- 13.6 Elimination of Variables -- 14: Penalty and Augmented Lagrangian Methods -- 14.1 The Quadratic Penalty Method -- 14.2 The Nonsmooth Penalty Method -- 14.3 The Augmented Lagrangian Method.
14.4 Criticism of the Penalty and Augmented Lagrangian Methods -- 14.5 A Penalty-Barrier Algorithm (SPENBAR) -- The Penalty-Barrier Method -- Global Convergence -- Numerical Study-SPENBAR: Solving Applications from the LACOP Collection -- 14.6 The Linearly Constrained Augmented Lagrangian (MINOS) -- MINOS for Linear Constraints -- Numerical Study: MINOS for Linear Programming -- MINOS for Nonlinear Constraints -- Numerical Study-MINOS: Solving Applications from the LACOP Collection -- 15: Sequential Quadratic Programming -- 15.1 A Simple Approach to SQP -- 15.2 Reduced-Hessian Quasi-Newton Approximations -- 15.3 Merit Functions -- 15.4 Second-Order Correction (Maratos Effect) -- 15.5 The Line-Search SQP Algorithm -- 15.6 The Trust-Region SQP Algorithm -- 15.7 Sequential Linear-Quadratic Programming (SLQP) -- 15.8 A SQP Algorithm for Large-Scale-Constrained Optimization (SNOPT) -- 15.9 A SQP Algorithm with Successive Error Restoration (NLPQLP) -- 15.10 Active-Set Sequential Linear-Quadratic Programming (KNITRO/ACTIVE) -- 16: Primal Methods: The Generalized Reduced Gradient with Sequential Linearization -- 16.1 Feasible Direction Methods -- 16.2 Active Set Methods -- 16.3 The Gradient Projection Method -- 16.4 The Reduced Gradient Method -- 16.5 The Convex Simplex Method -- 16.6 The Generalized Reduced Gradient Method (GRG) -- 16.7 GRG with Sequential Linear or Sequential Quadratic Programming (CONOPT) -- 17: Interior-Point Methods -- 17.1 Prototype of the Interior-Point Algorithm -- 17.2 Aspects of the Algorithmic Developments -- 17.3 Line-Search Interior-Point Algorithm -- 17.4 A Variant of the Line-Search Interior-Point Algorithm -- 17.5 Trust-Region Interior-Point Algorithm -- 17.6 Interior-Point Sequential Linear-Quadratic Programming (KNITRO/INTERIOR) -- 18: Filter Methods -- 18.1 Sequential Linear Programming Filter Algorithm.
18.2 Sequential Quadratic Programming Filter Algorithm.
Record Nr. UNINA-9910619281203321
Andrei Neculai  
Cham, Switzerland : , : Springer, , [2022]
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Modern numerical nonlinear optimization / / Neculai Andrei
Modern numerical nonlinear optimization / / Neculai Andrei
Autore Andrei Neculai
Pubbl/distr/stampa Cham, Switzerland : , : Springer, , [2022]
Descrizione fisica 1 online resource (824 pages)
Disciplina 016.5192
Collana Springer Optimization and Its Applications
Soggetto topico Mathematical optimization
Algebras, Linear
Optimització matemàtica
Àlgebra lineal
Soggetto genere / forma Llibres electrònics
ISBN 9783031087202
9783031087196
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Intro -- Preface -- Contents -- List of Algorithms -- List of Applications -- List of Figures -- List of Tables -- 1: Introduction -- 1.1 Mathematical Modeling: Linguistic Models Versus Mathematical Models -- 1.2 Mathematical Modeling and Computational Sciences -- 1.3 The Modern Modeling Scheme for Optimization -- 1.4 Classification of Optimization Problems -- 1.5 Optimization Algorithms -- 1.6 Collections of Applications for Numerical Experiments -- 1.7 Comparison of Algorithms -- 1.8 The Structure of the Book -- 2: Fundamentals on Unconstrained Optimization. Stepsize Computation -- 2.1 The Problem -- 2.2 Fundamentals on the Convergence of the Line-Search Methods -- 2.3 The General Algorithm for Unconstrained Optimization -- 2.4 Convergence of the Algorithm with Exact Line-Search -- 2.5 Inexact Line-Search Methods -- 2.6 Convergence of the Algorithm with Inexact Line-Search -- 2.7 Three Fortran Implementations of the Inexact Line-Search -- 2.8 Numerical Studies: Stepsize Computation -- 3: Steepest Descent Methods -- 3.1 The Steepest Descent -- Convergence of the Steepest Descent Method for Quadratic Functions -- Inequality of Kantorovich -- Numerical Study -- Convergence of the Steepest Descent Method for General Functions -- 3.2 The Relaxed Steepest Descent -- Numerical Study: SDB Versus RSDB -- 3.3 The Accelerated Steepest Descent -- Numerical Study -- 3.4 Comments on the Acceleration Scheme -- 4: The Newton Method -- 4.1 The Newton Method for Solving Nonlinear Algebraic Systems -- 4.2 The Gauss-Newton Method -- 4.3 The Newton Method for Function Minimization -- 4.4 The Newton Method with Line-Search -- 4.5 Analysis of Complexity -- 4.6 The Modified Newton Method -- 4.7 The Newton Method with Finite-Differences -- 4.8 Errors in Functions, Gradients, and Hessians -- 4.9 Negative Curvature Direction Methods -- 4.10 The Composite Newton Method.
5: Conjugate Gradient Methods -- 5.1 The Concept of Nonlinear Conjugate Gradient -- 5.2 The Linear Conjugate Gradient Method -- The Linear Conjugate Gradient Algorithm -- Convergence Rate of the Linear Conjugate Gradient Algorithm -- Preconditioning -- Incomplete Cholesky Factorization -- Comparison of the Convergence Rate of the Linear Conjugate Gradient and of the Steepest Descent -- 5.3 General Convergence Results for Nonlinear Conjugate Gradient Methods -- Convergence Under the Strong Wolfe Line-Search -- Convergence Under the Wolfe Line-Search -- 5.4 Standard Conjugate Gradient Methods -- Conjugate Gradient Methods with gk+12 in the Numerator of βk -- The Fletcher-Reeves Method -- The CD Method -- The Dai-Yuan Method -- Conjugate Gradient Methods with in the Numerator of βk -- The Polak-Ribière-Polyak Method -- The Hestenes-Stiefel Method -- The Liu-Storey Method -- Numerical Study: Standard Conjugate Gradient Methods -- 5.5 Hybrid Conjugate Gradient Methods -- Hybrid Conjugate Gradient Methods Based on the Projection Concept -- Numerical Study: Hybrid Conjugate Gradient Methods -- Hybrid Conjugate Gradient Methods as Convex Combinations of the Standard Conjugate Gradient Methods -- The Hybrid Convex Combination of LS and DY -- Numerical Study: NDLSDY -- 5.6 Conjugate Gradient Methods as Modifications of the Standard Schemes -- The Dai-Liao Conjugate Gradient Method -- The Conjugate Gradient with Guaranteed Descent (CG-DESCENT) -- Numerical Study: CG-DESCENT -- The Conjugate Gradient with Guaranteed Descent and Conjugacy Conditions and a Modified Wolfe Line-Search (DESCON) -- Numerical Study: DESCON -- 5.7 Conjugate Gradient Methods Memoryless BFGS Preconditioned -- The Memoryless BFGS Preconditioned Conjugate Gradient (CONMIN) -- Numerical Study: CONMIN.
The Conjugate Gradient Method Closest to the Scaled Memoryless BFGS Search Direction (DK / CGOPT) -- Numerical Study: DK/CGOPT -- 5.8 Solving Large-Scale Applications -- 6: Quasi-Newton Methods -- 6.1 DFP and BFGS Methods -- 6.2 Modifications of the BFGS Method -- 6.3 Quasi-Newton Methods with Diagonal Updating of the Hessian -- 6.4 Limited-Memory Quasi-Newton Methods -- 6.5 The SR1 Method -- 6.6 Sparse Quasi-Newton Updates -- 6.7 Quasi-Newton Methods and Separable Functions -- 6.8 Solving Large-Scale Applications -- 7: Inexact Newton Methods -- 7.1 The Inexact Newton Method for Nonlinear Algebraic Systems -- 7.2 Inexact Newton Methods for Functions Minimization -- 7.3 The Line-Search Newton-CG Method -- 7.4 Comparison of TN Versus Conjugate Gradient Algorithms -- 7.5 Comparison of TN Versus L-BFGS -- 7.6 Solving Large-Scale Applications -- 8: The Trust-Region Method -- 8.1 The Trust-Region -- 8.2 Algorithms Based on the Cauchy Point -- 8.3 The Trust-Region Newton-CG Method -- 8.4 The Global Convergence -- 8.5 Iterative Solution of the Subproblem -- 8.6 The Scaled Trust-Region -- 9: Direct Methods for Unconstrained Optimization -- 9.1 The NELMED Algorithm -- 9.2 The NEWUOA Algorithm -- 9.3 The DEEPS Algorithm -- 9.4 Numerical Study: NELMED, NEWUOA, and DEEPS -- 10: Constrained Nonlinear Optimization Methods: An Overview -- 10.1 Convergence Tests -- 10.2 Infeasible Points -- 10.3 Approximate Subproblem: Local Models and Their Solving -- 10.4 Globalization Strategy: Convergence from Remote Starting Points -- 10.5 The Refining the Local Model -- 11: Optimality Conditions for Nonlinear Optimization -- 11.1 General Concepts in Nonlinear Optimization -- 11.2 Optimality Conditions for Unconstrained Optimization -- 11.3 Optimality Conditions for Problems with Inequality Constraints -- 11.4 Optimality Conditions for Problems with Equality Constraints.
11.5 Optimality Conditions for General Nonlinear Optimization Problems -- 11.6 Duality -- 12: Simple Bound Constrained Optimization -- 12.1 Necessary Conditions for Optimality -- 12.2 Sufficient Conditions for Optimality -- 12.3 Methods for Solving Simple Bound Optimization Problems -- 12.4 The Spectral Projected Gradient Method (SPG) -- Numerical Study-SPG: Quadratic Interpolation versus Cubic Interpolation -- 12.5 L-BFGS with Simple Bounds (L-BFGS-B) -- Numerical Study: L-BFGS-B Versus SPG -- 12.6 Truncated Newton with Simple Bounds (TNBC) -- 12.7 Applications -- Application A1 (Elastic-Plastic Torsion) -- Application A2 (Pressure Distribution in a Journal Bearing) -- Application A3 (Optimal Design with Composite Materials) -- Application A4 (Steady-State Combustion) -- Application A6 (Inhomogeneous Superconductors: 1-D Ginzburg-Landau) -- 13: Quadratic Programming -- 13.1 Equality Constrained Quadratic Programming -- Factorization of the Full KKT System -- The Schur-Complement Method -- The Null-Space Method -- Large-Scale Problems -- The Conjugate Gradient Applied to the Reduced System -- The Projected Conjugate Gradient Method -- 13.2 Inequality Constrained Quadratic Programming -- The Primal Active-Set Method -- An Algorithm for Positive Definite Hessian -- Reduced Gradient for Inequality Constraints -- The Reduced Gradient for Simple Bounds -- The Primal-Dual Active-Set Method -- 13.3 Interior Point Methods -- Stepsize Selection -- 13.4 Methods for Convex QP Problems with Equality Constraints -- 13.5 Quadratic Programming with Simple Bounds: The Gradient Projection Method -- The Cauchy Point -- Subproblem Minimization -- 13.6 Elimination of Variables -- 14: Penalty and Augmented Lagrangian Methods -- 14.1 The Quadratic Penalty Method -- 14.2 The Nonsmooth Penalty Method -- 14.3 The Augmented Lagrangian Method.
14.4 Criticism of the Penalty and Augmented Lagrangian Methods -- 14.5 A Penalty-Barrier Algorithm (SPENBAR) -- The Penalty-Barrier Method -- Global Convergence -- Numerical Study-SPENBAR: Solving Applications from the LACOP Collection -- 14.6 The Linearly Constrained Augmented Lagrangian (MINOS) -- MINOS for Linear Constraints -- Numerical Study: MINOS for Linear Programming -- MINOS for Nonlinear Constraints -- Numerical Study-MINOS: Solving Applications from the LACOP Collection -- 15: Sequential Quadratic Programming -- 15.1 A Simple Approach to SQP -- 15.2 Reduced-Hessian Quasi-Newton Approximations -- 15.3 Merit Functions -- 15.4 Second-Order Correction (Maratos Effect) -- 15.5 The Line-Search SQP Algorithm -- 15.6 The Trust-Region SQP Algorithm -- 15.7 Sequential Linear-Quadratic Programming (SLQP) -- 15.8 A SQP Algorithm for Large-Scale-Constrained Optimization (SNOPT) -- 15.9 A SQP Algorithm with Successive Error Restoration (NLPQLP) -- 15.10 Active-Set Sequential Linear-Quadratic Programming (KNITRO/ACTIVE) -- 16: Primal Methods: The Generalized Reduced Gradient with Sequential Linearization -- 16.1 Feasible Direction Methods -- 16.2 Active Set Methods -- 16.3 The Gradient Projection Method -- 16.4 The Reduced Gradient Method -- 16.5 The Convex Simplex Method -- 16.6 The Generalized Reduced Gradient Method (GRG) -- 16.7 GRG with Sequential Linear or Sequential Quadratic Programming (CONOPT) -- 17: Interior-Point Methods -- 17.1 Prototype of the Interior-Point Algorithm -- 17.2 Aspects of the Algorithmic Developments -- 17.3 Line-Search Interior-Point Algorithm -- 17.4 A Variant of the Line-Search Interior-Point Algorithm -- 17.5 Trust-Region Interior-Point Algorithm -- 17.6 Interior-Point Sequential Linear-Quadratic Programming (KNITRO/INTERIOR) -- 18: Filter Methods -- 18.1 Sequential Linear Programming Filter Algorithm.
18.2 Sequential Quadratic Programming Filter Algorithm.
Record Nr. UNISA-996495169603316
Andrei Neculai  
Cham, Switzerland : , : Springer, , [2022]
Materiale a stampa
Lo trovi qui: Univ. di Salerno
Opac: Controlla la disponibilità qui
Reliability-based optimization of floating wind turbine support structures / / Mareike Leimeister
Reliability-based optimization of floating wind turbine support structures / / Mareike Leimeister
Autore Leimeister Mareike
Edizione [1st ed. 2022.]
Pubbl/distr/stampa Cham, Switzerland : , : Springer, , [2023]
Descrizione fisica 1 online resource (336 pages)
Disciplina 016.5192
Collana Springer Theses, Recognizing Outstanding Ph.D. Research
Soggetto topico Mathematical optimization
Offshore wind power plants - Law and legislation
ISBN 9783030968892
9783030968885
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Introduction -- Review of Reliability-Based Risk Analysis Methods Used in the Offshore Wind Industry -- Floating Offshore Wind Turbine Systems -- Modeling, Automated Simulation, and Optimization -- Design Optimization of FloatingWind Turbine Support Structures -- Reliability-Based Design Optimization of a Spar-Type FloatingWind Turbine Support Structure -- Discussion -- Conclusions.
Record Nr. UNINA-9910637719203321
Leimeister Mareike  
Cham, Switzerland : , : Springer, , [2023]
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui