10851nam 2200529 450 991050843750332120231110223535.03-030-85450-7(CKB)5460000000185013(MiAaPQ)EBC6796743(Au-PeEL)EBL6796743(OCoLC)1282600450(PPN)258296658(EXLCZ)99546000000018501320220723d2021 uy 0engurcnu||||||||txtrdacontentcrdamediacrrdacarrierLinear and nonlinear programming /David G. Luenberger, Yinyu Ye5th ed.Cham, Switzerland :Springer,[2021]©20211 online resource (609 pages)International Series in Operations Research and Management Science ;v.2283-030-85449-3 Includes bibliographical references and index.Intro -- Preface -- Contents -- 1 Introduction -- 1.1 Optimization -- 1.2 Types of Problems -- Linear Programming -- Conic Linear Programming -- Unconstrained Problems -- Constrained Problems -- 1.3 Complexity of Problems -- 1.4 Iterative Algorithms and Convergence -- Part I Linear Programming -- 2 Basic Properties of Linear Programs -- 2.1 Introduction -- 2.2 Examples of Linear Programming Problems -- 2.3 Basic Feasible Solutions -- 2.4 The Fundamental Theorem of Linear Programming -- 2.5 Relations to Convex Geometry -- 2.6 Farkas' Lemma and Alternative Systems -- 2.7 Summary -- 2.8 Exercises -- References -- 3 Duality and Complementarity -- 3.1 Dual Linear Programs and Interpretations -- 3.2 The Duality Theorem -- 3.3 Geometric and Economic Interpretations -- Dual Multipliers-Shadow Prices -- 3.4 Sensitivity and Complementary Slackness -- Sensitivity -- Complementary Slackness -- 3.5 Selected Applications of the Duality -- Robust and Distributionally Robust Optimization -- Online Linear Programming -- 3.6 Max Flow-Min Cut Theorem -- Max Flow Augmenting Algorithm -- Max Flow-Min Cut Theorem -- Relation to Duality -- 3.7 Summary -- 3.8 Exercises -- References -- 4 The Simplex Method -- 4.1 Adjacent Basic Feasible Solutions (Extreme Points) -- Nondegeneracy Assumption -- Determination of Vector to Leave Basis -- Conic Combination Interpretations -- 4.2 The Primal Simplex Method -- Determining an Optimal Feasible Solution -- The Simplex Procedure -- Finding an Initial Basic Feasible Solution -- 4.3 The Dual Simplex Method -- The Primal-Dual Algorithm -- 4.4 The Simplex Tableau Method -- Decomposition -- 4.5 The Simplex Method for Transportation Problems -- Finding a Basic Feasible Solution -- The Northwest Corner Rule -- Basis Triangularity -- The Transportation Simplex Method -- Simplex Multipliers -- Cycle of Change.The Transportation Simplex Algorithm -- 4.6 Efficiency Analysis of the Simplex Method -- 4.7 Summary -- 4.8 Exercises -- References -- 5 Interior-Point Methods -- 5.1 Elements of Complexity Theory -- 5.2 The Simplex Method Is Not Polynomial-Time -- 5.3 The Ellipsoid Method -- Cutting Plane and New Containing Ellipsoid -- Convergence -- Ellipsoid Method for Usual Form of LP -- 5.4 The Analytic Center -- Cutting Plane and Analytic Volume of Reduction -- 5.5 The Central Path -- Dual Central Path -- Primal-Dual Central Path -- Duality Gap -- 5.6 Solution Strategies -- Primal Barrier Method -- Primal-Dual Path-Following -- Primal-Dual Potential Reduction Algorithm -- Iteration Complexity -- 5.7 Termination and Initialization -- Termination -- Initialization -- The HSD Algorithm -- 5.8 Summary -- 5.9 Exercises -- References -- 6 Conic Linear Programming -- 6.1 Convex Cones -- 6.2 Conic Linear Programming Problem -- 6.3 Farkas' Lemma for Conic Linear Programming -- 6.4 Conic Linear Programming Duality -- 6.5 Complementarity and Solution Rank of SDP -- Null-Space Rank Reduction -- Gaussian Projection Rank Reduction -- Randomized Binary Rank Reduction -- Objective-Guide Rank Reduction -- 6.6 Interior-Point Algorithms for Conic Linear Programming -- Initialization: The HSD Algorithm -- 6.7 Summary -- 6.8 Exercises -- References -- Part II Unconstrained Problems -- 7 Basic Properties of Solutions and Algorithms -- 7.1 First-Order Necessary Conditions -- Feasible and Descent Directions -- 7.2 Examples of Unconstrained Problems -- 7.3 Second-Order Conditions -- Sufficient Conditions for a Relative Minimum -- 7.4 Convex and Concave Functions -- Properties of Convex Functions -- Properties of Differentiable Convex Functions -- 7.5 Minimization and Maximization of Convex Functions -- 7.6 Global Convergence of Descent Algorithms -- Iterative Algorithms -- Descent.Closed Mappings -- Global Convergence Theorem -- Spacer Steps -- 7.7 Speed of Convergence -- Order of Convergence -- Linear Convergence -- Arithmetic Convergence -- Average Rates -- Convergence of Vectors -- Complexity -- 7.8 Summary -- 7.9 Exercises -- References -- 8 Basic Descent Methods -- 8.1 Line Search Algorithms -- 0th-Order Method: Golden Section Search and Curve Fitting -- Search by Golden Section -- Quadratic Fit -- 1st-Order Method: Bisection and Curve Fitting Methods -- The Bisection Method -- Quadratic Fit: Method of False Position -- Cubic Fit -- 2nd-Order Method: Newton's Method -- Global Convergence of Curve Fitting -- Closedness of Line Search Algorithms -- Inaccurate Line Search -- Armijo's Rule -- 8.2 The Method of Steepest Descent: First-Order -- The Method -- Global Convergence and Convergence Speed -- The Quadratic Case -- The Nonquadratic Case -- 8.3 Applications of the Convergence Theory and Preconditioning -- Scaling as Preconditioning -- 8.4 Accelerated Steepest Descent -- The Heavy Ball Method -- The Method of False Position -- 8.5 Multiplicative Steepest Descent -- Affine-Scaling Method -- Mirror-Descent Method -- 8.6 Newton's Method: Second-Order -- Order Two Convergence -- Modifications -- Newton's Method and Logarithms -- Self-concordant Functions -- 8.7 Sequential Quadratic Optimization Methods -- Trust Region Method -- A Homotopy or Path-Following Method -- 8.8 Coordinate and Stochastic Gradient Descent Methods -- Global Convergence -- Local Convergence Rate -- Convergence Speed of a Randomized Coordinate Descent Method -- Stochastic Gradient Descent (SGD) Method -- 8.9 Summary -- 8.10 Exercises -- References -- 9 Conjugate Direction Methods -- 9.1 Conjugate Directions -- 9.2 Descent Properties of the Conjugate Direction Method -- 9.3 The Conjugate Gradient Method -- Conjugate Gradient Algorithm.Verification of the Algorithm -- 9.4 The C-G Method as an Optimal Process -- Bounds on Convergence -- 9.5 The Partial Conjugate Gradient Method -- 9.6 Extension to Nonquadratic Problems -- Quadratic Approximation -- Line Search Methods -- Convergence -- Preconditioning and Partial Methods -- 9.7 Parallel Tangents -- 9.8 Exercises -- References -- 10 Quasi-Newton Methods -- 10.1 Modified Newton Method -- Other Modified Newton's Methods -- 10.2 Construction of the Inverse -- Rank One Correction -- 10.3 Davidon-Fletcher-Powell Method -- Positive Definiteness -- Finite Step Convergence -- 10.4 The Broyden Family -- Partial Quasi-Newton Methods -- 10.5 Convergence Properties -- Global Convergence -- Local Convergence -- 10.6 Scaling -- Improvement of Eigenvalue Ratio -- Scale Factors -- A Self-Scaling Quasi-Newton Algorithm -- 10.7 Memoryless Quasi-Newton Methods -- Scaling and Preconditioning -- 10.8 Combination of Steepest Descent and Newton's Method -- 10.9 Summary -- 10.10 Exercises -- References -- Part III Constrained Optimization -- 11 Constrained Optimization Conditions -- 11.1 Constraints and Tangent Plane -- Tangent Plane -- 11.2 First-Order Necessary Conditions (Equality Constraints) -- Sensitivity -- 11.3 Equality Constrained Optimization Examples -- Large-Scale Applications -- 11.4 Second-Order Conditions (Equality Constraints) -- Eigenvalues in Tangent Subspace -- Projected Hessians -- 11.5 Inequality Constraints -- First-Order Necessary Conditions -- The Lagrangian and First-Order Conditions -- Second-Order Conditions -- Sensitivity -- 11.6 Mix-Constrained Optimization Examples -- 11.7 Lagrangian Duality and Zero-Order Conditions -- 11.8 Rules for Constructing the Lagrangian Dual Explicitly -- 11.9 Summary -- 11.10 Exercises -- References -- 12 Primal Methods -- 12.1 Infeasible Direction and the Steepest Descent Projection Method.12.2 Feasible Direction Methods: Sequential Linear Programming -- 12.3 The Gradient Projection Method -- Linear Constraints -- Nonlinear Constraints -- 12.4 Convergence Rate of the Gradient Projection Method -- Geodesic Descent -- Geodesics -- Lagrangian and Geodesics -- Rate of Convergence -- Problems with Inequalities -- 12.5 The Reduced Gradient Method -- Linear Constraints -- Global Convergence -- Nonlinear Constraints -- 12.6 Convergence Rate of the Reduced Gradient Method -- 12.7 Sequential Quadratic Optimization Methods -- 12.8 Active Set Methods -- Changes in Working Set -- 12.9 Summary -- 12.10 Exercises -- References -- 13 Penalty and Barrier Methods -- 13.1 Penalty Methods -- The Method -- Convergence -- 13.2 Barrier Methods -- The Method -- Convergence -- 13.3 Lagrange Multipliers in Penalty and Barrier Methods -- Lagrange Multipliers in the Penalty Method -- The Hessian Matrix -- Lagrange Multipliers in the Barrier Method -- 13.4 Newton's Method for the Logarithmic Barrier Optimization -- The KKT Condition System of the Logarithmic Barrier Function -- The KKT System of a ``Shifted'' Barrier -- The Interior Ellipsoidal-Trust Region Method with Barrier -- 13.5 Newton's Method for Equality Constrained Optimization -- Normalization of Penalty Functions -- Inequalities -- 13.6 Conjugate Gradients and Penalty Methods -- 13.7 Penalty Functions and Gradient Projection -- Underlying Concept -- Implementing the First Step -- Inequality Constraints -- 13.8 Summary -- 13.9 Exercises -- References -- 14 Local Duality and Dual Methods -- 14.1 Local Duality and the Lagrangian Method -- Inequality Constraints -- Partial Duality -- The Lagrangian Method: Dual Steepest Ascent -- Preconditioning or Scaling -- 14.2 Separable Problems and Their Duals -- Decomposition -- 14.3 The Augmented Lagrangian and Interpretation -- The Penalty Viewpoint.Geometric Interpretation.International Series in Operations Research and Management Science Linear programmingLinear programming.519.72Luenberger David G.1937-6376Ye YinyuMiAaPQMiAaPQMiAaPQBOOK9910508437503321Linear and nonlinear programming177148UNINA