08558nam 22020415 450 991081382780332120210114161929.01-282-08760-697866120876081-4008-2513-X10.1515/9781400825134(CKB)1000000000756257(EBL)445420(OCoLC)355680042(SSID)ssj0000243846(PQKBManifestationID)11923120(PQKBTitleCode)TC0000243846(PQKBWorkID)10168565(PQKB)10716548(DE-B1597)446375(OCoLC)979757457(DE-B1597)9781400825134(PPN)199244596(MiAaPQ)EBC445420(PPN)187949883(EXLCZ)99100000000075625720190708d2009 fg engur|n|---|||||txtccrSelf-Regularity A New Paradigm for Primal-Dual Interior-Point Algorithms /Jiming Peng, Cornelis Roos, Tamás TerlakyCourse BookPrinceton, NJ : Princeton University Press, [2009]©20031 online resource (201 p.)Princeton Series in Applied Mathematics ;22Description based upon print version of record.0-691-09193-5 Frontmatter -- Contents -- Preface -- Acknowledgments -- Notation -- List of Abbreviations -- Chapter 1. Introduction and Preliminaries -- Chapter 2. Self-Regular Functions and Their Properties -- Chapter 3. Primal-Dual Algorithms for Linear Optimization Based on Self-Regular Proximities -- Chapter 4. Interior-Point Methods for Complementarity Problems Based on Self- Regular Proximities -- Chapter 5. Primal-Dual Interior-Point Methods for Semidefinite Optimization Based on Self-Regular Proximities -- Chapter 6. Primal-Dual Interior-Point Methods for Second-Order Conic Optimization Based on Self-Regular Proximities -- Chapter 7. Initialization: Embedding Models for Linear Optimization, Complementarity Problems, Semidefinite Optimization and Second-Order Conic Optimization -- Chapter 8. Conclusions -- References -- IndexResearch on interior-point methods (IPMs) has dominated the field of mathematical programming for the last two decades. Two contrasting approaches in the analysis and implementation of IPMs are the so-called small-update and large-update methods, although, until now, there has been a notorious gap between the theory and practical performance of these two strategies. This book comes close to bridging that gap, presenting a new framework for the theory of primal-dual IPMs based on the notion of the self-regularity of a function. The authors deal with linear optimization, nonlinear complementarity problems, semidefinite optimization, and second-order conic optimization problems. The framework also covers large classes of linear complementarity problems and convex optimization. The algorithm considered can be interpreted as a path-following method or a potential reduction method. Starting from a primal-dual strictly feasible point, the algorithm chooses a search direction defined by some Newton-type system derived from the self-regular proximity. The iterate is then updated, with the iterates staying in a certain neighborhood of the central path until an approximate solution to the problem is found. By extensively exploring some intriguing properties of self-regular functions, the authors establish that the complexity of large-update IPMs can come arbitrarily close to the best known iteration bounds of IPMs. Researchers and postgraduate students in all areas of linear and nonlinear optimization will find this book an important and invaluable aid to their work.Princeton Series in Applied MathematicsInterior-point methodsMathematical optimizationMathematical optimizationProgramming (Mathematics)Mathematical optimizationInterior-point methodsProgramming (Mathematics)Civil & Environmental EngineeringHILCCEngineering & Applied SciencesHILCCOperations ResearchHILCCAccuracy and precision.Algorithm.Analysis of algorithms.Analytic function.Associative property.Barrier function.Binary number.Block matrix.Combination.Combinatorial optimization.Combinatorics.Complexity.Conic optimization.Continuous optimization.Control theory.Convex optimization.Delft University of Technology.Derivative.Differentiable function.Directional derivative.Division by zero.Dual space.Duality (mathematics).Duality gap.Eigenvalues and eigenvectors.Embedding.Equation.Estimation.Existential quantification.Explanation.Feasible region.Filter design.Function (mathematics).Implementation.Instance (computer science).Invertible matrix.Iteration.Jacobian matrix and determinant.Jordan algebra.Karmarkar's algorithm.Karush–Kuhn–Tucker conditions.Line search.Linear complementarity problem.Linear function.Linear programming.Lipschitz continuity.Local convergence.Loss function.Mathematical optimization.Mathematician.Mathematics.Matrix function.McMaster University.Monograph.Multiplication operator.Newton's method.Nonlinear programming.Nonlinear system.Notation.Operations research.Optimal control.Optimization problem.Parameter (computer programming).Parameter.Pattern recognition.Polyhedron.Polynomial.Positive semidefinite.Positive-definite matrix.Quadratic function.Requirement.Result.Scientific notation.Second derivative.Self-concordant function.Sensitivity analysis.Sign (mathematics).Signal processing.Simplex algorithm.Simultaneous equations.Singular value.Smoothness.Solution set.Solver.Special case.Subset.Suggestion.Technical report.Theorem.Theory.Time complexity.Two-dimensional space.Upper and lower bounds.Variable (computer science).Variable (mathematics).Variational inequality.Variational principle.Without loss of generality.Worst-case complexity.Yurii Nesterov.Interior-point methods.Mathematical optimization.Mathematical optimization.Programming (Mathematics).Mathematical optimizationInterior-point methodsProgramming (Mathematics)Civil & Environmental EngineeringEngineering & Applied SciencesOperations Research519.6Peng Jiming, 726750Roos Cornelis, Terlaky Tamás, DE-B1597DE-B1597BOOK9910813827803321Self-Regularity3924874UNINA