03334oam 2200517 450 991048449390332120220113171000.03-030-60300-810.1007/978-3-030-60300-7(CKB)4100000011610353(MiAaPQ)EBC6413330(DE-He213)978-3-030-60300-7(PPN)252508491(EXLCZ)99410000001161035320210520d2020 uy 0engurnn#008mamaatxtrdacontentcrdamediacrrdacarrierThe projected subgradient algorithm in convex optimization /Alexander J. Zaslavski1st ed. 2020.Cham, Switzerland :Springer,[2020]©20201 online resource (VI, 146 p.)SpringerBriefs in Optimization,2190-83543-030-60299-0 Includes bibliographical references.1. Introduction -- 2. Nonsmooth Convex Optimization -- 3. Extensions -- 4. Zero-sum Games with Two Players -- 5. Quasiconvex Optimization -- References.This focused monograph presents a study of subgradient algorithms for constrained minimization problems in a Hilbert space. The book is of interest for experts in applications of optimization to engineering and economics. The goal is to obtain a good approximate solution of the problem in the presence of computational errors. The discussion takes into consideration the fact that for every algorithm its iteration consists of several steps and that computational errors for different steps are different, in general. The book is especially useful for the reader because it contains solutions to a number of difficult and interesting problems in the numerical optimization. The subgradient projection algorithm is one of the most important tools in optimization theory and its applications. An optimization problem is described by an objective function and a set of feasible points. For this algorithm each iteration consists of two steps. The first step requires a calculation of a subgradient of the objective function; the second requires a calculation of a projection on the feasible set. The computational errors in each of these two steps are different. This book shows that the algorithm discussed, generates a good approximate solution, if all the computational errors are bounded from above by a small positive constant. Moreover, if computational errors for the two steps of the algorithm are known, one discovers an approximate solution and how many iterations one needs for this. In addition to their mathematical interest, the generalizations considered in this book have a significant practical meaning.SpringerBriefs in Optimization,2190-8354Mathematical optimizationSubgradient methodsNumerical analysisMathematical optimization.Subgradient methodsNumerical analysis.519.6Zaslavski Alexander J.721713MiAaPQMiAaPQUtOrBLWBOOK9910484493903321The projected subgradient algorithm in convex optimization2217879UNINA