LEADER 00803nam0 2200289 450 001 9910846495603321 005 20240502145410.0 010 $a9781787386921 100 $a20240502d2022----km y0itay50 ba 101 0 $aeng 102 $aGB 105 $a 001yy 200 1 $aAgainst decolonisation$etaking African agency seriously$fOlúf??mi Táíwò 210 $aLondon$cHurst$d2022 215 $aXVII, 270 p.$d21 cm 225 1 $aAfrican arguments 454 0$12001 610 0 $aDecolonizzazione$aAfrica 676 $a325.3$v22$zita 700 1$aTáíwò,$bOlúf??mi$0919495 801 0$aIT$bUNINA$gREICAT$2UNIMARC 901 $aBK 912 $a9910846495603321 952 $a325.3 TAI 1$b11669$fBFS 959 $aBFS 996 $aAgainst decolonisation$94156407 997 $aUNINA LEADER 05065nam 22005655 450 001 9910616207103321 005 20251009105919.0 010 $a9783031126444$b(electronic bk.) 010 $z9783031126437 024 7 $a10.1007/978-3-031-12644-4 035 $a(MiAaPQ)EBC7102365 035 $a(Au-PeEL)EBL7102365 035 $a(CKB)24950564900041 035 $a(PPN)26495565X 035 $a(DE-He213)978-3-031-12644-4 035 $a(OCoLC)1347369791 035 $a(EXLCZ)9924950564900041 100 $a20220929d2022 u| 0 101 0 $aeng 135 $aurcnu|||||||| 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aOptimization in Banach Spaces /$fby Alexander J. Zaslavski 205 $a1st ed. 2022. 210 1$aCham :$cSpringer International Publishing :$cImprint: Springer,$d2022. 215 $a1 online resource (132 pages) 225 1 $aSpringerBriefs in Optimization,$x2191-575X 311 08$aPrint version: Zaslavski, Alexander J. Optimization in Banach Spaces Cham : Springer International Publishing AG,c2022 9783031126437 320 $aIncludes bibliographical references and index. 327 $aPreface -- Introduction -- Convex optimization -- Nonconvex optimization -- Continuous algorithms -- References. 330 $aThe book is devoted to the study of constrained minimization problems on closed and convex sets in Banach spaces with a Frechet differentiable objective function. Such problems are well studied in a finite-dimensional space and in an infinite-dimensional Hilbert space. When the space is Hilbert there are many algorithms for solving optimization problems including the gradient projection algorithm which is one of the most important tools in the optimization theory, nonlinear analysis and their applications. An optimization problem is described by an objective function and a set of feasible points. For the gradient projection algorithm each iteration consists of two steps. The first step is a calculation of a gradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error. In our recent research we show that the gradient projection algorithm generates a good approximate solution, if all the computational errors are bounded from above by a small positive constant. It should be mentioned that the properties of a Hilbert space play an important role. When we consider an optimization problem in a general Banach space the situation becomes more difficult and less understood. On the other hand such problems arise in the approximation theory. The book is of interest for mathematicians working in optimization. It also can be useful in preparation courses for graduate students. The main feature of the book which appeals specifically to this audience is the study of algorithms for convex and nonconvex minimization problems in a general Banach space. The book is of interest for experts in applications of optimization to the approximation theory. In this book the goal is to obtain a good approximate solution of the constrained optimization problem in a general Banach space under the presence of computational errors. It is shown that the algorithm generates a good approximate solution, if the sequence of computational errors is bounded from above by a small constant. The book consists of four chapters. In the first we discuss several algorithms which are studied in the book and prove a convergence result for an unconstrained problem which is a prototype of our results for the constrained problem. In Chapter 2 we analyze convex optimization problems. Nonconvex optimization problems are studied in Chapter 3. In Chapter 4 we study continuous algorithms for minimization problems under the presence of computational errors. The algorithm generates a good approximate solution, if the sequence of computational errors is bounded from above by a small constant. The book consists of four chapters. In the first we discuss several algorithms which are studied in the book and prove a convergence result for an unconstrained problemwhich is a prototype of our results for the constrained problem. In Chapter 2 we analyze convex optimization problems. Nonconvex optimization problems are studied in Chapter 3. In Chapter 4 we study continuous algorithms for minimization problems under the presence of computational errors. 410 0$aSpringerBriefs in Optimization,$x2191-575X 606 $aMathematical optimization 606 $aNumerical analysis 606 $aOptimization 606 $aNumerical Analysis 615 0$aMathematical optimization. 615 0$aNumerical analysis. 615 14$aOptimization. 615 24$aNumerical Analysis. 676 $a515.732 676 $a519.6 700 $aZaslavski$b Alexander J.$0721713 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 912 $a9910616207103321 996 $aOptimization in Banach Spaces$92920192 997 $aUNINA