top

  Info

  • Utilizzare la checkbox di selezione a fianco di ciascun documento per attivare le funzionalità di stampa, invio email, download nei formati disponibili del (i) record.

  Info

  • Utilizzare questo link per rimuovere la selezione effettuata.
Accelerated optimization for machine learning : first-order algorithms / / Zhouchen Lin, Huan Li, Cong Fang
Accelerated optimization for machine learning : first-order algorithms / / Zhouchen Lin, Huan Li, Cong Fang
Autore Lin Zhouchen
Pubbl/distr/stampa Singapore : , : Springer, , [2020]
Descrizione fisica 1 online resource (286 pages)
Disciplina 006.31
Soggetto topico Machine learning - Mathematics
Mathematical optimization
Computer mathematics
Machine Learning
Optimization
Math Applications in Computer Science
Computational Mathematics and Numerical Analysis
ISBN 981-15-2910-8
9789811529108
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Chapter 1. Introduction -- Chapter 2. Accelerated Algorithms for Unconstrained Convex Optimization -- Chapter 3. Accelerated Algorithms for Constrained Convex Optimization -- Chapter 4. Accelerated Algorithms for Nonconvex Optimization -- Chapter 5. Accelerated Stochastic Algorithms -- Chapter 6. Accelerated Paralleling Algorithms -- Chapter 7. Conclusions.
Record Nr. UNISA-996465342403316
Lin Zhouchen  
Singapore : , : Springer, , [2020]
Materiale a stampa
Lo trovi qui: Univ. di Salerno
Opac: Controlla la disponibilità qui
Accelerated optimization for machine learning : first-order algorithms / / Zhouchen Lin, Huan Li, Cong Fang
Accelerated optimization for machine learning : first-order algorithms / / Zhouchen Lin, Huan Li, Cong Fang
Autore Lin Zhouchen
Pubbl/distr/stampa Singapore : , : Springer, , [2020]
Descrizione fisica 1 online resource (286 pages)
Disciplina 006.31
Soggetto topico Machine learning - Mathematics
Mathematical optimization
Computer mathematics
Machine Learning
Optimization
Math Applications in Computer Science
Computational Mathematics and Numerical Analysis
ISBN 981-15-2910-8
9789811529108
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Chapter 1. Introduction -- Chapter 2. Accelerated Algorithms for Unconstrained Convex Optimization -- Chapter 3. Accelerated Algorithms for Constrained Convex Optimization -- Chapter 4. Accelerated Algorithms for Nonconvex Optimization -- Chapter 5. Accelerated Stochastic Algorithms -- Chapter 6. Accelerated Paralleling Algorithms -- Chapter 7. Conclusions.
Record Nr. UNINA-9910409667103321
Lin Zhouchen  
Singapore : , : Springer, , [2020]
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Black box optimization, machine learning, and no-free lunch theorems / / Panos M. Pardalos, Varvara Rasskazova, Michael N. Vrahatis, editors
Black box optimization, machine learning, and no-free lunch theorems / / Panos M. Pardalos, Varvara Rasskazova, Michael N. Vrahatis, editors
Pubbl/distr/stampa Cham, Switzerland : , : Springer, , [2021]
Descrizione fisica 1 online resource (393 pages)
Disciplina 006.31
Collana Springer Optimization and Its Applications
Soggetto topico Machine learning - Mathematics
Aprenentatge automàtic
Optimització matemàtica
Algorismes computacionals
Soggetto genere / forma Llibres electrònics
ISBN 3-030-66515-1
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Intro -- Preface -- Contents -- Learning Enabled Constrained Black-Box Optimization -- 1 Introduction -- 2 Constrained Black-Box Optimization -- 3 The Basic Probabilistic Framework -- 3.1 Gaussian Processes -- 3.2 GP-Based Optimization -- 4 Constrained Bayesian Optimization -- 5 Constrained Bayesian Optimization for Partially Defined Objective Functions -- 6 Software for the Generation of Constrained Test Problems -- 6.1 Emmental-Type GKLS Generator -- 7 Conclusions -- References -- Black-Box Optimization: Methods and Applications -- 1 Introduction -- 2 Overview of BBO Methods -- 2.1 Direct Search Methods -- 2.1.1 Simplex Search -- 2.1.2 Coordinate Search -- 2.1.3 Generalized Pattern Search -- 2.1.4 Mesh Adaptive Direct Search -- 2.2 Model-Based Methods -- 2.2.1 Model-Based Trust Region -- 2.2.2 Projection-Based Methods -- 2.3 Heuristic Methods -- 2.3.1 DIRECT -- 2.3.2 Multilevel Coordinate Search -- 2.3.3 Hit-and-Run algorithms -- 2.3.4 Simulated Annealing -- 2.3.5 Genetic Algorithm -- 2.3.6 Particle Swarm Optimization -- 2.3.7 Surrogate Management Framework -- 2.3.8 Branch and Fit -- 2.4 Hybrid Methods -- 2.5 Extension to Constrained Problems -- 2.5.1 Penalty Method -- 2.5.2 Augmented Lagrangian -- 2.5.3 Filter Method -- 2.5.4 Surrogate Modeling -- 3 BBO Solvers -- 4 Recent Applications -- 4.1 Automatic Machine Learning -- 4.2 Optimization Solvers -- 4.3 Fluid Mechanics -- 4.4 Oilfield Development and Operations -- 4.5 Chemical and Biochemical Engineering -- 5 Open Problems and Future Research Directions -- References -- Tuning Algorithms for Stochastic Black-Box Optimization: State of the Art and Future Perspectives -- 1 Introduction -- 2 Tuning: Strategies -- 2.1 Key Topics -- 2.2 Stochastic Optimization Algorithms -- 2.3 Algorithm Tuning -- 2.4 Example: Grefenstette's Study of Control Parameters for Genetic Algorithms.
2.5 No Free Lunch Theorems -- 2.6 Tuning for Deterministic Algorithms -- 3 Test Sets -- 3.1 Test Functions -- 3.2 Application Domains -- 3.2.1 Tuning in Industry -- 3.2.2 Energy -- 3.2.3 Water Industry -- 3.2.4 Steel Industry -- 3.2.5 Automotive -- 3.2.6 Information Technology -- 4 Statistical Considerations -- 4.1 Experimental Setup -- 4.2 Design of Experiments -- 4.3 Measuring Performance -- 4.4 Reporting Results -- 5 Parallelization -- 5.1 Overview -- 5.2 Simplistic Approaches -- 5.3 Parallelization in Surrogate Model-Based Optimization -- 5.3.1 Uncertainty-Based Methods -- 5.3.2 Surrogate-Assisted Algorithms -- 6 Tuning Approaches -- 6.1 Overview -- 6.2 Manual Tuning -- 6.3 Automatic Tuning -- 6.4 Interactive Tuning -- 6.5 Internal Tuning -- 7 Tuning Software -- 7.1 Overview -- 7.2 IRACE -- 7.3 SPOT -- 7.4 SMAC -- 7.5 ParamILS -- 7.6 GGA -- 7.7 Usability and Availability of Tuning Software -- 7.8 Example: SPOT -- 8 Research Directions and Open Problems -- 9 Summary and Outlook -- References -- Quality-Diversity Optimization: A Novel Branch of Stochastic Optimization -- 1 Introduction -- 2 Problem Formulation -- 2.1 Collections of Solutions -- 2.2 How Do We Measure the Performance of a QD Algorithm? -- 3 Optimizing a Collection of Solutions -- 3.1 MAP-Elites -- 3.2 A Unified Framework -- 3.2.1 Containers -- 3.2.2 Selection Operators -- 3.2.3 Population Scores -- 3.3 Considerations of Quality-Diversity Optimization -- 4 Origins and Related Work -- 4.1 Searching for Diverse Behaviors -- 4.2 Connections to Multimodal Optimization -- 4.3 Connections to Multitask Optimization -- 5 Current Topics -- 5.1 Expensive Objective Functions -- 5.2 High-Dimensional Feature Space -- 5.3 Learning the Behavior Descriptor -- 5.4 Improving Variation Operators -- 5.5 Noisy Functions -- 6 Conclusion -- References.
Multi-Objective Evolutionary Algorithms: Past, Present, and Future -- 1 Introduction -- 2 Basic Concepts -- 3 The Past -- 3.1 Non-Elitist Non-Pareto Approaches -- 3.1.1 Linear Aggregating Functions -- 3.1.2 Vector Evaluated Genetic Algorithm (VEGA) -- 3.1.3 Lexicographic Ordering -- 3.1.4 Target-Vector Approaches -- 3.2 Non-Elitist Pareto-Based Approaches -- 3.2.1 Multi-Objective Genetic Algorithm (MOGA) -- 3.2.2 Nondominated Sorting Genetic Algorithm (NSGA) -- 3.2.3 Niched-Pareto Genetic Algorithm (NPGA) -- 3.3 Elitist Pareto-Based Approaches -- 3.3.1 The Strength Pareto Evolutionary Algorithm (SPEA) -- 3.3.2 The Pareto Archived Evolution Strategy (PAES) -- 3.3.3 The Nondominated Sorting Genetic Algorithm-II (NSGA-II) -- 4 The Present -- 4.1 Some Applications -- 5 The Future -- 6 Conclusions -- References -- Black-Box and Data-Driven Computation -- 1 Introduction -- 2 Black Box and Oracle -- 3 Reduction -- 4 Data-Driven Computation -- References -- Mathematically Rigorous Global Optimization and FuzzyOptimization -- 1 Introduction -- 2 Interval Analysis: Fundamentals and Philosophy -- 2.1 Overview -- 2.2 Interval Logic -- 2.3 Extensions -- 2.4 History and References -- 3 Fuzzy Sets: Fundamentals and Philosophy -- 3.1 Fuzzy Logic -- 3.2 A Brief History -- 4 The Branch and Bound Framework: Some Definitions and Details -- 5 Interval Technology: Some Details -- 5.1 Interval Newton Methods -- 5.2 Constraint Propagation -- 5.3 Relaxations -- 5.4 Interval Arithmetic Software -- 6 Fuzzy Technology: A Few Details -- 7 Conclusions -- References -- Optimization Under Uncertainty Explains Empirical Success of Deep Learning Heuristics -- 1 Formulation of the Problem -- 2 Why Rectified Linear Neurons Are Efficient: A Theoretical Explanation -- 3 Why Sigmoid Activation Functions -- 4 Selection of Poolings -- 5 Why Softmax -- 6 Which Averaging Should We Choose.
7 Proofs -- References -- Variable Neighborhood Programming as a Tool of Machine Learning -- 1 Introduction -- 2 Variable Neighborhood Search -- 3 Variable Neighborhood Programming -- 3.1 Solution Presentation -- 3.2 Neighborhood Structures -- 3.3 Elementary Tree Transformation in Automatic Programming -- 3.3.1 ETT in the Tree of an Undirected Graph -- 3.3.2 ETT in AP Tree -- 3.3.3 Bound on Cardinality of AP-ETT(T) -- 4 VNP for Symbolic Regression -- 4.1 Test Instances and Parameter Values -- 4.2 Comparison of BVNP with Other Methods -- 5 Life Expectancy Estimation as a Symbolic Regression Problem Solved by VNP: Case Study on Russian Districts -- 5.1 Life Expectancy Estimation as a Machine Learning Problem -- 5.2 VNP for Estimating Life Expectancy Problem -- 5.3 Case Study at Russian Districts -- 5.3.1 One-Attribute Analysis -- 5.3.2 Results and Discussion on 3-Attribute Data -- 5.4 Conclusions -- 6 Preventive Maintenance in Railway Planning as a Machine Learning Problem -- 6.1 Literature Review and Motivation -- 6.2 Reduced VNP for Solving the Preventive Maintenance Planning of Railway Infrastructure -- 6.2.1 Learning for Stage 1: Prediction -- 6.2.2 Learning for Stage 2: Classification -- 6.3 Computation Results -- 6.3.1 Prediction -- 6.3.2 Classification -- 6.4 Conclusions and Future Work -- 7 Conclusions -- References -- Non-lattice Covering and Quantization of High Dimensional Sets -- 1 Introduction -- 2 Weak Covering -- 2.1 Comparison of Designs from the View Point of Weak Covering -- 2.2 Reduction to the Probability of Covering a Point by One Ball -- 2.3 Designs of Theoretical Interest -- 3 Approximation of Cd(Zn,r) for Design 1 -- 3.1 Normal Approximation for PU,δ,α,r -- 3.2 Refined Approximation for PU,δ,α,r -- 3.3 Approximation for Cd(Zn,r) for Design 1 -- 4 Approximating Cd(Zn,r) for Design 2a -- 4.1 Normal Approximation for PU,δ,0,r.
4.2 Refined Approximation for PU,δ,0,r -- 4.3 Approximation for Cd(Zn,r) -- 5 Approximating Cd(Zn,r) for Design 2b -- 5.1 Establishing a Connection Between Sampling with and Without Replacement: General Case -- 5.2 Approximation of Cd(Zn,r) for Design 2b. -- 6 Numerical Study -- 6.1 Assessing Accuracy of Approximations of Cd(Zn,r) and Studying Their Dependence on δ -- 6.2 Comparison Across α -- 7 Quantization in a Cube -- 7.1 Quantization Error and Its Relation to Weak Covering -- 7.2 Quantization Error for Design 1 -- 7.3 Quantization Error for Design 2a -- 7.4 Quantization Error for Design 2b -- 7.5 Accuracy of Approximations for Quantization Error and the δ-Effect -- 8 Comparative Numerical Studies of Covering Properties for Several Designs -- 8.1 Covering Comparisons -- 8.2 Quantization Comparisons -- 9 Covering and Quantization in the d-Simplex -- 9.1 Characteristics of Interest -- 9.2 Numerical Investigation of the δ-Effect for d-Simplex -- 10 Appendix: An Auxiliary Lemma -- References -- Finding Effective SAT Partitionings Via Black-Box Optimization -- 1 Introduction -- 2 Preliminaries -- 2.1 Boolean Satisfiability Problem (SAT) -- 2.2 SAT-Based Cryptanalysis -- 3 Decomposition Sets and Backdoors in SAT with Application to Inversion of Discrete Functions -- 3.1 On Interconnection Between Plain Partitionings and Cryptographic Attacks -- 3.2 Using Monte Carlo Method to Estimate Runtime of SAT-Based Guess-and-Determine Attacks -- 4 Practical Aspects of Evaluating Effectiveness of SAT Partitionings -- 4.1 Narrowing Search Space to SUPBS -- 4.2 Applications of Incremental SAT Solving -- 4.3 Finding Partitionings via Incremental SAT -- 5 Employed Optimization Algorithms -- 6 Experimental Results -- 6.1 Considered Problems -- 6.2 Implementations of Objective Functions -- 6.3 Finding Effective SAT Partitionings.
6.4 Solving Hard SAT Instances via Found Partitionings.
Record Nr. UNISA-996466410203316
Cham, Switzerland : , : Springer, , [2021]
Materiale a stampa
Lo trovi qui: Univ. di Salerno
Opac: Controlla la disponibilità qui
Black box optimization, machine learning, and no-free lunch theorems / / Panos M. Pardalos, Varvara Rasskazova, Michael N. Vrahatis, editors
Black box optimization, machine learning, and no-free lunch theorems / / Panos M. Pardalos, Varvara Rasskazova, Michael N. Vrahatis, editors
Pubbl/distr/stampa Cham, Switzerland : , : Springer, , [2021]
Descrizione fisica 1 online resource (393 pages)
Disciplina 006.31
Collana Springer Optimization and Its Applications
Soggetto topico Machine learning - Mathematics
Aprenentatge automàtic
Optimització matemàtica
Algorismes computacionals
Soggetto genere / forma Llibres electrònics
ISBN 3-030-66515-1
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Intro -- Preface -- Contents -- Learning Enabled Constrained Black-Box Optimization -- 1 Introduction -- 2 Constrained Black-Box Optimization -- 3 The Basic Probabilistic Framework -- 3.1 Gaussian Processes -- 3.2 GP-Based Optimization -- 4 Constrained Bayesian Optimization -- 5 Constrained Bayesian Optimization for Partially Defined Objective Functions -- 6 Software for the Generation of Constrained Test Problems -- 6.1 Emmental-Type GKLS Generator -- 7 Conclusions -- References -- Black-Box Optimization: Methods and Applications -- 1 Introduction -- 2 Overview of BBO Methods -- 2.1 Direct Search Methods -- 2.1.1 Simplex Search -- 2.1.2 Coordinate Search -- 2.1.3 Generalized Pattern Search -- 2.1.4 Mesh Adaptive Direct Search -- 2.2 Model-Based Methods -- 2.2.1 Model-Based Trust Region -- 2.2.2 Projection-Based Methods -- 2.3 Heuristic Methods -- 2.3.1 DIRECT -- 2.3.2 Multilevel Coordinate Search -- 2.3.3 Hit-and-Run algorithms -- 2.3.4 Simulated Annealing -- 2.3.5 Genetic Algorithm -- 2.3.6 Particle Swarm Optimization -- 2.3.7 Surrogate Management Framework -- 2.3.8 Branch and Fit -- 2.4 Hybrid Methods -- 2.5 Extension to Constrained Problems -- 2.5.1 Penalty Method -- 2.5.2 Augmented Lagrangian -- 2.5.3 Filter Method -- 2.5.4 Surrogate Modeling -- 3 BBO Solvers -- 4 Recent Applications -- 4.1 Automatic Machine Learning -- 4.2 Optimization Solvers -- 4.3 Fluid Mechanics -- 4.4 Oilfield Development and Operations -- 4.5 Chemical and Biochemical Engineering -- 5 Open Problems and Future Research Directions -- References -- Tuning Algorithms for Stochastic Black-Box Optimization: State of the Art and Future Perspectives -- 1 Introduction -- 2 Tuning: Strategies -- 2.1 Key Topics -- 2.2 Stochastic Optimization Algorithms -- 2.3 Algorithm Tuning -- 2.4 Example: Grefenstette's Study of Control Parameters for Genetic Algorithms.
2.5 No Free Lunch Theorems -- 2.6 Tuning for Deterministic Algorithms -- 3 Test Sets -- 3.1 Test Functions -- 3.2 Application Domains -- 3.2.1 Tuning in Industry -- 3.2.2 Energy -- 3.2.3 Water Industry -- 3.2.4 Steel Industry -- 3.2.5 Automotive -- 3.2.6 Information Technology -- 4 Statistical Considerations -- 4.1 Experimental Setup -- 4.2 Design of Experiments -- 4.3 Measuring Performance -- 4.4 Reporting Results -- 5 Parallelization -- 5.1 Overview -- 5.2 Simplistic Approaches -- 5.3 Parallelization in Surrogate Model-Based Optimization -- 5.3.1 Uncertainty-Based Methods -- 5.3.2 Surrogate-Assisted Algorithms -- 6 Tuning Approaches -- 6.1 Overview -- 6.2 Manual Tuning -- 6.3 Automatic Tuning -- 6.4 Interactive Tuning -- 6.5 Internal Tuning -- 7 Tuning Software -- 7.1 Overview -- 7.2 IRACE -- 7.3 SPOT -- 7.4 SMAC -- 7.5 ParamILS -- 7.6 GGA -- 7.7 Usability and Availability of Tuning Software -- 7.8 Example: SPOT -- 8 Research Directions and Open Problems -- 9 Summary and Outlook -- References -- Quality-Diversity Optimization: A Novel Branch of Stochastic Optimization -- 1 Introduction -- 2 Problem Formulation -- 2.1 Collections of Solutions -- 2.2 How Do We Measure the Performance of a QD Algorithm? -- 3 Optimizing a Collection of Solutions -- 3.1 MAP-Elites -- 3.2 A Unified Framework -- 3.2.1 Containers -- 3.2.2 Selection Operators -- 3.2.3 Population Scores -- 3.3 Considerations of Quality-Diversity Optimization -- 4 Origins and Related Work -- 4.1 Searching for Diverse Behaviors -- 4.2 Connections to Multimodal Optimization -- 4.3 Connections to Multitask Optimization -- 5 Current Topics -- 5.1 Expensive Objective Functions -- 5.2 High-Dimensional Feature Space -- 5.3 Learning the Behavior Descriptor -- 5.4 Improving Variation Operators -- 5.5 Noisy Functions -- 6 Conclusion -- References.
Multi-Objective Evolutionary Algorithms: Past, Present, and Future -- 1 Introduction -- 2 Basic Concepts -- 3 The Past -- 3.1 Non-Elitist Non-Pareto Approaches -- 3.1.1 Linear Aggregating Functions -- 3.1.2 Vector Evaluated Genetic Algorithm (VEGA) -- 3.1.3 Lexicographic Ordering -- 3.1.4 Target-Vector Approaches -- 3.2 Non-Elitist Pareto-Based Approaches -- 3.2.1 Multi-Objective Genetic Algorithm (MOGA) -- 3.2.2 Nondominated Sorting Genetic Algorithm (NSGA) -- 3.2.3 Niched-Pareto Genetic Algorithm (NPGA) -- 3.3 Elitist Pareto-Based Approaches -- 3.3.1 The Strength Pareto Evolutionary Algorithm (SPEA) -- 3.3.2 The Pareto Archived Evolution Strategy (PAES) -- 3.3.3 The Nondominated Sorting Genetic Algorithm-II (NSGA-II) -- 4 The Present -- 4.1 Some Applications -- 5 The Future -- 6 Conclusions -- References -- Black-Box and Data-Driven Computation -- 1 Introduction -- 2 Black Box and Oracle -- 3 Reduction -- 4 Data-Driven Computation -- References -- Mathematically Rigorous Global Optimization and FuzzyOptimization -- 1 Introduction -- 2 Interval Analysis: Fundamentals and Philosophy -- 2.1 Overview -- 2.2 Interval Logic -- 2.3 Extensions -- 2.4 History and References -- 3 Fuzzy Sets: Fundamentals and Philosophy -- 3.1 Fuzzy Logic -- 3.2 A Brief History -- 4 The Branch and Bound Framework: Some Definitions and Details -- 5 Interval Technology: Some Details -- 5.1 Interval Newton Methods -- 5.2 Constraint Propagation -- 5.3 Relaxations -- 5.4 Interval Arithmetic Software -- 6 Fuzzy Technology: A Few Details -- 7 Conclusions -- References -- Optimization Under Uncertainty Explains Empirical Success of Deep Learning Heuristics -- 1 Formulation of the Problem -- 2 Why Rectified Linear Neurons Are Efficient: A Theoretical Explanation -- 3 Why Sigmoid Activation Functions -- 4 Selection of Poolings -- 5 Why Softmax -- 6 Which Averaging Should We Choose.
7 Proofs -- References -- Variable Neighborhood Programming as a Tool of Machine Learning -- 1 Introduction -- 2 Variable Neighborhood Search -- 3 Variable Neighborhood Programming -- 3.1 Solution Presentation -- 3.2 Neighborhood Structures -- 3.3 Elementary Tree Transformation in Automatic Programming -- 3.3.1 ETT in the Tree of an Undirected Graph -- 3.3.2 ETT in AP Tree -- 3.3.3 Bound on Cardinality of AP-ETT(T) -- 4 VNP for Symbolic Regression -- 4.1 Test Instances and Parameter Values -- 4.2 Comparison of BVNP with Other Methods -- 5 Life Expectancy Estimation as a Symbolic Regression Problem Solved by VNP: Case Study on Russian Districts -- 5.1 Life Expectancy Estimation as a Machine Learning Problem -- 5.2 VNP for Estimating Life Expectancy Problem -- 5.3 Case Study at Russian Districts -- 5.3.1 One-Attribute Analysis -- 5.3.2 Results and Discussion on 3-Attribute Data -- 5.4 Conclusions -- 6 Preventive Maintenance in Railway Planning as a Machine Learning Problem -- 6.1 Literature Review and Motivation -- 6.2 Reduced VNP for Solving the Preventive Maintenance Planning of Railway Infrastructure -- 6.2.1 Learning for Stage 1: Prediction -- 6.2.2 Learning for Stage 2: Classification -- 6.3 Computation Results -- 6.3.1 Prediction -- 6.3.2 Classification -- 6.4 Conclusions and Future Work -- 7 Conclusions -- References -- Non-lattice Covering and Quantization of High Dimensional Sets -- 1 Introduction -- 2 Weak Covering -- 2.1 Comparison of Designs from the View Point of Weak Covering -- 2.2 Reduction to the Probability of Covering a Point by One Ball -- 2.3 Designs of Theoretical Interest -- 3 Approximation of Cd(Zn,r) for Design 1 -- 3.1 Normal Approximation for PU,δ,α,r -- 3.2 Refined Approximation for PU,δ,α,r -- 3.3 Approximation for Cd(Zn,r) for Design 1 -- 4 Approximating Cd(Zn,r) for Design 2a -- 4.1 Normal Approximation for PU,δ,0,r.
4.2 Refined Approximation for PU,δ,0,r -- 4.3 Approximation for Cd(Zn,r) -- 5 Approximating Cd(Zn,r) for Design 2b -- 5.1 Establishing a Connection Between Sampling with and Without Replacement: General Case -- 5.2 Approximation of Cd(Zn,r) for Design 2b. -- 6 Numerical Study -- 6.1 Assessing Accuracy of Approximations of Cd(Zn,r) and Studying Their Dependence on δ -- 6.2 Comparison Across α -- 7 Quantization in a Cube -- 7.1 Quantization Error and Its Relation to Weak Covering -- 7.2 Quantization Error for Design 1 -- 7.3 Quantization Error for Design 2a -- 7.4 Quantization Error for Design 2b -- 7.5 Accuracy of Approximations for Quantization Error and the δ-Effect -- 8 Comparative Numerical Studies of Covering Properties for Several Designs -- 8.1 Covering Comparisons -- 8.2 Quantization Comparisons -- 9 Covering and Quantization in the d-Simplex -- 9.1 Characteristics of Interest -- 9.2 Numerical Investigation of the δ-Effect for d-Simplex -- 10 Appendix: An Auxiliary Lemma -- References -- Finding Effective SAT Partitionings Via Black-Box Optimization -- 1 Introduction -- 2 Preliminaries -- 2.1 Boolean Satisfiability Problem (SAT) -- 2.2 SAT-Based Cryptanalysis -- 3 Decomposition Sets and Backdoors in SAT with Application to Inversion of Discrete Functions -- 3.1 On Interconnection Between Plain Partitionings and Cryptographic Attacks -- 3.2 Using Monte Carlo Method to Estimate Runtime of SAT-Based Guess-and-Determine Attacks -- 4 Practical Aspects of Evaluating Effectiveness of SAT Partitionings -- 4.1 Narrowing Search Space to SUPBS -- 4.2 Applications of Incremental SAT Solving -- 4.3 Finding Partitionings via Incremental SAT -- 5 Employed Optimization Algorithms -- 6 Experimental Results -- 6.1 Considered Problems -- 6.2 Implementations of Objective Functions -- 6.3 Finding Effective SAT Partitionings.
6.4 Solving Hard SAT Instances via Found Partitionings.
Record Nr. UNINA-9910483695503321
Cham, Switzerland : , : Springer, , [2021]
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Ensemble methods : foundations and algorithms / / Zhi-Hua Zhou
Ensemble methods : foundations and algorithms / / Zhi-Hua Zhou
Autore Zhou Zhi-Hua, Ph. D.
Pubbl/distr/stampa Boca Raton, Fla. : , : CRC Press, , 2012
Descrizione fisica 1 online resource (234 p.)
Disciplina 006.3/1
Collana Chapman & Hall/CRC machine learning & pattern recognition series
Soggetto topico Machine learning - Mathematics
Algorithms
ISBN 0-429-15109-8
1-4398-3005-3
Classificazione BUS061000COM021030COM037000
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Front Cover; Preface; Notations; Contents; 1. Introduction; 2. Boosting; 3. Bagging; 4. Combination Methods; 5. Diversity; 6. Ensemble Pruning; 7. Clustering Ensembles; 8. Advanced Topics; References
Record Nr. UNINA-9910790494803321
Zhou Zhi-Hua, Ph. D.  
Boca Raton, Fla. : , : CRC Press, , 2012
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Ensemble methods : foundations and algorithms / / Zhi-Hua Zhou
Ensemble methods : foundations and algorithms / / Zhi-Hua Zhou
Autore Zhou Zhi-Hua, Ph. D.
Edizione [1st ed.]
Pubbl/distr/stampa Boca Raton, Fla. : , : CRC Press, , 2012
Descrizione fisica 1 online resource (234 p.)
Disciplina 006.3/1
Collana Chapman & Hall/CRC machine learning & pattern recognition series
Soggetto topico Machine learning - Mathematics
Algorithms
ISBN 0-429-15109-8
1-4398-3005-3
Classificazione BUS061000COM021030COM037000
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Front Cover; Preface; Notations; Contents; 1. Introduction; 2. Boosting; 3. Bagging; 4. Combination Methods; 5. Diversity; 6. Ensemble Pruning; 7. Clustering Ensembles; 8. Advanced Topics; References
Record Nr. UNINA-9910808092203321
Zhou Zhi-Hua, Ph. D.  
Boca Raton, Fla. : , : CRC Press, , 2012
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Mathematical foundations for data analysis / / Jeff M. Phillips
Mathematical foundations for data analysis / / Jeff M. Phillips
Autore Phillips Jeff M.
Pubbl/distr/stampa Cham, Switzerland : , : Springer, , [2021]
Descrizione fisica 1 online resource (299 pages)
Disciplina 006.312
Collana Springer Series in the Data Sciences
Soggetto topico Data mining - Mathematics
Machine learning - Mathematics
Mineria de dades
Aprenentatge automàtic
Matemàtica
Soggetto genere / forma Llibres electrònics
ISBN 3-030-62341-6
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Intro -- Preface -- Acknowledgements -- Contents -- 1 Probability Review -- 1.1 Sample Spaces -- 1.2 Conditional Probability and Independence -- 1.3 Density Functions -- 1.4 Expected Value -- 1.5 Variance -- 1.6 Joint, Marginal, and Conditional Distributions -- 1.7 Bayes' Rule -- 1.7.1 Model Given Data -- 1.8 Bayesian Inference -- Exercises -- 2 Convergence and Sampling -- 2.1 Sampling and Estimation -- 2.2 Probably Approximately Correct (PAC) -- 2.3 Concentration of Measure -- 2.3.1 Markov Inequality -- 2.3.2 Chebyshev Inequality -- 2.3.3 Chernoff-Hoeffding Inequality -- 2.3.4 Union Bound and Examples -- 2.4 Importance Sampling -- 2.4.1 Sampling Without Replacement with Priority Sampling -- Exercises -- 3 Linear Algebra Review -- 3.1 Vectors and Matrices -- 3.2 Addition and Multiplication -- 3.3 Norms -- 3.4 Linear Independence -- 3.5 Rank -- 3.6 Square Matrices and Properties -- 3.7 Orthogonality -- Exercises -- 4 Distances and Nearest Neighbors -- 4.1 Metrics -- 4.2 Lp Distances and their Relatives -- 4.2.1 Lp Distances -- 4.2.2 Mahalanobis Distance -- 4.2.3 Cosine and Angular Distance -- 4.2.4 KL Divergence -- 4.3 Distances for Sets and Strings -- 4.3.1 Jaccard Distance -- 4.3.2 Edit Distance -- 4.4 Modeling Text with Distances -- 4.4.1 Bag-of-Words Vectors -- 4.4.2 k-Grams -- 4.5 Similarities -- 4.5.1 Set Similarities -- 4.5.2 Normed Similarities -- 4.5.3 Normed Similarities between Sets -- 4.6 Locality Sensitive Hashing -- 4.6.1 Properties of Locality Sensitive Hashing -- 4.6.2 Prototypical Tasks for LSH -- 4.6.3 Banding to Amplify LSH -- 4.6.4 LSH for Angular Distance -- 4.6.5 LSH for Euclidean Distance -- 4.6.6 Min Hashing as LSH for Jaccard Distance -- Exercises -- 5 Linear Regression -- 5.1 Simple Linear Regression -- 5.2 Linear Regression with Multiple Explanatory Variables -- 5.3 Polynomial Regression -- 5.4 Cross-Validation.
5.4.1 Other ways to Evaluate Linear Regression Models -- 5.5 Regularized Regression -- 5.5.1 Tikhonov Regularization for Ridge Regression -- 5.5.2 Lasso -- 5.5.3 Dual Constrained Formulation -- 5.5.4 Matching Pursuit -- Exercises -- 6 Gradient Descent -- 6.1 Functions -- 6.2 Gradients -- 6.3 Gradient Descent -- 6.3.1 Learning Rate -- 6.4 Fitting a Model to Data -- 6.4.1 Least Mean Squares Updates for Regression -- 6.4.2 Decomposable Functions -- Exercises -- 7 Dimensionality Reduction -- 7.1 Data Matrices -- 7.1.1 Projections -- 7.1.2 Sum of Squared Errors Goal -- 7.2 Singular Value Decomposition -- 7.2.1 Best Rank-k Approximation of a Matrix -- 7.3 Eigenvalues and Eigenvectors -- 7.4 The Power Method -- 7.5 Principal Component Analysis -- 7.6 Multidimensional Scaling -- 7.6.1 Why does Classical MDS work? -- 7.7 Linear Discriminant Analysis -- 7.8 Distance Metric Learning -- 7.9 Matrix Completion -- 7.10 Random Projections -- Exercises -- 8 Clustering -- 8.1 Voronoi Diagrams -- 8.1.1 Delaunay Triangulation -- 8.1.2 Connection to Assignment-Based Clustering -- 8.2 Gonzalez's Algorithm for k-Center Clustering -- 8.3 Lloyd's Algorithm for k-Means Clustering -- 8.3.1 Lloyd's Algorithm -- 8.3.2 k-Means++ -- 8.3.3 k-Mediod Clustering -- 8.3.4 Soft Clustering -- 8.4 Mixture of Gaussians -- 8.4.1 Expectation-Maximization -- 8.5 Hierarchical Clustering -- 8.6 Density-Based Clustering and Outliers -- 8.6.1 Outliers -- 8.7 Mean Shift Clustering -- Exercises -- 9 Classification -- 9.1 Linear Classifiers -- 9.1.1 Loss Functions -- 9.1.2 Cross-Validation and Regularization -- 9.2 Perceptron Algorithm -- 9.3 Support Vector Machines and Kernels -- 9.3.1 The Dual: Mistake Counter -- 9.3.2 Feature Expansion -- 9.3.3 Support Vector Machines -- 9.4 Learnability and VC dimension -- 9.5 kNN Classifiers -- 9.6 Decision Trees -- 9.7 Neural Networks.
9.7.1 Training with Back-propagation -- 10 Graph Structured Data -- 10.1 Markov Chains -- 10.1.1 Ergodic Markov Chains -- 10.1.2 Metropolis Algorithm -- 10.2 PageRank -- 10.3 Spectral Clustering on Graphs -- 10.3.1 Laplacians and their EigenStructures -- 10.4 Communities in Graphs -- 10.4.1 Preferential Attachment -- 10.4.2 Betweenness -- 10.4.3 Modularity -- Exercises -- 11 Big Data and Sketching -- 11.1 The Streaming Model -- 11.1.1 Mean and Variance -- 11.1.2 Reservoir Sampling -- 11.2 Frequent Items -- 11.2.1 Warm-Up: Majority -- 11.2.2 Misra-Gries Algorithm -- 11.2.3 Count-Min Sketch -- 11.2.4 Count Sketch -- 11.3 Matrix Sketching -- 11.3.1 Covariance Matrix Summation -- 11.3.2 Frequent Directions -- 11.3.3 Row Sampling -- 11.3.4 Random Projections and Count Sketch Hashing -- Exercises -- Index.
Record Nr. UNINA-9910483358803321
Phillips Jeff M.  
Cham, Switzerland : , : Springer, , [2021]
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Mathematical foundations for data analysis / / Jeff M. Phillips
Mathematical foundations for data analysis / / Jeff M. Phillips
Autore Phillips Jeff M.
Pubbl/distr/stampa Cham, Switzerland : , : Springer, , [2021]
Descrizione fisica 1 online resource (299 pages)
Disciplina 006.312
Collana Springer Series in the Data Sciences
Soggetto topico Data mining - Mathematics
Machine learning - Mathematics
Mineria de dades
Aprenentatge automàtic
Matemàtica
Soggetto genere / forma Llibres electrònics
ISBN 3-030-62341-6
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Intro -- Preface -- Acknowledgements -- Contents -- 1 Probability Review -- 1.1 Sample Spaces -- 1.2 Conditional Probability and Independence -- 1.3 Density Functions -- 1.4 Expected Value -- 1.5 Variance -- 1.6 Joint, Marginal, and Conditional Distributions -- 1.7 Bayes' Rule -- 1.7.1 Model Given Data -- 1.8 Bayesian Inference -- Exercises -- 2 Convergence and Sampling -- 2.1 Sampling and Estimation -- 2.2 Probably Approximately Correct (PAC) -- 2.3 Concentration of Measure -- 2.3.1 Markov Inequality -- 2.3.2 Chebyshev Inequality -- 2.3.3 Chernoff-Hoeffding Inequality -- 2.3.4 Union Bound and Examples -- 2.4 Importance Sampling -- 2.4.1 Sampling Without Replacement with Priority Sampling -- Exercises -- 3 Linear Algebra Review -- 3.1 Vectors and Matrices -- 3.2 Addition and Multiplication -- 3.3 Norms -- 3.4 Linear Independence -- 3.5 Rank -- 3.6 Square Matrices and Properties -- 3.7 Orthogonality -- Exercises -- 4 Distances and Nearest Neighbors -- 4.1 Metrics -- 4.2 Lp Distances and their Relatives -- 4.2.1 Lp Distances -- 4.2.2 Mahalanobis Distance -- 4.2.3 Cosine and Angular Distance -- 4.2.4 KL Divergence -- 4.3 Distances for Sets and Strings -- 4.3.1 Jaccard Distance -- 4.3.2 Edit Distance -- 4.4 Modeling Text with Distances -- 4.4.1 Bag-of-Words Vectors -- 4.4.2 k-Grams -- 4.5 Similarities -- 4.5.1 Set Similarities -- 4.5.2 Normed Similarities -- 4.5.3 Normed Similarities between Sets -- 4.6 Locality Sensitive Hashing -- 4.6.1 Properties of Locality Sensitive Hashing -- 4.6.2 Prototypical Tasks for LSH -- 4.6.3 Banding to Amplify LSH -- 4.6.4 LSH for Angular Distance -- 4.6.5 LSH for Euclidean Distance -- 4.6.6 Min Hashing as LSH for Jaccard Distance -- Exercises -- 5 Linear Regression -- 5.1 Simple Linear Regression -- 5.2 Linear Regression with Multiple Explanatory Variables -- 5.3 Polynomial Regression -- 5.4 Cross-Validation.
5.4.1 Other ways to Evaluate Linear Regression Models -- 5.5 Regularized Regression -- 5.5.1 Tikhonov Regularization for Ridge Regression -- 5.5.2 Lasso -- 5.5.3 Dual Constrained Formulation -- 5.5.4 Matching Pursuit -- Exercises -- 6 Gradient Descent -- 6.1 Functions -- 6.2 Gradients -- 6.3 Gradient Descent -- 6.3.1 Learning Rate -- 6.4 Fitting a Model to Data -- 6.4.1 Least Mean Squares Updates for Regression -- 6.4.2 Decomposable Functions -- Exercises -- 7 Dimensionality Reduction -- 7.1 Data Matrices -- 7.1.1 Projections -- 7.1.2 Sum of Squared Errors Goal -- 7.2 Singular Value Decomposition -- 7.2.1 Best Rank-k Approximation of a Matrix -- 7.3 Eigenvalues and Eigenvectors -- 7.4 The Power Method -- 7.5 Principal Component Analysis -- 7.6 Multidimensional Scaling -- 7.6.1 Why does Classical MDS work? -- 7.7 Linear Discriminant Analysis -- 7.8 Distance Metric Learning -- 7.9 Matrix Completion -- 7.10 Random Projections -- Exercises -- 8 Clustering -- 8.1 Voronoi Diagrams -- 8.1.1 Delaunay Triangulation -- 8.1.2 Connection to Assignment-Based Clustering -- 8.2 Gonzalez's Algorithm for k-Center Clustering -- 8.3 Lloyd's Algorithm for k-Means Clustering -- 8.3.1 Lloyd's Algorithm -- 8.3.2 k-Means++ -- 8.3.3 k-Mediod Clustering -- 8.3.4 Soft Clustering -- 8.4 Mixture of Gaussians -- 8.4.1 Expectation-Maximization -- 8.5 Hierarchical Clustering -- 8.6 Density-Based Clustering and Outliers -- 8.6.1 Outliers -- 8.7 Mean Shift Clustering -- Exercises -- 9 Classification -- 9.1 Linear Classifiers -- 9.1.1 Loss Functions -- 9.1.2 Cross-Validation and Regularization -- 9.2 Perceptron Algorithm -- 9.3 Support Vector Machines and Kernels -- 9.3.1 The Dual: Mistake Counter -- 9.3.2 Feature Expansion -- 9.3.3 Support Vector Machines -- 9.4 Learnability and VC dimension -- 9.5 kNN Classifiers -- 9.6 Decision Trees -- 9.7 Neural Networks.
9.7.1 Training with Back-propagation -- 10 Graph Structured Data -- 10.1 Markov Chains -- 10.1.1 Ergodic Markov Chains -- 10.1.2 Metropolis Algorithm -- 10.2 PageRank -- 10.3 Spectral Clustering on Graphs -- 10.3.1 Laplacians and their EigenStructures -- 10.4 Communities in Graphs -- 10.4.1 Preferential Attachment -- 10.4.2 Betweenness -- 10.4.3 Modularity -- Exercises -- 11 Big Data and Sketching -- 11.1 The Streaming Model -- 11.1.1 Mean and Variance -- 11.1.2 Reservoir Sampling -- 11.2 Frequent Items -- 11.2.1 Warm-Up: Majority -- 11.2.2 Misra-Gries Algorithm -- 11.2.3 Count-Min Sketch -- 11.2.4 Count Sketch -- 11.3 Matrix Sketching -- 11.3.1 Covariance Matrix Summation -- 11.3.2 Frequent Directions -- 11.3.3 Row Sampling -- 11.3.4 Random Projections and Count Sketch Hashing -- Exercises -- Index.
Record Nr. UNISA-996466554403316
Phillips Jeff M.  
Cham, Switzerland : , : Springer, , [2021]
Materiale a stampa
Lo trovi qui: Univ. di Salerno
Opac: Controlla la disponibilità qui
Optimization for learning and control / / Anders Hansson and Martin Andersen
Optimization for learning and control / / Anders Hansson and Martin Andersen
Autore Hansson Anders
Pubbl/distr/stampa Hoboken, New Jersey : , : John Wiley & Sons, Inc., , [2023]
Descrizione fisica 1 online resource (435 pages)
Disciplina 519.3
Soggetto topico System analysis - Mathematics
Mathematical optimization
Machine learning - Mathematics
Signal processing - Mathematics
ISBN 1-119-80918-5
1-119-80914-2
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Cover -- Title Page -- Copyright -- Contents -- Preface -- Acknowledgments -- Glossary -- Acronyms -- About the Companion Website -- Part I Introductory Part -- Chapter 1 Introduction -- 1.1 Optimization -- 1.2 Unsupervised Learning -- 1.3 Supervised Learning -- 1.4 System Identification -- 1.5 Control -- 1.6 Reinforcement Learning -- 1.7 Outline -- Chapter 2 Linear Algebra -- 2.1 Vectors and Matrices -- 2.2 Linear Maps and Subspaces -- 2.2.1 Four Fundamental Subspaces -- 2.2.2 Square Matrices -- 2.2.3 Affine Sets -- 2.3 Norms -- 2.4 Algorithm Complexity -- 2.5 Matrices with Structure -- 2.5.1 Diagonal Matrices -- 2.5.2 Orthogonal Matrices -- 2.5.3 Triangular Matrices -- 2.5.4 Symmetric and Skew‐Symmetric Matrices -- 2.5.5 Toeplitz and Hankel Matrices -- 2.5.6 Sparse Matrices -- 2.5.7 Band Matrices -- 2.6 Quadratic Forms and Definiteness -- 2.7 Spectral Decomposition -- 2.8 Singular Value Decomposition -- 2.9 Moore-Penrose Pseudoinverse -- 2.10 Systems of Linear Equations -- 2.10.1 Gaussian Elimination -- 2.10.2 Block Elimination -- 2.11 Factorization Methods -- 2.11.1 LU Factorization -- 2.11.2 Cholesky Factorization -- 2.11.3 Indefinite LDL Factorization -- 2.11.4 QR Factorization -- 2.11.5 Sparse Factorizations -- 2.11.6 Block Factorization -- 2.11.7 Positive Semidefinite Block Factorization -- 2.12 Saddle‐Point Systems -- 2.12.1 H Positive Definite -- 2.12.2 H Positive Semidefinite -- 2.13 Vector and Matrix Calculus -- Exercises -- Chapter 3 Probability Theory -- 3.1 Probability Spaces -- 3.1.1 Probability Measure -- 3.1.2 Probability Function -- 3.1.3 Probability Density Function -- 3.2 Conditional Probability -- 3.3 Independence -- 3.4 Random Variables -- 3.4.1 Vector‐Valued Random Variable -- 3.4.2 Marginal Distribution -- 3.4.3 Independence of Random Variables -- 3.4.4 Function of Random Variable -- 3.5 Conditional Distributions.
3.5.1 Conditional Probability Function -- 3.5.2 Conditional Probability Density Function -- 3.6 Expectations -- 3.6.1 Moments -- 3.6.2 Expected Value of Function of Random Variable -- 3.6.3 Covariance -- 3.7 Conditional Expectations -- 3.8 Convergence of Random Variables -- 3.9 Random Processes -- 3.10 Markov Processes -- 3.11 Hidden Markov Models -- 3.12 Gaussian Processes -- Exercises -- Part II Optimization -- Chapter 4 Optimization Theory -- 4.1 Basic Concepts and Terminology -- 4.1.1 Optimization Problems -- 4.1.2 Equivalent Problems -- 4.2 Convex Sets -- 4.2.1 Convexity‐Preserving Operations -- 4.2.1.1 Intersection -- 4.2.1.2 Affine Transformation -- 4.2.1.3 Perspective Transformation -- 4.2.2 Examples of Convex Sets -- 4.2.2.1 Hyperplanes and Halfspaces -- 4.2.2.2 Polyhedral Sets -- 4.2.2.3 Norm Balls and Ellipsoids -- 4.2.2.4 Convex Cones -- 4.2.3 Generalized Inequalities -- 4.3 Convex Functions -- 4.3.1 First‐ and Second‐Order Conditions for Convexity -- 4.3.2 Convexity‐Preserving Operations -- 4.3.2.1 Scaling, Sums, and Integrals -- 4.3.2.2 Pointwise Maximum and Supremum -- 4.3.2.3 Affine Transformation -- 4.3.2.4 Perspective Transformation -- 4.3.2.5 Partial Infimum -- 4.3.2.6 Square of Nonnegative Convex Functions -- 4.3.3 Examples of Convex Functions -- 4.3.3.1 Norms -- 4.3.3.2 Indicator and Support Functions -- 4.3.4 Conjugation -- 4.3.5 Dual Norms -- 4.4 Subdifferentiability -- 4.4.1 Subdifferential Calculus -- 4.4.1.1 Nonnegative Scaling -- 4.4.1.2 Summation -- 4.4.1.3 Affine Transformation -- 4.4.1.4 Pointwise Maximum -- 4.4.1.5 Subgradients of Conjugate Functions -- 4.5 Convex Optimization Problems -- 4.5.1 Optimality Condition -- 4.5.2 Equality Constrained Convex Problems -- 4.6 Duality -- 4.6.1 Lagrangian Duality -- 4.6.2 Lagrange Dual Problem -- 4.6.3 Fenchel Duality -- 4.7 Optimality Conditions.
4.7.1 Convex Optimization Problems -- 4.7.2 Nonconvex Optimization Problems -- Exercises -- Chapter 5 Optimization Problems -- 5.1 Least‐Squares Problems -- 5.2 Quadratic Programs -- 5.3 Conic Optimization -- 5.3.1 Conic Duality -- 5.3.2 Epigraphical Cones -- 5.4 Rank Optimization -- 5.5 Partially Separability -- 5.5.1 Minimization of Partially Separable Functions -- 5.5.2 Principle of Optimality -- 5.6 Multiparametric Optimization -- 5.7 Stochastic Optimization -- Exercises -- Chapter 6 Optimization Methods -- 6.1 Basic Principles -- 6.1.1 Smoothness -- 6.1.2 Descent Methods -- 6.1.3 Line Search Methods -- 6.1.3.1 Backtracking Line Search -- 6.1.3.2 Bisection Method for Wolfe Conditions -- 6.1.4 Surrogation Methods -- 6.1.4.1 Trust‐Region Methods -- 6.1.4.2 Majorization Minimization -- 6.1.5 Convergence of Sequences -- 6.2 Gradient Descent -- 6.2.1 L‐Smooth Functions -- 6.2.2 Smooth and Convex Functions -- 6.2.3 Smooth and Strongly Convex Functions -- 6.3 Newton's Method -- 6.3.1 The Newton Decrement -- 6.3.2 Analysis of Newton's Method -- 6.3.2.1 Affine Invariance -- 6.3.2.2 Pure Newton Phase -- 6.3.2.3 Damped Newton Phase -- 6.3.3 Equality Constrained Minimization -- 6.4 Variable Metric Methods -- 6.4.1 Quasi‐Newton Updates -- 6.4.1.1 The BFGS Update -- 6.4.1.2 The DFP Update -- 6.4.1.3 The SR1 Update -- 6.4.2 The Barzilai-Borwein Method -- 6.5 Proximal Gradient Method -- 6.5.1 Gradient Projection Method -- 6.5.2 Proximal Quasi‐Newton -- 6.5.3 Accelerated Proximal Gradient Method -- 6.6 Sequential Convex Optimization -- 6.7 Methods for Nonlinear Least‐Squares -- 6.7.1 The Levenberg‐Marquardt Algorithm -- 6.7.2 The Variable Projection Method -- 6.8 Stochastic Optimization Methods -- 6.8.1 Smooth Functions -- 6.8.2 Smooth and Strongly Convex Functions -- 6.8.3 Incremental Methods -- 6.8.4 Adaptive Methods -- 6.8.4.1 AdaGrad -- 6.8.4.2 RMSprop.
6.8.4.3 Adam -- 6.9 Coordinate Descent Methods -- 6.10 Interior‐Point Methods -- 6.10.1 Path‐Following Method -- 6.10.2 Generalized Inequalities -- 6.11 Augmented Lagrangian Methods -- 6.11.1 Method of Multipliers -- 6.11.2 Alternating Direction Method of Multipliers -- 6.11.3 Variable Splitting -- Exercises -- Part III Optimal Control -- Chapter 7 Calculus of Variations -- 7.1 Extremum of Functionals -- 7.1.1 Necessary Condition for Extremum -- 7.1.2 Sufficient Condition for Optimality -- 7.1.3 Constrained Problem -- 7.1.4 Du Bois-Reymond Lemma -- 7.1.5 Generalizations -- 7.2 The Pontryagin Maximum Principle -- 7.2.1 Linear Quadratic Control -- 7.2.2 The Riccati Equation -- 7.3 The Euler-Lagrange Equations -- 7.3.1 Beltrami's Identity -- 7.4 Extensions -- 7.5 Numerical Solutions -- 7.5.1 The Gradient Method -- 7.5.2 The Shooting Method -- 7.5.3 The Discretization Method -- 7.5.4 The Multiple Shooting Method -- 7.5.5 The Collocation Method -- Exercises -- Chapter 8 Dynamic Programming -- 8.1 Finite Horizon Optimal Control -- 8.1.1 Standard Optimization Problem -- 8.1.2 Dynamic Programming -- 8.2 Parametric Approximations -- 8.2.1 Fitted‐Value Iteration -- 8.3 Infinite Horizon Optimal Control -- 8.3.1 Bellman Equation -- 8.4 Value Iterations -- 8.5 Policy Iterations -- 8.5.1 Approximation -- 8.6 Linear Programming Formulation -- 8.6.1 Approximations -- 8.7 Model Predictive Control -- 8.7.1 Infinite Horizon Problem -- 8.7.2 Guessing the Value Function -- 8.7.3 Finite Horizon Approximation -- 8.7.4 Receding Horizon Approximation -- 8.8 Explicit MPC -- 8.9 Markov Decision Processes -- 8.9.1 Stochastic Dynamic Programming -- 8.9.2 Infinite Time Horizon -- 8.9.3 Stochastic Bellman Equation -- 8.10 Appendix -- 8.10.1 Stability and Optimality of Infinite Horizon Problem -- 8.10.2 Stability and Optimality of Stochastic Infinite Time Horizon Problem.
8.10.3 Stability of MPC -- Exercises -- Part IV Learning -- Chapter 9 Unsupervised Learning -- 9.1 Chebyshev Bounds -- 9.2 Entropy -- 9.2.1 Categorical Distribution -- 9.2.2 Ising Distribution -- 9.2.3 Normal Distribution -- 9.3 Prediction -- 9.3.1 Conditional Expectation Predictor -- 9.3.2 Affine Predictor -- 9.3.3 Linear Regression -- 9.4 The Viterbi Algorithm -- 9.5 Kalman Filter on Innovation Form -- 9.6 Viterbi Decoder -- 9.7 Graphical Models -- 9.7.1 Ising Distribution -- 9.7.2 Normal Distribution -- 9.7.3 Markov Random Field -- 9.8 Maximum Likelihood Estimation -- 9.8.1 Categorical Distribution -- 9.8.2 Ising Distribution -- 9.8.3 Normal Distribution -- 9.8.4 Generalizations -- 9.9 Relative Entropy and Cross Entropy -- 9.9.1 Gibbs' Inequality -- 9.9.2 Cross Entropy -- 9.10 The Expectation Maximization Algorithm -- 9.11 Mixture Models -- 9.12 Gibbs Sampling -- 9.13 Boltzmann Machine -- 9.14 Principal Component Analysis -- 9.14.1 Solution -- 9.14.2 Relation to Rank‐Constrained Optimization -- 9.15 Mutual Information -- 9.15.1 Channel Model -- 9.15.2 Orthogonal Case -- 9.15.3 Nonorthogonal Case -- 9.15.4 Relationship to PCA -- 9.16 Cluster Analysis -- Exercises -- Chapter 10 Supervised Learning -- 10.1 Linear Regression -- 10.1.1 Least‐Squares Estimation -- 10.1.2 Maximum Likelihood Estimation -- 10.1.3 Maximum a Posteriori Estimation -- 10.2 Regression in Hilbert Spaces -- 10.2.1 Infinite‐Dimensional LS Problem -- 10.2.2 The Kernel Trick -- 10.3 Gaussian Processes -- 10.3.1 Gaussian MAP Estimate -- 10.3.2 The Kernel Trick -- 10.4 Classification -- 10.4.1 Linear Regression -- 10.4.2 Logistic Regression -- 10.5 Support Vector Machines -- 10.5.1 Hebbian Learning -- 10.5.2 Quadratic Programming Formulation -- 10.5.3 Soft Margin Classification -- 10.5.4 The Dual Problem -- 10.5.5 Recovering the Primal Solution -- 10.5.6 The Kernel Trick.
10.6 Restricted Boltzmann Machine.
Record Nr. UNINA-9910830085103321
Hansson Anders  
Hoboken, New Jersey : , : John Wiley & Sons, Inc., , [2023]
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Statistical learning with math and R : 100 exercises for building logic / / Joe Suzuki
Statistical learning with math and R : 100 exercises for building logic / / Joe Suzuki
Autore Suzuki Joe
Edizione [1st ed. 2020.]
Pubbl/distr/stampa Gateway East, Singapore : , : Springer, , [2020]
Descrizione fisica 1 online resource (XI, 217 p. 70 illus., 65 illus. in color.)
Disciplina 006.31
Soggetto topico Machine learning - Mathematics
Logic, Symbolic and mathematical
Artificial intelligence - Mathematics
R (Computer program language)
ISBN 981-15-7568-1
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Chapter 1: Linear Algebra -- Chapter 2: Linear Regression -- Chapter 3: Classification -- Chapter 4: Resampling -- Chapter 5: Information Criteria -- Chapter 6: Regularization -- Chapter 7: Nonlinear Regression -- Chapter 8: Decision Trees -- Chapter 9: Support Vector Machine -- Chapter 10: Unsupervised Learning.
Record Nr. UNINA-9910427703303321
Suzuki Joe  
Gateway East, Singapore : , : Springer, , [2020]
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui