Alternating direction method of multipliers for machine learning / / Zhouchen Lin, Huan Li, and Cong Fang |
Autore | Lin Zhouchen |
Pubbl/distr/stampa | Singapore : , : Springer, , [2022] |
Descrizione fisica | 1 online resource (274 pages) |
Disciplina | 005.1 |
Soggetto topico |
Computer algorithms
Machine learning - Statistical methods |
ISBN | 981-16-9840-6 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Record Nr. | UNINA-9910578685303321 |
Lin Zhouchen | ||
Singapore : , : Springer, , [2022] | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Alternating direction method of multipliers for machine learning / / Zhouchen Lin, Huan Li, and Cong Fang |
Autore | Lin Zhouchen |
Pubbl/distr/stampa | Singapore : , : Springer, , [2022] |
Descrizione fisica | 1 online resource (274 pages) |
Disciplina | 005.1 |
Soggetto topico |
Computer algorithms
Machine learning - Statistical methods |
ISBN | 981-16-9840-6 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Record Nr. | UNISA-996478866403316 |
Lin Zhouchen | ||
Singapore : , : Springer, , [2022] | ||
Materiale a stampa | ||
Lo trovi qui: Univ. di Salerno | ||
|
Bayesian tensor decomposition for signal processing and machine learning : modeling, tuning-free algorithms and applications / / Lei Cheng, Zhongtao Chen, and Yik-Chung Wu |
Autore | Cheng Lei |
Edizione | [1st ed. 2023.] |
Pubbl/distr/stampa | Cham, Switzerland : , : Springer, , [2023] |
Descrizione fisica | 1 online resource (189 pages) |
Disciplina | 006.31 |
Soggetto topico |
Machine learning - Statistical methods
Signal processing - Statistical methods |
ISBN | 3-031-22438-8 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto | Tensor decomposition: Basics, algorithms, and recent advances -- Bayesian learning for sparsity-aware modeling -- Bayesian tensor CPD: Modeling and inference -- Bayesian tensor CPD: Performance and real-world applications -- When stochastic optimization meets VI: Scaling Bayesian CPD to massive data -- Bayesian tensor CPD with nonnegative factors -- Complex-valued CPD, orthogonality constraint and beyond Gaussian noises -- Handling missing value: A case study in direction-of-arrival estimation -- From CPD to other tensor decompositions. |
Record Nr. | UNINA-9910672446803321 |
Cheng Lei | ||
Cham, Switzerland : , : Springer, , [2023] | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
An elementary introduction to statistical learning theory [[electronic resource] /] / Sanjeev Kulkarni, Gilbert Harman |
Autore | Kulkarni Sanjeev |
Edizione | [1st ed.] |
Pubbl/distr/stampa | Hoboken, N.J., : Wiley, c2011 |
Descrizione fisica | 1 online resource (235 p.) |
Disciplina |
006.3/1
006.31 |
Altri autori (Persone) | HarmanGilbert |
Collana | Wiley series in probability and statistics |
Soggetto topico |
Machine learning - Statistical methods
Pattern recognition systems |
ISBN |
1-283-09868-7
9786613098689 1-118-02346-3 1-118-02347-1 1-118-02343-9 |
Classificazione | ST 300 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
An Elementary Introduction to Statistical Learning Theory; Contents; Preface; 1 Introduction: Classification, Learning, Features, and Applications; 1.1 Scope; 1.2 Why Machine Learning?; 1.3 Some Applications; 1.3.1 Image Recognition; 1.3.2 Speech Recognition; 1.3.3 Medical Diagnosis; 1.3.4 Statistical Arbitrage; 1.4 Measurements, Features, and Feature Vectors; 1.5 The Need for Probability; 1.6 Supervised Learning; 1.7 Summary; 1.8 Appendix: Induction; 1.9 Questions; 1.10 References; 2 Probability; 2.1 Probability of Some Basic Events; 2.2 Probabilities of Compound Events
2.3 Conditional Probability2.4 Drawing Without Replacement; 2.5 A Classic Birthday Problem; 2.6 Random Variables; 2.7 Expected Value; 2.8 Variance; 2.9 Summary; 2.10 Appendix: Interpretations of Probability; 2.11 Questions; 2.12 References; 3 Probability Densities; 3.1 An Example in Two Dimensions; 3.2 Random Numbers in [0,1]; 3.3 Density Functions; 3.4 Probability Densities in Higher Dimensions; 3.5 Joint and Conditional Densities; 3.6 Expected Value and Variance; 3.7 Laws of Large Numbers; 3.8 Summary; 3.9 Appendix: Measurability; 3.10 Questions; 3.11 References 4 The Pattern Recognition Problem4.1 A Simple Example; 4.2 Decision Rules; 4.3 Success Criterion; 4.4 The Best Classifier: Bayes Decision Rule; 4.5 Continuous Features and Densities; 4.6 Summary; 4.7 Appendix: Uncountably Many; 4.8 Questions; 4.9 References; 5 The Optimal Bayes Decision Rule; 5.1 Bayes Theorem; 5.2 Bayes Decision Rule; 5.3 Optimality and Some Comments; 5.4 An Example; 5.5 Bayes Theorem and Decision Rule with Densities; 5.6 Summary; 5.7 Appendix: Defining Conditional Probability; 5.8 Questions; 5.9 References; 6 Learning from Examples; 6.1 Lack of Knowledge of Distributions 6.2 Training Data6.3 Assumptions on the Training Data; 6.4 A Brute Force Approach to Learning; 6.5 Curse of Dimensionality, Inductive Bias, and No Free Lunch; 6.6 Summary; 6.7 Appendix: What Sort of Learning?; 6.8 Questions; 6.9 References; 7 The Nearest Neighbor Rule; 7.1 The Nearest Neighbor Rule; 7.2 Performance of the Nearest Neighbor Rule; 7.3 Intuition and Proof Sketch of Performance; 7.4 Using more Neighbors; 7.5 Summary; 7.6 Appendix: When People use Nearest Neighbor Reasoning; 7.6.1 Who Is a Bachelor?; 7.6.2 Legal Reasoning; 7.6.3 Moral Reasoning; 7.7 Questions; 7.8 References 8 Kernel Rules8.1 Motivation; 8.2 A Variation on Nearest Neighbor Rules; 8.3 Kernel Rules; 8.4 Universal Consistency of Kernel Rules; 8.5 Potential Functions; 8.6 More General Kernels; 8.7 Summary; 8.8 Appendix: Kernels, Similarity, and Features; 8.9 Questions; 8.10 References; 9 Neural Networks: Perceptrons; 9.1 Multilayer Feedforward Networks; 9.2 Neural Networks for Learning and Classification; 9.3 Perceptrons; 9.3.1 Threshold; 9.4 Learning Rule for Perceptrons; 9.5 Representational Capabilities of Perceptrons; 9.6 Summary; 9.7 Appendix: Models of Mind; 9.8 Questions; 9.9 References 10 Multilayer Networks |
Record Nr. | UNINA-9910139455203321 |
Kulkarni Sanjeev | ||
Hoboken, N.J., : Wiley, c2011 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
An elementary introduction to statistical learning theory [[electronic resource] /] / Sanjeev Kulkarni, Gilbert Harman |
Autore | Kulkarni Sanjeev |
Edizione | [1st ed.] |
Pubbl/distr/stampa | Hoboken, N.J., : Wiley, c2011 |
Descrizione fisica | 1 online resource (235 p.) |
Disciplina |
006.3/1
006.31 |
Altri autori (Persone) | HarmanGilbert |
Collana | Wiley series in probability and statistics |
Soggetto topico |
Machine learning - Statistical methods
Pattern recognition systems |
ISBN |
1-283-09868-7
9786613098689 1-118-02346-3 1-118-02347-1 1-118-02343-9 |
Classificazione | ST 300 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
An Elementary Introduction to Statistical Learning Theory; Contents; Preface; 1 Introduction: Classification, Learning, Features, and Applications; 1.1 Scope; 1.2 Why Machine Learning?; 1.3 Some Applications; 1.3.1 Image Recognition; 1.3.2 Speech Recognition; 1.3.3 Medical Diagnosis; 1.3.4 Statistical Arbitrage; 1.4 Measurements, Features, and Feature Vectors; 1.5 The Need for Probability; 1.6 Supervised Learning; 1.7 Summary; 1.8 Appendix: Induction; 1.9 Questions; 1.10 References; 2 Probability; 2.1 Probability of Some Basic Events; 2.2 Probabilities of Compound Events
2.3 Conditional Probability2.4 Drawing Without Replacement; 2.5 A Classic Birthday Problem; 2.6 Random Variables; 2.7 Expected Value; 2.8 Variance; 2.9 Summary; 2.10 Appendix: Interpretations of Probability; 2.11 Questions; 2.12 References; 3 Probability Densities; 3.1 An Example in Two Dimensions; 3.2 Random Numbers in [0,1]; 3.3 Density Functions; 3.4 Probability Densities in Higher Dimensions; 3.5 Joint and Conditional Densities; 3.6 Expected Value and Variance; 3.7 Laws of Large Numbers; 3.8 Summary; 3.9 Appendix: Measurability; 3.10 Questions; 3.11 References 4 The Pattern Recognition Problem4.1 A Simple Example; 4.2 Decision Rules; 4.3 Success Criterion; 4.4 The Best Classifier: Bayes Decision Rule; 4.5 Continuous Features and Densities; 4.6 Summary; 4.7 Appendix: Uncountably Many; 4.8 Questions; 4.9 References; 5 The Optimal Bayes Decision Rule; 5.1 Bayes Theorem; 5.2 Bayes Decision Rule; 5.3 Optimality and Some Comments; 5.4 An Example; 5.5 Bayes Theorem and Decision Rule with Densities; 5.6 Summary; 5.7 Appendix: Defining Conditional Probability; 5.8 Questions; 5.9 References; 6 Learning from Examples; 6.1 Lack of Knowledge of Distributions 6.2 Training Data6.3 Assumptions on the Training Data; 6.4 A Brute Force Approach to Learning; 6.5 Curse of Dimensionality, Inductive Bias, and No Free Lunch; 6.6 Summary; 6.7 Appendix: What Sort of Learning?; 6.8 Questions; 6.9 References; 7 The Nearest Neighbor Rule; 7.1 The Nearest Neighbor Rule; 7.2 Performance of the Nearest Neighbor Rule; 7.3 Intuition and Proof Sketch of Performance; 7.4 Using more Neighbors; 7.5 Summary; 7.6 Appendix: When People use Nearest Neighbor Reasoning; 7.6.1 Who Is a Bachelor?; 7.6.2 Legal Reasoning; 7.6.3 Moral Reasoning; 7.7 Questions; 7.8 References 8 Kernel Rules8.1 Motivation; 8.2 A Variation on Nearest Neighbor Rules; 8.3 Kernel Rules; 8.4 Universal Consistency of Kernel Rules; 8.5 Potential Functions; 8.6 More General Kernels; 8.7 Summary; 8.8 Appendix: Kernels, Similarity, and Features; 8.9 Questions; 8.10 References; 9 Neural Networks: Perceptrons; 9.1 Multilayer Feedforward Networks; 9.2 Neural Networks for Learning and Classification; 9.3 Perceptrons; 9.3.1 Threshold; 9.4 Learning Rule for Perceptrons; 9.5 Representational Capabilities of Perceptrons; 9.6 Summary; 9.7 Appendix: Models of Mind; 9.8 Questions; 9.9 References 10 Multilayer Networks |
Record Nr. | UNINA-9910818427003321 |
Kulkarni Sanjeev | ||
Hoboken, N.J., : Wiley, c2011 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Introduction to statistical machine learning / / Masashi Sugiyama |
Autore | Sugiyama Masashi <1974-> |
Pubbl/distr/stampa | Amsterdam : , : Elsevier, , [2016] |
Descrizione fisica | 1 online resource (535 p.) |
Disciplina | 006.3/1 |
Soggetto topico |
Machine learning - Statistical methods
Information science - Statistical methods Pattern recognition systems |
ISBN | 0-12-802350-3 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Front Cover; Introduction to Statistical Machine Learning; Copyright; Table of Contents; Biography; Preface; 1 INTRODUCTION; 1 Statistical Machine Learning; 1.1 Types of Learning; 1.2 Examples of Machine Learning Tasks; 1.2.1 Supervised Learning; 1.2.2 Unsupervised Learning; 1.2.3 Further Topics; 1.3 Structure of This Textbook; 2 STATISTICS AND PROBABILITY; 2 Random Variables and Probability Distributions; 2.1 Mathematical Preliminaries; 2.2 Probability; 2.3 Random Variable and Probability Distribution; 2.4 Properties of Probability Distributions; 2.4.1 Expectation, Median, and Mode
2.4.2 Variance and Standard Deviation2.4.3 Skewness, Kurtosis, and Moments; 2.5 Transformation of Random Variables; 3 Examples of Discrete Probability Distributions; 3.1 Discrete Uniform Distribution; 3.2 Binomial Distribution; 3.3 Hypergeometric Distribution; 3.4 Poisson Distribution; 3.5 Negative Binomial Distribution; 3.6 Geometric Distribution; 4 Examples of Continuous Probability Distributions; 4.1 Continuous Uniform Distribution; 4.2 Normal Distribution; 4.3 Gamma Distribution, Exponential Distribution, and Chi-Squared Distribution; 4.4 Beta Distribution 4.5 Cauchy Distribution and Laplace Distribution4.6 t-Distribution and F-Distribution; 5 Multidimensional Probability Distributions; 5.1 Joint Probability Distribution; 5.2 Conditional Probability Distribution; 5.3 Contingency Table; 5.4 Bayes' Theorem; 5.5 Covariance and Correlation; 5.6 Independence; 6 Examples of Multidimensional Probability Distributions; 6.1 Multinomial Distribution; 6.2 Multivariate Normal Distribution; 6.3 Dirichlet Distribution; 6.4 Wishart Distribution; 7 Sum of Independent Random Variables; 7.1 Convolution; 7.2 Reproductive Property; 7.3 Law of Large Numbers 7.4 Central Limit Theorem8 Probability Inequalities; 8.1 Union Bound; 8.2 Inequalities for Probabilities; 8.2.1 Markov's Inequality and Chernoff's Inequality; 8.2.2 Cantelli's Inequality and Chebyshev's Inequality; 8.3 Inequalities for Expectation; 8.3.1 Jensen's Inequality; 8.3.2 Hölder's Inequality and Schwarz's Inequality; 8.3.3 Minkowski's Inequality; 8.3.4 Kantorovich's Inequality; 8.4 Inequalities for the Sum of Independent Random Variables; 8.4.1 Chebyshev's Inequality and Chernoff's Inequality; 8.4.2 Hoeffding's Inequality and Bernstein's Inequality; 8.4.3 Bennett's Inequality 9 Statistical Estimation9.1 Fundamentals of Statistical Estimation; 9.2 Point Estimation; 9.2.1 Parametric Density Estimation; 9.2.2 Nonparametric Density Estimation; 9.2.3 Regression and Classification; 9.2.4 Model Selection; 9.3 Interval Estimation; 9.3.1 Interval Estimation for Expectation of Normal Samples; 9.3.2 Bootstrap Confidence Interval; 9.3.3 Bayesian Credible Interval; 10 Hypothesis Testing; 10.1 Fundamentals of Hypothesis Testing; 10.2 Test for Expectation of Normal Samples; 10.3 Neyman-Pearson Lemma; 10.4 Test for Contingency Tables 10.5 Test for Difference in Expectations of Normal Samples |
Record Nr. | UNINA-9910583088403321 |
Sugiyama Masashi <1974-> | ||
Amsterdam : , : Elsevier, , [2016] | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Machine learning with R / / Brett Lantz |
Autore | Lantz Brett |
Edizione | [1st edition] |
Pubbl/distr/stampa | Birmingham : , : Packt Publishing, , 2013 |
Descrizione fisica | 1 online resource (396 p.) |
Collana | Community experience distilled |
Soggetto topico |
Machine learning - Statistical methods
R (Computer program language) Programming languages (Electronic computers) |
Soggetto genere / forma | Electronic books. |
ISBN |
1-68015-358-7
1-78216-215-1 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Record Nr. | UNINA-9910453395303321 |
Lantz Brett | ||
Birmingham : , : Packt Publishing, , 2013 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Machine learning with R / / Brett Lantz |
Autore | Lantz Brett |
Edizione | [1st edition] |
Pubbl/distr/stampa | Birmingham : , : Packt Publishing, , 2013 |
Descrizione fisica | 1 online resource (396 p.) |
Collana | Community experience distilled |
Soggetto topico |
Machine learning - Statistical methods
R (Computer program language) Programming languages (Electronic computers) |
ISBN |
1-68015-358-7
1-78216-215-1 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Record Nr. | UNINA-9910790503303321 |
Lantz Brett | ||
Birmingham : , : Packt Publishing, , 2013 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Machine learning with R / / Brett Lantz |
Autore | Lantz Brett |
Edizione | [1st edition] |
Pubbl/distr/stampa | Birmingham : , : Packt Publishing, , 2013 |
Descrizione fisica | 1 online resource (396 p.) |
Collana | Community experience distilled |
Soggetto topico |
Machine learning - Statistical methods
R (Computer program language) Programming languages (Electronic computers) |
ISBN |
1-68015-358-7
1-78216-215-1 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Record Nr. | UNINA-9910824972803321 |
Lantz Brett | ||
Birmingham : , : Packt Publishing, , 2013 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Tensorflow 2. 0 quick start guide : get up to speed with the newly introduced features of tensorflow 2.0 / / Tony Holdroyd |
Autore | Holdroyd Tony |
Edizione | [1st edition] |
Pubbl/distr/stampa | Birmingham, England ; ; Mumbai : , : Packt, , 2019 |
Descrizione fisica | 1 online resource (185 pages) |
Disciplina | 006.31 |
Soggetto topico | Machine learning - Statistical methods |
ISBN | 1-78953-696-0 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Record Nr. | UNINA-9910793540903321 |
Holdroyd Tony | ||
Birmingham, England ; ; Mumbai : , : Packt, , 2019 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|