F# for machine learning essentials : get up and running with machine learning with F# in a fun and functional way / / Sudipta Mukherjee ; foreword by Dr. Ralf Herbrich, director of machine learning science at Amazon |
Autore | Mukherjee Sudipta |
Edizione | [1.] |
Pubbl/distr/stampa | Birmingham, England ; ; Mumbai, [India] : , : Packt Publishing, , 2016 |
Descrizione fisica | 1 online resource (194 p.) |
Disciplina | 005.133 |
Collana | Community Experience Distilled |
Soggetto topico |
F# (Computer program language)
Machine learning |
ISBN | 1-78398-935-1 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Cover ; Copyright; Credits; Foreword; About the Author; Acknowledgments; About the Reviewers; www.PacktPub.com; Table of Contents; Preface; Chapter 1: Introduction to Machine Learning; Objective; Getting in touch; Different areas where machine learning is being used; Why use F#?; Supervised machine learning; Training and test dataset/corpus; Some motivating real life examples of supervised learning; Nearest Neighbour algorithm (a.k.a k-NN algorithm); Distance metrics; Decision tree algorithms; Unsupervised learning; Machine learning frameworks; Machine learning for fun and profit
Recognizing handwritten digits - your ""Hello World"" ML programHow does this work?; Summary; Chapter 2: Linear Regression; Objective; Different types of linear regression algorithms; APIs used; Math.NET Numerics for F# 3.7.0; Getting Math.NET; Experimenting with Math.NET; The basics of matrices and vectors (a short and sweet refresher); Creating a vector; Creating a matrix; Finding the transpose of a matrix; Finding the inverse of a matrix; Trace of a matrix; QR decomposition of a matrix; SVD of a matrix; Linear regression method of least square Finding linear regression coefficients using F#Finding the linear regression coefficients using Math.NET; Putting it together with Math.NET and FsPlot; Multiple linear regression; Multiple linear regression and variations using Math.NET; Weighted linear regression; Plotting the result of multiple linear regression; Ridge regression; Multivariate multiple linear regression; Feature scaling; Summary; Chapter 3: Classification Techniques; Objective; Different classification algorithms you will learn; Some interesting things you can do; Binary classification using k-NN; How does it work? Finding cancerous cells using k-NN: a case studyUnderstanding logistic regression ; The sigmoid function chart; Binary classification using logistic regression (using Accord.NET); Multiclass classification using logistic regression; How does it work?; Multiclass classification using decision trees; Obtaining and using WekaSharp; How does it work?; Predicting a traffic jam using a decision tree: a case study; Challenge yourself!; Summary; Chapter 4: Information Retrieval; Objective; Different IR algorithms you will learn; What interesting things can you do? Information retrieval using tf-idfMeasures of similarity; Generating a PDF from a histogram; Minkowski family; L1 family; Intersection family; Inner Product family; Fidelity family or squared-chord family; Squared L2 family; Shannon's Entropy family; Similarity of asymmetric binary attributes; Some example usages of distance metrics; Finding similar cookies using asymmetric binary similarity measures; Grouping/clustering color images based on Canberra distance; Summary; Chapter 5: Collaborative Filtering; Objective; Different classification algorithms you will learn Vocabulary of collaborative filtering |
Altri titoli varianti | F sharp for machine learning essentials |
Record Nr. | UNINA-9910798003903321 |
Mukherjee Sudipta | ||
Birmingham, England ; ; Mumbai, [India] : , : Packt Publishing, , 2016 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
F# for machine learning essentials : get up and running with machine learning with F# in a fun and functional way / / Sudipta Mukherjee ; foreword by Dr. Ralf Herbrich, director of machine learning science at Amazon |
Autore | Mukherjee Sudipta |
Edizione | [1.] |
Pubbl/distr/stampa | Birmingham, England ; ; Mumbai, [India] : , : Packt Publishing, , 2016 |
Descrizione fisica | 1 online resource (194 p.) |
Disciplina | 005.133 |
Collana | Community Experience Distilled |
Soggetto topico |
F# (Computer program language)
Machine learning |
ISBN | 1-78398-935-1 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Cover ; Copyright; Credits; Foreword; About the Author; Acknowledgments; About the Reviewers; www.PacktPub.com; Table of Contents; Preface; Chapter 1: Introduction to Machine Learning; Objective; Getting in touch; Different areas where machine learning is being used; Why use F#?; Supervised machine learning; Training and test dataset/corpus; Some motivating real life examples of supervised learning; Nearest Neighbour algorithm (a.k.a k-NN algorithm); Distance metrics; Decision tree algorithms; Unsupervised learning; Machine learning frameworks; Machine learning for fun and profit
Recognizing handwritten digits - your ""Hello World"" ML programHow does this work?; Summary; Chapter 2: Linear Regression; Objective; Different types of linear regression algorithms; APIs used; Math.NET Numerics for F# 3.7.0; Getting Math.NET; Experimenting with Math.NET; The basics of matrices and vectors (a short and sweet refresher); Creating a vector; Creating a matrix; Finding the transpose of a matrix; Finding the inverse of a matrix; Trace of a matrix; QR decomposition of a matrix; SVD of a matrix; Linear regression method of least square Finding linear regression coefficients using F#Finding the linear regression coefficients using Math.NET; Putting it together with Math.NET and FsPlot; Multiple linear regression; Multiple linear regression and variations using Math.NET; Weighted linear regression; Plotting the result of multiple linear regression; Ridge regression; Multivariate multiple linear regression; Feature scaling; Summary; Chapter 3: Classification Techniques; Objective; Different classification algorithms you will learn; Some interesting things you can do; Binary classification using k-NN; How does it work? Finding cancerous cells using k-NN: a case studyUnderstanding logistic regression ; The sigmoid function chart; Binary classification using logistic regression (using Accord.NET); Multiclass classification using logistic regression; How does it work?; Multiclass classification using decision trees; Obtaining and using WekaSharp; How does it work?; Predicting a traffic jam using a decision tree: a case study; Challenge yourself!; Summary; Chapter 4: Information Retrieval; Objective; Different IR algorithms you will learn; What interesting things can you do? Information retrieval using tf-idfMeasures of similarity; Generating a PDF from a histogram; Minkowski family; L1 family; Intersection family; Inner Product family; Fidelity family or squared-chord family; Squared L2 family; Shannon's Entropy family; Similarity of asymmetric binary attributes; Some example usages of distance metrics; Finding similar cookies using asymmetric binary similarity measures; Grouping/clustering color images based on Canberra distance; Summary; Chapter 5: Collaborative Filtering; Objective; Different classification algorithms you will learn Vocabulary of collaborative filtering |
Altri titoli varianti | F sharp for machine learning essentials |
Record Nr. | UNINA-9910824490003321 |
Mukherjee Sudipta | ||
Birmingham, England ; ; Mumbai, [India] : , : Packt Publishing, , 2016 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
The NeurIPS '18 Competition [[electronic resource] ] : From Machine Learning to Intelligent Conversations / / edited by Sergio Escalera, Ralf Herbrich |
Edizione | [1st ed. 2020.] |
Pubbl/distr/stampa | Cham : , : Springer International Publishing : , : Imprint : Springer, , 2020 |
Descrizione fisica | 1 online resource (345 pages) |
Disciplina | 006.3 |
Collana | The Springer Series on Challenges in Machine Learning |
Soggetto topico |
Artificial intelligence
Optical data processing Pattern recognition Artificial Intelligence Image Processing and Computer Vision Pattern Recognition |
ISBN | 3-030-29135-9 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Record Nr. | UNISA-996465462403316 |
Cham : , : Springer International Publishing : , : Imprint : Springer, , 2020 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. di Salerno | ||
|
The NeurIPS '18 Competition : From Machine Learning to Intelligent Conversations / / edited by Sergio Escalera, Ralf Herbrich |
Edizione | [1st ed. 2020.] |
Pubbl/distr/stampa | Cham : , : Springer International Publishing : , : Imprint : Springer, , 2020 |
Descrizione fisica | 1 online resource (345 pages) |
Disciplina | 006.3 |
Collana | The Springer Series on Challenges in Machine Learning |
Soggetto topico |
Artificial intelligence
Optical data processing Pattern recognition Artificial Intelligence Image Processing and Computer Vision Pattern Recognition |
ISBN | 3-030-29135-9 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Record Nr. | UNINA-9910366659803321 |
Cham : , : Springer International Publishing : , : Imprint : Springer, , 2020 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|