LEADER 01037nam a2200253 i 4500 001 991003231289707536 005 20020509114456.0 008 951113s1970 ||| ||| | ger 020 $a3205036409 035 $ab11129165-39ule_inst 035 $aPARLA177790$9ExL 040 $aDip.to Filol. Class. e Med.$bita 082 0 $a882 100 1 $aJobst, Werner$0213239 245 14$aDie Hohle im griechischen Theater des 5. und 4. Jahrunderts v. Chr. :$beine Untersuchung zur Inszenierung klassischer Dramen /$cWerner Jobst 260 $aWien [etc.] :$bHermann Böhlaus,$c1970 300 $a166 p. :$b[7] p. di tav. ;$c24 cm. 650 4$aTeatro greco - 5.-4. sec. a.C.$xSaggio critico 907 $a.b11129165$b02-04-14$c28-06-02 912 $a991003231289707536 945 $aLE007 T 3772$g1$i2007000025796$lle007$o-$pE0.00$q-$rl$s- $t0$u0$v0$w0$x0$y.i1126777x$z28-06-02 996 $aHohle im griechischen Theater des 5. und 4. Jahrunderts v. Chr.$9872075 997 $aUNISALENTO 998 $ale007$b01-01-95$cm$da $e-$feng$gxx $h4$i1 LEADER 04584nam 2200565 450 001 9910824490003321 005 20170919022152.0 010 $a1-78398-935-1 035 $a(CKB)3710000000604294 035 $a(EBL)4520803 035 $a(MiAaPQ)EBC4520803 035 $z(PPN)220206333 035 $a(PPN)19329379X 035 $a(EXLCZ)993710000000604294 100 $a20160810h20162016 uy 0 101 0 $aeng 135 $aur|n|---||||| 181 $2rdacontent 182 $2rdamedia 183 $2rdacarrier 200 10$aF# for machine learning essentials $eget up and running with machine learning with F# in a fun and functional way /$fSudipta Mukherjee ; foreword by Dr. Ralf Herbrich, director of machine learning science at Amazon 205 $a1. 210 1$aBirmingham, England ;$aMumbai, [India] :$cPackt Publishing,$d2016. 210 4$d©2016 215 $a1 online resource (194 p.) 225 1 $aCommunity Experience Distilled 300 $aIncludes index. 311 $a1-78398-934-3 327 $aCover ; Copyright; Credits; Foreword; About the Author; Acknowledgments; About the Reviewers; www.PacktPub.com; Table of Contents; Preface; Chapter 1: Introduction to Machine Learning; Objective; Getting in touch; Different areas where machine learning is being used; Why use F#?; Supervised machine learning; Training and test dataset/corpus; Some motivating real life examples of supervised learning; Nearest Neighbour algorithm (a.k.a k-NN algorithm); Distance metrics; Decision tree algorithms; Unsupervised learning; Machine learning frameworks; Machine learning for fun and profit 327 $aRecognizing handwritten digits - your ""Hello World"" ML programHow does this work?; Summary; Chapter 2: Linear Regression; Objective; Different types of linear regression algorithms; APIs used; Math.NET Numerics for F# 3.7.0; Getting Math.NET; Experimenting with Math.NET; The basics of matrices and vectors (a short and sweet refresher); Creating a vector; Creating a matrix; Finding the transpose of a matrix; Finding the inverse of a matrix; Trace of a matrix; QR decomposition of a matrix; SVD of a matrix; Linear regression method of least square 327 $aFinding linear regression coefficients using F#Finding the linear regression coefficients using Math.NET; Putting it together with Math.NET and FsPlot; Multiple linear regression; Multiple linear regression and variations using Math.NET; Weighted linear regression; Plotting the result of multiple linear regression; Ridge regression; Multivariate multiple linear regression; Feature scaling; Summary; Chapter 3: Classification Techniques; Objective; Different classification algorithms you will learn; Some interesting things you can do; Binary classification using k-NN; How does it work? 327 $aFinding cancerous cells using k-NN: a case studyUnderstanding logistic regression ; The sigmoid function chart; Binary classification using logistic regression (using Accord.NET); Multiclass classification using logistic regression; How does it work?; Multiclass classification using decision trees; Obtaining and using WekaSharp; How does it work?; Predicting a traffic jam using a decision tree: a case study; Challenge yourself!; Summary; Chapter 4: Information Retrieval; Objective; Different IR algorithms you will learn; What interesting things can you do? 327 $aInformation retrieval using tf-idfMeasures of similarity; Generating a PDF from a histogram; Minkowski family; L1 family; Intersection family; Inner Product family; Fidelity family or squared-chord family; Squared L2 family; Shannon's Entropy family; Similarity of asymmetric binary attributes; Some example usages of distance metrics; Finding similar cookies using asymmetric binary similarity measures; Grouping/clustering color images based on Canberra distance; Summary; Chapter 5: Collaborative Filtering; Objective; Different classification algorithms you will learn 327 $aVocabulary of collaborative filtering 410 0$aCommunity experience distilled. 517 3 $aF sharp for machine learning essentials 606 $aF# (Computer program language) 606 $aMachine learning 615 0$aF# (Computer program language) 615 0$aMachine learning. 676 $a005.133 700 $aMukherjee$b Sudipta$0892441 702 $aHerbrich$b Ralf 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a9910824490003321 996 $aF# for machine learning essentials$94013458 997 $aUNINA