LEADER 03018nam 2200433 450 001 996475762603316 005 20221204134625.0 010 $a9789811904011$b(electronic bk.) 010 $z9789811904004 035 $a(MiAaPQ)EBC6986761 035 $a(Au-PeEL)EBL6986761 035 $a(CKB)22371876200041 035 $a(PPN)269151966 035 $a(EXLCZ)9922371876200041 100 $a20221204d2022 uy 0 101 0 $aeng 135 $aurcnu|||||||| 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aKernel methods for machine learning with math and python $e100 exercises for building logic /$fJoe Suzuki 210 1$aGateway East, Singapore :$cSpringer,$d[2022] 210 4$d©2022 215 $a1 online resource (216 pages) 311 08$aPrint version: Suzuki, Joe Kernel Methods for Machine Learning with Math and Python Singapore : Springer,c2022 9789811904004 320 $aIncludes bibliographical references and index. 327 $aIntro -- Preface -- How to Overcome Your Kernel Weakness -- What Makes KMMP Unique? -- Acknowledgments -- Contents -- 1 Positive Definite Kernels -- 1.1 Positive Definiteness of a Matrix -- 1.2 Kernels -- 1.3 Positive Definite Kernels -- 1.4 Probability -- 1.5 Bochner's Theorem -- 1.6 Kernels for Strings, Trees, and Graphs -- Appendix -- Exercises 1 sim 15 -- 2 Hilbert Spaces -- 2.1 Metric Spaces and Their Completeness -- 2.2 Linear Spaces and Inner Product Spaces -- 2.3 Hilbert Spaces -- 2.4 Projection Theorem -- 2.5 Linear Operators -- 2.6 Compact Operators -- Appendix: Proofs of Propositions -- Exercises 16 sim 30 -- 3 Reproducing Kernel Hilbert Space -- 3.1 RKHSs -- 3.2 Sobolev Space -- 3.3 Mercer's Theorem -- Appendix -- Exercises 31 sim 45 -- 4 Kernel Computations -- 4.1 Kernel Ridge Regression -- 4.2 Kernel Principle Component Analysis -- 4.3 Kernel SVM -- 4.4 Spline Curves -- 4.5 Random Fourier Features -- 4.6 Nyström Approximation -- 4.7 Incomplete Cholesky Decomposition -- Appendix -- Exercises 46 sim 64 -- 5 The MMD and HSIC -- 5.1 Random Variables in RKHSs -- 5.2 The MMD and Two-Sample Problem -- 5.3 The HSIC and Independence Test -- 5.4 Characteristic and Universal Kernels -- 5.5 Introduction to Empirical Processes -- Appendix -- Exercises 65 sim83 -- 6 Gaussian Processes and Functional Data Analyses -- 6.1 Regression -- 6.2 Classification -- 6.3 Gaussian Processes with Inducing Variables -- 6.4 Karhunen-Lóeve Expansion -- 6.5 Functional Data Analysis -- Appendix -- Exercises 83sim100 -- Appendix Bibliography. 606 $aArtificial intelligence 606 $aArtificial intelligence$xData processing 615 0$aArtificial intelligence. 615 0$aArtificial intelligence$xData processing. 676 $a515.9 700 $aSuzuki$b Joe$0846228 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 912 $a996475762603316 996 $aKernel Methods for Machine Learning with Math and Python$92851093 997 $aUNISA LEADER 00870nam a22002291i 4500 001 991001510389707536 005 20030121182757.0 008 030121s1970 uik|||||||||||||||||eng 035 $ab12157314-39ule_inst 035 $aARCHE-024711$9ExL 040 $aDip.to Filologia Ling. e Lett.$bita$cA.t.i. Arché s.c.r.l. Pandora Sicilia s.r.l. 100 1 $aMcVeagh, John$0450902 245 10$aElizabeth Gaskell /$cby John McVeagh 260 $aLondon :$bRoutledge & Kegan,$c1970 300 $aVIII, 104 p. ;$c19 cm 440 4$aThe profiles in literature series 907 $a.b12157314$b02-04-14$c01-04-03 912 $a991001510389707536 945 $aLE008 FL.M. (IN) F 84$g1$i2008000306120$lle008$o-$pE0.00$q-$rl$s- $t0$u0$v0$w0$x0$y.i12487764$z01-04-03 996 $aElizabeth Gaskell$9147818 997 $aUNISALENTO 998 $ale008$b01-04-03$cm$da $e-$feng$guik$h0$i1