LEADER 04280nam 22006735 450 001 9910427690903321 005 20200820043533.0 010 $a3-030-46444-X 024 7 $a10.1007/978-3-030-46444-8 035 $a(CKB)4100000011392572 035 $a(DE-He213)978-3-030-46444-8 035 $a(MiAaPQ)EBC6313862 035 $a(PPN)250214628 035 $a(EXLCZ)994100000011392572 100 $a20200820d2020 u| 0 101 0 $aeng 135 $aurnn#008mamaa 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aStatistical Field Theory for Neural Networks /$fby Moritz Helias, David Dahmen 205 $a1st ed. 2020. 210 1$aCham :$cSpringer International Publishing :$cImprint: Springer,$d2020. 215 $a1 online resource (XVII, 203 p. 127 illus., 5 illus. in color.) 225 1 $aLecture Notes in Physics,$x0075-8450 ;$v970 311 $a3-030-46443-1 327 $aIntroduction -- Probabilities, moments, cumulants -- Gaussian distribution and Wick?s theorem -- Perturbation expansion -- Linked cluster theorem -- Functional preliminaries -- Functional formulation of stochastic differential equations -- Ornstein-Uhlenbeck process: The free Gaussian theory -- Perturbation theory for stochastic differential equations -- Dynamic mean-field theory for random networks -- Vertex generating function -- Application: TAP approximation -- Expansion of cumulants into tree diagrams of vertex functions -- Loopwise expansion of the effective action - Tree level -- Loopwise expansion in the MSRDJ formalism -- Nomenclature. 330 $aThis book presents a self-contained introduction to techniques from field theory applied to stochastic and collective dynamics in neuronal networks. These powerful analytical techniques, which are well established in other fields of physics, are the basis of current developments and offer solutions to pressing open problems in theoretical neuroscience and also machine learning. They enable a systematic and quantitative understanding of the dynamics in recurrent and stochastic neuronal networks. This book is intended for physicists, mathematicians, and computer scientists and it is designed for self-study by researchers who want to enter the field or as the main text for a one semester course at advanced undergraduate or graduate level. The theoretical concepts presented in this book are systematically developed from the very beginning, which only requires basic knowledge of analysis and linear algebra. 410 0$aLecture Notes in Physics,$x0075-8450 ;$v970 606 $aStatistical physics 606 $aNeurosciences 606 $aMachine learning 606 $aNeural networks (Computer science) 606 $aMathematical statistics 606 $aStatistical Physics and Dynamical Systems$3https://scigraph.springernature.com/ontologies/product-market-codes/P19090 606 $aNeurosciences$3https://scigraph.springernature.com/ontologies/product-market-codes/B18006 606 $aMachine Learning$3https://scigraph.springernature.com/ontologies/product-market-codes/I21010 606 $aMathematical Models of Cognitive Processes and Neural Networks$3https://scigraph.springernature.com/ontologies/product-market-codes/M13100 606 $aProbability and Statistics in Computer Science$3https://scigraph.springernature.com/ontologies/product-market-codes/I17036 615 0$aStatistical physics. 615 0$aNeurosciences. 615 0$aMachine learning. 615 0$aNeural networks (Computer science) 615 0$aMathematical statistics. 615 14$aStatistical Physics and Dynamical Systems. 615 24$aNeurosciences. 615 24$aMachine Learning. 615 24$aMathematical Models of Cognitive Processes and Neural Networks. 615 24$aProbability and Statistics in Computer Science. 676 $a519.2 700 $aHelias$b Moritz$4aut$4http://id.loc.gov/vocabulary/relators/aut$0932868 702 $aDahmen$b David$4aut$4http://id.loc.gov/vocabulary/relators/aut 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a9910427690903321 996 $aStatistical Field Theory for Neural Networks$92099673 997 $aUNINA