04280nam 22006735 450 991042769090332120200820043533.03-030-46444-X10.1007/978-3-030-46444-8(CKB)4100000011392572(DE-He213)978-3-030-46444-8(MiAaPQ)EBC6313862(PPN)250214628(EXLCZ)99410000001139257220200820d2020 u| 0engurnn#008mamaatxtrdacontentcrdamediacrrdacarrierStatistical Field Theory for Neural Networks /by Moritz Helias, David Dahmen1st ed. 2020.Cham :Springer International Publishing :Imprint: Springer,2020.1 online resource (XVII, 203 p. 127 illus., 5 illus. in color.)Lecture Notes in Physics,0075-8450 ;9703-030-46443-1 Introduction -- Probabilities, moments, cumulants -- Gaussian distribution and Wick’s theorem -- Perturbation expansion -- Linked cluster theorem -- Functional preliminaries -- Functional formulation of stochastic differential equations -- Ornstein-Uhlenbeck process: The free Gaussian theory -- Perturbation theory for stochastic differential equations -- Dynamic mean-field theory for random networks -- Vertex generating function -- Application: TAP approximation -- Expansion of cumulants into tree diagrams of vertex functions -- Loopwise expansion of the effective action - Tree level -- Loopwise expansion in the MSRDJ formalism -- Nomenclature.This book presents a self-contained introduction to techniques from field theory applied to stochastic and collective dynamics in neuronal networks. These powerful analytical techniques, which are well established in other fields of physics, are the basis of current developments and offer solutions to pressing open problems in theoretical neuroscience and also machine learning. They enable a systematic and quantitative understanding of the dynamics in recurrent and stochastic neuronal networks. This book is intended for physicists, mathematicians, and computer scientists and it is designed for self-study by researchers who want to enter the field or as the main text for a one semester course at advanced undergraduate or graduate level. The theoretical concepts presented in this book are systematically developed from the very beginning, which only requires basic knowledge of analysis and linear algebra.Lecture Notes in Physics,0075-8450 ;970Statistical physicsNeurosciencesMachine learningNeural networks (Computer science)Mathematical statisticsStatistical Physics and Dynamical Systemshttps://scigraph.springernature.com/ontologies/product-market-codes/P19090Neuroscienceshttps://scigraph.springernature.com/ontologies/product-market-codes/B18006Machine Learninghttps://scigraph.springernature.com/ontologies/product-market-codes/I21010Mathematical Models of Cognitive Processes and Neural Networkshttps://scigraph.springernature.com/ontologies/product-market-codes/M13100Probability and Statistics in Computer Sciencehttps://scigraph.springernature.com/ontologies/product-market-codes/I17036Statistical physics.Neurosciences.Machine learning.Neural networks (Computer science)Mathematical statistics.Statistical Physics and Dynamical Systems.Neurosciences.Machine Learning.Mathematical Models of Cognitive Processes and Neural Networks.Probability and Statistics in Computer Science.519.2Helias Moritzauthttp://id.loc.gov/vocabulary/relators/aut932868Dahmen Davidauthttp://id.loc.gov/vocabulary/relators/autMiAaPQMiAaPQMiAaPQBOOK9910427690903321Statistical Field Theory for Neural Networks2099673UNINA