LEADER 05396nam 2200649Ia 450 001 9910830752003321 005 20230124181247.0 010 $a1-280-36756-3 010 $a9786610367566 010 $a0-470-31226-2 010 $a0-471-46421-X 010 $a0-471-22154-6 035 $a(CKB)111056485580890 035 $a(EBL)152678 035 $a(OCoLC)475872145 035 $a(SSID)ssj0000080453 035 $a(PQKBManifestationID)11125840 035 $a(PQKBTitleCode)TC0000080453 035 $a(PQKBWorkID)10095355 035 $a(PQKB)11781840 035 $a(MiAaPQ)EBC152678 035 $a(EXLCZ)99111056485580890 100 $a20010720d2001 uy 0 101 0 $aeng 135 $aur|n|---||||| 181 $ctxt 182 $cc 183 $acr 200 00$aKalman filtering and neural networks$b[electronic resource] /$fedited by Simon Haykin 210 $aNew York $cWiley$dc2001 215 $a1 online resource (302 p.) 225 1 $aAdaptive and learning systems for signal processing, communications, and control 300 $aDescription based upon print version of record. 311 $a0-471-36998-5 320 $aIncludes bibliographical references and index. 327 $aKALMAN FILTERING AND NEURAL NETWORKS; CONTENTS; Preface; Contributors; 1 Kalman Filters; 1.1 Introduction; 1.2 Optimum Estimates; 1.3 Kalman Filter; 1.4 Divergence Phenomenon: Square-Root Filtering; 1.5 Rauch-Tung-Striebel Smoother; 1.6 Extended Kalman Filter; 1.7 Summary; References; 2 Parameter-Based Kalman Filter Training: Theory and Implementation; 2.1 Introduction; 2.2 Network Architectures; 2.3 The EKF Procedure; 2.3.1 Global EKF Training; 2.3.2 Learning Rate and Scaled Cost Function; 2.3.3 Parameter Settings; 2.4 Decoupled EKF (DEKF); 2.5 Multistream Training 327 $a2.5.1 Some Insight into the Multistream Technique2.5.2 Advantages and Extensions of Multistream Training; 2.6 Computational Considerations; 2.6.1 Derivative Calculations; 2.6.2 Computationally Efficient Formulations for Multiple-Output Problems; 2.6.3 Avoiding Matrix Inversions; 2.6.4 Square-Root Filtering; 2.7 Other Extensions and Enhancements; 2.7.1 EKF Training with Constrained Weights; 2.7.2 EKF Training with an Entropic Cost Function; 2.7.3 EKF Training with Scalar Errors; 2.8 Automotive Applications of EKF Training; 2.8.1 Air/Fuel Ratio Control; 2.8.2 Idle Speed Control 327 $a2.8.3 Sensor-Catalyst Modeling2.8.4 Engine Misfire Detection; 2.8.5 Vehicle Emissions Estimation; 2.9 Discussion; 2.9.1 Virtues of EKF Training; 2.9.2 Limitations of EKF Training; 2.9.3 Guidelines for Implementation and Use; References; 3 Learning Shape and Motion from Image Sequences; 3.1 Introduction; 3.2 Neurobiological and Perceptual Foundations of our Model; 3.3 Network Description; 3.4 Experiment 1; 3.5 Experiment 2; 3.6 Experiment 3; 3.7 Discussion; References; 4 Chaotic Dynamics; 4.1 Introduction; 4.2 Chaotic (Dynamic) Invariants; 4.3 Dynamic Reconstruction 327 $a4.4 Modeling Numerically Generated Chaotic Time Series4.4.1 Logistic Map; 4.4.2 Ikeda Map; 4.4.3 Lorenz Attractor; 4.5 Nonlinear Dynamic Modeling of Real-World Time Series; 4.5.1 Laser Intensity Pulsations; 4.5.2 Sea Clutter Data; 4.6 Discussion; References; 5 Dual Extended Kalman Filter Methods; 5.1 Introduction; 5.2 Dual EKF-Prediction Error; 5.2.1 EKF-State Estimation; 5.2.2 EKF-Weight Estimation; 5.2.3 Dual Estimation; 5.3 A Probabilistic Perspective; 5.3.1 Joint Estimation Methods; 5.3.2 Marginal Estimation Methods; 5.3.3 Dual EKF Algorithms; 5.3.4 Joint EKF 327 $a5.4 Dual EKF Variance Estimation5.5 Applications; 5.5.1 Noisy Time-Series Estimation and Prediction; 5.5.2 Economic Forecasting-Index of Industrial Production; 5.5.3 Speech Enhancement; 5.6 Conclusions; Acknowledgments; Appendix A: Recurrent Derivative of the Kalman Gain; Appendix B: Dual EKF with Colored Measurement Noise; References; 6 Learning Nonlinear Dynamical System Using the Expectation-Maximization Algorithm; 6.1 Learning Stochastic Nonlinear Dynamics; 6.1.1 State Inference and Model Learning; 6.1.2 The Kalman Filter; 6.1.3 The EM Algorithm; 6.2 Combining EKS and EM 327 $a6.2.1 Extended Kalman Smoothing (E-step) 330 $aState-of-the-art coverage of Kalman filter methods for the design of neural networks This self-contained book consists of seven chapters by expert contributors that discuss Kalman filtering as applied to the training and use of neural networks. Although the traditional approach to the subject is almost always linear, this book recognizes and deals with the fact that real problems are most often nonlinear. The first chapter offers an introductory treatment of Kalman filters with an emphasis on basic Kalman filter theory, Rauch-Tung-Striebel smoother, and the extended Kalman filter. O 410 0$aAdaptive and learning systems for signal processing, communications, and control. 606 $aKalman filtering 606 $aNeural networks (Computer science) 615 0$aKalman filtering. 615 0$aNeural networks (Computer science) 676 $a006.3/2 676 $a621.3815324 701 $aHaykin$b Simon S.$f1931-$08857 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a9910830752003321 996 $aKalman filtering and neural networks$94065234 997 $aUNINA LEADER 03454nam 22007335 450 001 9910427708603321 005 20250609110955.0 010 $a3-030-60887-5 024 7 $a10.1007/978-3-030-60887-3 035 $a(CKB)4100000011493370 035 $a(DE-He213)978-3-030-60887-3 035 $a(MiAaPQ)EBC6369407 035 $a(PPN)25452723X 035 $a(MiAaPQ)EBC6368876 035 $a(EXLCZ)994100000011493370 100 $a20201006d2020 u| 0 101 0 $aeng 135 $aurnn|008mamaa 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aAdvances in Computational Intelligence $e19th Mexican International Conference on Artificial Intelligence, MICAI 2020, Mexico City, Mexico, October 12?17, 2020, Proceedings, Part II /$fedited by Lourdes Martínez-Villaseñor, Oscar Herrera-Alcántara, Hiram Ponce, Félix A. Castro-Espinoza 205 $a1st ed. 2020. 210 1$aCham :$cSpringer International Publishing :$cImprint: Springer,$d2020. 215 $a1 online resource (XXII, 489 p. 202 illus., 115 illus. in color.) 225 1 $aLecture Notes in Artificial Intelligence,$x2945-9141 ;$v12469 311 08$a3-030-60886-7 320 $aIncludes bibliographical references and index. 327 $aNatural Language Processing -- Image Processing and Pattern Recognition -- Intelligent Applications and Robotics. 330 $aThe two-volume set LNAI 12468 and 12469 constitutes the proceedings of the 19th Mexican International Conference on Artificial Intelligence, MICAI 2020, held in Mexico City, Mexico, in October 2020. The total of 77 papers presented in these two volumes was carefully reviewed and selected from 186 submissions. The contributions are organized in topical as follows: Part I: machine and deep learning, evolutionary and metaheuristic algorithms, and soft computing. Part II: natural language processing, image processing and pattern recognition, and intelligent applications and robotics. 410 0$aLecture Notes in Artificial Intelligence,$x2945-9141 ;$v12469 606 $aArtificial intelligence 606 $aSocial sciences$xData processing 606 $aComputers 606 $aComputers, Special purpose 606 $aData mining 606 $aApplication software 606 $aArtificial Intelligence 606 $aComputer Application in Social and Behavioral Sciences 606 $aComputing Milieux 606 $aSpecial Purpose and Application-Based Systems 606 $aData Mining and Knowledge Discovery 606 $aComputer and Information Systems Applications 615 0$aArtificial intelligence. 615 0$aSocial sciences$xData processing. 615 0$aComputers. 615 0$aComputers, Special purpose. 615 0$aData mining. 615 0$aApplication software. 615 14$aArtificial Intelligence. 615 24$aComputer Application in Social and Behavioral Sciences. 615 24$aComputing Milieux. 615 24$aSpecial Purpose and Application-Based Systems. 615 24$aData Mining and Knowledge Discovery. 615 24$aComputer and Information Systems Applications. 676 $a006.3 702 $aMarti?nez-Villasen?or$b Mari?a de Lourdes 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a9910427708603321 996 $aAdvances in Computational Intelligence$94209925 997 $aUNINA