Vai al contenuto principale della pagina

Nonlinear filters : theory and applications / / Peyman Setoodeh, Saeid Habibi, Simon Haykin



(Visualizza in formato marc)    (Visualizza in BIBFRAME)

Autore: Setoodeh Peyman <1974-> Visualizza persona
Titolo: Nonlinear filters : theory and applications / / Peyman Setoodeh, Saeid Habibi, Simon Haykin Visualizza cluster
Pubblicazione: Hoboken, New Jersey : , : John Wiley & Sons, Inc., , [2022]
©2022
Descrizione fisica: 1 online resource (307 pages)
Disciplina: 629.8/36
Soggetto topico: Nonlinear control theory
Digital filters (Mathematics)
Signal processing - Digital techniques
Persona (resp. second.): HabibiSaeid
HaykinSimon S. <1931->
Nota di bibliografia: Includes bibliographical references and index.
Nota di contenuto: Cover -- Title Page -- Copyright -- Contents -- List of Figures -- List of Table -- Preface -- Acknowledgments -- Acronyms -- Chapter 1 Introduction -- 1.1 State of a Dynamic System -- 1.2 State Estimation -- 1.3 Construals of Computing -- 1.4 Statistical Modeling -- 1.5 Vision for the Book -- Chapter 2 Observability -- 2.1 Introduction -- 2.2 State‐Space Model -- 2.3 The Concept of Observability -- 2.4 Observability of Linear Time‐Invariant Systems -- 2.4.1 Continuous‐Time LTI Systems -- 2.4.2 Discrete‐Time LTI Systems -- 2.4.3 Discretization of LTI Systems -- 2.5 Observability of Linear Time‐Varying Systems -- 2.5.1 Continuous‐Time LTV Systems -- 2.5.2 Discrete‐Time LTV Systems -- 2.5.3 Discretization of LTV Systems -- 2.6 Observability of Nonlinear Systems -- 2.6.1 Continuous‐Time Nonlinear Systems -- 2.6.2 Discrete‐Time Nonlinear Systems -- 2.6.3 Discretization of Nonlinear Systems -- 2.7 Observability of Stochastic Systems -- 2.8 Degree of Observability -- 2.9 Invertibility -- 2.10 Concluding Remarks -- Chapter 3 Observers -- 3.1 Introduction -- 3.2 Luenberger Observer -- 3.3 Extended Luenberger‐Type Observer -- 3.4 Sliding‐Mode Observer -- 3.5 Unknown‐Input Observer -- 3.6 Concluding Remarks -- Chapter 4 Bayesian Paradigm and Optimal Nonlinear Filtering -- 4.1 Introduction -- 4.2 Bayes' Rule -- 4.3 Optimal Nonlinear Filtering -- 4.4 Fisher Information -- 4.5 Posterior Cramér-Rao Lower Bound -- 4.6 Concluding Remarks -- Chapter 5 Kalman Filter -- 5.1 Introduction -- 5.2 Kalman Filter -- 5.3 Kalman Smoother -- 5.4 Information Filter -- 5.5 Extended Kalman Filter -- 5.6 Extended Information Filter -- 5.7 Divided‐Difference Filter -- 5.8 Unscented Kalman Filter -- 5.9 Cubature Kalman Filter -- 5.10 Generalized PID Filter -- 5.11 Gaussian‐Sum Filter -- 5.12 Applications -- 5.12.1 Information Fusion -- 5.12.2 Augmented Reality.
5.12.3 Urban Traffic Network -- 5.12.4 Cybersecurity of Power Systems -- 5.12.5 Incidence of Influenza -- 5.12.6 COVID‐19 Pandemic -- 5.13 Concluding Remarks -- Chapter 6 Particle Filter -- 6.1 Introduction -- 6.2 Monte Carlo Method -- 6.3 Importance Sampling -- 6.4 Sequential Importance Sampling -- 6.5 Resampling -- 6.6 Sample Impoverishment -- 6.7 Choosing the Proposal Distribution -- 6.8 Generic Particle Filter -- 6.9 Applications -- 6.9.1 Simultaneous Localization and Mapping -- 6.10 Concluding Remarks -- Chapter 7 Smooth Variable‐Structure Filter -- 7.1 Introduction -- 7.2 The Switching Gain -- 7.3 Stability Analysis -- 7.4 Smoothing Subspace -- 7.5 Filter Corrective Term for Linear Systems -- 7.6 Filter Corrective Term for Nonlinear Systems -- 7.7 Bias Compensation -- 7.8 The Secondary Performance Indicator -- 7.9 Second‐Order Smooth Variable Structure Filter -- 7.10 Optimal Smoothing Boundary Design -- 7.11 Combination of SVSF with Other Filters -- 7.12 Applications -- 7.12.1 Multiple Target Tracking -- 7.12.2 Battery State‐of‐Charge Estimation -- 7.12.3 Robotics -- 7.13 Concluding Remarks -- Chapter 8 Deep Learning -- 8.1 Introduction -- 8.2 Gradient Descent -- 8.3 Stochastic Gradient Descent -- 8.4 Natural Gradient Descent -- 8.5 Neural Networks -- 8.6 Backpropagation -- 8.7 Backpropagation Through Time -- 8.8 Regularization -- 8.9 Initialization -- 8.10 Convolutional Neural Network -- 8.11 Long Short‐Term Memory -- 8.12 Hebbian Learning -- 8.13 Gibbs Sampling -- 8.14 Boltzmann Machine -- 8.15 Autoencoder -- 8.16 Generative Adversarial Network -- 8.17 Transformer -- 8.18 Concluding Remarks -- Chapter 9 Deep Learning‐Based Filters -- 9.1 Introduction -- 9.2 Variational Inference -- 9.3 Amortized Variational Inference -- 9.4 Deep Kalman Filter -- 9.5 Backpropagation Kalman Filter -- 9.6 Differentiable Particle Filter.
9.7 Deep Rao-Blackwellized Particle Filter -- 9.8 Deep Variational Bayes Filter -- 9.9 Kalman Variational Autoencoder -- 9.10 Deep Variational Information Bottleneck -- 9.11 Wasserstein Distributionally Robust Kalman Filter -- 9.12 Hierarchical Invertible Neural Transport -- 9.13 Applications -- 9.13.1 Prediction of Drug Effect -- 9.13.2 Autonomous Driving -- 9.14 Concluding Remarks -- Chapter 10 Expectation Maximization -- 10.1 Introduction -- 10.2 Expectation Maximization Algorithm -- 10.3 Particle Expectation Maximization -- 10.4 Expectation Maximization for Gaussian Mixture Models -- 10.5 Neural Expectation Maximization -- 10.6 Relational Neural Expectation Maximization -- 10.7 Variational Filtering Expectation Maximization -- 10.8 Amortized Variational Filtering Expectation Maximization -- 10.9 Applications -- 10.9.1 Stochastic Volatility -- 10.9.2 Physical Reasoning -- 10.9.3 Speech, Music, and Video Modeling -- 10.10 Concluding Remarks -- Chapter 11 Reinforcement Learning‐Based Filter -- 11.1 Introduction -- 11.2 Reinforcement Learning -- 11.3 Variational Inference as Reinforcement Learning -- 11.4 Application -- 11.4.1 Battery State‐of‐Charge Estimation -- 11.5 Concluding Remarks -- Chapter 12 Nonparametric Bayesian Models -- 12.1 Introduction -- 12.2 Parametric vs Nonparametric Models -- 12.3 Measure‐Theoretic Probability -- 12.4 Exchangeability -- 12.5 Kolmogorov Extension Theorem -- 12.6 Extension of Bayesian Models -- 12.7 Conjugacy -- 12.8 Construction of Nonparametric Bayesian Models -- 12.9 Posterior Computability -- 12.10 Algorithmic Sufficiency -- 12.11 Applications -- 12.11.1 Multiple Object Tracking -- 12.11.2 Data‐Driven Probabilistic Optimal Power Flow -- 12.11.3 Analyzing Single‐Molecule Tracks -- 12.12 Concluding Remarks -- References -- Index -- EULA.
Sommario/riassunto: "This book fills the gap between the literature on nonlinear filters and nonlinear observers by presenting a new state estimation strategy, the smooth variable structure filter (SVSF). The book is a valuable resource to researchers outside of the control society, where literature on nonlinear observers is less well-known. SVSF is a predictor-corrector estimator that is formulated based on a stability theorem, to confine the estimated states within a neighborhood of their true values. It has the potential to improve performance in the presence of severe and changing modeling uncertainties and noise. An important advantage of the SVSF is the availability of a set of secondary performance indicators that pertain to each estimate. This allows for dynamic refinement of the filter model. The combination of SVSF's robust stability and its secondary indicators of performance make it a powerful estimation tool, capable of compensating for uncertainties that are abruptly introduced in the system"--
Titolo autorizzato: Nonlinear filters  Visualizza cluster
ISBN: 1-119-07818-0
1-119-07815-6
1-119-07816-4
Formato: Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione: Inglese
Record Nr.: 9910831039103321
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui