Bayesian inference / Anthony O'Hagan
| Bayesian inference / Anthony O'Hagan |
| Autore | O'Hagan, Anthony |
| Pubbl/distr/stampa | London : E. Arnold, 1994 |
| Descrizione fisica | xii, 330 p. ; 26 cm. |
| Disciplina | 519.5 |
| Altri autori (Persone) | Kendall, Maurice G. |
| Collana | Kendall's advanced theory og statistics ; 2B |
| Soggetto topico |
Bayesian inference
Mathematical statistics |
| ISBN | 0340529229 |
| Classificazione | AMS 62F15 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Record Nr. | UNISALENTO-991000715799707536 |
O'Hagan, Anthony
|
||
| London : E. Arnold, 1994 | ||
| Lo trovi qui: Univ. del Salento | ||
| ||
Benchmark Priors Revisited : : On Adaptive Shrinkage and the Supermodel Effect in Bayesian Model Averaging / / Martin Feldkircher, Stefan Zeugner
| Benchmark Priors Revisited : : On Adaptive Shrinkage and the Supermodel Effect in Bayesian Model Averaging / / Martin Feldkircher, Stefan Zeugner |
| Autore | Feldkircher Martin |
| Pubbl/distr/stampa | Washington, D.C. : , : International Monetary Fund, , 2009 |
| Descrizione fisica | 39 p. : col. ill |
| Altri autori (Persone) | ZeugnerStefan |
| Collana | IMF Working Papers |
| Soggetto topico |
Bayesian statistical decision theory
Economic development - Mathematical models Econometrics Inflation Labor Public Finance Data Processing Bayesian Analysis: General Data Collection and Data Estimation Methodology Computer Programs: General National Government Expenditures and Related Policies: Infrastructures Other Public Investment and Capital Stock Human Capital Skills Occupational Choice Labor Productivity Price Level Deflation Bayesian inference Data capture & analysis Public finance & taxation Labour income economics Macroeconomics Bayesian models Data processing Public investment and public-private partnerships (PPP) Human capital Econometric models Electronic data processing Public-private sector cooperation Prices |
| ISBN |
1-4623-4466-6
1-4518-7349-2 9786612844096 1-4527-6923-0 1-282-84409-1 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Record Nr. | UNINA-9910788226903321 |
Feldkircher Martin
|
||
| Washington, D.C. : , : International Monetary Fund, , 2009 | ||
| Lo trovi qui: Univ. Federico II | ||
| ||
Benchmark Priors Revisited : : On Adaptive Shrinkage and the Supermodel Effect in Bayesian Model Averaging / / Martin Feldkircher, Stefan Zeugner
| Benchmark Priors Revisited : : On Adaptive Shrinkage and the Supermodel Effect in Bayesian Model Averaging / / Martin Feldkircher, Stefan Zeugner |
| Autore | Feldkircher Martin |
| Edizione | [1st ed.] |
| Pubbl/distr/stampa | Washington, D.C. : , : International Monetary Fund, , 2009 |
| Descrizione fisica | 39 p. : col. ill |
| Disciplina | 332.015195 |
| Altri autori (Persone) | ZeugnerStefan |
| Collana | IMF Working Papers |
| Soggetto topico |
Bayesian statistical decision theory
Economic development - Mathematical models Bayesian Analysis: General Bayesian inference Bayesian models Computer Programs: General Data capture & analysis Data Collection and Data Estimation Methodology Data Processing Data processing Deflation Econometric models Econometrics Electronic data processing Human Capital Human capital Income economics Inflation Labor Productivity Labor Labour Macroeconomics National Government Expenditures and Related Policies: Infrastructures Occupational Choice Other Public Investment and Capital Stock Price Level Prices Public finance & taxation Public Finance Public investment and public-private partnerships (PPP) Public-private sector cooperation Skills |
| ISBN |
9786612844096
9781462344666 1462344666 9781451873498 1451873492 9781452769233 1452769230 9781282844094 1282844091 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Nota di contenuto | Cover Page -- Title Page -- Copyright Page -- Contents -- I Introduction -- II Bayesian Model Averaging under Zellner's g Prior -- II. 1 Popular Settings for Zellner's g -- III The Hyper-g Prior: A Beta Prior on the Shrinkage Factor -- IV A Simulation Study -- V Growth Determinants Revisited -- VI Concluding Remarks -- A Technical Appendix -- A. 1 Consistency of the Hyper-g Prior -- A. 2 Relationship between Hyper-g Prior and EBL -- A. 3 The Shrinkage Factor and Goodness-of-Fit -- A. 4 The Posterior Predictive Distribution and the Hyper-g Prior -- A. 5 The Beta-binomial Prior over the Model Space -- A. 6 Charts and Tables -- References -- Footnotes. |
| Record Nr. | UNINA-9910961802503321 |
Feldkircher Martin
|
||
| Washington, D.C. : , : International Monetary Fund, , 2009 | ||
| Lo trovi qui: Univ. Federico II | ||
| ||
Data Assimilation Fundamentals : A Unified Formulation of the State and Parameter Estimation Problem
| Data Assimilation Fundamentals : A Unified Formulation of the State and Parameter Estimation Problem |
| Autore | Evensen Geir |
| Edizione | [1st ed.] |
| Pubbl/distr/stampa | Cham, : Springer International Publishing AG, 2022 |
| Descrizione fisica | 1 online resource (251 p.) |
| Altri autori (Persone) |
VossepoelFemke C
LeeuwenPeter Jan van |
| Collana | Springer Textbooks in Earth Sciences, Geography and Environment |
| Soggetto topico |
Earth sciences
Probability & statistics Bayesian inference |
| Soggetto non controllato |
Data Assimilation
Parameter Estimation Ensemble Kalman Filter 4DVar Representer Method Ensemble Methods Particle Filter Particle Flow |
| ISBN | 3-030-96709-3 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Nota di contenuto |
Intro -- Preface -- Contents -- Symbols -- List of Approximations -- 1 Introduction -- 2 Problem Formulation -- 2.1 Bayesian Formulation -- 2.1.1 Assimilation Windows -- 2.1.2 Model with Uncertain Inputs -- 2.1.3 Model State -- 2.1.4 State Vector -- 2.1.5 Formulation Over Multiple Assimilation Windows -- 2.1.6 Measurements with Errors -- 2.1.7 Bayesian Inference -- 2.2 Recursive Bayesian Formulation -- 2.2.1 Markov Model -- 2.2.2 Independent Measurements -- 2.2.3 Recursive form of Bayes' -- 2.2.4 Marginal Bayes' for Filtering -- 2.3 Error Propagation -- 2.3.1 Fokker-Planck Equation -- 2.3.2 Covariance Evolution Equation -- 2.3.3 Ensemble Predictions -- 2.4 Various Problem Formulations -- 2.4.1 General Smoother Formulation -- 2.4.2 Filter Formulation -- 2.4.3 Recursive Smoother Formulation -- 2.4.4 A Smoother Formulation for Perfect Models -- 2.4.5 Parameter Estimation -- 2.4.6 Estimating Initial Conditions, Parameters, Controls, and Errors -- 2.5 Including the Predicted Measurements in Bayes Theorem -- 3 Maximum a Posteriori Solution -- 3.1 Maximum a Posteriori (MAP) Estimate -- 3.2 Gaussian Prior and Likelihood -- 3.3 Iterative Solutions -- 3.4 Gauss-Newton Iterations -- 3.5 Incremental Form of Gauss-Newton Iterations -- 4 Strong-Constraint 4DVar -- 4.1 Standard Strong-Constraint 4DVar Method -- 4.1.1 Data-Assimilation Problem -- 4.1.2 Lagrangian Formulation -- 4.1.3 Explaining the Measurement Operator -- 4.1.4 Euler-Lagrange Equations -- 4.2 Incremental Strong-Constraint 4DVar -- 4.2.1 Incremental Formulation -- 4.2.2 Lagrangian Formulation for the Inner Iterations -- 4.2.3 Euler-Lagrange Equations for the Inner Iterations -- 4.3 Preconditioning in Incremental SC-4DVar -- 4.4 Summary of SC-4DVar -- 5 Weak Constraint 4DVar -- 5.1 Forcing Formulation -- 5.2 State-Space Formulation -- 5.3 Incremental Form of the Generalized Inverse.
5.4 Minimizing the Cost Function for the Increment -- 5.5 Observation Space Formulation -- 5.5.1 Original Representer Method -- 5.5.2 Efficient Weak-Constraint Solution in Observation Space -- 6 Kalman Filters and 3DVar -- 6.1 Linear Update from Predicted Measurements -- 6.2 3DVar -- 6.3 Kalman Filter -- 6.4 Optimal Interpolation -- 6.5 Extended Kalman Filter -- 7 Randomized-Maximum-Likelihood Sampling -- 7.1 RML Sampling -- 7.2 Approximate EKF Sampling -- 7.3 Approximate Gauss-Newton Sampling -- 7.4 Least-Squares Best-Fit Model Sensitivity -- 8 Low-Rank Ensemble Methods -- 8.1 Ensemble Approximation -- 8.2 Definition of Ensemble Matrices -- 8.3 Cost Function in the Ensemble Subspace -- 8.4 Ensemble Subspace RML -- 8.5 Ensemble Kalman Filter (EnKF) Update -- 8.6 Ensemble DA with Multiple Updating (ESMDA) -- 8.7 Ensemble 4DVar with Consistent Error Statistics -- 8.8 Square-Root EnKF -- 8.9 Ensemble Subspace Inversion -- 8.10 A Note on the EnKF Analysis Equation -- 9 Fully Nonlinear Data Assimilation -- 9.1 Particle Approximation -- 9.2 Particle Filters -- 9.2.1 The Standard Particle Filter -- 9.2.2 Proposal Densities -- 9.2.3 The Optimal Proposal Density -- 9.2.4 Other Particle Filter Schemes -- 9.3 Particle-Flow Filters -- 9.3.1 Particle Flow Filters via Likelihood Factorization -- 9.3.2 Particle Flows via Distance Minimization -- 10 Localization and Inflation -- 10.1 Background -- 10.2 Various Forms of the EnKF Update -- 10.3 Impact of Sampling Errors in the EnKF Update -- 10.3.1 Spurious Correlations -- 10.3.2 Update Confined to Ensemble Subspace -- 10.3.3 Ensemble Representation of the Measurement Information -- 10.4 Localization in Ensemble Kalman Filters -- 10.4.1 Covariance Localization -- 10.4.2 Localization in Observation Space -- 10.4.3 Localization in Ensemble Space -- 10.4.4 Local Analysis -- 10.5 Adaptive Localization. 10.6 Localization in Time -- 10.7 Inflation -- 10.8 Localization in Particle Filters -- 10.9 Summary -- 11 Methods' Summary -- 11.1 Discussion of Methods -- 11.2 So Which Method to Use? -- blackPart II Examples and Applications-1pt -- 12 A Kalman Filter with the Roessler Model -- 12.1 Roessler Model System -- 12.2 Kalman Filter with the Roessler System -- 12.3 Extended Kalman Filter with the Roessler System -- 13 Linear EnKF Update -- 13.1 EnKF Update Example -- 13.2 Solution Methods -- 13.3 Example 1 (Large Ensemble Size) -- 13.4 Example 2 (Ensemble Size of 100) -- 13.5 Example 3 (Augmenting the Measurement Perturbations) -- 13.6 Example 4 (Large Number of Measurements) -- 14 EnKF for an Advection Equation -- 14.1 Experiment Description -- 14.2 Assimilation Experiment -- 15 EnKF with the Lorenz Equations -- 15.1 The Lorenz'63 Model -- 15.2 Ensemble Smoother Solution -- 15.3 Ensemble Kalman Filter Solution -- 15.4 Ensemble Kalman Smoother Solution -- 16 3Dvar and SC-4DVar for the Lorenz 63 Model -- 16.1 Data Assimilation Set up -- 16.2 Comparing 3DVar and SC-4DVar -- 16.3 Sensitivity to Observation Density in SC-4DVar -- 16.4 3DVar and SC-4DVar with Partial Observations -- 16.5 Sensitivity to the Length of Assimilation Window -- 16.6 SC-4DVar with Multiple Assimilation Windows -- 16.7 A Comparison with Ensemble Methods -- 17 Representer Method with an Ekman-Flow Model -- 17.1 Ekman-Flow Model -- 17.2 Example Experiment -- 17.3 Assimilation of Real Measurements -- 18 Comparison of Methods on a Scalar Model -- 18.1 Scalar Model and Inverse Problem -- 18.2 Discussion of Data-Assimilation Examples -- 18.3 Summary -- 19 Particle Filter for Seismic-Cycle Estimation -- 19.1 Particle Filter for State and Parameter Estimation -- 19.2 Seismic Cycle Model -- 19.3 Data-Assimilation Experiments -- 19.4 Case A: State Estimation. 19.5 Case B: State Estimation with Increased Model Error -- 19.6 Case C: State- and Parameter Estimation -- 19.7 Summary -- 20 Particle Flow for a Quasi-Geostrophic Model -- 20.1 Introduction -- 20.2 Application to the QG Model -- 20.3 Data-Assimilation Experiment -- 20.4 Results -- 21 EnRML for History Matching Petroleum Models -- 21.1 Reservoir Modeling -- 21.2 History Matching Reservoir Models -- 21.3 Example -- 22 ESMDA with a SARS-COV-2 Pandemic Model -- 22.1 An Extended SEIR Model -- 22.2 Example -- 22.3 Sensitivity to Ensemble Size -- 22.4 Sensitivity to MDA Steps -- 22.5 Summary -- 23 Final Summary -- 23.1 Classification of the Nonlinearity -- 23.1.1 Linear to Weakly-Nonlinear Systems with Gaussian Priors -- 23.1.2 Weakly Nonlinear Systems with Gaussian Priors -- 23.1.3 Strongly Nonlinear Systems -- 23.2 Purpose of the Data Assimilation -- 23.2.1 Hindcasts and Re-analyses -- 23.2.2 Prediction Systems -- 23.2.3 Uncertainty Quantification and Risk Assessment -- 23.2.4 Model Improvement and Parameter Estimation -- 23.2.5 Scenario Forecasts and Optimal Controls -- 23.3 How to Reduce Computational Costs -- 23.4 What Will the Future Hold? -- References -- Author Index -- Author Index -- Index -- Index. |
| Record Nr. | UNINA-9910564680903321 |
Evensen Geir
|
||
| Cham, : Springer International Publishing AG, 2022 | ||
| Lo trovi qui: Univ. Federico II | ||
| ||
Limited Information Bayesian Model Averaging for Dynamic Panels with Short Time Periods / / Alin Mirestean, Charalambos Tsangarides, Huigang Chen
| Limited Information Bayesian Model Averaging for Dynamic Panels with Short Time Periods / / Alin Mirestean, Charalambos Tsangarides, Huigang Chen |
| Autore | Mirestean Alin |
| Pubbl/distr/stampa | Washington, D.C. : , : International Monetary Fund, , 2009 |
| Descrizione fisica | 1 online resource (45 p.) |
| Altri autori (Persone) |
TsangaridesCharalambos
ChenHuigang |
| Collana | IMF Working Papers |
| Soggetto topico |
Panel analysis
Bayesian statistical decision theory Econometrics Data Processing Bayesian Analysis: General Estimation Data Collection and Data Estimation Methodology Computer Programs: General Bayesian inference Econometrics & economic statistics Data capture & analysis Bayesian models Estimation techniques Data processing Econometric models Electronic data processing |
| ISBN |
1-4623-7192-2
1-4527-1274-3 9786612842955 1-4518-7221-6 1-282-84295-1 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Nota di contenuto |
Contents; I. Introduction; II. Model Uncertainty in the Bayesian Context; A. Model Selection and Hypothesis Testing; B. Bayesian Model Averaging; C. Choice of Priors; III. Limited Information Bayesian Model Averaging; A. A Dynamic Panel Data Model with Endogenous Regressors; B. Estimation and Moment Conditions; C. The Limited Information Criterion; IV. Monte Carlo Simualtions and Results; A. The Data Generating Process; B. Simulation Results; V. Conclusion; References; Tables; 1. Posterior Probability of the True Model; 2. Posterior Probability Ratio of True Model/Best among the Other Models
3. Probability of Retrieving the True Model4. Model Recovery: Medians and Variances of Posterior Inclusi; 5. Model Recovery: Medians and Variances of Estimated Paramet; 6. Posterior Probability of the True Model (Non-Gaussian Case); 7. Posterior Probability Ratio: True Model/best among the Other Models (Non-Gaussian Case); 8. Probability of Retrieving the True Model (Non-Gaussian Case); 9. Model Recovery: Medians and Variances of Posterior Inclusion Probability for Each Variable (Non-Gaussian Case); 10. Model Recovery: Medians and Variances of Estimated Parameter Values (Non- Gaussian Case) Appendix A Figures1. Posterior Densities for the Probabilities in Table 1; 2. Posterior Densities for the Probabilities in Table 2; 3. Box Plots for Parameters in Table 5; 4. Posterior Densities for the Probabilities in Table 6; 5. Posterior Densities for the Probabilities in Table 7; 6. Box Plots for Parameters in Table 10 |
| Record Nr. | UNINA-9910788337703321 |
Mirestean Alin
|
||
| Washington, D.C. : , : International Monetary Fund, , 2009 | ||
| Lo trovi qui: Univ. Federico II | ||
| ||
Limited Information Bayesian Model Averaging for Dynamic Panels with Short Time Periods / / Alin Mirestean, Charalambos Tsangarides, Huigang Chen
| Limited Information Bayesian Model Averaging for Dynamic Panels with Short Time Periods / / Alin Mirestean, Charalambos Tsangarides, Huigang Chen |
| Autore | Mirestean Alin |
| Edizione | [1st ed.] |
| Pubbl/distr/stampa | Washington, D.C. : , : International Monetary Fund, , 2009 |
| Descrizione fisica | 1 online resource (45 p.) |
| Disciplina | 332.152 |
| Altri autori (Persone) |
ChenHuigang
TsangaridesCharalambos |
| Collana | IMF Working Papers |
| Soggetto topico |
Panel analysis
Bayesian statistical decision theory Bayesian Analysis: General Bayesian inference Bayesian models Computer Programs: General Data capture & analysis Data Collection and Data Estimation Methodology Data Processing Data processing Econometric models Econometrics & economic statistics Econometrics Electronic data processing Estimation techniques Estimation |
| ISBN |
9786612842955
9781462371921 1462371922 9781452712741 1452712743 9781451872217 1451872216 9781282842953 1282842951 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Nota di contenuto |
Contents; I. Introduction; II. Model Uncertainty in the Bayesian Context; A. Model Selection and Hypothesis Testing; B. Bayesian Model Averaging; C. Choice of Priors; III. Limited Information Bayesian Model Averaging; A. A Dynamic Panel Data Model with Endogenous Regressors; B. Estimation and Moment Conditions; C. The Limited Information Criterion; IV. Monte Carlo Simualtions and Results; A. The Data Generating Process; B. Simulation Results; V. Conclusion; References; Tables; 1. Posterior Probability of the True Model; 2. Posterior Probability Ratio of True Model/Best among the Other Models
3. Probability of Retrieving the True Model4. Model Recovery: Medians and Variances of Posterior Inclusi; 5. Model Recovery: Medians and Variances of Estimated Paramet; 6. Posterior Probability of the True Model (Non-Gaussian Case); 7. Posterior Probability Ratio: True Model/best among the Other Models (Non-Gaussian Case); 8. Probability of Retrieving the True Model (Non-Gaussian Case); 9. Model Recovery: Medians and Variances of Posterior Inclusion Probability for Each Variable (Non-Gaussian Case); 10. Model Recovery: Medians and Variances of Estimated Parameter Values (Non- Gaussian Case) Appendix A Figures1. Posterior Densities for the Probabilities in Table 1; 2. Posterior Densities for the Probabilities in Table 2; 3. Box Plots for Parameters in Table 5; 4. Posterior Densities for the Probabilities in Table 6; 5. Posterior Densities for the Probabilities in Table 7; 6. Box Plots for Parameters in Table 10 |
| Record Nr. | UNINA-9910961807103321 |
Mirestean Alin
|
||
| Washington, D.C. : , : International Monetary Fund, , 2009 | ||
| Lo trovi qui: Univ. Federico II | ||
| ||
Practical Model-Based Monetary Policy Analysis : : A How-To Guide / / Douglas Laxton, Andrew Berg, Philippe Karam
| Practical Model-Based Monetary Policy Analysis : : A How-To Guide / / Douglas Laxton, Andrew Berg, Philippe Karam |
| Autore | Laxton Douglas |
| Edizione | [1st ed.] |
| Pubbl/distr/stampa | Washington, D.C. : , : International Monetary Fund, , 2006 |
| Descrizione fisica | 1 online resource (69 p.) |
| Disciplina | 332.8/2/0971 |
| Altri autori (Persone) |
BergAndrew
KaramPhilippe |
| Collana | IMF Working Papers |
| Soggetto topico |
Monetary policy - Econometric models - Canada
Economic forecasting - Econometric models - Canada Aggregate demand Bayesian inference Central bank Consumer price index Demand Disinflation Dynamic stochastic general equilibrium Econometrics Economic equilibrium Economic model |
| ISBN |
9786613822123
9781462372010 1462372015 9781452790510 1452790515 9781282545366 1282545361 9781451908763 1451908768 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Nota di contenuto | ""Contents""; ""I. INTRODUCTION""; ""II. THE MODEL""; ""III. BUILDING THE MODEL""; ""IV. FORECASTING AND POLICY ANALYSIS""; ""V. AN EXAMPLE""; ""VI. CONCLUSIONS""; ""REFERENCES"" |
| Record Nr. | UNINA-9910965332703321 |
Laxton Douglas
|
||
| Washington, D.C. : , : International Monetary Fund, , 2006 | ||
| Lo trovi qui: Univ. Federico II | ||
| ||
Regularized System Identification : Learning Dynamic Models from Data
| Regularized System Identification : Learning Dynamic Models from Data |
| Autore | Pillonetto Gianluigi |
| Edizione | [1st ed.] |
| Pubbl/distr/stampa | Cham, : Springer International Publishing AG, 2022 |
| Descrizione fisica | 1 online resource (394 p.) |
| Altri autori (Persone) |
ChenTianshi
ChiusoAlessandro De NicolaoGiuseppe LjungLennart |
| Collana | Communications and Control Engineering |
| Soggetto topico |
Machine learning
Automatic control engineering Statistical physics Bayesian inference Probability & statistics Cybernetics & systems theory |
| Soggetto non controllato |
System Identification
Machine Learning Linear Dynamical Systems Nonlinear Dynamical Systems Kernel-based Regularization Bayesian Interpretation of Regularization Gaussian Processes Reproducing Kernel Hilbert Spaces Estimation Theory Support Vector Machines Regularization Networks |
| ISBN | 3-030-95860-4 |
| Classificazione | COM004000MAT029000MAT029010SCI055000SCI064000TEC004000 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Nota di contenuto |
Intro -- Preface -- Acknowledgements -- Contents -- Abbreviations and Notation -- Notation -- Abbreviations -- 1 Bias -- 1.1 The Stein Effect -- 1.1.1 The James-Stein Estimator -- 1.1.2 Extensions of the James-Stein Estimator -- 1.2 Ridge Regression -- 1.3 Further Topics and Advanced Reading -- 1.4 Appendix: Proof of Theorem 1.1 -- References -- 2 Classical System Identification -- 2.1 The State-of-the-Art Identification Setup -- 2.2 mathcalM: Model Structures -- 2.2.1 Linear Time-Invariant Models -- 2.2.2 Nonlinear Models -- 2.3 mathcalI: Identification Methods-Criteria -- 2.3.1 A Maximum Likelihood (ML) View -- 2.4 Asymptotic Properties of the Estimated Models -- 2.4.1 Bias and Variance -- 2.4.2 Properties of the PEM Estimate as Ntoinfty -- 2.4.3 Trade-Off Between Bias and Variance -- 2.5 X: Experiment Design -- 2.6 mathcalV: Model Validation -- 2.6.1 Falsifying Models: Residual Analysis -- 2.6.2 Comparing Different Models -- 2.6.3 Cross-Validation -- References -- 3 Regularization of Linear Regression Models -- 3.1 Linear Regression -- 3.2 The Least Squares Method -- 3.2.1 Fundamentals of the Least Squares Method -- 3.2.2 Mean Squared Error and Model Order Selection -- 3.3 Ill-Conditioning -- 3.3.1 Ill-Conditioned Least Squares Problems -- 3.3.2 Ill-Conditioning in System Identification -- 3.4 Regularized Least Squares with Quadratic Penalties -- 3.4.1 Making an Ill-Conditioned LS Problem Well Conditioned -- 3.4.2 Equivalent Degrees of Freedom -- 3.5 Regularization Tuning for Quadratic Penalties -- 3.5.1 Mean Squared Error and Expected Validation Error -- 3.5.2 Efficient Sample Reuse -- 3.5.3 Expected In-Sample Validation Error -- 3.6 Regularized Least Squares with Other Types of Regularizers -- 3.6.1 ell1-Norm Regularization -- 3.6.2 Nuclear Norm Regularization -- 3.7 Further Topics and Advanced Reading -- 3.8 Appendix.
3.8.1 Fundamentals of Linear Algebra -- 3.8.2 Proof of Lemma 3.1 -- 3.8.3 Derivation of Predicted Residual Error Sum of Squares (PRESS) -- 3.8.4 Proof of Theorem 3.7 -- 3.8.5 A Variant of the Expected In-Sample Validation Error and Its Unbiased Estimator -- References -- 4 Bayesian Interpretation of Regularization -- 4.1 Preliminaries -- 4.2 Incorporating Prior Knowledge via Bayesian Estimation -- 4.2.1 Multivariate Gaussian Variables -- 4.2.2 The Gaussian Case -- 4.2.3 The Linear Gaussian Model -- 4.2.4 Hierarchical Bayes: Hyperparameters -- 4.3 Bayesian Interpretation of the James-Stein Estimator -- 4.4 Full and Empirical Bayes Approaches -- 4.5 Improper Priors and the Bias Space -- 4.6 Maximum Entropy Priors -- 4.7 Model Approximation via Optimal Projection -- 4.8 Equivalent Degrees of Freedom -- 4.9 Bayesian Function Reconstruction -- 4.10 Markov Chain Monte Carlo Estimation -- 4.11 Model Selection Using Bayes Factors -- 4.12 Further Topics and Advanced Reading -- 4.13 Appendix -- 4.13.1 Proof of Theorem 4.1 -- 4.13.2 Proof of Theorem 4.2 -- 4.13.3 Proof of Lemma 4.1 -- 4.13.4 Proof of Theorem 4.3 -- 4.13.5 Proof of Theorem 4.6 -- 4.13.6 Proof of Proposition 4.3 -- 4.13.7 Proof of Theorem 4.8 -- References -- 5 Regularization for Linear System Identification -- 5.1 Preliminaries -- 5.2 MSE and Regularization -- 5.3 Optimal Regularization for FIR Models -- 5.4 Bayesian Formulation and BIBO Stability -- 5.5 Smoothness and Contractivity: Time- and Frequency-Domain Interpretations -- 5.5.1 Maximum Entropy Priors for Smoothness and Stability: From Splines to Dynamical Systems -- 5.6 Regularization and Basis Expansion -- 5.7 Hankel Nuclear Norm Regularization -- 5.8 Historical Overview -- 5.8.1 The Distributed Lag Estimator: Prior Means and Smoothing -- 5.8.2 Frequency-Domain Smoothing and Stability. 5.8.3 Exponential Stability and Stochastic Embedding -- 5.9 Further Topics and Advanced Reading -- 5.10 Appendix -- 5.10.1 Optimal Kernel -- 5.10.2 Proof of Lemma 5.1 -- 5.10.3 Proof of Theorem 5.5 -- 5.10.4 Proof of Corollary 5.1 -- 5.10.5 Proof of Lemma 5.2 -- 5.10.6 Proof of Theorem 5.6 -- 5.10.7 Proof of Lemma 5.5 -- 5.10.8 Forward Representations of Stable-Splines Kernels -- References -- 6 Regularization in Reproducing Kernel Hilbert Spaces -- 6.1 Preliminaries -- 6.2 Reproducing Kernel Hilbert Spaces -- 6.2.1 Reproducing Kernel Hilbert Spaces Induced by Operations on Kernels -- 6.3 Spectral Representations of Reproducing Kernel Hilbert Spaces -- 6.3.1 More General Spectral Representation -- 6.4 Kernel-Based Regularized Estimation -- 6.4.1 Regularization in Reproducing Kernel Hilbert Spaces and the Representer Theorem -- 6.4.2 Representer Theorem Using Linear and Bounded Functionals -- 6.5 Regularization Networks and Support Vector Machines -- 6.5.1 Regularization Networks -- 6.5.2 Robust Regression via Huber Loss -- 6.5.3 Support Vector Regression -- 6.5.4 Support Vector Classification -- 6.6 Kernels Examples -- 6.6.1 Linear Kernels, Regularized Linear Regression and System Identification -- 6.6.2 Kernels Given by a Finite Number of Basis Functions -- 6.6.3 Feature Map and Feature Space -- 6.6.4 Polynomial Kernels -- 6.6.5 Translation Invariant and Radial Basis Kernels -- 6.6.6 Spline Kernels -- 6.6.7 The Bias Space and the Spline Estimator -- 6.7 Asymptotic Properties -- 6.7.1 The Regression Function/Optimal Predictor -- 6.7.2 Regularization Networks: Statistical Consistency -- 6.7.3 Connection with Statistical Learning Theory -- 6.8 Further Topics and Advanced Reading -- 6.9 Appendix -- 6.9.1 Fundamentals of Functional Analysis -- 6.9.2 Proof of Theorem 6.1 -- 6.9.3 Proof of Theorem 6.10 -- 6.9.4 Proof of Theorem 6.13. 6.9.5 Proofs of Theorems 6.15 and 6.16 -- 6.9.6 Proof of Theorem 6.21 -- References -- 7 Regularization in Reproducing Kernel Hilbert Spaces for Linear System Identification -- 7.1 Regularized Linear System Identification in Reproducing Kernel Hilbert Spaces -- 7.1.1 Discrete-Time Case -- 7.1.2 Continuous-Time Case -- 7.1.3 More General Use of the Representer Theorem for Linear System Identification -- 7.1.4 Connection with Bayesian Estimation of Gaussian Processes -- 7.1.5 A Numerical Example -- 7.2 Kernel Tuning -- 7.2.1 Marginal Likelihood Maximization -- 7.2.2 Stein's Unbiased Risk Estimator -- 7.2.3 Generalized Cross-Validation -- 7.3 Theory of Stable Reproducing Kernel Hilbert Spaces -- 7.3.1 Kernel Stability: Necessary and Sufficient Conditions -- 7.3.2 Inclusions of Reproducing Kernel Hilbert Spaces in More General Lebesque Spaces -- 7.4 Further Insights into Stable Reproducing Kernel Hilbert Spaces -- 7.4.1 Inclusions Between Notable Kernel Classes -- 7.4.2 Spectral Decomposition of Stable Kernels -- 7.4.3 Mercer Representations of Stable Reproducing Kernel Hilbert Spaces and of Regularized Estimators -- 7.4.4 Necessary and Sufficient Stability Condition Using Kernel Eigenvectors and Eigenvalues -- 7.5 Minimax Properties of the Stable Spline Estimator -- 7.5.1 Data Generator and Minimax Optimality -- 7.5.2 Stable Spline Estimator -- 7.5.3 Bounds on the Estimation Error and Minimax Properties -- 7.6 Further Topics and Advanced Reading -- 7.7 Appendix -- 7.7.1 Derivation of the First-Order Stable Spline Norm -- 7.7.2 Proof of Proposition 7.1 -- 7.7.3 Proof of Theorem 7.5 -- 7.7.4 Proof of Theorem 7.7 -- 7.7.5 Proof of Theorem 7.9 -- References -- 8 Regularization for Nonlinear System Identification -- 8.1 Nonlinear System Identification -- 8.2 Kernel-Based Nonlinear System Identification. 8.2.1 Connection with Bayesian Estimation of Gaussian Random Fields -- 8.2.2 Kernel Tuning -- 8.3 Kernels for Nonlinear System Identification -- 8.3.1 A Numerical Example -- 8.3.2 Limitations of the Gaussian and Polynomial Kernel -- 8.3.3 Nonlinear Stable Spline Kernel -- 8.3.4 Numerical Example Revisited: Use of the Nonlinear Stable Spline Kernel -- 8.4 Explicit Regularization of Volterra Models -- 8.5 Other Examples of Regularization in Nonlinear System Identification -- 8.5.1 Neural Networks and Deep Learning Models -- 8.5.2 Static Nonlinearities and Gaussian Process (GP) -- 8.5.3 Block-Oriented Models -- 8.5.4 Hybrid Models -- 8.5.5 Sparsity and Variable Selection -- References -- 9 Numerical Experiments and Real World Cases -- 9.1 Identification of Discrete-Time Output Error Models -- 9.1.1 Monte Carlo Studies with a Fixed Output Error Model -- 9.1.2 Monte Carlo Studies with Different Output Error Models -- 9.1.3 Real Data: A Robot Arm -- 9.1.4 Real Data: A Hairdryer -- 9.2 Identification of ARMAX Models -- 9.2.1 Monte Carlo Experiment -- 9.2.2 Real Data: Temperature Prediction -- 9.3 Multi-task Learning and Population Approaches -- 9.3.1 Kernel-Based Multi-task Learning -- 9.3.2 Numerical Example: Real Pharmacokinetic Data -- References -- Appendix Index -- Index. |
| Record Nr. | UNINA-9910568256103321 |
Pillonetto Gianluigi
|
||
| Cham, : Springer International Publishing AG, 2022 | ||
| Lo trovi qui: Univ. Federico II | ||
| ||