Preface |
|
xiii | |
Frequently Used Notation |
|
xvii | |
I Foundations |
|
1 | (130) |
|
|
3 | (30) |
|
|
3 | (2) |
|
|
5 | (10) |
|
1.2.1 Second-order stationary processes |
|
|
5 | (4) |
|
1.2.2 Spectral representation |
|
|
9 | (4) |
|
|
13 | (2) |
|
|
15 | (10) |
|
1.3.1 What are linear Gaussian processes? |
|
|
15 | (1) |
|
|
16 | (3) |
|
|
19 | (2) |
|
|
21 | (4) |
|
1.4 The multivariate cases |
|
|
25 | (3) |
|
|
25 | (2) |
|
|
27 | (1) |
|
|
28 | (2) |
|
|
30 | (3) |
|
2 Linear Gaussian State Space Models |
|
|
33 | (28) |
|
|
33 | (3) |
|
2.2 Filtering, smoothing, and forecasting |
|
|
36 | (6) |
|
2.3 Maximum likelihood estimation |
|
|
42 | (3) |
|
|
42 | (1) |
|
|
43 | (2) |
|
2.4 Smoothing splines and the Kalman smoother |
|
|
45 | (2) |
|
2.5 Asymptotic distribution of the MLE |
|
|
47 | (2) |
|
2.6 Missing data modifications |
|
|
49 | (1) |
|
2.7 Structural component models |
|
|
50 | (3) |
|
2.8 State-space models with correlated errors |
|
|
53 | (3) |
|
|
54 | (2) |
|
2.8.2 Regression with autocorrelated errors |
|
|
56 | (1) |
|
|
56 | (5) |
|
|
61 | (30) |
|
3.1 Nonlinear non-Gaussian data |
|
|
62 | (6) |
|
3.2 Volterra series expansion |
|
|
68 | (1) |
|
3.3 Cumulants and higher-order spectra |
|
|
69 | (3) |
|
|
72 | (1) |
|
3.5 Conditionally heteroscedastic models |
|
|
73 | (4) |
|
3.6 Threshold ARMA models |
|
|
77 | (1) |
|
3.7 Functional autoregressive models |
|
|
78 | (1) |
|
3.8 Linear processes with infinite variance |
|
|
79 | (2) |
|
|
81 | (3) |
|
3.9.1 Integer valued models |
|
|
81 | (2) |
|
3.9.2 Generalized linear models |
|
|
83 | (1) |
|
|
84 | (5) |
|
|
89 | (2) |
|
4 Stochastic Recurrence Equations |
|
|
91 | (40) |
|
|
93 | (14) |
|
4.1.1 Strict stationarity |
|
|
93 | (5) |
|
|
98 | (4) |
|
|
102 | (5) |
|
|
107 | (11) |
|
4.2.1 Strict stationarity |
|
|
109 | (2) |
|
|
111 | (3) |
|
|
114 | (4) |
|
4.3 Iterated random function |
|
|
118 | (5) |
|
4.3.1 Strict stationarity |
|
|
118 | (3) |
|
|
121 | (2) |
|
|
123 | (8) |
II Markovian Models |
|
131 | (154) |
|
5 Markov Models: Construction and Definitions |
|
|
133 | (32) |
|
5.1 Markov chains: Past, future, and forgetfulness |
|
|
133 | (1) |
|
|
134 | (2) |
|
5.3 Homogeneous Markov chain |
|
|
136 | (2) |
|
5.4 Canonical representation |
|
|
138 | (1) |
|
|
139 | (3) |
|
5.6 Observation-driven models |
|
|
142 | (1) |
|
5.7 Iterated random functions |
|
|
143 | (9) |
|
|
152 | (5) |
|
5.8.1 Metropolis-Hastings algorithm |
|
|
152 | (3) |
|
|
155 | (2) |
|
|
157 | (8) |
|
6 Stability and Convergence |
|
|
165 | (30) |
|
|
166 | (7) |
|
6.1.1 Total variation distance |
|
|
166 | (1) |
|
6.1.2 Dobrushin coefficient |
|
|
167 | (2) |
|
6.1.3 The Doeblin condition |
|
|
169 | (1) |
|
|
169 | (4) |
|
6.2 V-geometric ergodicity |
|
|
173 | (13) |
|
6.2.1 V-total variation distance |
|
|
173 | (1) |
|
6.2.2 V -Dobrushin coefficient |
|
|
174 | (1) |
|
6.2.3 Drift and minorization conditions |
|
|
175 | (5) |
|
|
180 | (6) |
|
|
186 | (2) |
|
|
188 | (1) |
|
|
189 | (6) |
|
7 Sample Paths and Limit Theorems |
|
|
195 | (44) |
|
|
196 | (15) |
|
7.1.1 Dynamical system and ergodicity |
|
|
196 | (7) |
|
7.1.2 Markov chain ergodicity |
|
|
203 | (8) |
|
7.2 Central limit theorem |
|
|
211 | (7) |
|
7.3 Deviation inequalities for additive functionals |
|
|
218 | (7) |
|
7.3.1 Rosenthal type inequality |
|
|
218 | (3) |
|
7.3.2 Concentration inequality |
|
|
221 | (4) |
|
|
225 | (6) |
|
|
231 | (8) |
|
8 Liference for Markovian Models |
|
|
239 | (46) |
|
|
239 | (6) |
|
8.2 Consistency and asymptotic normality of the MLE |
|
|
245 | (9) |
|
|
245 | (2) |
|
8.2.2 Asymptotic normality |
|
|
247 | (7) |
|
8.3 Observation-driven models |
|
|
254 | (9) |
|
|
263 | (8) |
|
|
271 | (3) |
|
|
274 | (1) |
|
|
275 | (10) |
III State Space and Hidden Markov Models |
|
285 | (182) |
|
9 Non-Gaussian and Nonlinear State Space Models |
|
|
287 | (34) |
|
9.1 Definitions and basic properties |
|
|
287 | (15) |
|
9.1.1 Discrete-valued state space HMM |
|
|
287 | (8) |
|
9.1.2 Continuous-valued state-space models |
|
|
295 | (2) |
|
9.1.3 Conditionally Gaussian linear state-space models |
|
|
297 | (3) |
|
9.1.4 Switching processes with Markov regimes |
|
|
300 | (2) |
|
9.2 Filtering and smoothing |
|
|
302 | (12) |
|
9.2.1 Discrete-valued state-space HMM |
|
|
303 | (7) |
|
9.2.2 Continuous-valued state-space HMM |
|
|
310 | (4) |
|
|
314 | (1) |
|
|
315 | (6) |
|
|
321 | (40) |
|
|
321 | (8) |
|
10.2 Sequential importance sampling |
|
|
329 | (5) |
|
10.3 Sampling importance resampling |
|
|
334 | (3) |
|
10.3.1 Algorithm description |
|
|
335 | (1) |
|
10.3.2 Resampling techniques |
|
|
336 | (1) |
|
|
337 | (4) |
|
10.4.1 Sequential importance sampling |
|
|
337 | (2) |
|
10.4.2 Auxiliary sampling |
|
|
339 | (2) |
|
10.5 Convergence of the particle filter |
|
|
341 | (8) |
|
10.5.1 Exponential deviation inequalities |
|
|
341 | (2) |
|
10.5.2 Time-uniform bounds |
|
|
343 | (6) |
|
|
349 | (1) |
|
|
350 | (11) |
|
|
361 | (44) |
|
11.1 Poor man's smoother algorithm |
|
|
362 | (3) |
|
|
365 | (2) |
|
|
367 | (2) |
|
11.4 Smoothing functionals |
|
|
369 | (1) |
|
11.5 Particle independent Metropolis-Hastings |
|
|
370 | (6) |
|
|
376 | (5) |
|
11.7 Convergence of the FFBSm and FFBSi algorithms |
|
|
381 | (13) |
|
11.7.1 Exponential deviation inequality |
|
|
384 | (2) |
|
11.7.2 Asymptotic normality |
|
|
386 | (4) |
|
11.7.3 Time uniform bounds |
|
|
390 | (4) |
|
|
394 | (3) |
|
|
397 | (8) |
|
12 Inference for Nonlinear State Space Models |
|
|
405 | (36) |
|
12.1 Monte Carlo maximum likelihood estimation |
|
|
407 | (11) |
|
12.1.1 Particle approximation of the likelihood function |
|
|
407 | (3) |
|
12.1.2 Particle stochastic gradient |
|
|
410 | (2) |
|
12.1.3 Particle Monte Carlo EM algorithms |
|
|
412 | (3) |
|
12.1.4 Particle stochastic approximation EM |
|
|
415 | (3) |
|
|
418 | (15) |
|
12.2.1 Gaussian linear state space models |
|
|
419 | (4) |
|
12.2.2 Gibbs sampling for NLSS model |
|
|
423 | (5) |
|
12.2.3 Particle marginal Markov chain Monte Carlo |
|
|
428 | (3) |
|
12.2.4 Particle Gibbs algorithm |
|
|
431 | (2) |
|
|
433 | (2) |
|
|
435 | (6) |
|
13 Asymptotics of the MLE for NLSS |
|
|
441 | (26) |
|
13.1 Strong consistency of the MLE |
|
|
442 | (11) |
|
13.1.1 Forgetting the initial distribution |
|
|
442 | (2) |
|
13.1.2 Approximation by conditional likelihood |
|
|
444 | (1) |
|
13.1.3 Strong consistency |
|
|
445 | (7) |
|
13.1.4 Identifiability of mixture densities |
|
|
452 | (1) |
|
13.2 Asymptotic normality |
|
|
453 | (8) |
|
13.2.1 Convergence of the observed information |
|
|
458 | (2) |
|
13.2.2 Limit distribution of the MLE |
|
|
460 | (1) |
|
|
461 | (1) |
|
|
462 | (5) |
IV Appendices |
|
467 | (38) |
|
Appendix A Some Mathematical Background |
|
|
469 | (6) |
|
|
469 | (2) |
|
A.2 Some probability theory |
|
|
471 | (4) |
|
|
475 | (8) |
|
B.1 Definitions and elementary properties |
|
|
475 | (2) |
|
|
477 | (6) |
|
Appendix C Stochastic Approximation |
|
|
483 | (8) |
|
C.1 Robbins-Monro algorithm: Elementary results |
|
|
484 | (3) |
|
|
487 | (1) |
|
C.3 Stepsize selection and averaging |
|
|
488 | (1) |
|
C.4 The Kiefer-Wolfowitz procedure |
|
|
488 | (3) |
|
Appendix D Data Augmentation |
|
|
491 | (14) |
|
D.1 The EM algorithm in the incomplete data model |
|
|
492 | (2) |
|
D.2 The Fisher and-Louis identities |
|
|
494 | (1) |
|
D.3 Monte Carlo EM algorithm |
|
|
495 | (3) |
|
D.3.1 Stochastic approximation EM |
|
|
496 | (2) |
|
D.4 Convergence of the EM algorithm |
|
|
498 | (2) |
|
D.5 Convergence of the MCEM algorithm |
|
|
500 | (5) |
|
D.5.1 Convergence of perturbed dynamical systems |
|
|
500 | (2) |
|
D.5.2 Convergence of the MCEM algorithm |
|
|
502 | (3) |
References |
|
505 | (22) |
Index |
|
527 | |