Atnaujinkite slapukų nuostatas

El. knyga: Nonlinear Time Series: Theory, Methods and Applications with R Examples

(Telecom SudParis, Evry, France), (Telecom ParisTech, Paris, France), (University of Pittsburgh, Pennsylvania, USA)

DRM apribojimai

  • Kopijuoti:

    neleidžiama

  • Spausdinti:

    neleidžiama

  • El. knygos naudojimas:

    Skaitmeninių teisių valdymas (DRM)
    Leidykla pateikė šią knygą šifruota forma, o tai reiškia, kad norint ją atrakinti ir perskaityti reikia įdiegti nemokamą programinę įrangą. Norint skaityti šią el. knygą, turite susikurti Adobe ID . Daugiau informacijos  čia. El. knygą galima atsisiųsti į 6 įrenginius (vienas vartotojas su tuo pačiu Adobe ID).

    Reikalinga programinė įranga
    Norint skaityti šią el. knygą mobiliajame įrenginyje (telefone ar planšetiniame kompiuteryje), turite įdiegti šią nemokamą programėlę: PocketBook Reader (iOS / Android)

    Norint skaityti šią el. knygą asmeniniame arba „Mac“ kompiuteryje, Jums reikalinga  Adobe Digital Editions “ (tai nemokama programa, specialiai sukurta el. knygoms. Tai nėra tas pats, kas „Adobe Reader“, kurią tikriausiai jau turite savo kompiuteryje.)

    Negalite skaityti šios el. knygos naudodami „Amazon Kindle“.

Designed for researchers and students, Nonlinear Times Series: Theory, Methods and Applications with R Examples familiarizes readers with the principles behind nonlinear time series modelswithout overwhelming them with difficult mathematical developments. By focusing on basic principles and theory, the authors give readers the background required to craft their own stochastic models, numerical methods, and software. They will also be able to assess the advantages and disadvantages of different approaches, and thus be able to choose the right methods for their purposes.

The first part can be seen as a crash course on "classical" time series, with a special emphasis on linear state space models and detailed coverage of random coefficient autoregressions, both ARCH and GARCH models. The second part introduces Markov chains, discussing stability, the existence of a stationary distribution, ergodicity, limit theorems, and statistical inference. The book concludes with a self-contained account on nonlinear state space and sequential Monte Carlo methods. An elementary introduction to nonlinear state space modeling and sequential Monte Carlo, this section touches on current topics, from the theory of statistical inference to advanced computational methods.

The book can be used as a support to an advanced course on these methods, or an introduction to this field before studying more specialized texts. Several chapters highlight recent developments such as explicit rate of convergence of Markov chains and sequential Monte Carlo techniques. And while the chapters are organized in a logical progression, the three parts can be studied independently.

Statistics is not a spectator sport, so the book contains more than 200 exercises to challenge readers. These problems strengthen intellectual muscles strained by the introduction of new theory and go on to extend the theory in significant ways. The book helps readers hone their skills in nonlinear time series analysis and their applications.

Recenzijos

"This book is very suitable for mathematicians requiring a very rigorous and complete introduction to nonlinear time series and their applications in several fields." Zentralblatt MATH 1306

"This book focuses on theory and methods, with applications in mind. It is quite theory-heavy, with many rigorously established theoretical results. It is also very timely and covers many recent developments in nonlinear time series analysis readers can get a very up-to-date view of the current developments in nonlinear time series analysis from this book." Journal of the American Statistical Association, December 2014

" the book will definitely help readers who are very mathematically inclined and keen on rigour and interested in further pursuing the probabilistic aspects of nonlinear time series. I have no doubt the book will be useful and timely, and I have no hesitation in recommending the book ." T. Subba Rao, Journal of Time Series Analysis, 2014 "This book is very suitable for mathematicians requiring a very rigorous and complete introduction to nonlinear time series and their applications in several fields." Zentralblatt MATH 1306

"This book focuses on theory and methods, with applications in mind. It is quite theory-heavy, with many rigorously established theoretical results. It is also very timely and covers many recent developments in nonlinear time series analysis readers can get a very up-to-date view of the current developments in nonlinear time series analysis from this book." Journal of the American Statistical Association, December 2014

" the book will definitely help readers who are very mathematically inclined and keen on rigour and interested in further pursuing the probabilistic aspects of nonlinear time series. I have no doubt the book will be useful and timely, and I have no hesitation in recommending the book ." T. Subba Rao, Journal of Time Series Analysis, 2014

Preface xiii
Frequently Used Notation xvii
I Foundations 1(130)
1 Linear Models
3(30)
1.1 Stochastic processes
3(2)
1.2 The covariance world
5(10)
1.2.1 Second-order stationary processes
5(4)
1.2.2 Spectral representation
9(4)
1.2.3 Wold decomposition
13(2)
1.3 Linear processes
15(10)
1.3.1 What are linear Gaussian processes?
15(1)
1.3.2 ARMA models
16(3)
1.3.3 Prediction
19(2)
1.3.4 Estimation
21(4)
1.4 The multivariate cases
25(3)
1.4.1 Time domain
25(2)
1.4.2 Frequency domain
27(1)
1.5 Numerical examples
28(2)
Exercises
30(3)
2 Linear Gaussian State Space Models
33(28)
2.1 Model basics
33(3)
2.2 Filtering, smoothing, and forecasting
36(6)
2.3 Maximum likelihood estimation
42(3)
2.3.1 Newton-Raphson
42(1)
2.3.2 EM algorithm
43(2)
2.4 Smoothing splines and the Kalman smoother
45(2)
2.5 Asymptotic distribution of the MLE
47(2)
2.6 Missing data modifications
49(1)
2.7 Structural component models
50(3)
2.8 State-space models with correlated errors
53(3)
2.8.1 ARMAX models
54(2)
2.8.2 Regression with autocorrelated errors
56(1)
Exercises
56(5)
3 Beyond Linear Models
61(30)
3.1 Nonlinear non-Gaussian data
62(6)
3.2 Volterra series expansion
68(1)
3.3 Cumulants and higher-order spectra
69(3)
3.4 Bilinear models
72(1)
3.5 Conditionally heteroscedastic models
73(4)
3.6 Threshold ARMA models
77(1)
3.7 Functional autoregressive models
78(1)
3.8 Linear processes with infinite variance
79(2)
3.9 Models for counts
81(3)
3.9.1 Integer valued models
81(2)
3.9.2 Generalized linear models
83(1)
3.10 Numerical examples
84(5)
Exercises
89(2)
4 Stochastic Recurrence Equations
91(40)
4.1 The Scalar Case
93(14)
4.1.1 Strict stationarity
93(5)
4.1.2 Weak stationarity
98(4)
4.1.3 GARCH(1,1)
102(5)
4.2 The Vector Case
107(11)
4.2.1 Strict stationarity
109(2)
4.2.2 Weak stationarity
111(3)
4.2.3 GARCH(p, q)
114(4)
4.3 Iterated random function
118(5)
4.3.1 Strict stationarity
118(3)
4.3.2 Weak stationarity
121(2)
Exercises
123(8)
II Markovian Models 131(154)
5 Markov Models: Construction and Definitions
133(32)
5.1 Markov chains: Past, future, and forgetfulness
133(1)
5.2 Kernels
134(2)
5.3 Homogeneous Markov chain
136(2)
5.4 Canonical representation
138(1)
5.5 Invariant measures
139(3)
5.6 Observation-driven models
142(1)
5.7 Iterated random functions
143(9)
5.8 MCMC methods
152(5)
5.8.1 Metropolis-Hastings algorithm
152(3)
5.8.2 Gibbs sampling
155(2)
Exercises
157(8)
6 Stability and Convergence
165(30)
6.1 Uniform ergodicity
166(7)
6.1.1 Total variation distance
166(1)
6.1.2 Dobrushin coefficient
167(2)
6.1.3 The Doeblin condition
169(1)
6.1.4 Examples
169(4)
6.2 V-geometric ergodicity
173(13)
6.2.1 V-total variation distance
173(1)
6.2.2 V -Dobrushin coefficient
174(1)
6.2.3 Drift and minorization conditions
175(5)
6.2.4 Examples
180(6)
6.3 Some proofs
186(2)
6.4 Endnotes
188(1)
Exercises
189(6)
7 Sample Paths and Limit Theorems
195(44)
7.1 Law of large numbers
196(15)
7.1.1 Dynamical system and ergodicity
196(7)
7.1.2 Markov chain ergodicity
203(8)
7.2 Central limit theorem
211(7)
7.3 Deviation inequalities for additive functionals
218(7)
7.3.1 Rosenthal type inequality
218(3)
7.3.2 Concentration inequality
221(4)
7.4 Some proofs
225(6)
Exercises
231(8)
8 Liference for Markovian Models
239(46)
8.1 Likelihood inference
239(6)
8.2 Consistency and asymptotic normality of the MLE
245(9)
8.2.1 Consistency
245(2)
8.2.2 Asymptotic normality
247(7)
8.3 Observation-driven models
254(9)
8.4 Bayesian inference
263(8)
8.5 Some proofs
271(3)
8.6 Endnotes
274(1)
Exercises
275(10)
III State Space and Hidden Markov Models 285(182)
9 Non-Gaussian and Nonlinear State Space Models
287(34)
9.1 Definitions and basic properties
287(15)
9.1.1 Discrete-valued state space HMM
287(8)
9.1.2 Continuous-valued state-space models
295(2)
9.1.3 Conditionally Gaussian linear state-space models
297(3)
9.1.4 Switching processes with Markov regimes
300(2)
9.2 Filtering and smoothing
302(12)
9.2.1 Discrete-valued state-space HMM
303(7)
9.2.2 Continuous-valued state-space HMM
310(4)
9.3 Endnotes
314(1)
Exercises
315(6)
10 Particle Filtering
321(40)
10.1 Importance sampling
321(8)
10.2 Sequential importance sampling
329(5)
10.3 Sampling importance resampling
334(3)
10.3.1 Algorithm description
335(1)
10.3.2 Resampling techniques
336(1)
10.4 Particle filter
337(4)
10.4.1 Sequential importance sampling
337(2)
10.4.2 Auxiliary sampling
339(2)
10.5 Convergence of the particle filter
341(8)
10.5.1 Exponential deviation inequalities
341(2)
10.5.2 Time-uniform bounds
343(6)
10.6 Endnotes
349(1)
Exercises
350(11)
11 Particle Smoothing
361(44)
11.1 Poor man's smoother algorithm
362(3)
11.2 FFBSm algorithm
365(2)
11.3 FFBSi algorithm
367(2)
11.4 Smoothing functionals
369(1)
11.5 Particle independent Metropolis-Hastings
370(6)
11.6 Particle Gibbs
376(5)
11.7 Convergence of the FFBSm and FFBSi algorithms
381(13)
11.7.1 Exponential deviation inequality
384(2)
11.7.2 Asymptotic normality
386(4)
11.7.3 Time uniform bounds
390(4)
11.8 Endnotes
394(3)
Exercises
397(8)
12 Inference for Nonlinear State Space Models
405(36)
12.1 Monte Carlo maximum likelihood estimation
407(11)
12.1.1 Particle approximation of the likelihood function
407(3)
12.1.2 Particle stochastic gradient
410(2)
12.1.3 Particle Monte Carlo EM algorithms
412(3)
12.1.4 Particle stochastic approximation EM
415(3)
12.2 Bayesian analysis
418(15)
12.2.1 Gaussian linear state space models
419(4)
12.2.2 Gibbs sampling for NLSS model
423(5)
12.2.3 Particle marginal Markov chain Monte Carlo
428(3)
12.2.4 Particle Gibbs algorithm
431(2)
12.3 Endnotes
433(2)
Exercises
435(6)
13 Asymptotics of the MLE for NLSS
441(26)
13.1 Strong consistency of the MLE
442(11)
13.1.1 Forgetting the initial distribution
442(2)
13.1.2 Approximation by conditional likelihood
444(1)
13.1.3 Strong consistency
445(7)
13.1.4 Identifiability of mixture densities
452(1)
13.2 Asymptotic normality
453(8)
13.2.1 Convergence of the observed information
458(2)
13.2.2 Limit distribution of the MLE
460(1)
13.3 Endnotes
461(1)
Exercises
462(5)
IV Appendices 467(38)
Appendix A Some Mathematical Background
469(6)
A.1 Some measure theory
469(2)
A.2 Some probability theory
471(4)
Appendix B Martingales
475(8)
B.1 Definitions and elementary properties
475(2)
B.2 Limits theorems
477(6)
Appendix C Stochastic Approximation
483(8)
C.1 Robbins-Monro algorithm: Elementary results
484(3)
C.2 Stochastic gradient
487(1)
C.3 Stepsize selection and averaging
488(1)
C.4 The Kiefer-Wolfowitz procedure
488(3)
Appendix D Data Augmentation
491(14)
D.1 The EM algorithm in the incomplete data model
492(2)
D.2 The Fisher and-Louis identities
494(1)
D.3 Monte Carlo EM algorithm
495(3)
D.3.1 Stochastic approximation EM
496(2)
D.4 Convergence of the EM algorithm
498(2)
D.5 Convergence of the MCEM algorithm
500(5)
D.5.1 Convergence of perturbed dynamical systems
500(2)
D.5.2 Convergence of the MCEM algorithm
502(3)
References 505(22)
Index 527
Randal Douc, Eric Moulines, David Stoffer