Atnaujinkite slapukų nuostatas

Time Series: A First Course with Bootstrap Starter [Kietas viršelis]

, (U.S. Census Bureau, Washington, D.C., USA)
Kitos knygos pagal šią temą:
Kitos knygos pagal šią temą:

Time Series: A First Course with Bootstrap Starter provides an introductory course on time series analysis that satisfies the triptych of (i) mathematical completeness, (ii) computational illustration and implementation, and (iii) conciseness and accessibility to upper-level undergraduate and M.S. students. Basic theoretical results are presented in a mathematically convincing way, and the methods of data analysis are developed through examples and exercises parsed in R. A student with a basic course in mathematical statistics will learn both how to analyze time series and how to interpret the results.

The book provides the foundation of time series methods, including linear filters and a geometric approach to prediction. The important paradigm of ARMA models is studied in-depth, as well as frequency domain methods. Entropy and other information theoretic notions are introduced, with applications to time series modeling. The second half of the book focuses on statistical inference, the fitting of time series models, as well as computational facets of forecasting. Many time series of interest are nonlinear in which case classical inference methods can fail, but bootstrap methods may come to the rescue. Distinctive features of the book are the emphasis on geometric notions and the frequency domain, the discussion of entropy maximization, and a thorough treatment of recent computer-intensive methods for time series such as subsampling and the bootstrap. There are more than 600 exercises, half of which involve R coding and/or data analysis. Supplements include a website with 12 key data sets and all R code for the book's examples, as well as the solutions to exercises.

Recenzijos

"The authors should be congratulated for providing many concise and compact proofs for various technical assertions in time series. (There are many seemingly inconspicuous but intriguing technical details in time series!) The authors' strength and perhaps also their preference in frequency domain methods are well-reflected in the treatments in Chapters 6, 7 and 9, and also some parts of Chapters 10 and 11. Chapter 12 introduces several of the most popular bootstrap methods for time series, including AR-sieve bootstrap, block bootstrap and frequency domain bootstrap. In terms of the mathematical level, the book is for students with a solid mathematical background. The style of the presentation would also better suit courses offered in statistics, mathematics or engineering programmes for which spectral analysis is pertinent." ~International Statistical Review

"The first eight chapters of this book mainly focus on understanding the structure of time series. From the ninth chapter onwards, they discuss statistical inference based on time series dataSince the book includes a large number of exercises, teachers of a course on time series may find this book useful. Overall, researchers working in the area of time series may also find this book a useful reference. Finally, applied researchers involved with time series data may also find this book helpful." ~ISCB News

"This new monograph by McElroy (US Census Bureau) and Politis (Univ. of California, San Diego) is a timely publication, whereas the more well-known time series monographs were published long ago (in the 1980s and 1990s).. this volume stands out as an ideal source for readers exploring time series analysis both theoretically and empiricallySome unique topics are introduced, for example, information entropy in time series, time-series-specific statistical inference, and dependent data bootstrapping. The latter represents an important recent advancement in time series analysis." ~CHOICE

Preface xiii
1 Introduction 1(24)
1.1 Time Series Data
1(4)
1.2 Cycles in Time Series Data
5(3)
1.3 Spanning and Scaling Time Series
8(3)
1.4 Time Series Regression and Autoregression
11(5)
1.5 Overview
16(2)
1.6 Exercises
18(7)
2 The Probabilistic Structure of Time Series 25(28)
2.1 Random Vectors
25(4)
2.2 Time Series and Stochastic Processes
29(3)
2.3 Marginals and Strict Stationarity
32(3)
2.4 Autocovariance and Weak Stationarity
35(5)
2.5 Illustrations of Stochastic Processes
40(4)
2.6 Three Examples of White Noise
44(2)
2.7 Overview
46(1)
2.8 Exercises
47(6)
3 Trends, Seasonality, and Filtering 53(40)
3.1 Nonparametric Smoothing
53(3)
3.2 Linear Filters and Linear Time Series
56(2)
3.3 Some Common Types of Filters
58(4)
3.4 Trends
62(7)
3.5 Seasonality
69(7)
3.6 Trend and Seasonality Together
76(4)
3.7 Integrated Processes
80(4)
3.8 Overview
84(2)
3.9 Exercises
86(7)
4 The Geometry of Random Variables 93(36)
4.1 Vector Space Geometry and Inner Products
93(4)
4.2 L2 (Ω, P, F): The Space of Random Variables with Finite Second Moment
97(1)
4.3 Hilbert Space Geometry [ *]
98(3)
4.4 Projection in Hilbert Space
101(3)
4.5 Prediction of Time Series
104(4)
4.6 Linear Prediction of Time Series
108(3)
4.7 Orthonormal Sets and Infinite Projection
111(2)
4.8 Projection of Signals [ *]
113(6)
4.9 Overview
119(1)
4.10 Exercises
120(9)
5 ARMA Models with White Noise Residuals 129(40)
5.1 Definition of the ARMA Recursion
129(3)
5.2 Difference Equations
132(5)
5.3 Stationarity and Causality of the AR(1)
137(3)
5.4 Causality of ARMA Processes
140(4)
5.5 Invertibility of ARMA Processes
144(3)
5.6 The Autocovariance Generating Function
147(5)
5.7 Computing ARMA Autocovariances via the MA Representation
152(3)
5.8 Recursive Computation of ARMA Autocovariances
155(4)
5.9 Overview
159(1)
5.10 Exercises
160(9)
6 Time Series in the Frequency Domain 169(38)
6.1 The Spectral Density
169(6)
6.2 Filtering in the Frequency Domain
175(6)
6.3 Inverse Autocovariances
181(4)
6.4 Spectral Representation of Toeplitz Covariance Matrices
185(4)
6.5 Partial Autocorrelations
189(4)
6.6 Application to Model Identification
193(3)
6.7 Overview
196(1)
6.8 Exercises
197(10)
7 The Spectral Representation [ *] 207(40)
7.1 The Herglotz Theorem
207(5)
7.2 The Discrete Fourier Transform
212(3)
7.3 The Spectral Representation
215(5)
7.4 Optimal Filtering
220(5)
7.5 Kolmogorov's Formula
225(4)
7.6 The Wold Decomposition
229(3)
7.7 Spectral Approximation and the Cepstrum
232(5)
7.8 Overview
237(2)
7.9 Exercises
239(8)
8 Information and Entropy [ *] 247(32)
8.1 Introduction
247(4)
8.2 Events and Information Sets
251(3)
8.3 Maximum Entropy Distributions
254(4)
8.4 Entropy in Time Series
258(4)
8.5 Markov Time Series
262(3)
8.6 Modeling Time Series via Entropy
265(3)
8.7 Relative Entropy and Kullback-Leibler Discrepancy
268(3)
8.8 Overview
271(1)
8.9 Exercises
272(7)
9 Statistical Estimation 279(46)
9.1 Weak Correlation and Weak Dependence
279(2)
9.2 The Sample Mean
281(5)
9.3 CLT for Weakly Dependent Time Series [ *]
286(2)
9.4 Estimating Serial Correlation
288(3)
9.5 The Sample Autocovariance
291(4)
9.6 Spectral Means
295(6)
9.7 Statistical Properties of the Periodogram
301(5)
9.8 Spectral Density Estimation
306(5)
9.9 Refinements of Spectral Analysis
311(5)
9.10 Overview
316(2)
9.11 Exercises
318(7)
10 Fitting Time Series Models 325(60)
10.1 MA Model Identification
325(3)
10.2 EXP Model Identification [ *]
328(3)
10.3 AR Model Identification
331(5)
10.4 Optimal Prediction Estimators
336(5)
10.5 Relative Entropy Minimization
341(4)
10.6 Computation of Optimal Predictors
345(4)
10.7 Computation of the Gaussian Likelihood
349(5)
10.8 Model Evaluation
354(5)
10.9 Model Parsimony and Information Criteria
359(2)
10.10 Model Comparisons
361(5)
10.11 Iterative Forecasting
366(4)
10.12 Applications to Imputation and Signal Extraction
370(3)
10.13 Overview
373(3)
10.14 Exercises
376(9)
11 Nonlinear Time Series Analysis 385(30)
11.1 Types of Nonlinearity
385(4)
11.2 The Generalized Linear Process [ *]
389(3)
11.3 The ARCH Model
392(4)
11.4 The GARCH Model
396(4)
11.5 The Bi-spectral Density
400(4)
11.6 Volatility Filtering
404(5)
11.7 Overview
409(2)
11.8 Exercises
411(4)
12 The Bootstrap 415(52)
12.1 Sampling Distributions of Statistics
415(3)
12.2 Parameter Functionals and Monte Carlo
418(5)
12.3 The Plug-In Principle and the Bootstrap
423(4)
12.4 Model-Based Bootstrap and Residuals
427(6)
12.5 Sieve Bootstraps
433(6)
12.6 Time Frequency Toggle Bootstrap
439(5)
12.7 Subsampling
444(6)
12.8 Block Bootstrap Methods
450(8)
12.9 Overview
458(2)
12.10 Exercises
460(7)
A Probability 467(20)
A.1 Probability Spaces
467(3)
A.2 Random Variables
470(4)
A.3 Expectation and Variance
474(4)
A.4 Joint Distributions
478(4)
A.5 The Normal Distribution
482(1)
A.6 Exercises
483(4)
B Mathematical Statistics 487(20)
B.1 Data
487(2)
B.2 Sampling Distributions
489(2)
B.3 Estimation
491(2)
B.4 Inference
493(2)
B.5 Confidence Intervals
495(3)
B.6 Hypothesis Testing
498(4)
B.7 Exercises
502(5)
C Asymptotics 507(22)
C.1 Convergence Topologies
507(3)
C.2 Convergence Results for Random Variables
510(4)
C.3 Asymptotic Distributions
514(5)
C.4 Central Limit Theory for Time Series
519(9)
C.5 Exercises
528(1)
D Fourier Series 529(6)
D.1 Complex Random Variables
529(2)
D.2 Trigonometric Polynomials
531(4)
E Stieltjes Integration 535(12)
E.1 Deterministic Integration
535(3)
E.2 Stochastic Integration
538(9)
Index 547
Tucker S. McElroy is Senior Time Series Mathematical Statistician at the U.S. Census Bureau, where he has contributed to developing time series research and software for the last 15 years. He has published more than 80 papers and is a recipient of the Arthur S. Flemming award (2011).

Dimitris N. Politis is Distinguished Professor of Mathematics at the University of California at San Diego, where he is also serving as Associate Director of the Halcolu Data Science Institute. He has co-authored two research monographs and more than 100 journal papers. He is a recipient of the Tjalling C. Koopmans Econometric Theory Prize (2009-2011) and is Co-Editor of the Journal of Time Series Analysis.