Atnaujinkite slapukų nuostatas

Hidden Markov Models for Time Series: An Introduction Using R, Second Edition 2nd edition [Kietas viršelis]

4.00/5 (13 ratings by Goodreads)
(University of Cape Town, South Africa), (Bielefeld University, Germany), (University of Gottingen, Germany)
Kitos knygos pagal šią temą:
Kitos knygos pagal šią temą:
Hidden Markov Models for Time Series: An Introduction Using R, Second Edition illustrates the great flexibility of hidden Markov models (HMMs) as general-purpose models for time series data. The book provides a broad understanding of the models and their uses.

After presenting the basic model formulation, the book covers estimation, forecasting, decoding, prediction, model selection, and Bayesian inference for HMMs. Through examples and applications, the authors describe how to extend and generalize the basic model so that it can be applied in a rich variety of situations.

The book demonstrates how HMMs can be applied to a wide range of types of time series: continuous-valued, circular, multivariate, binary, bounded and unbounded counts, and categorical observations. It also discusses how to employ the freely available computing environment R to carry out the computations.

Features





Presents an accessible overview of HMMs Explores a variety of applications in ecology, finance, epidemiology, climatology, and sociology Includes numerous theoretical and programming exercises Provides most of the analysed data sets online

New to the second edition





A total of five chapters on extensions, including HMMs for longitudinal data, hidden semi-Markov models and models with continuous-valued state process New case studies on animal movement, rainfall occurrence and capture-recapture data

Recenzijos

"This book is an excellent resource for researchers of all levels, from undergraduate students to researchers already working with hidden Markov models. The book initially provides the mathematical theory and underlying intuition of hidden Markov models in a clear and concise manner before describing more advanced, recently developed techniques and a wide range of applications using real data. One focus of the book is the practical application of hidden Markov models. R code is usefully provided throughout the text (and combined within the appendix) aiding researchers in applying the techniques to their own problems, in addition to the description of some specific R packages. Thus the book is a valuable resource for both researchers new to hidden Markov models and as a reference for individuals already familiar with the models and concepts. In particular, the inclusion of the new Part II ("Extensions") for the second edition relating to the recent advanced techniques is an excellent addition, providing a clear description of state-of-the-art hidden Markov-type models and associated issues. Overall, the book is exceptionally well written and will be a well thumbed book in my collection." Ruth King, Thomas Bayes' Chair of Statistics, University of Edinburgh

"this is far and away the most accessible, up-to-date, and comprehensive introductory text on HMMs that there is, for students, applied statisticians, and indeed any quantitatively able researcher. It doubles as an excellent reference text for researchers who use HMMs. The addition of new R code and illustration of the use of HMM packages in R makes the text all the more useful, and the new chapters on applications in ecology and the environment will extend the appeal of the book into an area in which the huge potential of HMMs has only recently become apparent. If you want to find out about and use HMMs, ranging from the simplest to those at the cutting-edge research, this is the book for you!" David Borchers, Professor of Statistics, University of St Andrews

"The authoritative text on HMMs has become even better. This second edition is welcome and timely, filled with many examples of HMMs in the real world, and very useful snippets of code to help us get going. The authors have once again hit the jackpot." Trevor Hastie, Statistics Department, Stanford University

"The first edition of Hidden Markov Models for Time Series: An Introduction using R was the clearest and most comprehensive description of the theory and applications of HMMs in print. This new second edition from Zucchini et al contains a highly useful update to the already impressive body of material covered in the first edition. New additions include chapters on Hidden Semi-Markov Models, continuous-valued state processes, and new application sections detailing the use of HMMs for animal movement and survival estimation. The R code provided outlines key computational procedures and provides a workable foundation upon which researchers can build their own bespoke implementations of HMMs and understand the working of other software packages, which are now considered in detail. This book is structured in an accessible, yet thorough, manner which will be appreciated by statistically literate researchers and students from a variety of disciplines. This book is highly recommended for anyone wishing to understand or use Hidden Markov models." Dr. Toby Patterson, Senior Research Scientist, CSIRO Oceans and Atmosphere

"The first edition profoundly influenced my research and this new edition adds substantial material on R packages, hidden semi-Markov models and more. The book is a must have for any applied statistician interested in modeling incomplete encounter history or movement data for animals. The simplicity and generality of hidden Markov models make them an elegant solution for many applications and an essential method to have in an applied statistician's toolbox." Prof. Jeff Laake, Marine Mammal Laboratory, Alaska Fisheries Science Center, Seattle

"This book is an essential for all researchers in the area of hidden Markov models and indeed, more generally, in the broad arena of statistical modelling. The theory underpinning hidden Markov models (HMMs) is meticulously delineated and perfectly complemented by a broad range of applications chosen from real-world settings in, for example, finance, zoology and the health sciences. This second edition of the book now includes particularly valuable chapters on recent extensions to HMMs and intriguing new applications in ecology and the environment. Fragments of R code are provided throughout the text and in the Appendix and serve to fix ideas relating to both theory and practice. In summary, the book is a most welcome addition to the statistician's armoury and can be used both as a comprehensive reference work and as a well-crafted textbook." Linda Haines, Emeritus Professor, Department of Statistical Sciences, University of Cape Town.

Preface xxi
Preface to first edition xxiii
Notation and abbreviations xxvii
I Model structure, properties and methods 1(130)
1 Preliminaries: mixtures and Markov chains
3(26)
1.1 Introduction
3(3)
1.2 Independent mixture models
6(8)
1.2.1 Definition and properties
6(3)
1.2.2 Parameter estimation
9(2)
1.2.3 Unbounded likelihood in mixtures
11(1)
1.2.4 Examples of fitted mixture models
12(2)
1.3 Markov chains
14(9)
1.3.1 Definitions and example
14(3)
1.3.2 Stationary distributions
17(1)
1.3.3 Autocorrelation function
18(1)
1.3.4 Estimating transition probabilities
19(1)
1.3.5 Higher-order Markov chains
20(3)
Exercises
23(6)
2 Hidden Markov models: definition and properties
29(18)
2.1 A simple hidden Markov model
29(1)
2.2 The basics
30(4)
2.2.1 Definition and notation
30(2)
2.2.2 Marginal distributions
32(1)
2.2.3 Moments
33(1)
2.3 The likelihood
34(7)
2.3.1 The likelihood of a two-state Bernoulli-HMM
35(1)
2.3.2 The likelihood in general
36(3)
2.3.3 HMMs are not Markov processes
39(1)
2.3.4 The likelihood when data are missing
40(1)
2.3.5 The likelihood when observations are interval-censored
41(1)
Exercises
41(6)
3 Estimation by direct maximization of the likelihood
47(18)
3.1 Introduction
47(1)
3.2 Scaling the likelihood computation
48(2)
3.3 Maximization of the likelihood subject to constraints
50(3)
3.3.1 Reparametrization to avoid constraints
50(2)
3.3.2 Embedding in a continuous-time Markov chain
52(1)
3.4 Other problems
53(1)
3.4.1 Multiple maxima in the likelihood
53(1)
3.4.2 Starting values for the iterations
53(1)
3.4.3 Unbounded likelihood
53(1)
3.5 Example: earthquakes
54(2)
3.6 Standard errors and confidence intervals
56(3)
3.6.1 Standard errors via the Hessian
56(2)
3.6.2 Bootstrap standard errors and confidence intervals
58(1)
3.7 Example: the parametric bootstrap applied to the three-state model for the earthquakes data
59(1)
Exercises
60(5)
4 Estimation by the EM algorithm
65(16)
4.1 Forward and backward probabilities
65(4)
4.1.1 Forward probabilities
66(1)
4.1.2 Backward probabilities
67(1)
4.1.3 Properties of forward and backward probabilities
68(1)
4.2 The EM algorithm
69(5)
4.2.1 EM in general
70(1)
4.2.2 EM for HMMs
70(2)
4.2.3 M step for Poisson- and normal-HMMs
72(1)
4.2.4 Starting from a specified state
73(1)
4.2.5 EM for the case in which the Markov chain is stationary
73(1)
4.3 Examples of EM applied to Poisson-HMMs
74(3)
4.3.1 Earthquakes
74(2)
4.3.2 Foetal movement counts
76(1)
4.4 Discussion
77(1)
Exercises
78(3)
5 Forecasting, decoding and state prediction
81(16)
5.1 Introduction
81(1)
5.2 Conditional distributions
82(1)
5.3 Forecast distributions
83(2)
5.4 Decoding
85(7)
5.4.1 State probabilities and local decoding
86(2)
5.4.2 Global decoding
88(4)
5.5 State prediction
92(1)
5.6 HMMs for classification
93(1)
Exercises
94(3)
6 Model selection and checking
97(14)
6.1 Model selection by AIC and BIC
97(4)
6.2 Model checking with pseudo-residuals
101(5)
6.2.1 Introducing pseudo-residuals
101(4)
6.2.2 Ordinary pseudo-residuals
105(1)
6.2.3 Forecast pseudo-residuals
105(1)
6.3 Examples
106(3)
6.3.1 Ordinary pseudo-residuals for the earthquakes
106(2)
6.3.2 Dependent ordinary pseudo-residuals
108(1)
6.4 Discussion
109(1)
Exercises
109(2)
7 Bayesian inference for Poisson-hidden Markov models
111(12)
7.1 Applying the Gibbs sampler to Poisson-HMMs
111(3)
7.1.1 Introduction and outline
111(2)
7.1.2 Generating sample paths of the Markov chain
113(1)
7.1.3 Decomposing the observed counts into regime contributions
114(1)
7.1.4 Updating the parameters
114(1)
7.2 Bayesian estimation of the number of states
114(2)
7.2.1 Use of the integrated likelihood
115(1)
7.2.2 Model selection by parallel sampling
116(1)
7.3 Example: earthquakes
116(3)
7.4 Discussion
119(1)
Exercises
120(3)
8 R packages
123(8)
8.1 The package depmixS4
123(1)
8.1.1 Model formulation and estimation
123(1)
8.1.2 Decoding
124(1)
8.2 The package HiddenMarkov
124(2)
8.2.1 Model formulation and estimation
124(2)
8.2.2 Decoding
126(1)
8.2.3 Residuals
126(1)
8.3 The package msm
126(2)
8.3.1 Model formulation and estimation
126(2)
8.3.2 Decoding
128(1)
8.4 The package R2OpenBUGS
128(1)
8.5 Discussion
129(2)
II Extensions 131(66)
9 HMMs with general state-dependent distribution
133(12)
9.1 Introduction
133(1)
9.2 General univariate state-dependent distribution
133(3)
9.2.1 HMMs for unbounded counts
133(1)
9.2.2 HMMs for binary data
134(1)
9.2.3 HMMs for bounded counts
134(1)
9.2.4 HMMs for continuous-valued series
135(1)
9.2.5 HMMs for proportions
135(1)
9.2.6 HMMs for circular-valued series
136(1)
9.3 Multinomial and categorical HMMs
136(2)
9.3.1 Multinomial-HMM
136(1)
9.3.2 HMMs for categorical data
137(1)
9.3.3 HMMs for compositional data
138(1)
9.4 General multivariate state-dependent distribution
138(4)
9.4.1 Longitudinal conditional independence
138(2)
9.4.2 Contemporaneous conditional independence
140(1)
9.4.3 Further remarks on multivariate HMMs
141(1)
Exercises
142(3)
10 Covariates and other extra dependencies
145(10)
10.1 Introduction
145(1)
10.2 HMMs with covariates
145(3)
10.2.1 Covariates in the state-dependent distributions
146(1)
10.2.2 Covariates in the transition probabilities
147(1)
10.3 HMMs based on a second-order Malloy chain
148(2)
10.4 HMMs with other additional dependencies
150(2)
Exercises
152(3)
11 Continuous-valued state processes
155(10)
11.1 Introduction
155(1)
11.2 Models with continuous-valued state process
156(4)
11.2.1 Numerical integration of the likelihood
157(1)
11.2.2 Evaluation of the approximate likelihood via forward recursion
158(2)
11.2.3 Parameter estimation and related issues
160(1)
11.3 Fitting an SSM to the earthquake data
160(2)
11.4 Discussion
162(3)
12 Hidden semi-Markov models and their representation as HMMs
165(22)
12.1 Introduction
165(1)
12.2 Semi-Markov processes, hidden semi-Markov models and approximating HMMs
165(2)
12.3 Examples of HSMMs represented as HMMs
167(6)
12.3.1 A simple two-state Poisson-HSMM
167(2)
12.3.2 Example of HSMM with three states
169(2)
12.3.3 A two-state HSMM with general dwell-time distribution in one state
171(2)
12.4 General HSMM
173(3)
12.5 It code
176(2)
12.6 Some examples of dwell-time distributions
178(3)
12.6.1 Geometric distribution
178(1)
12.6.2 Shifted Poisson distribution
178(1)
12.6.3 Shifted negative binomial distribution
179(1)
12.6.4 Shifted binomial distribution
180(1)
12.6.5 A distribution with unstructured start and geometric tail
180(1)
12.7 Fitting HSMMs via the HMM representation
181(1)
12.8 Example: earthquakes
182(2)
12.9 Discussion
184(1)
Exercises
184(3)
13 HMMs for longitudinal data
187(10)
13.1 Introduction
187(2)
13.2 Models that assume some parameters to be constant across component series
189(1)
13.3 Models with random effects
190(5)
13.3.1 HMMs with continuous-valued random effects
191(2)
13.3.2 HMMs with discrete-valued random effects
193(2)
13.4 Discussion
195(1)
Exercises
196(1)
III Applications 197(134)
14 Introduction to applications
199(2)
15 Epileptic seizures
201(6)
15.1 Introduction
201(1)
15.2 Models fitted
201(3)
15.3 Model checking by pseudo-residuals
204(2)
Exercises
206(1)
16 Daily rainfall occurrence
207(6)
16.1 Introduction
207(1)
16.2 Models fitted
207(6)
17 Eruptions of the Old Faithful geyser
213(14)
17.1 Introduction
213(1)
17.2 The data
213(1)
17.3 The binary time series of short and long eruptions
214(6)
17.3.1 Markov chain models
214(2)
17.3.2 Hidden Markov models
216(3)
17.3.3 Comparison of models
219(1)
17.3.4 Forecast distributions
219(1)
17.4 Univariate normal-HMMs for durations and waiting times
220(3)
17.5 Bivariate normal-HMM for durations and waiting times
223(1)
Exercises
224(3)
18 HMMs for animal movement
227(18)
18.1 Introduction
227(1)
18.2 Directional data
228(1)
18.2.1 Directional means
228(1)
18.2.2 The von Mises distribution
228(1)
18.3 HMMs for movement data
229(3)
18.3.1 Movement data
229(1)
18.3.2 HMMs as multi-state random walks
230(2)
18.4 A basic HMM for Drosophila movement
232(3)
18.5 HMMs and HSMMs for bison movement
235(3)
18.6 Mixed HMMs for woodpecker movement
238(4)
Exercises
242(3)
19 Wind direction at Koeberg
245(14)
19.1 Introduction
245(1)
19.2 Wind direction classified into 16 categories
245(6)
19.2.1 Three HMMs for hourly averages of wind direction
245(3)
19.2.2 Model comparisons and other possible models
248(3)
19.3 Wind direction as a circular variable
251(6)
19.3.1 Daily at hour 24: von Mises-HMMs
251(2)
19.3.2 Modelling hourly change of direction
253(1)
19.3.3 Transition probabilities varying with lagged speed
253(1)
19.3.4 Concentration parameter varying with lagged speed
254(3)
Exercises
257(2)
20 Models for financial series
259(16)
20.1 Financial series I: A multivariate normal-HMM for returns on four shares
259(3)
20.2 Financial series II: Discrete state-space stochastic volatility models
262(11)
20.2.1 Stochastic volatility models without leverage
263(2)
20.2.2 Application: FTSE 100 returns
265(1)
20.2.3 Stochastic volatility models with leverage
265(3)
20.2.4 Application: TOPIX returns
268(2)
20.2.5 Non-standard stochastic volatility models
270(1)
20.2.6 A model with a mixture AR(1) volatility process
271(1)
20.2.7 Application: S&P 500 returns
272(1)
Exercises
273(2)
21 Births at Edendale Hospital
275(12)
21.1 Introduction
275(1)
21.2 Models for the proportion Caesarean
275(7)
21.3 Models for the total number of deliveries
282(3)
21.4 Conclusion
285(2)
22 Homicides and suicides in Cape Town, 1986-1991
287(10)
22.1 Introduction
287(1)
22.2 Firearm homicides as a proportion of all homicides, suicides and legal intervention homicides
287(2)
22.3 The number of firearm homicides
289(2)
22.4 Firearm homicides as a proportion of all homicides, and firearm suicides as a proportion of all suicides
291(4)
22.5 Proportion in each of the five categories
295(2)
23 A model for animal behaviour which incorporates feed- back
297(20)
23.1 Introduction
297(1)
23.2 The model
298(2)
23.3 Likelihood evaluation
300(2)
23.3.1 The likelihood as a multiple sum
301(1)
23.3.2 Recursive evaluation
301(1)
23.4 Parameter estimation by maximum likelihood
302(1)
23.5 Model checking
302(1)
23.6 Inferring the underlying state
303(1)
23.7 Models for a heterogeneous group of subjects
304(2)
23.7.1 Models assuming some parameters to be constant across subjects
304(1)
23.7.2 Mixed models
305(1)
23.7.3 Inclusion of covariates
306(1)
23.8 Other modifications or extensions
306(1)
23.8.1 Increasing the number of states
306(1)
23.8.2 Changing the nature of the state-dependent distribution
306(1)
23.9 Application to caterpillar feeding behaviour
307(7)
23.9.1 Data description And preliminary analysis
307(1)
23.9.2 Parameter estimates and model checking
307(4)
23.9.3 Runlength distributions
311(2)
23.9.4 Joint models for seven subjects
313(1)
23.10 Discussion
314(3)
24 Estimating the survival rates of Soay sheep from mark- recapture-recovery data
317(14)
24.1 Introduction
317(1)
24.2 MRR data without use of covariates
318(3)
24.3 MRR data involving individual-specific time-varying continuous-valued covariates
321(3)
24.4 Application to Soay sheep data
324(4)
24.5 Conclusion
328(3)
A Examples of R code 331(10)
A.1 The functions
331(7)
A.1.1 Transforming natural parameters to working
332(1)
A.1.2 Transforming working parameters to natural
332(1)
A.1.3 Computing minus the log-likelihood from the working parameters
332(1)
A.1.4 Computing the MLEs, given starting values for the natural parameters
333(1)
A.1.5 Generating a sample
333(1)
A.1.6 Global decoding by the Viterbi algorithm
334(1)
A.1.7 Computing log(forward probabilities)
334(1)
A.1.8 Computing log(backward probabilities)
334(1)
A.1.9 Conditional probabilities
335(1)
A.1.10 Pseudo-residuals
336(1)
A.1.11 State probabilities
336(1)
A.1.12 State prediction
336(1)
A.1.13 Local decoding
337(1)
A.1.14 Forecast probabilities
337(1)
A.2 Examples of code using the above functions
338(3)
A.2.1 Fitting Poisson-HMMs to the earthquakes series
338(1)
A.2.2 Forecast probabilities
339(2)
B Some proofs 341(4)
B.1 A factorization needed for the forward probabilities
341(1)
B.2 Two results needed for the backward probabilities
342(1)
B.3 Conditional independence of XI and XT+1
343(2)
References 345(14)
Author index 359(6)
Subject index 365
Walter Zucchini, Iain K. MacDonald, Roland Langrock