Preface |
|
xv | |
Editors |
|
xvii | |
Contributors |
|
xix | |
List of Symbols |
|
xxi | |
I Foundations and Methods |
|
1 | (154) |
|
1 Introduction to Finite Mixtures |
|
|
3 | (18) |
|
|
1.1 Introduction and Motivation |
|
|
3 | (5) |
|
|
4 | (2) |
|
|
6 | (1) |
|
1.1.3 Latent allocation variables |
|
|
6 | (2) |
|
|
8 | (1) |
|
|
8 | (5) |
|
|
8 | (1) |
|
1.2.2 Continuous mixtures |
|
|
9 | (1) |
|
1.2.3 Finite mixtures with nonparametric components |
|
|
10 | (1) |
|
1.2.4 Covariates and mixtures of experts |
|
|
11 | (1) |
|
1.2.5 Hidden Markov models |
|
|
11 | (1) |
|
|
12 | (1) |
|
1.3 Some Technical Concerns |
|
|
13 | (1) |
|
|
13 | (1) |
|
|
13 | (1) |
|
|
14 | (4) |
|
1.4.1 Frequentist inference, and the role of EM |
|
|
14 | (1) |
|
1.4.2 Bayesian inference, and the role of MCMC |
|
|
15 | (1) |
|
1.4.3 Variable number of components |
|
|
16 | (1) |
|
1.4.4 Modes versus components |
|
|
16 | (1) |
|
1.4.5 Clustering and classification |
|
|
17 | (1) |
|
|
18 | (1) |
|
|
19 | (2) |
|
2 EM Methods for Finite Mixtures |
|
|
21 | (20) |
|
|
|
21 | (1) |
|
|
22 | (3) |
|
2.2.1 Description of EM for finite mixtures |
|
|
22 | (2) |
|
2.2.2 EM as an alternating-maximization algorithm |
|
|
24 | (1) |
|
2.3 Convergence and Behavior of EM |
|
|
25 | (1) |
|
2.4 Cousin Algorithms of EM |
|
|
26 | (4) |
|
2.4.1 Stochastic versions of the EM algorithm |
|
|
27 | (2) |
|
2.4.2 The Classification EM algorithm |
|
|
29 | (1) |
|
2.5 Accelerating the EM Algorithm |
|
|
30 | (2) |
|
2.6 Initializing the EM Algorithm |
|
|
32 | (2) |
|
2.6.1 Random initialization |
|
|
33 | (1) |
|
2.6.2 Hierarchical initialization |
|
|
33 | (1) |
|
2.6.3 Recursive initialization |
|
|
33 | (1) |
|
2.7 Avoiding Spurious Local Maximizers |
|
|
34 | (1) |
|
|
35 | (2) |
|
|
37 | (4) |
|
3 An Expansive View of EM Algorithms |
|
|
41 | (12) |
|
|
Prabhani Kuruppumullage Don |
|
|
|
|
41 | (1) |
|
3.2 The Product-of-Sums Formulation |
|
|
42 | (2) |
|
3.2.1 Iterative algorithms and the ascent property |
|
|
43 | (1) |
|
3.2.2 Creating a minorizing surrogate function |
|
|
43 | (1) |
|
3.3 Likelihood as a Product of Sums |
|
|
44 | (2) |
|
3.4 Non-standard Examples of EM Algorithms |
|
|
46 | (4) |
|
|
47 | (1) |
|
|
47 | (1) |
|
|
48 | (2) |
|
3.5 Stopping Rules for EM Algorithms |
|
|
50 | (1) |
|
|
50 | (2) |
|
|
52 | (1) |
|
4 Bayesian Mixture Models: Theory and Methods |
|
|
53 | (20) |
|
|
|
|
|
53 | (1) |
|
4.2 Bayesian Mixtures: From Priors to Posteriors |
|
|
54 | (7) |
|
4.2.1 Models and representations |
|
|
54 | (2) |
|
4.2.2 Impact of the prior distribution |
|
|
56 | (1) |
|
|
56 | (1) |
|
4.2.2.2 Improper and non-informative priors |
|
|
57 | (1) |
|
4.2.2.3 Data-dependent priors |
|
|
59 | (1) |
|
4.2.2.4 Priors for overfitted mixtures |
|
|
60 | (1) |
|
4.3 Asymptotic Properties of the Posterior Distribution in the Finite Case |
|
|
61 | (7) |
|
4.3.1 Posterior concentration around the marginal density |
|
|
61 | (2) |
|
4.3.2 Recovering the parameters in the well-behaved case |
|
|
63 | (1) |
|
4.3.3 Boundary parameters: overfitted mixtures |
|
|
64 | (4) |
|
4.3.4 Asymptotic behaviour of posterior estimates of the number of components |
|
|
68 | (1) |
|
|
68 | (2) |
|
|
70 | (3) |
|
5 Computational Solutions for Bayesian Inference in Mixture Models |
|
|
73 | (24) |
|
|
|
|
|
|
|
73 | (2) |
|
5.2 Algorithms for Posterior Sampling |
|
|
75 | (8) |
|
5.2.1 A computational problem? Which computational problem? |
|
|
75 | (1) |
|
|
76 | (4) |
|
5.2.3 Metropolis-Hastings schemes |
|
|
80 | (1) |
|
5.2.4 Reversible jump MCMC |
|
|
81 | (1) |
|
5.2.5 Sequential Monte Carlo |
|
|
82 | (1) |
|
|
82 | (1) |
|
5.3 Bayesian Inference in the Model-Based Clustering Context |
|
|
83 | (2) |
|
|
85 | (3) |
|
5.4.1 Known number of components |
|
|
86 | (1) |
|
5.4.2 Unknown number of components |
|
|
87 | (1) |
|
5.5 Gibbs Sampling for High-Dimensional Mixtures |
|
|
88 | (4) |
|
5.5.1 Determinant coefficient of determination |
|
|
89 | (2) |
|
5.5.2 Simulation study using the determinant criterion |
|
|
91 | (1) |
|
|
92 | (1) |
|
|
93 | (4) |
|
6 Bayesian Nonparametric Mixture Models |
|
|
97 | (20) |
|
|
|
97 | (3) |
|
6.2 Dirichlet Process Mixtures |
|
|
100 | (4) |
|
6.2.1 The Dirichlet process prior |
|
|
100 | (2) |
|
6.2.2 Posterior simulation in Dirichlet process mixture models |
|
|
102 | (2) |
|
6.2.3 Dependent mixtures - the dependent Dirichlet process model |
|
|
104 | (1) |
|
6.3 Normalized Generalized Gamma Process Mixtures |
|
|
104 | (4) |
|
|
104 | (2) |
|
6.3.2 Posterior simulation for normalized generalized gamma process mixtures |
|
|
106 | (2) |
|
6.4 Bayesian Nonparametric Mixtures with Random Partitions |
|
|
108 | (2) |
|
6.4.1 Locally weighted mixtures |
|
|
108 | (1) |
|
6.4.2 Conditional regression |
|
|
109 | (1) |
|
6.5 Repulsive Mixtures (Determinantal Point Process) |
|
|
110 | (2) |
|
|
112 | (2) |
|
|
114 | (3) |
|
7 Model Selection for Mixture Models - Perspectives and Strategies |
|
|
117 | (38) |
|
|
Sylvia Pruhwirth-Schnatter |
|
|
|
|
118 | (1) |
|
7.2 Selecting G as a Density Estimation Problem |
|
|
119 | (12) |
|
7.2.1 Testing the order of a finite mixture through likelihood ratio tests |
|
|
121 | (1) |
|
7.2.2 Information criteria for order selection |
|
|
122 | (1) |
|
|
122 | (1) |
|
7.2.2.2 The Slope Heuristics |
|
|
123 | (1) |
|
|
124 | (1) |
|
7.2.2.4 The minimum message length |
|
|
125 | (1) |
|
7.2.3 Bayesian model choice based on marginal likelihoods |
|
|
126 | (1) |
|
7.2.3.1 Chib's method, limitations and extensions |
|
|
126 | (1) |
|
7.2.3.2 Sampling-based approximations |
|
|
127 | (4) |
|
7.3 Selecting G in the Framework of Model-Based Clustering |
|
|
131 | (10) |
|
7.3.1 Mixtures as partition models |
|
|
133 | (1) |
|
7.3.2 Classification-based information criteria |
|
|
134 | (1) |
|
7.3.2.1 The integrated complete-data likelihood criterion |
|
|
134 | (1) |
|
7.3.2.2 The conditional classification likelihood |
|
|
135 | (1) |
|
7.3.2.3 Exact derivation of the ICL |
|
|
137 | (1) |
|
7.3.3 Bayesian clustering |
|
|
138 | (2) |
|
7.3.4 Selecting G under model misspecification |
|
|
140 | (1) |
|
7.4 One-Sweep Methods for Cross-model Inference on G |
|
|
141 | (7) |
|
7.4.1 Overfitting mixtures |
|
|
142 | (1) |
|
7.4.2 Reversible jump MCMC |
|
|
142 | (1) |
|
7.4.3 Allocation sampling |
|
|
142 | (1) |
|
7.4.4 Bayesian nonparametric methods |
|
|
143 | (2) |
|
7.4.5 Sparse finite mixtures for model-based clustering |
|
|
145 | (3) |
|
|
148 | (1) |
|
|
149 | (6) |
II Mixture Modelling and Extensions |
|
155 | (336) |
|
|
157 | (36) |
|
|
|
158 | (6) |
|
8.1.1 Heuristic clustering |
|
|
159 | (1) |
|
8.1.2 From k-means to Gaussian mixture modelling |
|
|
160 | (1) |
|
8.1.3 Specifying the clustering problem |
|
|
161 | (3) |
|
|
164 | (9) |
|
8.2.1 Components corresponding to clusters |
|
|
165 | (3) |
|
8.2.2 Combining components into clusters |
|
|
168 | (2) |
|
8.2.3 Selecting the clustering base |
|
|
170 | (2) |
|
8.2.4 Selecting the number of clusters |
|
|
172 | (1) |
|
8.3 Post-processing the Fitted Model |
|
|
173 | (7) |
|
8.3.1 Identifying the model |
|
|
173 | (1) |
|
8.3.2 Determining a partition |
|
|
173 | (2) |
|
8.3.3 Characterizing clusters |
|
|
175 | (1) |
|
8.3.4 Validating clusters |
|
|
175 | (2) |
|
8.3.5 Visualizing cluster solutions |
|
|
177 | (3) |
|
8.4 Illustrative Applications |
|
|
180 | (4) |
|
8.4.1 Bioinformatics: Analysing gene expression data |
|
|
181 | (1) |
|
8.4.2 Marketing: Determining market segments |
|
|
181 | (1) |
|
8.4.3 Psychology and sociology: Revealing latent structures |
|
|
182 | (1) |
|
8.4.4 Economics and finance: Clustering time series |
|
|
183 | (1) |
|
8.4.5 Medicine and biostatistics: Unobserved heterogeneity |
|
|
183 | (1) |
|
|
184 | (1) |
|
|
185 | (8) |
|
9 Mixture Modelling of Discrete Data |
|
|
193 | (26) |
|
|
|
194 | (1) |
|
9.2 Mixtures of Univariate Count Data |
|
|
194 | (3) |
|
|
194 | (1) |
|
9.2.2 Finite mixtures of Poisson and related distributions |
|
|
195 | (2) |
|
9.2.3 Zero-inflated models |
|
|
197 | (1) |
|
|
197 | (3) |
|
9.3.1 Mixtures of time series count data |
|
|
197 | (1) |
|
9.3.2 Hidden Markov models |
|
|
198 | (1) |
|
9.3.3 Mixture of regression models for discrete data |
|
|
199 | (1) |
|
|
200 | (1) |
|
9.4 Mixtures of Multivariate Count Data |
|
|
200 | (9) |
|
9.4.1 Some models for multivariate counts |
|
|
200 | (1) |
|
9.4.1.1 Multivariate reduction approach |
|
|
200 | (1) |
|
|
202 | (1) |
|
|
203 | (1) |
|
9.4.2 Finite mixture for multivariate counts |
|
|
204 | (1) |
|
9.4.2.1 Conditional independence |
|
|
204 | (1) |
|
9.4.2.2 Conditional dependence |
|
|
205 | (1) |
|
9.4.2.3 Finite mixtures of multivariate Poisson distributions |
|
|
205 | (1) |
|
9.4.3 Zero-inflated multivariate models |
|
|
206 | (1) |
|
9.4.4 Copula-based models |
|
|
206 | (2) |
|
9.4.5 Finite mixture of bivariate Poisson regression models |
|
|
208 | (1) |
|
9.5 Other Mixtures for Discrete Data |
|
|
209 | (3) |
|
9.5.1 Latent class models |
|
|
209 | (1) |
|
9.5.2 Mixtures for ranking data |
|
|
209 | (1) |
|
9.5.3 Mixtures of multinomial distributions |
|
|
210 | (1) |
|
9.5.4 Mixtures of Markov chains |
|
|
211 | (1) |
|
|
212 | (1) |
|
|
213 | (6) |
|
10 Continuous Mixtures with Skewness and Heavy Tails |
|
|
219 | (20) |
|
|
|
|
219 | (2) |
|
|
221 | (3) |
|
|
224 | (1) |
|
|
225 | (4) |
|
|
229 | (5) |
|
|
229 | (4) |
|
|
233 | (1) |
|
|
234 | (1) |
|
|
235 | (4) |
|
11 Mixture Modelling of High-Dimensional Data |
|
|
239 | (32) |
|
|
|
|
240 | (1) |
|
11.2 High-Dimensional Data |
|
|
240 | (2) |
|
11.2.1 Continuous data: Italian wine |
|
|
241 | (1) |
|
11.2.2 Categorical data: lower back pain |
|
|
241 | (1) |
|
11.2.3 Mixed data: prostate cancer |
|
|
242 | (1) |
|
11.3 Mixtures for High-Dimensional Data |
|
|
242 | (3) |
|
11.3.1 Curse of dimensionality/modeling issues |
|
|
243 | (1) |
|
|
244 | (1) |
|
11.4 Mixtures for Continuous Data |
|
|
245 | (6) |
|
11.4.1 Diagonal covariance |
|
|
246 | (1) |
|
11.4.2 Eigendecomposed covariance |
|
|
247 | (1) |
|
11.4.3 Mixtures of factor analyzers and probabilistic principal components analyzers |
|
|
248 | (1) |
|
11.4.4 High-dimensional models |
|
|
249 | (1) |
|
|
250 | (1) |
|
11.5 Mixtures for Categorical Data |
|
|
251 | (2) |
|
11.5.1 Local independence models and latent class analysis |
|
|
251 | (1) |
|
|
252 | (1) |
|
11.6 Mixtures for Mixed Data |
|
|
253 | (1) |
|
|
254 | (3) |
|
11.7.1 Wrapper-based methods |
|
|
254 | (1) |
|
11.7.2 Stepwise approaches for continuous data |
|
|
255 | (2) |
|
11.7.3 Stepwise approaches for categorical data |
|
|
257 | (1) |
|
|
257 | (6) |
|
11.8.1 Continuous data: Italian wine |
|
|
258 | (2) |
|
11.8.2 Categorical data: lower back pain |
|
|
260 | (2) |
|
11.8.3 Mixed data: prostate cancer |
|
|
262 | (1) |
|
|
263 | (1) |
|
|
264 | (7) |
|
12 Mixture of Experts Models |
|
|
271 | (38) |
|
|
Sylvia Fruhwirth-Schnatter |
|
|
|
271 | (1) |
|
12.2 The Mixture of Experts Framework |
|
|
272 | (4) |
|
12.2.1 A mixture of experts model |
|
|
272 | (1) |
|
|
273 | (1) |
|
12.2.3 The suite of mixture of experts models |
|
|
274 | (2) |
|
12.3 Statistical Inference for Mixture of Experts Models |
|
|
276 | (6) |
|
12.3.1 Maximum likelihood estimation |
|
|
276 | (2) |
|
12.3.2 Bayesian estimation |
|
|
278 | (2) |
|
|
280 | (2) |
|
12.4 Illustrative Applications |
|
|
282 | (11) |
|
12.4.1 Analysing marijuana use through mixture of experts Markov chain models |
|
|
282 | (3) |
|
12.4.2 A mixture of experts model for ranked preference data |
|
|
285 | (4) |
|
12.4.3 A mixture of experts latent position cluster model |
|
|
289 | (4) |
|
|
293 | (1) |
|
12.5 Identifiability of Mixture of Experts Models |
|
|
293 | (9) |
|
12.5.1 Identifiability of binomial mixtures |
|
|
294 | (2) |
|
12.5.2 Identifiability for mixtures of regression models |
|
|
296 | (4) |
|
12.5.3 Identifiability for mixture of experts models |
|
|
300 | (2) |
|
|
302 | (1) |
|
|
303 | (6) |
|
13 Hidden Markov Models in Time Series, with Applications in Economics |
|
|
309 | (34) |
|
|
|
309 | (3) |
|
13.2 Regime Switching: Mixture Modelling over Time |
|
|
312 | (9) |
|
13.2.1 Preliminaries and model specification |
|
|
312 | (1) |
|
13.2.2 The functional form of state transition |
|
|
313 | (1) |
|
13.2.2.1 Time-invariant switching |
|
|
313 | (1) |
|
13.2.2.2 Time-varying switching |
|
|
314 | (1) |
|
13.2.2.3 Nested alternatives |
|
|
315 | (1) |
|
|
315 | (1) |
|
13.2.4 Some considerations on parameterization |
|
|
316 | (1) |
|
13.2.5 Stability conditions: combining stable and unstable processes |
|
|
317 | (4) |
|
|
321 | (10) |
|
13.3.1 The complete-data likelihood and the FFBS algorithm |
|
|
322 | (1) |
|
13.3.2 Maximum likelihood estimation |
|
|
322 | (2) |
|
13.3.3 Bayesian estimation |
|
|
324 | (1) |
|
13.3.3.1 Prior specifications for the transition distribution |
|
|
324 | (1) |
|
13.3.3.2 Posterior inference |
|
|
325 | (2) |
|
13.3.4 Sampler efficiency: logit versus probit |
|
|
327 | (2) |
|
13.3.5 Posterior state identification |
|
|
329 | (2) |
|
13.4 Informative Regime Switching in Applications |
|
|
331 | (4) |
|
13.4.1 Time-invariant switching |
|
|
331 | (1) |
|
13.4.1.1 Unconditional switching |
|
|
331 | (1) |
|
13.4.1.2 Structured Markov switching |
|
|
331 | (2) |
|
13.4.2 Time-varying switching |
|
|
333 | (1) |
|
13.4.2.1 Duration dependence and state-identifying restrictions |
|
|
333 | (1) |
|
13.4.2.2 Shape restrictions |
|
|
334 | (1) |
|
|
335 | (2) |
|
|
337 | (6) |
|
14 Mixtures of Nonparametric Components and Hidden Markov Models |
|
|
343 | (20) |
|
|
|
343 | (2) |
|
14.2 Mixtures with One Known Component |
|
|
345 | (1) |
|
14.2.1 The case where the other component is symmetric |
|
|
345 | (1) |
|
14.2.2 Mixture of a uniform and a non-decreasing density |
|
|
345 | (1) |
|
14.3 Translation Mixtures |
|
|
346 | (3) |
|
14.3.1 Translation of a symmetric density |
|
|
346 | (1) |
|
14.3.2 Translation of any distribution and hidden Markov models |
|
|
347 | (2) |
|
14.4 Multivariate Mixtures |
|
|
349 | (7) |
|
|
349 | (2) |
|
14.4.2 Estimation with spectral methods |
|
|
351 | (1) |
|
14.4.3 Estimation with nonparametric methods |
|
|
351 | (3) |
|
14.4.4 Hidden Markov models |
|
|
354 | (2) |
|
|
356 | (1) |
|
|
356 | (1) |
|
|
356 | (1) |
|
14.5.3 Semi-parametric estimation |
|
|
356 | (1) |
|
14.5.4 Regressions with random (observed or non-observed) design |
|
|
356 | (1) |
|
|
357 | (1) |
|
|
358 | (3) |
|
III Selected Applications |
|
|
361 | (2) |
|
15 Applications in Industry |
|
|
363 | (22) |
|
|
|
|
|
|
|
363 | (1) |
|
15.2 Mixtures for Monitoring |
|
|
364 | (2) |
|
15.3 Health Resource Usage |
|
|
366 | (3) |
|
15.3.1 Assessing the effectiveness of a measles vaccination |
|
|
366 | (1) |
|
15.3.2 Spatio-temporal disease mapping: identifying unstable trends in congenital malformations |
|
|
367 | (2) |
|
|
369 | (5) |
|
|
369 | (2) |
|
15.4.2 Resulting clusters |
|
|
371 | (3) |
|
|
374 | (5) |
|
|
375 | (3) |
|
15.5.2 Posterior sampling and summaries |
|
|
378 | (1) |
|
|
379 | (2) |
|
|
381 | (4) |
|
16 Mixture Models for Image Analysis |
|
|
385 | (22) |
|
|
|
385 | (1) |
|
16.2 Hidden Markov Model Based Clustering |
|
|
386 | (3) |
|
|
387 | (1) |
|
16.2.2 Markov random fields: Potts model and extensions |
|
|
387 | (1) |
|
16.2.3 Hidden Markov field with independent noise |
|
|
388 | (1) |
|
16.3 Markov Model Based Segmentation via Variational EM |
|
|
389 | (4) |
|
16.3.1 Links with the iterated conditional mode and the Gibbs sampler |
|
|
392 | (1) |
|
16.4 Illustration: MRI Brain Scan Segmentation |
|
|
393 | (7) |
|
16.4.1 Healthy brain tissue and structure segmentation |
|
|
393 | (1) |
|
16.4.1.1 A Markov random field approach to segmentation and registration |
|
|
394 | (1) |
|
16.4.1.2 Experiments: Joint tissue and structure segmentation |
|
|
397 | (1) |
|
16.4.2 Brain tumor detection from multiple MR sequences |
|
|
398 | (1) |
|
16.4.2.1 Tissue interaction modelling |
|
|
398 | (1) |
|
16.4.2.2 Experiments: Lesion segmentation |
|
|
400 | (1) |
|
|
400 | (2) |
|
|
402 | (5) |
|
17 Applications in Finance |
|
|
407 | (32) |
|
|
|
|
407 | (1) |
|
17.2 Finite Mixture Models |
|
|
408 | (6) |
|
17.2.1 i.i.d. mixture models with volatility dynamics |
|
|
408 | (1) |
|
17.2.2 Markov switching models |
|
|
408 | (3) |
|
17.2.3 Markov switching volatility models |
|
|
411 | (2) |
|
|
413 | (1) |
|
17.3 Infinite Mixture Models |
|
|
414 | (15) |
|
17.3.1 Dirichlet process mixture model |
|
|
414 | (6) |
|
17.3.2 GARCH-DPM and SV-DPM |
|
|
420 | (5) |
|
17.3.3 Infinite hidden Markov model |
|
|
425 | (4) |
|
|
429 | (2) |
|
|
431 | (8) |
|
18 Applications in Genomics |
|
|
439 | (24) |
|
|
|
|
439 | (1) |
|
18.2 Mixture Models in Transcriptorne and Genome Analysis |
|
|
440 | (6) |
|
18.2.1 Analyzing the genetic structure of a population |
|
|
440 | (1) |
|
18.2.2 Finding sets of co-transcribed genes |
|
|
441 | (2) |
|
18.2.3 Variable selection for clustering with Gaussian mixture models |
|
|
443 | (1) |
|
18.2.4 Mixture models in the specific case of multiple testing |
|
|
444 | (2) |
|
18.3 Hidden Markov Models in Genomics: Some Specificities |
|
|
446 | (7) |
|
18.3.1 A typical case: Copy number variations |
|
|
446 | (1) |
|
18.3.2 Complex emission distributions |
|
|
447 | (3) |
|
18.3.3 Complex hidden states |
|
|
450 | (1) |
|
18.3.4 Non-standard hidden Markov structures |
|
|
451 | (2) |
|
18.4 Complex Dependency Structures |
|
|
453 | (2) |
|
18.4.1 Markov random fields |
|
|
453 | (1) |
|
18.4.2 Stochastic block model |
|
|
454 | (1) |
|
|
454 | (1) |
|
|
455 | (1) |
|
|
456 | (7) |
|
19 Applications in Astronomy |
|
|
463 | (28) |
|
|
|
|
463 | (1) |
|
19.2 Clusters of Stars and Galaxies |
|
|
464 | (8) |
|
|
464 | (2) |
|
19.2.2 Young star clusters |
|
|
466 | (1) |
|
19.2.2.1 Star-cluster models |
|
|
467 | (1) |
|
19.2.2.2 Model fitting and validation |
|
|
469 | (1) |
|
19.2.2.3 Results from the mixture model approach |
|
|
471 | (1) |
|
19.3 Classification of Astronomical Objects |
|
|
472 | (5) |
|
19.3.1 Tests for multiple components |
|
|
472 | (2) |
|
19.3.2 Two or three classes of gamma-ray bursts? |
|
|
474 | (1) |
|
19.3.3 Removal of contaminants |
|
|
475 | (1) |
|
19.3.4 Red and blue galaxies |
|
|
476 | (1) |
|
19.4 Advanced Mixture Model Applications |
|
|
477 | (4) |
|
19.4.1 Regression with heteroscedastic uncertainties |
|
|
477 | (2) |
|
19.4.2 Deconvolution of distributions from data with heteroscedastic errors and missing information |
|
|
479 | (2) |
|
|
481 | (3) |
|
|
484 | (7) |
Index |
|
491 | |