Preface |
|
xv | |
|
|
1 | (13) |
|
|
1 | (3) |
|
|
1 | (1) |
|
|
2 | (1) |
|
|
3 | (1) |
|
|
4 | (5) |
|
Residuals versus fitted values |
|
|
4 | (2) |
|
Residuals versus the predictors |
|
|
6 | (1) |
|
Residuals versus the response |
|
|
7 | (2) |
|
|
9 | (1) |
|
|
10 | (4) |
|
|
13 | (1) |
|
Introduction to 2D Scatterplots |
|
|
14 | (26) |
|
Response plots in simple regression |
|
|
14 | (1) |
|
New Zealand horse mussels |
|
|
15 | (5) |
|
Transforming y via inverse response plots |
|
|
20 | (5) |
|
|
21 | (3) |
|
Response transformations: Mussel data |
|
|
24 | (1) |
|
|
25 | (4) |
|
|
29 | (3) |
|
|
29 | (2) |
|
|
31 | (1) |
|
Regression graphics in the 1920s |
|
|
32 | (5) |
|
Ezekiel's successive approximations |
|
|
32 | (2) |
|
|
34 | (3) |
|
|
37 | (3) |
|
|
38 | (2) |
|
Constructing 3D Scatterplots |
|
|
40 | (7) |
|
Getting an impression of 3D |
|
|
40 | (2) |
|
|
42 | (1) |
|
|
43 | (1) |
|
|
44 | (3) |
|
|
46 | (1) |
|
Interpreting 3D Scatterplots |
|
|
47 | (31) |
|
|
47 | (2) |
|
Structural dimensionality |
|
|
49 | (2) |
|
|
49 | (1) |
|
|
50 | (1) |
|
|
51 | (1) |
|
One-dimensional structure |
|
|
51 | (4) |
|
Two-dimensional structure |
|
|
55 | (3) |
|
|
55 | (1) |
|
Identifying semiparametric regression functions |
|
|
56 | (2) |
|
Assessing structural dimensionality |
|
|
58 | (5) |
|
A visual metaphor for structural dimension |
|
|
59 | (1) |
|
A first method for deciding d = 1 or 2 |
|
|
59 | (2) |
|
|
61 | (2) |
|
|
63 | (15) |
|
|
64 | (1) |
|
Using uncorrelated 2D views |
|
|
65 | (2) |
|
Uncorrelated 2D views: Haystack data |
|
|
67 | (2) |
|
|
69 | (2) |
|
Intraslice orthogonalization |
|
|
71 | (1) |
|
|
72 | (1) |
|
|
73 | (1) |
|
|
74 | (4) |
|
Binary Response Variables |
|
|
78 | (23) |
|
|
78 | (1) |
|
|
79 | (7) |
|
|
82 | (1) |
|
|
82 | (2) |
|
Comparison with previous checking methods |
|
|
84 | (1) |
|
Exploiting the binary response |
|
|
85 | (1) |
|
|
86 | (5) |
|
Australian Institute of Sport |
|
|
86 | (3) |
|
|
89 | (2) |
|
|
91 | (3) |
|
|
91 | (2) |
|
|
93 | (1) |
|
Visualizing a logistic model |
|
|
94 | (7) |
|
Conditionally normal predictors |
|
|
95 | (3) |
|
Other predictor distributions |
|
|
98 | (1) |
|
|
99 | (2) |
|
Dimension-Reduction Subspaces |
|
|
101 | (19) |
|
|
101 | (2) |
|
Dimension-reduction subspaces |
|
|
103 | (2) |
|
|
105 | (3) |
|
Guaranteeing S y/x by constraining |
|
|
108 | (4) |
|
|
108 | (3) |
|
|
111 | (1) |
|
Importance of central subspaces |
|
|
112 | (2) |
|
|
114 | (6) |
|
|
117 | (3) |
|
|
120 | (23) |
|
Introduction to graphical regression |
|
|
120 | (4) |
|
|
124 | (3) |
|
Example: Linear regression |
|
|
125 | (1) |
|
Example: Sy/x1 = S (η1), but Sy/x2 ≠ S (η2) |
|
|
126 | (1) |
|
|
127 | (7) |
|
Location regressions for the predictors |
|
|
128 | (1) |
|
Elliptically contoured distributions |
|
|
129 | (2) |
|
Elliptically contoured predictors |
|
|
131 | (3) |
|
|
134 | (3) |
|
|
137 | (3) |
|
Example: x1 independent of x2, but Sy/x1 ≠ S(η1) |
|
|
137 | (1) |
|
Conditions for Sy/x1 = S(η1) |
|
|
137 | (2) |
|
Marginal consistency assumption |
|
|
139 | (1) |
|
Visual fitting with h-level response plots |
|
|
140 | (3) |
|
|
142 | (1) |
|
|
143 | (16) |
|
Fitting with linear kernels |
|
|
143 | (4) |
|
|
145 | (1) |
|
Using the Li-Duan Proposition |
|
|
146 | (1) |
|
|
147 | (3) |
|
The predictor distribution |
|
|
150 | (3) |
|
Reweighting for elliptical contours |
|
|
153 | (6) |
|
|
154 | (1) |
|
|
155 | (1) |
|
|
156 | (2) |
|
|
158 | (1) |
|
Graphical Regression Studies |
|
|
159 | (28) |
|
|
159 | (16) |
|
|
161 | (9) |
|
|
170 | (5) |
|
|
175 | (4) |
|
|
179 | (5) |
|
|
184 | (3) |
|
|
184 | (3) |
|
Inverse Regression Graphics |
|
|
187 | (16) |
|
Inverse regression function |
|
|
187 | (9) |
|
|
191 | (2) |
|
Mean checking condition: Wheat protein |
|
|
193 | (1) |
|
Mean checking condition: Mussel data |
|
|
194 | (2) |
|
Inverse variance function |
|
|
196 | (7) |
|
Variance checking condition |
|
|
199 | (1) |
|
Variance checking condition: Ethanol data |
|
|
200 | (1) |
|
|
201 | (2) |
|
Sliced Inverse Regression |
|
|
203 | (21) |
|
Inverse regression subspace |
|
|
203 | (1) |
|
|
204 | (2) |
|
Asymptotic distribution of Λd |
|
|
206 | (7) |
|
|
206 | (2) |
|
|
208 | (2) |
|
Distribution of Λd with constraints |
|
|
210 | (3) |
|
|
213 | (3) |
|
|
216 | (4) |
|
|
220 | (4) |
|
|
222 | (2) |
|
Principal Hessian Directions |
|
|
224 | (30) |
|
|
225 | (2) |
|
Connecting Se/z and Sezz when |
|
|
227 | (4) |
|
|
227 | (3) |
|
E(z /ρTZ) = PρZ and Var(Z ρTZ) = Qρ |
|
|
230 | (1) |
|
Z is normally distributed |
|
|
231 | (1) |
|
|
231 | (7) |
|
Asymptotic distribution of Δk |
|
|
232 | (3) |
|
An algorithm for inference on k |
|
|
235 | (1) |
|
Asymptotic distribution of Δk with constraints |
|
|
236 | (2) |
|
Testing e independent of z |
|
|
238 | (1) |
|
|
238 | (5) |
|
|
239 | (1) |
|
|
240 | (1) |
|
|
241 | (2) |
|
Using stronger assumptions |
|
|
243 | (1) |
|
|
243 | (5) |
|
|
244 | (2) |
|
|
246 | (1) |
|
|
246 | (2) |
|
|
248 | (1) |
|
|
249 | (5) |
|
|
249 | (1) |
|
|
250 | (1) |
|
|
251 | (3) |
|
Studying Predictor Effects |
|
|
254 | (18) |
|
Introduction to net-effect plots |
|
|
254 | (5) |
|
Natural rubber: Net-effect plots |
|
|
255 | (1) |
|
|
256 | (1) |
|
|
257 | (2) |
|
Reducing brushing dimensions |
|
|
259 | (1) |
|
|
259 | (7) |
|
|
260 | (2) |
|
|
262 | (2) |
|
Post-model net-effect plots |
|
|
264 | (1) |
|
|
265 | (1) |
|
|
266 | (6) |
|
|
268 | (1) |
|
Minneapolis schools again |
|
|
269 | (1) |
|
|
270 | (2) |
|
Predictor Transformations |
|
|
272 | (31) |
|
|
273 | (6) |
|
|
273 | (1) |
|
|
274 | (2) |
|
|
276 | (3) |
|
CERES plots when (x1 / x2) is |
|
|
279 | (4) |
|
|
279 | (1) |
|
|
280 | (1) |
|
|
281 | (2) |
|
|
283 | (7) |
|
Highly dependent predictors |
|
|
284 | (1) |
|
|
285 | (4) |
|
Transforming more than one predictor |
|
|
289 | (1) |
|
|
290 | (4) |
|
|
294 | (2) |
|
Environmental contamination |
|
|
296 | (7) |
|
Assessing relative importance |
|
|
297 | (1) |
|
|
298 | (4) |
|
|
302 | (1) |
|
Graphics for Model Assessment |
|
|
303 | (26) |
|
|
304 | (9) |
|
|
304 | (1) |
|
|
305 | (1) |
|
Using residuals in graphical regression |
|
|
306 | (3) |
|
|
309 | (1) |
|
Residual plots: Kyphosis data |
|
|
309 | (3) |
|
Interpreting residual plots |
|
|
312 | (1) |
|
|
313 | (16) |
|
Marginal regression functions |
|
|
314 | (2) |
|
Marginal variance functions |
|
|
316 | (1) |
|
|
317 | (2) |
|
|
319 | (3) |
|
|
322 | (1) |
|
|
322 | (3) |
|
Marginal model plots: Kyphosis data |
|
|
325 | (2) |
|
|
327 | (2) |
Bibliography |
|
329 | (10) |
Author Index |
|
339 | (4) |
Subject Index |
|
343 | |