Atnaujinkite slapukų nuostatas

El. knyga: Regression Graphics - Ideas for Studying Regressions through Graphics: Ideas for Studying Regressions Through Graphics [Wiley Online]

(The University of Minnesota, St. Paul)
  • Wiley Online
  • Kaina: 200,89 €*
  • * this price gives unlimited concurrent access for unlimited time
Developed from lecture notes for a one-quarter graduate course for students who had at least a year of mathematical statistics and two quarters of linear models under their hats. Focuses narrowly on new ideas for graphical representation of statistics made possible by the growing power of computers. Emphasizes a relatively new statistical context for regression and regression graphics that is intended to blend with rather than replace more traditional paradigms for regression analysis. Annotation c. by Book News, Inc., Portland, Or.

An exploration of regression graphics through computer graphics.

Recent developments in computer technology have stimulated new and exciting uses for graphics in statistical analyses. Regression Graphics, one of the first graduate-level textbooks on the subject, demonstrates how statisticians, both theoretical and applied, can use these exciting innovations. After developing a relatively new regression context that requires few scope-limiting conditions, Regression Graphics guides readers through the process of analyzing regressions graphically and assessing and selecting models. This innovative reference makes use of a wide range of graphical tools, including 2D and 3D scatterplots, 3D binary response plots, and scatterplot matrices. Supplemented by a companion ftp site, it features numerous data sets and applied examples that are used to elucidate the theory.

Other important features of this book include:
* Extensive coverage of a relatively new regression context based on dimension-reduction subspaces and sufficient summary plots
* Graphical regression, an iterative visualization process for constructing sufficient regression views
* Graphics for regressions with a binary response
* Graphics for model assessment, including residual plots
* Net-effects plots for assessing predictor contributions
* Graphics for predictor and response transformations
* Inverse regression methods
* Access to a Web site of supplemental plots, data sets, and 3D color displays.

An ideal text for students in graduate-level courses on statistical analysis, Regression Graphics is also an excellent reference for professional statisticians.
Preface xv
Introduction
1(13)
C C & I
1(3)
Construction
1(1)
Characterization
2(1)
Inference
3(1)
Illustrations
4(5)
Residuals versus fitted values
4(2)
Residuals versus the predictors
6(1)
Residuals versus the response
7(2)
On things to come
9(1)
Notational conventions
10(4)
Problems
13(1)
Introduction to 2D Scatterplots
14(26)
Response plots in simple regression
14(1)
New Zealand horse mussels
15(5)
Transforming y via inverse response plots
20(5)
Response transformations
21(3)
Response transformations: Mussel data
24(1)
Danish twins
25(4)
Scatterplot matrices
29(3)
Construction
29(2)
Example
31(1)
Regression graphics in the 1920s
32(5)
Ezekiel's successive approximations
32(2)
Bean's graphic method
34(3)
Discussion
37(3)
Problems
38(2)
Constructing 3D Scatterplots
40(7)
Getting an impression of 3D
40(2)
Depth cuing
42(1)
Scaling
43(1)
Orthogonalization
44(3)
Problems
46(1)
Interpreting 3D Scatterplots
47(31)
Haystacks
47(2)
Structural dimensionality
49(2)
One predictor
49(1)
Two predictors
50(1)
Many predictors
51(1)
One-dimensional structure
51(4)
Two-dimensional structure
55(3)
Removing linear trends
55(1)
Identifying semiparametric regression functions
56(2)
Assessing structural dimensionality
58(5)
A visual metaphor for structural dimension
59(1)
A first method for deciding d = 1 or 2
59(2)
Natural rubber
61(2)
Assessment methods
63(15)
Using independence
64(1)
Using uncorrelated 2D views
65(2)
Uncorrelated 2D views: Haystack data
67(2)
Intraslice residuals
69(2)
Intraslice orthogonalization
71(1)
Mussels again
72(1)
Discussion
73(1)
Problems
74(4)
Binary Response Variables
78(23)
One predictor
78(1)
Two predictors
79(7)
Checking 0D structure
82(1)
Checking 1D structure
82(2)
Comparison with previous checking methods
84(1)
Exploiting the binary response
85(1)
Illustrations
86(5)
Australian Institute of Sport
86(3)
Kyphosis data
89(2)
Three predictors
91(3)
Checking 1D structure
91(2)
Kyphosis data again
93(1)
Visualizing a logistic model
94(7)
Conditionally normal predictors
95(3)
Other predictor distributions
98(1)
Problems
99(2)
Dimension-Reduction Subspaces
101(19)
Overview
101(2)
Dimension-reduction subspaces
103(2)
Central subspaces
105(3)
Guaranteeing S y/x by constraining
108(4)
the distribution of x
108(3)
the distribution of x
111(1)
Importance of central subspaces
112(2)
h-Level response plots
114(6)
Problems
117(3)
Graphical Regression
120(23)
Introduction to graphical regression
120(4)
Capturing Sy/x1
124(3)
Example: Linear regression
125(1)
Example: Sy/x1 = S (η1), but Sy/x2 ≠ S (η2)
126(1)
Forcing Sy/x1 ⊂ S(η1)
127(7)
Location regressions for the predictors
128(1)
Elliptically contoured distributions
129(2)
Elliptically contoured predictors
131(3)
Improving resolution
134(3)
Forcing Sy/x1 = S(η1)
137(3)
Example: x1 independent of x2, but Sy/x1 ≠ S(η1)
137(1)
Conditions for Sy/x1 = S(η1)
137(2)
Marginal consistency assumption
139(1)
Visual fitting with h-level response plots
140(3)
Problems
142(1)
Getting Numerical Help
143(16)
Fitting with linear kernels
143(4)
Isomerization data
145(1)
Using the Li-Duan Proposition
146(1)
Quadratic kernels
147(3)
The predictor distribution
150(3)
Reweighting for elliptical contours
153(6)
Voronoi weights
154(1)
Target distribution
155(1)
Modifying the predictors
156(2)
Problems
158(1)
Graphical Regression Studies
159(28)
Naphthalene data
159(16)
Naphthoquinone, YN
161(9)
Phthalic anhydride, YP
170(5)
Wheat protein
175(4)
Reaction yield
179(5)
Discussion
184(3)
Problems
184(3)
Inverse Regression Graphics
187(16)
Inverse regression function
187(9)
Mean checking condition
191(2)
Mean checking condition: Wheat protein
193(1)
Mean checking condition: Mussel data
194(2)
Inverse variance function
196(7)
Variance checking condition
199(1)
Variance checking condition: Ethanol data
200(1)
Problems
201(2)
Sliced Inverse Regression
203(21)
Inverse regression subspace
203(1)
SIR
204(2)
Asymptotic distribution of Λd
206(7)
Overview
206(2)
The general case
208(2)
Distribution of Λd with constraints
210(3)
SIR: Mussel data
213(3)
Minneapolis schools
216(4)
Discussion
220(4)
Problems
222(2)
Principal Hessian Directions
224(30)
Incorporating residuals
225(2)
Connecting Se/z and Sezz when
227(4)
E(z /ρTZ) = PρZ
227(3)
E(z /ρTZ) = PρZ and Var(Z ρTZ) = Qρ
230(1)
Z is normally distributed
231(1)
Estimation and testing
231(7)
Asymptotic distribution of Δk
232(3)
An algorithm for inference on k
235(1)
Asymptotic distribution of Δk with constraints
236(2)
Testing e independent of z
238(1)
pHd: Reaction yield
238(5)
OLS and SIR
239(1)
pHd test results
240(1)
Subtracting β
241(2)
Using stronger assumptions
243(1)
pHd: Mussel data
243(5)
pHd test results
244(2)
Simulating the response
246(1)
Using Voronoi weights
246(2)
pHd: Haystacks
248(1)
Discussion
249(5)
pHd with the response
249(1)
Additional developments
250(1)
Problems
251(3)
Studying Predictor Effects
254(18)
Introduction to net-effect plots
254(5)
Natural rubber: Net-effect plots
255(1)
Joint normality
256(1)
Slicing
257(2)
Reducing brushing dimensions
259(1)
Distributional indices
259(7)
Example
260(2)
Location dependence
262(2)
Post-model net-effect plots
264(1)
Bivariate SIR
265(1)
Global net-effect plots
266(6)
Tar
268(1)
Minneapolis schools again
269(1)
Problems
270(2)
Predictor Transformations
272(31)
CERES plots
273(6)
Motivation
273(1)
Estimating α1
274(2)
Example
276(3)
CERES plots when (x1 / x2) is
279(4)
Constant
279(1)
Linear in x2
280(1)
Quadratic in x2
281(2)
CERES plots in practice
283(7)
Highly dependent predictors
284(1)
Using many CERES plots
285(4)
Transforming more than one predictor
289(1)
Big Mac data
290(4)
Added-variable plots
294(2)
Environmental contamination
296(7)
Assessing relative importance
297(1)
Data analysis
298(4)
Problems
302(1)
Graphics for Model Assessment
303(26)
Residual plots
304(9)
Rationale
304(1)
Isomerization data
305(1)
Using residuals in graphical regression
306(3)
pHd
309(1)
Residual plots: Kyphosis data
309(3)
Interpreting residual plots
312(1)
Assessing model adequacy
313(16)
Marginal regression functions
314(2)
Marginal variance functions
316(1)
Marginal model plots
317(2)
Isomerization data again
319(3)
Reaction yield data
322(1)
Tomato tops
322(3)
Marginal model plots: Kyphosis data
325(2)
Problems
327(2)
Bibliography 329(10)
Author Index 339(4)
Subject Index 343


R. DENNIS COOK is Professor, Department of Applied Statistics, University of Minnesota. He is the coauthor of An Introduction to Regression Graphics and numerous articles on regression and experimental design. He received his PhD in statistics from Kansas State University and is a Fellow of the American Statistical Association and the Institute of Mathematical Statistics.