|
List of partial statistical tables |
|
|
xi | |
Preface |
|
xiii | |
|
1 Introduction to statistics and simple descriptive statistics |
|
|
1 | (12) |
|
1.1 Statistics and scientific enquiry |
|
|
1 | (2) |
|
|
3 | (6) |
|
1.2.1 Variables and constants |
|
|
3 | (1) |
|
1.2.2 Scales of measurement |
|
|
4 | (2) |
|
1.2.3 Accuracy and precision |
|
|
6 | (1) |
|
1.2.4 Independent and dependent variables |
|
|
6 | (1) |
|
1.2.5 Control and experimental groups |
|
|
7 | (1) |
|
1.2.6 Samples and statistics, populations and parameters. Descriptive and inferential statistics. A few words about sampling |
|
|
8 | (1) |
|
|
9 | (3) |
|
1.4 Chapter 1 key concepts |
|
|
12 | (1) |
|
|
12 | (1) |
|
2 The first step in data analysis: summarizing and displaying data. Computing descriptive statistics |
|
|
13 | (29) |
|
2.1 Frequency distributions |
|
|
13 | (5) |
|
2.1.1 Frequency distributions of discontinuous numeric and qualitative variables |
|
|
13 | (2) |
|
2.1.2 Frequency distributions of continuous numeric variables |
|
|
15 | (2) |
|
2.1.3 Stem-and-leaf displays of data |
|
|
17 | (1) |
|
|
18 | (7) |
|
2.2.1 Bar graphs and pie charts |
|
|
19 | (2) |
|
|
21 | (1) |
|
|
21 | (1) |
|
|
21 | (4) |
|
2.3 Descriptive statistics. Measures of central tendency and dispersion |
|
|
25 | (14) |
|
2.3.1 Measures of central tendency |
|
|
26 | (3) |
|
2.3.2 Measures of variation |
|
|
29 | (10) |
|
2.4 Chapter 2 key concepts |
|
|
39 | (1) |
|
|
40 | (1) |
|
|
40 | (2) |
|
3 Probability and statistics |
|
|
42 | (41) |
|
3.1 Random sampling and probability distributions |
|
|
43 | (1) |
|
3.2 The probability distribution of qualitative and discontinuous numeric variables |
|
|
44 | (2) |
|
3.3 The binomial distribution |
|
|
46 | (2) |
|
3.4 The Poisson distribution |
|
|
48 | (5) |
|
|
53 | (4) |
|
3.6 The probability distribution of continuous variables |
|
|
57 | (21) |
|
3.6.1 z scores and the standard normal distribution (SND) |
|
|
63 | (8) |
|
3.6.2 Percentile ranks and percentiles |
|
|
71 | (2) |
|
3.6.3 The probability distribution of sample means |
|
|
73 | (4) |
|
3.6.4 Is my bell shape normal? |
|
|
77 | (1) |
|
3.7 Chapter 3 key concepts |
|
|
78 | (1) |
|
|
79 | (1) |
|
|
80 | (3) |
|
4 Hypothesis testing and estimation |
|
|
83 | (25) |
|
4.1 Different approaches to hypothesis testing and estimation |
|
|
83 | (1) |
|
4.1.1 The classical significance testing approach |
|
|
83 | (1) |
|
4.1.2 The maximum likelihood approach |
|
|
84 | (1) |
|
4.1.3 The Bayesian approach |
|
|
84 | (1) |
|
|
84 | (6) |
|
4.2.1 Confidence limits and confidence interval |
|
|
85 | (4) |
|
|
89 | (1) |
|
|
90 | (16) |
|
4.3.1 The principles of hypothesis testing |
|
|
90 | (3) |
|
4.3.2 Errors and power in hypothesis testing |
|
|
93 | (5) |
|
4.3.3 Hypothesis tests using z scores |
|
|
98 | (2) |
|
4.3.4 One- and two-tailed hypothesis tests |
|
|
100 | (1) |
|
4.3.5 Assumptions of statistical tests |
|
|
101 | (2) |
|
4.3.6 Hypothesis testing with the t distribution |
|
|
103 | (1) |
|
4.3.7 Hypothesis tests using t scores |
|
|
104 | (1) |
|
4.3.8 Reporting hypothesis tests |
|
|
105 | (1) |
|
4.3.9 The classical significance testing approach. A conclusion |
|
|
106 | (1) |
|
4.4 Chapter 4 key concepts |
|
|
106 | (1) |
|
|
107 | (1) |
|
5 The difference between two means |
|
|
108 | (14) |
|
|
108 | (8) |
|
5.1.1 Assumptions of the un-paired t test |
|
|
112 | (4) |
|
5.2 The comparison of a single observation with the mean of a sample |
|
|
116 | (1) |
|
|
117 | (3) |
|
5.3.1 Assumptions of the paired t test |
|
|
119 | (1) |
|
5.4 Chapter 5 key concepts |
|
|
120 | (1) |
|
|
120 | (1) |
|
|
121 | (1) |
|
6 The analysis of variance (ANOVA) |
|
|
122 | (24) |
|
6.1 Model I and model II ANOVA |
|
|
122 | (1) |
|
6.2 Model I, one-way ANOVA. Introduction and nomenclature |
|
|
123 | (8) |
|
|
131 | (1) |
|
|
132 | (3) |
|
|
133 | (2) |
|
6.5 Model I, two-way ANOVA |
|
|
135 | (8) |
|
|
143 | (1) |
|
6.7 Chapter 6 key concepts |
|
|
144 | (1) |
|
|
145 | (1) |
|
|
145 | (1) |
|
7 Non-parametric tests for the comparison of samples |
|
|
146 | (20) |
|
|
147 | (1) |
|
7.2 The Mann-Whitney U test for a two-sample un-matched design |
|
|
148 | (5) |
|
7.3 The Kruskal-Wallis for a one-way, model I ANOVA design |
|
|
153 | (6) |
|
7.4 The Wilcoxon signed-ranks test for a two-sample paired design |
|
|
159 | (5) |
|
7.5 Chapter 7 key concepts |
|
|
164 | (1) |
|
|
164 | (1) |
|
|
164 | (2) |
|
8 The analysis of frequencies |
|
|
166 | (27) |
|
8.1 The X2 test for goodness-of-fit |
|
|
166 | (4) |
|
8.2 The Kolmogorov-Smirnov one sample test |
|
|
170 | (2) |
|
8.3 The X2 test for independence of variables |
|
|
172 | (3) |
|
8.4 Yates' correction for continuity |
|
|
175 | (1) |
|
8.5 The likelihood ratio test (the G test) |
|
|
176 | (2) |
|
|
178 | (5) |
|
8.7 The McNemar test for a matched design |
|
|
183 | (1) |
|
8.8 Tests of goodness-of-fit and independence of variables. Conclusion |
|
|
184 | (1) |
|
8.9 The odds ratio (OR): measuring the degree of the association between two discrete variables |
|
|
185 | (3) |
|
8.10 The relative risk (RR): measuring the degree of the association between two discrete variables |
|
|
188 | (2) |
|
8.11 Chapter 8 key concepts |
|
|
190 | (1) |
|
|
190 | (1) |
|
|
191 | (2) |
|
|
193 | (16) |
|
9.1 The Pearson product-moment correlation |
|
|
193 | (6) |
|
9.2 Non-parametric tests of correlation |
|
|
199 | (9) |
|
9.2.1 The Spearman correlation coefficient rs |
|
|
199 | (3) |
|
9.2.2 Kendall's coefficient of rank correlation - tau (τ) |
|
|
202 | (6) |
|
9.3 Chapter 9 key concepts |
|
|
208 | (1) |
|
|
208 | (1) |
|
10 Simple linear regression |
|
|
209 | (25) |
|
10.1 An overview of regression analysis |
|
|
210 | (4) |
|
10.2 Regression analysis step-by-step |
|
|
214 | (11) |
|
10.2.1 The data are plotted and inspected to detect violations of the linearity and homoscedasticity assumptions |
|
|
214 | (1) |
|
10.2.2 The relation between the X and the Y is described mathematically with an equation |
|
|
215 | (1) |
|
10.2.3 The regression analysis is expressed as an analysis of the variance of Y |
|
|
215 | (2) |
|
10.2.4 The null hypothesis that the parametric value of the slope is not statistically different from 0 is tested |
|
|
217 | (1) |
|
10.2.5 The regression equation is used to predict values of Y |
|
|
217 | (2) |
|
10.2.6 Lack of fit is assessed |
|
|
219 | (2) |
|
10.2.7 The residuals are analyzed |
|
|
221 | (4) |
|
10.3 Transformations in regression analysis |
|
|
225 | (7) |
|
10.4 Chapter 10 key concepts |
|
|
232 | (1) |
|
|
232 | (1) |
|
|
232 | (2) |
|
11 Advanced topics in regression analysis |
|
|
234 | (23) |
|
11.1 The multiple regression model |
|
|
234 | (17) |
|
11.1.1 The problem of multicollinearity/collinearity |
|
|
235 | (1) |
|
11.1.2 The algebraic computation of the multiple regression equation |
|
|
236 | (4) |
|
11.1.3 An overview of multiple-regression-model building |
|
|
240 | (7) |
|
11.1.4 Dummy independent variables |
|
|
247 | (4) |
|
11.2 An overview of logistic regression |
|
|
251 | (4) |
|
11.3 Writing up your results |
|
|
255 | (1) |
|
11.4 Chapter 11 key concepts |
|
|
255 | (1) |
|
|
256 | (1) |
|
|
256 | (1) |
References |
|
257 | (3) |
Index |
|
260 | |