Atnaujinkite slapukų nuostatas

Sociocognitive Foundations of Educational Measurement [Minkštas viršelis]

(Educational Testing Service, Princeton, New Jersey, USA)
  • Formatas: Paperback / softback, 438 pages, aukštis x plotis: 254x178 mm, weight: 771 g, 41 Tables, black and white; 70 Line drawings, black and white; 1 Halftones, black and white; 71 Illustrations, black and white
  • Išleidimo metai: 06-Mar-2018
  • Leidėjas: Routledge
  • ISBN-10: 0415716977
  • ISBN-13: 9780415716970
  • Formatas: Paperback / softback, 438 pages, aukštis x plotis: 254x178 mm, weight: 771 g, 41 Tables, black and white; 70 Line drawings, black and white; 1 Halftones, black and white; 71 Illustrations, black and white
  • Išleidimo metai: 06-Mar-2018
  • Leidėjas: Routledge
  • ISBN-10: 0415716977
  • ISBN-13: 9780415716970

Several key developments challenge the field of educational measurement today: demands for tests at larger scales with higher stakes, an improved understanding of how people develop capabilities, and new technologies for interactive digital assessments. Sociocognitive Foundations of Educational Measurement integrates new developments in educational measurement and educational psychology in order to provide researchers, testing professionals, and students with an innovative sociocognitive perspective on assessment. This comprehensive volume begins with a broad explanation of the sociocognitive perspective and the foundations of assessment, then provides a series of focused applications to major topics such as assessment arguments, validity, fairness, interactive assessment, and a conception of "measurement" in educational assessment. Classical test theory, item response theory, categorical models, mixture models, cognitive diagnosis models, and Bayesian networks are explored from the resulting perspective. Ideal for specialists in these areas, graduate students, developers, and scholars in both educational measurement and fields that contribute to a sociocognitive perspective, this book consolidates nearly a decade of research into a fresh perspective on educational measurement.

Recenzijos

"This volume is a conceptual and methodological tour de force for the field of educational measurement. In weaving together theory, research, and practice across the diverse disciplines of sociocognition, measurement, and assessment, Robert J. Mislevy has provided a vision for the future of educational measurement. This is the synthesis of ideas, examples, and implications that many of us have hoped he would write."

James W. Pellegrino, Co-Director of the Learning Sciences Research Institute, University of Illinois at Chicago, USA

"Robert J. Mislevythe founder of Evidence-Centered Design and the leading expert on assessment todayoffers here a profound and definitive approach to assessment that finally comes fully to terms with the social and situated nature of learning and assessment."

James Paul Gee, Mary Lou Fulton Professor of Literacy Studies and Regents Professor at Arizona State University, USA

"There are those who study the situated context of human learning and those who develop the empirical tools by which that learning is assessed. Were there ever two tribes more at odds? But now we have common ground. In Sociocognitive Foundations of Educational Measurement, distinguished psychometrician Robert J. Mislevy proves an expert guide in explaining commonalities between context and evaluation. From the use of familiar assessments like limited-response tests to next-generation gaming, his situated perspective has wide applications for wise use. Innovative and welcoming on every page, this is a phenomenal book for teachers, researchers, administrator, and policy makers."

Norbert Elliot, Professor Emeritus of English, New Jersey Institute of Technology, USA

"Sociocognitive Foundations of Educational Measurement [ is] a sweeping new volume from Robert Mislevy that defies genre conventions, as Mislevy himself has done over his remarkable career. Sociocognitive Foundations is the sort of book that seeks not just to contribute to ongoing conversations but to open up entirely new ones, and to give us the vocabulary and transdisciplinary conceptual schemes with which to have them. In particular, Mislevys volume addresses psychometric modeling, validity, philosophy of measurement, and (of course) sociocognitive theory, and while there are extant sources with more to say about each of these topics individually, we know of no other source that puts these fields into dialogue as vigorously as is done here."

Psychometrika

"Mislevy has extensively articulated ways in which sociocultural insights can inform the definition of the targets of measurement, characterize threats to arguments for the interpretation and use of test scores, and provide means for better matching test methods and score interpretations with respect to differences among individuals. . . . Mislevys framework for measurement is comprehensive and boundary-pushing."

Language Assessment Quarterly

"Robert Mislevy writes convincingly about ways in which measurement models can be in the service of claims about human cognition, inextricably embedded within social structures and life experiences. . . . It is essential reading for imagining and creating a different present and future of educational measurement."

Journal of Educational Measurement

Preface xiii
Acknowledgments xvii
1 Where We Are Going, and Why, and How
1(20)
1.1 Introduction
1(1)
1.2 LCS Patterns Across People and Resources Within People
2(2)
1.3 A Sketch of the Articulation
4(5)
1.4 Model-Based Reasoning
9(5)
1.5 From Situated Action to Measurement-Model Variables
14(2)
1.6 Looking Ahead
16(5)
2 A Sociocognitive Perspective
21(25)
2.1 Overview
21(1)
2.2 Complex Adaptive Systems
21(2)
2.3 Patterns Within Individuals
23(6)
2.4 Patterns Across Individuals
29(4)
2.5 A Gedanken Representation, With Implications for Assessment
33(3)
2.6 Examples
36(5)
2.7 Reflections
41(5)
3 The Structure of Assessment Arguments
46(23)
3.1 Overview
46(1)
3.2 Psychological Perspectives
46(1)
3.3 The Assessment Design/Interpretation Argument
47(13)
3.4 The Assessment-Use Argument
60(6)
3.5 Reflections
66(3)
4 A Sociocognitive Perspective on Design/Interpretation Arguments
69(36)
4.1 Overview
69(2)
4.2 Assessments Are Practices
71(1)
4.3 What Claims, If Any, Might We Make About Individuals?
71(2)
4.4 Constructs, Warrants, Backing, and Alternative Explanations
73(6)
4.5 Patterns at Many Levels
79(4)
4.6 What Makes Tasks Difficult?
83(3)
4.7 Characterizing Task Situations
86(1)
4.8 Characterizing Performances
87(3)
4.9 What Can This Person Be Thinking?
90(6)
4.10 Applying the Interpretation Argument to Multiple Examinees
96(3)
4.11 Reflections
99(6)
5 A Sociocognitive Perspective on Assessment-Use Arguments
105(28)
5.1 Overview
105(1)
5.2 Acting in the Assessment and Criterion Situations
105(7)
5.3 Two Examples
112(2)
5.4 Behavioral Assessment-Use Arguments
114(2)
5.5 Trait Assessment-Use Arguments
116(6)
5.6 Trait Within Social/Behavioral Domain Assessment-Use Arguments
122(3)
5.7 Information-Processing Assessment-Use Arguments
125(3)
5.8 Applying the Use Argument to Multiple Examinees
128(1)
5.9 Reflections
129(4)
6 Meaning in Measurement Models
133(29)
6.1 Overview
133(1)
6.2 Connecting Measurement Models With Assessment Arguments
134(2)
6.3 The "As If" Pivot
136(3)
6.4 Background for the Force Concept Inventory Example
139(3)
6.5 Classical Test Theory
142(9)
6.6 A Model for a "Resources" Narrative Space
151(5)
6.7 Reflections
156(6)
7 Probability-Based Reasoning in Measurement Models
162(32)
7.1 Overview
162(1)
7.2 A Subjectivist-Bayesian Perspective on Model-Based Reasoning
162(2)
7.3 Additional Background for Hydrive
164(1)
7.4 Concepts in Probability-Based Reasoning
165(14)
7.5 Working With Probability Models
179(11)
7.6 Reflections
190(4)
8 Measurement Concepts
194(24)
8.1 Overview
194(1)
8.2 Reliability
195(6)
8.3 Validity
201(8)
8.4 Comparability
209(3)
8.5 What Are True Scores, Latent Variables, and Measurement Error?
212(2)
8.6 Reflections
214(4)
9 A Conditional Sense of Fairness
218(27)
9.1 Overview
218(2)
9.2 Marginal and Conditional Inference
220(1)
9.3 Conditioning Evaluation Processes on Information About Students
220(6)
9.4 Conditioning Task Situations on Information About Students
226(15)
9.5 Reflections
241(4)
10 Measurement Models and Fairness
245(23)
10.1 Overview
245(1)
10.2 The Rasch Model for Dichotomous Items
245(4)
10.3 Person-Fit Analyses
249(2)
10.4 Differential Item Functioning
251(12)
10.5 Reflections
263(5)
11 Item Response Theory I: Item-Level Models
268(21)
11.1 Overview
268(1)
11.2 Some Antecedents
269(4)
11.3 Standardized Tests
273(7)
11.4 Item-Level Response Models
280(5)
11.5 Reflections
285(4)
12 Item Response Theory II: Sociocognitive Perspectives
289(24)
12.1 Overview
289(1)
12.2 A Sociocognitive View of Responding to Items
289(3)
12.3 Examples
292(6)
12.4 Sociocognitive Clines
298(6)
12.5 Analytic Approaches to IRT From a Sociocognitive Perspective
304(6)
12.6 Reflections
310(3)
13 Item Response Theory III: Measurement
313(35)
13.1 Overview
313(1)
13.2 A Closer Look at Measurement
313(8)
13.3 Rasch Measurement
321(5)
13.4 Incorporating Cognitive Theory Into IRT
326(8)
13.5 So, Is It Measurement?
334(8)
13.6 Reflections
342(6)
14 Generalizability Theory
348(19)
14.1 Overview
348(1)
14.2 A Sociocognitive Perspective on Generalizability Theory
348(9)
14.3 Modeling Rater Effects
357(6)
14.4 Reflections
363(4)
15 Cognitive Diagnosis Models
367(18)
15.1 Overview
367(1)
15.2 The Basic Idea
367(3)
15.3 Mixed-Number Subtraction
370(5)
15.4 A Hybrid Model
375(4)
15.5 A Measurement Model for a Conditional Sense of Fairness
379(2)
15.6 Reflections
381(4)
16 Simulation-Based Assessment
385(30)
16.1 Overview
385(1)
16.2 A Brief History of Evidence-Bearing Opportunities
385(5)
16.3 Arguments for Assessments With Contingent Data
390(5)
16.4 Evidence Identification
395(6)
16.5 Modular Assembly of Measurement-Model Components
401(8)
16.6 Benefits of Measurement Modeling
409(1)
16.7 Reflections
410(5)
17 Our Story So Far
415(13)
17.1 Overview
415(1)
17.2 Where We Have Arrived
415(3)
17.3 Frames for Thinking About Assessment
418(7)
17.4 Reflections
425(3)
Index 428
Robert J. Mislevy is Frederic M. Lord Chair in Measurement and Statistics at Educational Testing Service. He is Professor Emeritus of Measurement, Statistics, and Evaluation with affiliations in Second Language Acquisition and Survey Methods at the University of Maryland, College Park, USA.