Module 30: Booklet Designs in Large-Scale Assessments

Product not yet rated

In most large-scale assessments of student achievement, several broad content domains are tested. Because more items are needed to cover the content domains than can be presented in the limited testing time to each individual student, multiple test forms or booklets are utilized to distribute the items to the students. The construction of an appropriate booklet design is a complex and challenging endeavor that has far-reaching implications for data calibration and score reporting.This module describes the construction of booklet designs as the task of allocating items to booklets under context-specific constraints. Several types of experimental designs are presented that can be used as booklet designs. The theoretical properties and construction principles for each type of design are discussed and illustrated with examples. Finally, the evaluation of booklet designs is described and future directions for researching, teaching, and reporting on booklet designs for large-scale assessments of student achievement are identified.

Keywords: booklet design, educational survey, experimental design, item response theory, IRT, large-scale assessments, measurement

Andre A. Rupp

Research Director

Dr. Rupp is Research Director of Integrated Scoring Research (iSCORE) group in the Psychometric Analysis and Research area at the Educational Testing Service (ETS) in Princeton, New Jersey. Dr. Rupp currently leads a research team whose work focuses on evidentiary reasoning for digitally-delivered performance-based assessments, specifically evaluations of human scoring processes and automated scoring systems for written, spoken, and multimodal performances. He has published widely on a variety of educational measurement topics, including applications of evidence-centered design, cognitive diagnostic measurement, and automated scoring, often with a didactic and conceptual synthesis approach. Notably larger volumes include a co-written book entitled Diagnostic Measurement: Theory, Methods, and Applications (2010), which won the Significant Contribution to Educational Measurement and Research Methodology award from AERA Division D in 2012 (with Jonathan Templin and Robert Henson) and the co-edited Handbook of Cognition and Assessment: Frameworks, Methodologies, and Applications (2016), which won the Outstanding Contribution to Practice award from the associated AERA SIG (with Jacqueline P. Leighton). He is currently working on a co-edited Handbook of Automated Scoring (with Duanli Yan and Peter Foltz). He is a reviewer for many well-known measurement journals and is the lead editor / developer of the ITEMS portal for NCME (2016-2019). He has extensive teaching experience from his prior positions in academia, most recently as an associate professor with tenure in the Quantitative Methodology Program in the Department of Human Development and Quantitative Methodology at the University of Maryland in College Park, Maryland.

Andreas Frey

Full Professor for Educational Research Methods, Friedrich Schiller University Jena, Germany

Dr. Frey's research interests include:

  • Empirical educational research
  • Innovativ pedagogical-psychological diagnostics
  • Computerized Adaptive Testing (CAT)
  • Item Response Theory

Johannes Hartig

Research Professor, Educational Quality and Evaluation, German Institute for International Educational Research (DIPF)

Johannes Hartig currently works at the Educational Quality and Evaluation, German Institute for International Educational Research (DIPF). Johannes does research in Educational Psychology, Psychometrics and Differential Psychology.

Key:

Complete
Failed
Available
Locked
Module 30: An NCME Instructional Module on Booklet Designs in Large-Scale Assessments of Student Achievement: Theory and Practice
Open to download resource.
Open to download resource.