ACCESS IS FREE! 

Quickly create a new user account for the Elevate learning management system (this website) to access any modules. 

You do NOT have to be an NCME member to do this! 

Welcome to the ITEMS Portal!

The ITEMS portal is your entry point into a world of learning in educational measurement and assessment. ITEMS modules are its centerpiece, which are short self-contained lessons with various supporting resources that facilitate self-guided learning, team-based discussions, as well as professional instruction. In addition, you can explore a network of other professional organizations, stay up-to-date via news feeds, and contribute to the success of the portal. Enjoy your learning experience!

New Digital Modules:

All-access Pass
This provides immediate access to ALL print and digital modules in the portal by "registering" you for each and displaying all modules as a single collection as part of this pass.
Digital Module 01: Reliability in Classical Test Theory
​In this digital ITEMS module, Dr. Charlie Lewis and Dr. Michael Chajewski provide a two-part introduction to the topic of reliability from the perspective of classical test theory (CTT). Keywords: classical test theory, CTT, congeneric, KR-20, KR-21, Cronbach’s alpha, Pearson correlation, reliability, ​Spearman-Brown formula, parallel, tau-equivalent, test-retest, validity
Digital Module 02: Scale Reliability in Structural Equation Modeling
​In this digital ITEMS module, Dr. Greg Hancock and Dr. Ji An provide an overview of scale reliability from the perspective of structural equation modeling (SEM) and address some of the limitations of Cronbach’s α. Keywords: congeneric, Cronbach’s alpha, ​reliability, scale reliability, SEM, s​tructural equation modeling, McDonald’s omega, model fit, parallel, tau-equivalent
Digital Module 03: Nonparametric Item Response Theory
In this digital ITEMS module Dr. Stefanie Wind introduces the framework of nonparametric item response theory (IRT), in particular Mokken scaling, which can be used to evaluate fundamental measurement properties with less strict assumptions than parametric IRT models. Keywords: ​double monotonicity model, DMM, ​item response theory, IRT, Mokken scaling, monotone homogeneity model, multilevel modeling, mokken package, nonparametric IRT, ​R​, rater effects
Digital Module 04: Diagnostic Measurement Checklists
​In this digital ITEMS module, Dr. Natacha Carragher, Dr. Jonathan Templin, and colleagues provide a didactic overview of the specification, estimation, evaluation, and interpretation steps for diagnostic measurement / classification models (DCMs) centered around checklists for practitioners. A library of macros and supporting files for Excel, SAS, and Mplus is provided along with video tutorials for key practices. ​Keywords: attributes, ​checklists, ​diagnostic measurement, diagnostic classification models, DCM, Excel, ​log-linear cognitive diagnosis modeling framework, LCDM, Mplus, ​Q-matrix, model fit, SAS
Digital Module 05: The G-DINA Framework
In this digital ITEMS module, Dr. Wenchao Ma and Dr. Jimmy de la Torre introduce the G-DINA model, which is a general framework for specifying, estimating, and evaluating a wide variety of cognitive diagnosis models for the purpose of diagnostic measurement. Keywords: cognitive diagnosis models, CDM, diagnostic classification models, ​DCM, diagnostic measurement, ​GDINA, G-DINA framework, GDINA package, model fit, model comparison, Q-matrix, validation
Digital Module 06: Posterior Predictive Model Checking
​In this digital ITEMS module, Dr. Allison Ames and Aaron Myers ​discuss the most common Bayesian approach to model-data fit evaluation, which is called Posterior Predictive Model Checking (PPMC), for simple linear regression and item response theory models. Keywords: Bayesian inference, simple linear regression, item response theory, IRT, model fit, posterior predictive model checking, PPMC, Bayes theorem, Yen’s Q3, item fit
Digital Module 07: Subscores - Evaluation & Reporting
In this digital ITEMS module, Dr. Sandip Sinharay reviews the status quo on the reporting of subscores, which includes how they are used in operational reporting, what kinds of professional standards they need to meet, and how their psychometric properties can be evaluated. Keywords: Diagnostic scores, disattenuation, DETECT, DIMTEST, factor analysis, multidimensional item response theory (MIRT), proportional reduction in mean squared error (PRMSE), reliability, subscores
Digital Module 08: Foundations of Operational Item Analysis
In this digital ITEMS module, Dr. Hanwook Yoo and Dr. Ronald K. Hambleton provide an accessible overview of operational item analysis approaches for dichotomously scored items within the frameworks of classical test theory and item response theory. Keywords: Classical test theory, CTT, corrections, difficulty, discrimination, distractors, item analysis, item response theory, operations, R Shiny, TAP, test development
Digital Module 09: Sociocognitive Assessment for Diverse Populations
In this digital ITEMS module, Dr. Robert [Bob] Mislevy and Dr. Maria Elena Oliveri introduce and illustrate a sociocognitive perspective on educational measurement, which focuses on a variety of design and implementation considerations for creating fair and valid assessments for learners from diverse populations with diverse sociocultural experiences. Keywords: assessment design, Bayesian statistics, cross-cultural assessment, diverse populations, educational measurement, evidence-centered design, fairness, international assessments, prototype, reliability, sociocognitive assessment, validity
Digital Module 10: Rasch Measurement Theory
In this digital ITEMS module, Dr. Jue Wang and Dr. George Engelhard Jr. describe the Rasch measurement framework for the construction and evaluation of new measures and scales and demonstrate the estimation of core models with the Shiny_ERMA and Winsteps programs. Keywords: invariance, item fit, item response theory, IRT, person fit, model fit, multi-faceted Rasch model, objective measurement, R, Rasch measurement, Shiny_ERMA, Winsteps

Attention: You need to quickly create a (free!) user account within the Elevate learning management system (this website) to access any modules - you do NOT have to be an NCME member to do this. 

More Info:

image

Learn more about the mission of ITEMS here.

Learn More
image

Find answers to common questions about ITEMS here.

Learn More
image

Check out the latest news about ITEMS here.

Learn More

ETS and the ETS logo are registered trademarks of Educational Testing Service (ETS).