Quickly create a new user account for the Elevate learning management system (this website) to access any modules. 

You do NOT have to be an NCME member to do this! 

Welcome to the ITEMS Portal!

The ITEMS portal is your entry point into a world of learning in educational measurement and assessment. ITEMS modules are its centerpiece, which are short self-contained lessons with various supporting resources that facilitate self-guided learning, team-based discussions, as well as professional instruction. In addition, you can explore a network of other professional organizations, stay up-to-date via news feeds, and contribute to the success of the portal. Enjoy your learning experience!

New Videos: Anatomy of Measurement

Video 1: Educational Measurement

Short animated video overview of the field of educational measurement in a nontechnical and fun way. [7 Minutes]

Video 2: Reliability, Validity, and Fairness

Short animated video explaining reliability, validity, and fairness in a nontechnical and fun way. [7 Minutes]

Video 3: Standardized Test Development

Short animated video describing the lifecycle of standardized test development in a nontechnical and fun way. [10 Minutes]

Video 4: Standardized Test Scoring

Short animated video explaining how standardized tests are scored in a nontechnical and fun way. [5 Minutes]

New Digital Modules:

All-access Pass
This provides immediate access to ALL print and digital modules in the portal by "registering" you for each and displaying all modules as a single collection as part of this pass.
All-access Pass (PRINT ONLY)
This provides access to a ZIP folder with all 45 previously published print modules.
Digital Module 12: Think-aloud Interviews and Cognitive Labs
‚ÄčIn this digital ITEMS module, Dr. Jacqueline Leighton and Dr. Blair Lehman review differences between think-aloud interviews to measure problem-solving processes and cognitive labs to measure comprehension processes and illustrate both traditional and modern data-collection methods. Keywords: ABC tool, cognitive laboratory, cog lab, cognition, cognitive model, interrater agreement, kappa, probe, rubric, thematic analysis, think-aloud interview, verbal report
Digital Module 13: Simulation Studies in IRT
In this digital ITEMS module, Dr. Brian Leventhal and Dr. Allison Ames provide an overview of Monte Carlo simulation studies (MCSS) in item response theory (IRT). MCSS are utilized for a variety of reasons, one of the most compelling being that they can be used when analytic solutions are impractical or nonexistent because they allow researchers to specify and manipulate an array of parameter values and experimental conditions (e.g., sample size, test length, and test characteristics). Key words: bias, bi-factor model, estimation, graded response model, item response theory, mean squared error, Monte Carlo, simulation, standard error, two-parameter logistic model
Digital Module 14: Planning and Conducting Standard Setting
In this digital ITEMS module, Dr. Michael B. Bunch provides an in-depth, step-by-step look at how standard setting is done. It does not focus on any specific procedure or methodology (e.g., modified Angoff, bookmark, body of work) but on the practical tasks that must be completed for any standard setting activity. Keywords: achievement level descriptor, certification and licensure, cut score, feedback, interquartile range, performance level descriptor, score reporting, standard setting, panelist, vertical articulation
Digital Module 15: Accessibility of Educational Assessments
In this digital ITEMS module, Dr. Ketterlin Geller and her colleagues provide an introduction to accessibility of educational assessments. They discuss the legal basis for accessibility in K-12 and higher education organizations and describe how test and item design features as well as examinee characteristics affect the role that accessibility plays in evaluating test validity during test development operational deployment. Keywords: Accessibility, accommodations, examinee characteristics, fairness, higher education, K-12 education, item design, legal guidelines, test development, universal design
Digital Module 16: Longitudinal Data Analysis
In this digital ITEMS module, Dr. Jeffrey Harring and Ms. Tessa Johnson introduce the linear mixed effects (LME) model as a flexible general framework for simultaneously modeling continuous repeated measures data with a scientifically-defensible function that adequately summarizes both individual change as well as the average response. Keywords: fixed effect, linear mixed effects models, longitudinal data analysis, multilevel models, population-average, random effect, regression, subject-specific, trajectory
Digital Module 17: Data Visualizations
In this digital module, Nikole Gregg and Dr. Brian Leventhal discuss strategies to ensure data visualizations achieve graphical excellence. The instructors review key literature, discuss strategies for enhancing graphical presentation, and provide an introduction to the Graph Template Language (GTL) in SAS to illustrate how elementary components can be used to make efficient, effective and accurate graphics for a variety of audiences. Key words: data visualization, graphical excellence, graphical template language, SAS
Digital Module 18: Automated Scoring
In this digital ITEMS module, Dr. Sue Lottridge, Amy Burkhardt, and Dr. Michelle Boyer provide an overview of automated scoring. They discuss automated scoring from a number of perspectives and provide two data examples, one focused on training and evaluating an automated scoring engine and one focused on the impact of rater error on predicted scores. Key words: automated scoring, hand-scoring, machine learning, natural language processes, constructed response items

More Info:


Learn more about the mission of ITEMS here.

Learn More

Find answers to common questions about ITEMS here.

Learn More

Check out the latest news about ITEMS here.

Learn More
Access all 45 print modules and other digital modules via our module library.