Change-Point Analysis with Joint Modeling of Response Times and Item Response Data

Date:

Building upon previous change-point analysis methods, joint modeling of item responses and response times is applied to detect various types of aberrant response behavior and to estimate the point of change in behavior.

Quality checks for issues such as speeded response behavior are crucial for securing test fairness and score validity of large-scale, high-stakes educational assessments. Due to the high-stakes nature of such assessments, students may feel more inclined to engage in item harvesting. As item-level response time data becomes increasingly available, collateral information from such data may be used concurrently with the item responses to assist checks for item preknowledge and to define proper test time limits.

Building upon previous change-point analysis methods (Shao, Li & Cheng, 2016; Shao & Cheng, 2016), joint modeling of item responses and response times is applied to detect various types of aberrant response behavior and to estimate the point of change in behavior. Given an interim point of change, likelihood-ratio test statistics (Wang & Weiss, 2018) is applied to the data under a hierarchical modeling framework (Molenaar, Tuerlinckx & van der Maas, 2015). Considering all possible change points, a new test statistic is extracted and empirically tested.

This method holds promise to produce more accurate estimates of the point of change than previous methods, especially when there is a strong correlation between latent cognitive ability and working speed. Results and implications for the interpretation of scores are discussed.


Work accepted to the 2020 National Council of Measurement in Education (NCME) annual meeting. Please contact the author for further details.