Careless Responses Using Response Time Data

Date:

Using survey data from a high school sample, in this study, we explore item-level response times, and response sequence and styles using process and meta data, and build on previously recommended methods for detection of careless response behavior.

Survey data often suffer from inattentive or careless responses, resulting in low data quality and reduced sample sizes after exclusion of cases. In online survey platforms, response times and click-through data have become more easily available, although applications of such types of data are still limited.

Beyond item responses and response times, click-through data reveals the sequence in which item responses were selected, whether there were changes in response options, and whether items were answered in the expected order.

Careless response behavior has been associated with a tendency to speed through surveys by, for example, straightlining or random response selection (Wood, Harms, Lowman & DeSimone, 2017; Zhang & Conrad, 2014). Additionally, changes in answer-choice may indicate intentional engagement with the survey, while unexpected item order may reveal careless response behavior.

Using survey data from a high school sample, in this study, we explore item-level response times. Then we compare it to methods such as consistency indices, outlier detection, and special items (e.g., “Please select ‘agree’ for this item”). Main findings and future directions for the use of survey log data are discussed (Shao & Cheng, 2016; Meade & Craig, 2012).


Work presented at the 2020 International Meeting of the Psychometric Society (IMPS) as a conference talk. Please contact the author for further details.