Publication Date

2022

Document Type

Thesis

Committee Members

Nathan Bowling, Ph.D. (Advisor); Anthony Gibson, Ph.D. (Committee Member); Corey Miller, Ph.D. (Committee Member)

Degree Name

Master of Science (MS)

Abstract

Research has demonstrated that careless responding (CR) threatens the construct validity of measures (see Huang et al., 2015; Wise & Kong, 2005). Researchers have developed and studied many measurement approaches to capture CR in surveys, with different survey measures compensating for the practical or empirical limitations of other measures. This research is distinguished from ability test CR research because ability tests are fundamentally different from surveys. Within ability tests, CR research has focused only on response time and self-report measures of CR, both of which carry limitations. The former is inflexible because the index necessitates item-level response time information, and therefore cannot be used in pen-and-paper tests or online tests without such item-level information available. The latter index is plagued by theoretical and empirical shortcomings. Thus, the purpose of my study is to find a comparably valid and more flexible approach through testing the efficacy of five survey CR measurement approaches, namely the infrequency approach, instructed-response approach, the consistency approach, the self-report approach, and long-string analysis, in capturing CR in tests. In a sample of 291 undergraduate students, I found strong support for using the infrequency approach to assess careless responding, weak support for the instructed-response approach and long-string analysis, and no support for the self-report or consistency approaches.

Page Count

198

Department or Program

Department of Psychology

Year Degree Awarded

2022

ORCID ID

0000-0001-5366-5303


Share

COinS