Dissertation Defense of Dana Denick

Event Date: February 13, 2015
Time: 3:00 pm
Location: ARMS 1028

Difficulty as a Concept Inventory Design Consideration: An Exploratory Study of the Concept Assessment Tool for Statics (CATS)

The ability for engineering students to apply mathematic, scientific and engineering knowledge to real-life problems depends greatly on developing deep conceptual knowledge that structures and relates the meaning of underlying principles. Concept inventories have emerged as a class of assessment typically developed for use in higher education science and engineering courses. Concept Inventories (CIs) are multiple-choice tests that are designed to assess students’ conceptual understanding within a specific content domain. For example, the CI explored within this study, the Concept Assessment Tool for Statics (CATS) is intended to measure students’ understanding of the concepts underlying the domain of engineering statics. High quality, reliable CIs may be used as formative and summative assessments, and help address the need for measures of conceptual understanding.


One component of assessment quality can be found through psychometric evaluation. Prior research has applied multiple theoretical measurement models including classical test theory and item response theory to find psychometric measurements that characterize student performance on CATS. Common to these approaches is the calculation of item difficulty, a parameter that is used to distinguish which items are more difficult than others. The purpose of this dissertation study is to provide context and description of what makes some CI items more difficult than others within the content area of statics, based on students’ reasoning in response to CATS items. Specifically, the research question guiding this study is: how does student reasoning in response to CATS items explain variance in item difficulty across assessment questions?


Think-aloud interviews were conducted in combination with a content analysis of selected CATS items. Thematic analysis was performed on interview transcripts and CATS development and evaluation documentation. Two themes emerged as possible explanations for why some CATS items are more difficult than others: (1) a Direction of Problem Solving theme describes the direction of reasoning required or used to respond to CATS items, and may also provide some description of students’ reasoning in response to determinant and indeterminant multiple-choice problems; and (2) a Distractor Attractiveness theme describes problematic reasoning that is targeted and observed as argumentation for incorrect CATS responses. The findings from this study hold implications for the interpretation of CATS performance and the consideration of difficulty in concept inventory design.