Skip to main content

Panel Review: The Quantum Chemistry Concept Inventory (QCCI)

(Post last updated March 1, 2023)

Review panel summary 
The Quantum Chemistry Concept Inventory (QCCI) was designed to assess students’ conceptual understanding of topics commonly taught in the quantum section of undergraduate physical chemistry courses [1]. While the QCCI includes 14 multiple-choice items, one of the items was identified as having two possible correct answers. Therefore, reported results in the original publication [1] only included the 13 remaining items, and a subsequent usage paper [2] chose to administer the 13-item version. The QCCI has been administered as a pre-/post-course assessment at a variety of institutions across the United States and at a large public Canadian university [1]. Pre-/post-course data collected using the QCCI has been investigated using normalized gains analysis. Results found that students performed better on most items included in the QCCI (except for Item 1) after completing the course of interest, with the largest gains occurring in items related to content that is typically addressed for the first time in physical chemistry courses. In addition, the QCCI has been administered as only a post-course assessment as part of a larger study [2].

Evidence based on test content was provided during the development of the QCCI. The developers reported that literature on student misconceptions, related to topics that are commonly taught in the quantum section of physical chemistry, was used to inform the development of items [1], although details related to how the literature informed the items and distracters were not discussed. Additionally, the pilot version of the QCCI was developed and edited by professors and postdoctoral students [1], although the content areas and teaching experiences of these individuals were not described.

Item difficulty and item discrimination of the data generated by the QCCI was also examined. Pre-course data collected with the QCCI showed that item difficulty fell within the acceptable cutoff range for all but five items, while post-course data showed that item difficulty fell within the acceptable cutoff range for all items [1]. Regarding item discrimination, the QCCI showed acceptable values for Fergusons’ delta, which demonstrated that the measure could distinguish students’ understanding across the full range of scores [1]. Additionally, the point biserial correlation coefficient was used to identify correlations between students’ score on an item and their total score on the instrument. All items except one (Item 8) fell above the designated cutoff for the point biserial correlation coefficient [1].

Recommendations for use 
The QCCI has been used to assess students’ conceptual understanding of topics commonly taught in the quantum section of undergraduate physical chemistry courses. Evidence based on test content, item difficulty, and item discrimination have been published, although details related to supporting the evidence based on test content are limited [1]. While the developers of the QCCI suggested that the quantitative data collected in their study indicated that the assessment successfully elicited student misconceptions within the multiple choice format of the instrument, without collecting evidence based on response process, it is impossible to know if these distracters are actually representative of students’ misconceptions. For this reason, future users are encouraged to provide evidence based on response process for data collected with this assessment. This type of validity evidence would greatly strengthen the inferences that can be made from QCCI data.

Details from panel review 
The developers of the QCCI designed the measure with a “focus on content coverage” [1], although it was not explicitly described how topics of interest were identified or the prevalence of their coverage in the courses used for data collection. Of the incorrect multiple choice options (distracters) included in the QCCI, the developers found that most distracters (all except 5) attracted 10% or more of the students who completed the pre-test, while fewer distracters (all except 13) attracted 10% or more of the students who completed the post-test [1]. The developers suggested that this difference may stem from the fact that students are better able to avoid misconceptions after a semester of instruction. Alternatively, on 6 items, post-test distracters were chosen by greater than 25% of students [1]. In this case, the developers suggested that a semester of instruction does not completely dispel some students’ misconceptions or lack of original understanding. However, without collecting evidence based on response process, it is difficult to know if these data were actually indicative of students’ misconceptions, or if students decided to select or avoid distracters for other reasons.

The developers of the QCCI gathered and reported evidence based on test content, item difficulty, and item discrimination [1]. Regarding evidence based on test content, details related to how literature on student misconceptions was used in item development were not described. Additionally, it is unclear what feedback the professors and postdoctoral students provided on the items, and no descriptions of their content areas and/or teaching experiences was included. Finally, it was noted by the developers of the QCCI that results from the pilot study informed minor changes within existing items and the addition of two items relating to spectroscopy [1]. As the 12 original pilot items were not provided, and details related to the data/feedback that informed changes to the pilot was not included, it is unknown how or why these items were edited. For item difficulty, pre-course data showed that values fell within the acceptable cutoff range of 0.3-0.8 for all but five items (Items 3,4,5,8,12) while post-course data showed that item difficulty fell within the acceptable cutoff range for all items [1]. While five items of the QCCI were found to fall outside of the acceptable range of difficulty when considering the pre-course data, this may not be problematic, as students who have not yet received instruction on a particular topic may reasonably find items related to that topic more difficult. Turning to item discrimination, the QCCI showed acceptable values for Fergusons’ delta (> 0.9) and the point biserial correlation coefficient (≥ 0.2) for all items except Item 8 [1]. While this method of investigating item discrimination is appropriate for the sample sizes reported in the study, future users of the assessment with access to larger sample sizes may be interested in determining item-level discrimination using upper and lower quartiles to investigate how well items on the QCCI differentiate between high and low performers.

A later study investigated a possible correlation between student’s total scores on the QCCI and their performance on a quantum chemical model rating task; however, no significant correlation between the scores was found [2]. As the researchers who conducted the investigation did not provide any theoretical support for why a correlation between these scores might exist, the investigation does not provide evidence based on relations to other variables. Additionally, while a total score for the QCCI was reported in this case, many concept inventory developers suggest that outcomes for these types of assessments be analyzed at the item level only.

References

[1] García, M.P., Luxford, C.J., Windus, T.L., & Holme, T. A Quantum Chemistry Concept Inventory for Physical Chemistry Classes. J. Chem. Educ. 93(4), 605-612.

[2] Muniz, M.N., Crickmore, C., Kirsch, J., & Beck, J.P. Upper-division chemistry students’ navigation and use of quantum chemical models. Chem. Educ. Res. & Pract. 19, 767-782.