Skip to main content

Chemical Concepts Inventory

CCI (Mulford & Robinson, 2002)

    OVERVIEW
    Overview
    Listed below is general information about the instrument.
    Summary
    Original author(s)
    • Mulford, D.R., & Robinson, W.R.

    Original publication
    • Mulford, D.R., & Robinson, W.R. (2002). An inventory for alternate conceptions among first-semester general chemistry students. Journal of Chemical Education, 79(6).

    Year original instrument was published 2002
    Inventory
    Number of items 22
    Number of versions/translations 9
    Cited implementations 19
    Language
    • English
    Country United States, South Africa, Australia
    Format
    • Multiple Choice
    Intended population(s)
    • Students
    • Undergraduate
    • Teachers
    • Middle School
    • Unknown
    • High School
    Domain
    • Cognitive
    Topic
    • General Chemistry
    Evidence
    The CHIRAL team carefully combs through every reference that cites this instrument and pulls all evidence that relates to the instruments’ validity and reliability. These data are presented in the following table that simply notes the presence or absence of evidence related to that concept, but does not indicate the quality of that evidence. Similarly, if evidence is lacking, that does not necessarily mean the instrument is “less valid,” just that it wasn’t presented in literature. Learn more about this process by viewing the CHIRAL Process and consult the instrument’s Review (next tab), if available, for better insights into the usability of this instrument.

    Information in the table is given in four different categories:
    1. General - information about how each article used the instrument:
      • Original development paper - indicates whether in which paper(s) the instrument was developed initially
      • Uses the instrument in data collection - indicates whether an article administered the instrument and collected responses
      • Modified version of existing instrument - indicates whether an article has modified a prior version of this instrument
      • Evaluation of existing instrument - indicates whether an article explicitly provides evidence that attempt to evaluate the performance of the instrument; lack of a checkmark here implies an article that administered the instrument but did not evaluate the instrument itself
    2. Reliability - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    3. Validity - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    4. Other Information - information that may or may not directly relate to the evidence for validity and reliability, but are commonly reported when evaluating instruments; please see the Glossary for term definitions
    Publications: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19

    General

    Original development paper
    Uses the instrument in data collection
    Modified version of existing instrument
    Evaluation of existing instrument

    Reliability

    Test-retest reliability
    Internal consistency
    Coefficient (Cronbach's) alpha
    McDonald's Omega
    Inter-rater reliability
    Person separation
    Generalizability coefficients
    Other reliability evidence

    Vailidity

    Expert judgment
    Response process
    Factor analysis, IRT, Rasch analysis
    Differential item function
    Evidence based on relationships to other variables
    Evidence based on consequences of testing
    Other validity evidence

    Other information

    Difficulty
    Discrimination
    Evidence based on fairness
    Other general evidence
    Review
    DISCLAIMER: The evidence supporting the validity and reliability of the data summarized below is for use of this assessment instrument within the reported settings and populations. The continued collection and evaluation of validity and reliability evidence, in both similar and dissimilar contexts, is encouraged and will support the chemistry education community’s ongoing understanding of this instrument and its limitations.
    This review was generated by a CHIRAL review panel. Each CHIRAL review panel consists of multiple experts who first individually review the citations of the assessment instrument listed on this page for evidence in support of the validity and reliability of the data generated by the instrument. Panels then meet to discuss the evidence and summarize their opinions in the review posted in this tab. These reviews summarize only the evidence that was discussed during the panel which may not represent all evidence available in the published literature or that which appears on the Evidence tab.
    If you feel that evidence is missing from this review, or that something was documented in error, please use the CHIRAL Feedback page.

    Panel Review: Chemistry Concept Inventory (2002)

    (Post last updated June 1, 2021)

    Review panel summary

    The Mulford & Robinson Chemical Concepts Inventory (CCI) is a multiple-choice instrument designed to assess the alternate conceptions of students in high school or first-semester college chemistry [1]. The inventory contains 22 multiple-choice items, half of which are two-tiered. The CCI has been evaluated with students enrolled in general chemistry (including engineering students [4]) at a variety of institutions in the U.S. [1–6]. The reliability of the data collected with this instrument was established using test-retest reliability and single administration reliability [1, 2]. The validity of the data collected with this instrument has been investigated through expert evaluation, response-process interviews, and relations to other variables [1, 3-6]. Conflicting evidence was presented about the quality of the CCI items reported in various studies as evaluated through response-process interviews [1, 3, 6]. In addition, one study reported certain topics were underrepresented as 13 out of 22 items assess two general chemistry topics (conservation of mass-matter and phase change), and the remaining 9 items assess unique topics, which presents additional threats to validity [1].

    Recommendations for use

    In its current form, it is unclear whether the CCI is measuring alternate conceptions of students in high school or first-semester college chemistry. Despite this concern, no modifications and improvements have been made to the CCI items. If using the instrument as developed, users are cautioned to conduct their own examination for the evidence of validity and reliability for the data generated. Future work may explore the utility of changing the CCI items that showed threats to the response process validity [3] as well as adding different items to improve topic coverage and examining how these items function within the item sets.

    Details from panel review

    An extensive literature search was carried out by the developers to identify misconceptions associated with the CCI topics to inform writing of the items [1]. Expert evaluation of the instrument, as well as internal structure validity were explored with four expert chemical education researchers who believed that all the questions were at an appropriate level for students taking a general chemistry course [1]. Additional validity evidence was obtained with 30 chemistry instructors and 8 chemistry education graduate students who completed the content analysis of the CCI items [3].

    Response-process interviews were conducted with 8 students and showed that the students interpreted the questions on the CCI as intended [1]. However, response-process interviews with 25 students in a different study on a subset of the CCI items identified issues with some of the items (e.g., some items may be measuring recall instead of conceptual understanding) [3]. Similar findings were made by Karch et al. who also questioned what is being measured by some of the CCI items [6]. No work has been done to revise and reevaluate the problematic items. This raises concerns that in its current form the CCI is not measuring what it is intended to measure. The threat to validity is further exacerbated by the issue of topic underrepresentation. Specifically, 13 out of 22 items assess two general chemistry topics (conservation of mass-matter and phase change), whereas the remaining 9 items assess unique topics [2]. Although some of the items on the CCI do have some evidence to support the validity of students’ responses, there is not enough consistency among items covering an individual topic to make inferences about students’ alternate conceptions across general chemistry content. The instrument developers sought to “measure the extent of entering student’s alternate conceptions about topics found in the first semester of many traditional general chemistry courses” (p. 739, ref 1). There is a need for multiple items to measure students’ alternate conceptions about an individual concept or topic area.

    Relations to other variables were established across several studies [4-6]. Cracolice and Busby established correlations between CCI scores and scientific reasoning ability, intelligence, attitudes toward chemistry, as well as ACS exam scores [4]. Wheeler et al. established correlations between higher CCI scores and programming skills, solving a chemistry problem, and solving a chemistry problem in Mathematica [5]. Finally, Karch et al. used eye-tracking to characterize the mental tasks that students used when completing some of the CCI items and how those mental tasks elicited changing levels of cognitive load [6].

    Two sources of reliability were assessed for the CCI. Test-retest reliability was assessed with a subset of students who were given a second posttest two weeks after they were given the first posttest. Results showed a good test-retest reproducibility (Pearson correlation = 0.79, posttest). Single administration reliability was evaluated across two studies, with each reporting coefficient alpha values between 0.70 to 0.79 [1, 2]. Rasch person reliability was also reported [2].

    Three studies reported item difficulty and item discrimination values [1, 2, 7]. Barbera identified that 8 items on the CCI have difficulty and discrimination values outside of the recommended values for a concept inventory [2]. Similar result was reported by Reimer et al., who stated that nearly half of the CCI questions are not in the desired item difficulty and item discrimination window [7]. The Rasch model showed poor fit for only one item in both the pretest and posttest data; these results indicate that the data from this item do not fit the model of an item being easier for higher ability students. This finding does not indicate whether the problem is with the item itself or with the actual knowledge of the students [2].

    References

    [1] Mulford, D. R., & Robinson, W. R. (2002). An inventory for alternate conceptions among first-semester general chemistry students. Journal of Chemical Education, 79(6), 739–744. https://doi.org/10.1021/ed079p739

    [2] Barbera, J. (2013). A psychometric analysis of the Chemical Concepts Inventory. Journal of Chemical Education, 90(5), 546–553. https://doi.org/10.1021/ed3004353

    [3] Schwartz, P., & Barbera, J. (2014) Evaluating the content and response process validity of data from the Chemical Concepts Inventory. Journal of Chemical Education, 91(5), 630–640. https://doi.org/10.1021/ed400716p

    [4] Cracolice, M. S., & Busby, B. D. (2015). Preparation for college general chemistry: more than just a matter of content knowledge acquisition. Journal of Chemical Education, 92(11), 1790–1797. https://doi.org/10.1021/acs.jchemed.5b00146

    [5] Wheeler, L. B., Chiu, J. L., & Grisham, C. M. (2016). Computational methods in general chemistry: perceptions of programming, prior experience, and student outcomes. Journal of College Science Teaching, 2016, 45(3), 83–91.

    [6] Karch, J. M., Valles, J. C. G., & Sevian, H. (2019). Looking into the black box: using gaze and pupillometric data to probe how cognitive load changes with mental tasks. Journal of Chemical Education, 96(5), 830–840. https://doi.org/10.1021/acs.jchemed.9b00014

    [7] Reimer, L. C., Leslie, J. M., Bidwell, S. L., Isborn, C. M., Lair, D., Menke, E., Stokes, B. J., & Hratchian, H. P. (2019). Aiming toward an effective Hispanic-serving chemistry curriculum. In Growing Diverse STEM Communities: Methodology, Impact, and Evidence. ACS Symposium Series; American Chemical Society: Washington, DC.

    Versions
    Listed below are all versions and modifications that were based on this instrument or this instrument were based on.
    Instrument has been modified in:
    Name Authors
    • Mayer, K.

    • Jurisevic, M., Glazar, S., Pucko, C.R., & Devetak, I.

    • Lawrie, G.A., Schultz, M., & Wright, A.H.

    • Potgieter, M., Davidowitz, B., & Venter, E.

    • Potgieter, M., Rogan, J.M., & Howie, S.

    • Wood, C., & Breyfogle, B.

    • Regan, A., Childs, P., & Hayes, S.

    • Halakova, Z., & Proksa, M.

    Citations
    Listed below are all literature that develop, implement, modify, or reference the instrument.
    1. Mulford, D.R., & Robinson, W.R. (2002). An inventory for alternate conceptions among first-semester general chemistry students. Journal of Chemical Education, 79(6).

    2. Kruse, R.A., & Roehrig, G.H. (2005). A comparison study: Assessing teachers' conceptions with the chemistry concepts inventory. Journal of Chemical Education, 82(8), 1246-1250.

    3. chemedx.org/JCEDLib/QBank/collection/CQandChP/index.html#:~:text=Conceptual%20Questions%20and%20Challenge%20Problems%20is%20a%20source%20of%20questions,and%20problem%20solving%20in%20chemistry.

    4. Barker, J. G. Effect of instructional methodologies on student achievement modeling instruction vs. traditional

      instruction (2012). Master's Thesis.

    5. Marais, F., & Combrinck, S. (2009). An approach to dealing with the difficulties undergraduate chemistry students experience with stoichiometry. South African Journal of Chemistry, 62(1), 88-96.

    6. Reimer, L. C., Leslie, J. M., Bidwell, S. L., Isborn, C. M., Lair, D., Menke, E., ... & Hratchian, H. P. (2019). Aiming toward an Effective Hispanic-Serving Chemistry Curriculum. In Growing Diverse STEM Communities: Methodology, Impact, and Evidence (pp. 49-66). American Chemical Society.

    7. Cacciatore, K.L., & Sevian, H. (2009). Incrementally approaching an inquiry lab curriculum: Can changing a single laboratory experiment improve student performance in general chemistry?. Journal of Chemical Education, 86(4), 498-505.

    8. Karch, J.M., Garcia, Valles J.C., & Sevian, H. (2019). Looking into the Black Box: Using Gaze and Pupillometric Data to Probe How Cognitive Load Changes with Mental Tasks. Journal of Chemical Education, 96(5), 830-840.

    9. Van Dusen, B., Turner, M., Langdon, L., & Otero, V. (2016). Learning assistant supported student outcomes (LASSO). Colorado Learning Assistant Program Implementation Guide, 1-64.

    10. Nurrenbern, S. C., & Robinson, W. R. (1998). Conceptual questions and challenge problems. Journal of Chemical Education, 75(11), 1502.

    11. Stanich, C.A., Pelch, M.A., Theobald, E.J., & Freeman, S. (2018). A new approach to supplementary instruction narrows achievement and affect gaps for underrepresented minorities, first-generation students, and women. Chemistry Education Research and Pract

    12. Barbera, J., & VandenPlas, J. R. (2011). All assessment materials are not created equal: The myths about instrument development, validity, and reliability. In D. Bunce (Ed.), Investigating Classroom Myths through Research on Teaching and Learning: ACS Symposium Series (Vol. 1074, pp. 177–193). Washington, D.C.: American Chemical Society. doi.org/10.1021/bk-2011-1074.ch011

    13. Wheeler, L. B., Chiu, J. L., & Grisham, C. M. (2016). Computational Methods in General Chemistry: Perceptions of Programming, Prior Experience, and Student Outcomes. Journal of College Science Teaching, 45(3).

    14. Schwartz, P., & Barbera, J. (2014). Evaluating the content and response process validity of data from the chemical concepts inventory. Journal of Chemical Education, 91(5), 630-640.

    15. Barbera, J. (2013). A psychometric analysis of the chemical concepts inventory. Journal of Chemical Education, 90(5), 546-553.

    16. Geelan, D., & Mukherjee, M. (2010, May). Measuring the effectiveness of computer-based scientific visualisations for conceptual development in Australian chemistry classrooms. In Global Learn (pp. 3536-3545). Association for the Advancement of Computing in Education (AACE).

    17. Reimer, Lynn & Leslie, J. & Bidwell, Samantha & Isborn, Christine & Lair, Deborah & Menke, Erik & Stokes, Benjamin & Hratchian, Hrant. (2019). Aiming toward an Effective Hispanic-Serving Chemistry Curriculum. 10.1021/bk-2019-1328.ch004.

    18. Pentecost, T.C., & Barbera, J. (2013). Measuring learning gains in chemical education: A comparison of two methods. Journal of Chemical Education, 90(7), 839-845.

    19. Cracolice, M.S., & Busby, B.D. (2015). Preparation for College General Chemistry: More than Just a Matter of Content Knowledge Acquisition. Journal of Chemical Education, 92(11), 1790-1797.