Skip to main content

Chemistry Concept Inventory

CCI (Krause et al., 2004)

    OVERVIEW
    Overview
    Listed below is general information about the instrument.
    Summary
    Original author(s)
    • Krause, S., Birk, J., Bauer, R., Jenkins, B., & Pavelich, M. J.

    • Pavelich, M., Jenkins, B., Birk, J., Bauer, R., & Krause, S.

    Original publication
    • Krause, S., Birk, J., Bauer, R., Jenkins, B., & Pavelich, M. J. (2004, October). Development, testing, and application of a chemistry concept inventory. In 34th Annual Frontiers in Education, 2004. FIE 2004. (pp. T1G-1). IEEE.

    • Pavelich, M., Jenkins, B., Birk, J., Bauer, R., & Krause, S. (2004, June). Development of a chemistry concept inventory for use in chemistry, materials and other engineering courses. In Proceedings of the 2004 American Society for Engineering Annual Conference & Exposition. Paper (Vol. 1907).

    Year original instrument was published 2004, 2004
    Inventory
    Number of items 40
    Number of versions/translations 1
    Cited implementations 2
    Language
    • English
    Country United States
    Format
    • Multiple Choice
    Intended population(s)
    • Students
    • Undergraduate
    Domain
    • Cognitive
    Topic
    • General Chemistry
    Evidence
    The CHIRAL team carefully combs through every reference that cites this instrument and pulls all evidence that relates to the instruments’ validity and reliability. These data are presented in the following table that simply notes the presence or absence of evidence related to that concept, but does not indicate the quality of that evidence. Similarly, if evidence is lacking, that does not necessarily mean the instrument is “less valid,” just that it wasn’t presented in literature. Learn more about this process by viewing the CHIRAL Process and consult the instrument’s Review (next tab), if available, for better insights into the usability of this instrument.

    Information in the table is given in four different categories:
    1. General - information about how each article used the instrument:
      • Original development paper - indicates whether in which paper(s) the instrument was developed initially
      • Uses the instrument in data collection - indicates whether an article administered the instrument and collected responses
      • Modified version of existing instrument - indicates whether an article has modified a prior version of this instrument
      • Evaluation of existing instrument - indicates whether an article explicitly provides evidence that attempt to evaluate the performance of the instrument; lack of a checkmark here implies an article that administered the instrument but did not evaluate the instrument itself
    2. Reliability - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    3. Validity - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    4. Other Information - information that may or may not directly relate to the evidence for validity and reliability, but are commonly reported when evaluating instruments; please see the Glossary for term definitions
    Publications: 1 2

    General

    Original development paper
    Uses the instrument in data collection
    Modified version of existing instrument
    Evaluation of existing instrument

    Reliability

    Test-retest reliability
    Internal consistency
    Coefficient (Cronbach's) alpha
    McDonald's Omega
    Inter-rater reliability
    Person separation
    Generalizability coefficients
    Other reliability evidence

    Vailidity

    Expert judgment
    Response process
    Factor analysis, IRT, Rasch analysis
    Differential item function
    Evidence based on relationships to other variables
    Evidence based on consequences of testing
    Other validity evidence

    Other information

    Difficulty
    Discrimination
    Evidence based on fairness
    Other general evidence
    Review
    DISCLAIMER: The evidence supporting the validity and reliability of the data summarized below is for use of this assessment instrument within the reported settings and populations. The continued collection and evaluation of validity and reliability evidence, in both similar and dissimilar contexts, is encouraged and will support the chemistry education community’s ongoing understanding of this instrument and its limitations.
    This review was generated by a CHIRAL review panel. Each CHIRAL review panel consists of multiple experts who first individually review the citations of the assessment instrument listed on this page for evidence in support of the validity and reliability of the data generated by the instrument. Panels then meet to discuss the evidence and summarize their opinions in the review posted in this tab. These reviews summarize only the evidence that was discussed during the panel which may not represent all evidence available in the published literature or that which appears on the Evidence tab.
    If you feel that evidence is missing from this review, or that something was documented in error, please use the CHIRAL Feedback page.

    Panel Review: Chemistry Concept Inventory (Chemistry 2004)

    (Post last updated June 1, 2021)

    Review panel summary

    The Krause et al. Chemistry Concept Inventory (CCI) is a 40-item multiple-choice test that was designed to assess student understanding of general chemistry topics. Two items associated with each of 10 first semester and 10 second semester general chemistry topics are included in the instrument. The CCI has been evaluated with university students enrolled in a general chemistry course sequence in the U.S. [1, 2]. Literature to date has shown insufficient evidence in support of the final version of the CCI instrument generating valid data. Very few sources of evidence for validity of the instrument data have been found in the literature. Relations to other variables validity was established by identifying a correlation between the scores obtained with the CCI and the final course grades, as well as by identifying significant differences in the gains between students taught general chemistry by different instructors. Coefficient alpha was used to estimate the single administration reliability. The coefficient alpha values ranged from 0.47 to 0.68 for the administration of the final version of the instrument [1, 2]. It was unclear, however, whether these values were associated with the pre- or post-administration of the instrument.

    Recommendations for use

    Limited evidence for validity and reliability was established with the data from the studies describing how the instrument was developed and evaluated [1, 2]. If using the instrument as developed, users are cautioned to conduct their own examination for the evidence of validity and reliability for the data generated.

    Details from panel review

    Additional aspects of validity and reliability have been investigated for the data generated by the pilot administrations of the CCI [1]. Chemistry and engineering faculty selected topics for the instrument from both semesters of the general chemistry course sequence, which is some evidence of the internal structure validity. A literature search was carried out by the developers to identify misconceptions and distractors for the CCI instrument. Discrimination index and difficulty index were used to evaluate the items in the first version of the instrument, but the development studies did not present any results for these analyses or how these results were used to inform changes to the pilot version of the instrument [1, 2]. Additionally, eleven students were interviewed about 7 out of the 40 questions of a pilot version to establish the response process validity for part of the instrument. The information from the interviews was used to modify and improve the items. The modified wording was tested with teaching assistants. More robust evidence for the response process validity is suggested by conducting interviews with students on all 40 items of the instrument, as well as testing the revised items with the population for which the instrument was intended (i.e., general chemistry students). No response process interviews have been conducted to evaluate the items in the final version of the CCI.

    References

    [1] Krause, S., Birk, J., Bauer, R., Jenkins, B., & Pavelich, M. J. Development, Testing, and Application of a Chemistry Concept Inventory. 34th ASEE/IEEE Frontiers in Education Conference, Savannah, GA, October 20-23, T1G-1-5.

    [2] Pavelich, M., Jenkins, B., Birk, J., Bauer, R., & Krause, S. Development of a Chemistry Concept Inventory for Use in Chemistry, Materials and other Engineering Courses. Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition, 9.427.1-8.

    Versions
    This instrument has not been modified nor was it created based on an existing instrument.
    Citations
    Listed below are all literature that develop, implement, modify, or reference the instrument.
    1. Krause, S., Birk, J., Bauer, R., Jenkins, B., & Pavelich, M. J. (2004, October). Development, testing, and application of a chemistry concept inventory. In 34th Annual Frontiers in Education, 2004. FIE 2004. (pp. T1G-1). IEEE.

    2. Pavelich, M., Jenkins, B., Birk, J., Bauer, R., & Krause, S. (2004, June). Development of a chemistry concept inventory for use in chemistry, materials and other engineering courses. In Proceedings of the 2004 American Society for Engineering Annual Conference & Exposition. Paper (Vol. 1907).