Skip to main content

Covalent Bonding And Structure Diagnostic Instrument

CBSC

    OVERVIEW
    Overview
    Listed below is general information about the instrument.
    Summary
    Original author(s)
    • Peterson, R., Treagust, D., & Garnett, P.

    • Peterson, R.F., Treagust, D.F., & Garnett, P.

    Original publication
    • Peterson, R., Treagust, D., & Garnett, P. (1986). Identification of secondary students' misconceptions of covalent bonding and structure concepts using a diagnostic instrument. Research in Science Education, 16(1), 40-48.

    • Peterson, R.F., Treagust, D.F., & Garnett, P. (1989). Development and application of a diagnostic instrument to evaluate grade‐11 and ‐12 students' concepts of covalent bonding and structure following a course of instruction. Journal of Research in Scienc

    Year original instrument was published 1986, 1989
    Inventory
    Number of items 15
    Number of versions/translations 1
    Cited implementations 5
    Language
    • English
    Country Australia, United States
    Format
    • Multiple Choice
    Intended population(s)
    • Students
    • High School
    • Secondary School
    • Undergraduate
    • Graduate
    • Faculty
    • Tertiary
    Domain
    • Cognitive
    Topic
    • Bonding
    • Chemical Structure
    Evidence
    The CHIRAL team carefully combs through every reference that cites this instrument and pulls all evidence that relates to the instruments’ validity and reliability. These data are presented in the following table that simply notes the presence or absence of evidence related to that concept, but does not indicate the quality of that evidence. Similarly, if evidence is lacking, that does not necessarily mean the instrument is “less valid,” just that it wasn’t presented in literature. Learn more about this process by viewing the CHIRAL Process and consult the instrument’s Review (next tab), if available, for better insights into the usability of this instrument.

    Information in the table is given in four different categories:
    1. General - information about how each article used the instrument:
      • Original development paper - indicates whether in which paper(s) the instrument was developed initially
      • Uses the instrument in data collection - indicates whether an article administered the instrument and collected responses
      • Modified version of existing instrument - indicates whether an article has modified a prior version of this instrument
      • Evaluation of existing instrument - indicates whether an article explicitly provides evidence that attempt to evaluate the performance of the instrument; lack of a checkmark here implies an article that administered the instrument but did not evaluate the instrument itself
    2. Reliability - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    3. Validity - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    4. Other Information - information that may or may not directly relate to the evidence for validity and reliability, but are commonly reported when evaluating instruments; please see the Glossary for term definitions
    Publications: 1 2 3 4 5 6 7 8 9 10

    General

    Original development paper
    Uses the instrument in data collection
    Modified version of existing instrument
    Evaluation of existing instrument

    Reliability

    Test-retest reliability
    Internal consistency
    Coefficient (Cronbach's) alpha
    McDonald's Omega
    Inter-rater reliability
    Person separation
    Generalizability coefficients
    Other reliability evidence

    Vailidity

    Expert judgment
    Response process
    Factor analysis, IRT, Rasch analysis
    Differential item function
    Evidence based on relationships to other variables
    Evidence based on consequences of testing
    Other validity evidence

    Other information

    Difficulty
    Discrimination
    Evidence based on fairness
    Other general evidence
    Review
    DISCLAIMER: The evidence supporting the validity and reliability of the data summarized below is for use of this assessment instrument within the reported settings and populations. The continued collection and evaluation of validity and reliability evidence, in both similar and dissimilar contexts, is encouraged and will support the chemistry education community’s ongoing understanding of this instrument and its limitations.
    This review was generated by a CHIRAL review panel. Each CHIRAL review panel consists of multiple experts who first individually review the citations of the assessment instrument listed on this page for evidence in support of the validity and reliability of the data generated by the instrument. Panels then meet to discuss the evidence and summarize their opinions in the review posted in this tab. These reviews summarize only the evidence that was discussed during the panel which may not represent all evidence available in the published literature or that which appears on the Evidence tab.
    If you feel that evidence is missing from this review, or that something was documented in error, please use the CHIRAL Feedback page.

    Panel Review: Covalent Bonding and Structure Diagnostic Instrument

    (Post last updated July 21, 2021)

    Review panel summary

    The Covalent Bonding and Structure Diagnostic Instrument (CBSC) is a fifteen-item, two-tiered multiple-choice instrument designed to measure students’ conceptual knowledge of chemical bonding. The two-tiered multiple-choice format is intended to measure students’ content knowledge (tier 1) and reasoning (tier 2) about chemical bonding. Answer options for reasoning questions (tier 2) are intended to be used to identify student misconceptions about chemical bonding.

    The CBSC has been evaluated for use in high school (secondary) level chemistry settings in Australia. The developers of the CBSC have provided some evidence of validity and reliability of data generated using the instrument [1, 2]. Tier 1 (content) items were written based on a set of propositional statements related to covalent bonding, which were reviewed by chemistry teachers; all final items (tier 1 + tier 2) were also reviewed by chemistry educators. Together, these iterations of review and feedback from chemistry educators provide test content validity evidence, specifically that items on the CBSC align with experts’ understanding of the intended construct, chemical bonding [1, 2]. The answer options for tier 2 (reasoning) items were derived from student interviews and open-ended versions of the CBSC; analysis of interviews and student responses to open-ended prompts informed the iterative revision of the CBSC, providing some evidence of response process validity [1, 2]. Though the developers suggest that items may be grouped according to content area (e.g., items on bond polarity, items on intermolecular forces), no internal structure validity evidence is provided supporting the use of subsets of items or subscores derived from subsets of items [1, 2]. Coefficient alpha was used to estimate the single administration reliability of the CBSC (reported value α = .73) [1, 2]; a single value for alpha was reported, as opposed to reporting alpha at the subscale level. The developers [1, 2] and other users [3] observed that the prevalence of misconceptions about chemical bonding decreased with additional chemistry coursework (from grade 11 to grade 12 [1, 2]; from general chemistry students to chemistry graduate students and faculty [3]), providing some validity support in regard to relations with other variables.

    Recommendations for use

    The CBSC instrument was intended to measure students’ content knowledge (tier 1) and reasoning (tier 2) about chemical bonding [1, 2]. While developers suggest that items may be grouped into subsets of items related to six topic areas (bond polarity, molecular shape, molecular polarity, intermolecular forces, lattices, and the octet rule), there is no evidence supporting the use of subscales in the CBSC or sub-scores derived from those subscales. Users interested in CBSC results by the item groupings are encouraged to investigate their data for dimensionality.

    Some evidence supports the use of CBSC for measurement of students’ knowledge related to bonding in secondary chemistry settings (in Australia) [1, 2]. The CBSC has also been used in high school, undergraduate, and graduate chemistry settings in the USA and administered to chemistry faculty, though there is limited evidence for the validity and reliability of data generated using the instrument in these settings [3]. Across studies by both the developers [1, 2] and secondary users [3], there is inconsistency in scoring and reporting; specifically, developers and users report scores by tier in some cases and combine scores elsewhere. Users are encouraged to justify their specific scoring procedures based on the purposes of their testing and/or research questions.

    Generally, the performance of the instrument and its items would benefit from a fuller investigation, particularly in settings dissimilar to that for which the instrument was developed (Australian secondary schools).

    Details from panel review

    The developers of the CBSC provide some evidence to support the use of the instrument as described, including subjecting the instrument to review and iterative refinement based on both feedback from chemistry educators (test content validity) and analysis of interviews and written responses from students (response process validity). As evidence of single administration reliability, the developers report a single coefficient alpha value (0.73), as opposed to reporting coefficient alpha at the subscale level [1, 2]. Both the developers [1, 2] and a secondary study [3] investigated the prevalence of incorrect answer choices (misconceptions) for different groups (relations with other variables). In [1, 2], students in grade 12 performed better on the CBSC than students in grade 11, as indicated by the proportion of students selecting the correct response option; prevalence of specific misconceptions also generally decreased with students’ grade level. In one study [3], the authors report a linear relation between Years of [Chemistry] Study and score on the CBSC (tier 1 and tier 2 analyzed separately).

    However, the reviewers believe that additional validity and reliability evidence is warranted to support the use of CBSC. Item groupings seem to be derived from the developers’ (or perhaps chemistry educators’ who participated in the feedback process) understanding of related concepts. For instance, while the developers suggest that subgroups of items are intended to elicit student ideas/knowledge related to six conceptual areas (bond polarity, molecular shape, polarity of molecules, intermolecular forces, lattices, octet rule), no evidence is provided to suggest that 1) subscales do or do not measure distinct constructs or 2) subscales may (not) be considered/scored separately; the authors never explicitly suggest that subgroups should be separately scored but do report results according to subgroup. Secondary users [3] report using subscales of items because “the other three [subscales] did not provide reliable results”; no validity evidence is provided to support use of subscales.

    Across studies by both the developers [1, 2] and secondary users [3], there is inconsistency in scoring and reporting; specifically, tier 1 and tier 2 are sometimes scored and reported separately. However, scores on both tiers were considered together, for instance, when computing coefficient alpha [1, 2]. Additional evidence or justification to support an approach to scoring and data interpretation may be warranted.

    While some evidence supporting validity and reliability of data generated using the CBSC in secondary school settings in Australia were provided by the instrument’s developers [1, 2], more evidence is needed to support use in other, dissimilar settings, such as university settings in the USA.

    References

    [1] Peterson, R., Treagust, D., & Garnett, P. (1986). Identification of secondary students’ misconceptions of covalent bonding and structure concepts using a diagnostic instrument. Research in Science Education, 16(1), 40–48. https://doi.org/10.1007/BF02356816

    [2] Peterson, R.F., Treagust, D.F. and Garnett, P. (1989), Development and application of a diagnostic instrument to evaluate grade-11 and -12 students' concepts of covalent bonding and structure following a course of instruction. Journal of Research in Science Teaching, 26: 301-314. https://doi.org/10.1002/tea.3660260404

    [3] Birk, J. P., & Kurtz, M. J. (1999). Effect of Experience on Retention and Elimination of Misconceptions about Molecular Structure and Bonding. Journal of Chemical Education, 76(1), 124. https://doi.org/10.1021/ed076p124

    Versions
    This instrument has not been modified nor was it created based on an existing instrument.
    Citations
    Listed below are all literature that develop, implement, modify, or reference the instrument.
    1. Peterson, R., Treagust, D., & Garnett, P. (1986). Identification of secondary students' misconceptions of covalent bonding and structure concepts using a diagnostic instrument. Research in Science Education, 16(1), 40-48.

    2. Peterson, R.F., Treagust, D.F., & Garnett, P. (1989). Development and application of a diagnostic instrument to evaluate grade‐11 and ‐12 students' concepts of covalent bonding and structure following a course of instruction. Journal of Research in Scienc

    3. Treagust, D. (1986). Evaluating students' misconceptions by means of diagnostic multiple choice items. Research in Science education, 16(1), 199-207.

    4. Treagust, D.F. (1988). Development and use of diagnostic tests to evaluate students’ misconceptions in science. International Journal of Science Education, 10(2), 159-169.

    5. Birk, J.P., & Kurtz, M.J. (1999). Effect of experience on retention and elimination of misconceptions about molecular structure and bonding. Journal of Chemical Education, 76(1), 124-128.