Skip to main content

Bonding Representations Inventory

BRI

    OVERVIEW
    Overview
    Listed below is general information about the instrument.
    Summary
    Original author(s)
    • Luxford, C.J., & Bretz, S.L.

    Original publication
    • Luxford, C.J., & Bretz, S.L. (2014). Development of the bonding representations inventory to identify student misconceptions about covalent and ionic bonding representations. Journal of Chemical Education, 91(3), 312-320.

    Year original instrument was published 2014
    Inventory
    Number of items 23
    Number of versions/translations 1
    Cited implementations 1
    Language
    • English
    Country United States
    Format
    • Multiple Choice
    Intended population(s)
    • Students
    • High School
    • Undergraduate
    Domain
    • Cognitive
    Topic
    • Bonding
    Evidence
    The CHIRAL team carefully combs through every reference that cites this instrument and pulls all evidence that relates to the instruments’ validity and reliability. These data are presented in the following table that simply notes the presence or absence of evidence related to that concept, but does not indicate the quality of that evidence. Similarly, if evidence is lacking, that does not necessarily mean the instrument is “less valid,” just that it wasn’t presented in literature. Learn more about this process by viewing the CHIRAL Process and consult the instrument’s Review (next tab), if available, for better insights into the usability of this instrument.

    Information in the table is given in four different categories:
    1. General - information about how each article used the instrument:
      • Original development paper - indicates whether in which paper(s) the instrument was developed initially
      • Uses the instrument in data collection - indicates whether an article administered the instrument and collected responses
      • Modified version of existing instrument - indicates whether an article has modified a prior version of this instrument
      • Evaluation of existing instrument - indicates whether an article explicitly provides evidence that attempt to evaluate the performance of the instrument; lack of a checkmark here implies an article that administered the instrument but did not evaluate the instrument itself
    2. Reliability - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    3. Validity - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    4. Other Information - information that may or may not directly relate to the evidence for validity and reliability, but are commonly reported when evaluating instruments; please see the Glossary for term definitions
    Publications: 1

    General

    Original development paper
    Uses the instrument in data collection
    Modified version of existing instrument
    Evaluation of existing instrument

    Reliability

    Test-retest reliability
    Internal consistency
    Coefficient (Cronbach's) alpha
    McDonald's Omega
    Inter-rater reliability
    Person separation
    Generalizability coefficients
    Other reliability evidence

    Vailidity

    Expert judgment
    Response process
    Factor analysis, IRT, Rasch analysis
    Differential item function
    Evidence based on relationships to other variables
    Evidence based on consequences of testing
    Other validity evidence

    Other information

    Difficulty
    Discrimination
    Evidence based on fairness
    Other general evidence
    Review
    DISCLAIMER: The evidence supporting the validity and reliability of the data summarized below is for use of this assessment instrument within the reported settings and populations. The continued collection and evaluation of validity and reliability evidence, in both similar and dissimilar contexts, is encouraged and will support the chemistry education community’s ongoing understanding of this instrument and its limitations.
    This review was generated by a CHIRAL review panel. Each CHIRAL review panel consists of multiple experts who first individually review the citations of the assessment instrument listed on this page for evidence in support of the validity and reliability of the data generated by the instrument. Panels then meet to discuss the evidence and summarize their opinions in the review posted in this tab. These reviews summarize only the evidence that was discussed during the panel which may not represent all evidence available in the published literature or that which appears on the Evidence tab.
    If you feel that evidence is missing from this review, or that something was documented in error, please use the CHIRAL Feedback page.

    Panel Review: Bonding Representations Inventory (BRI)

    (Post last updated 09 June 2023)

    Review panel summary   
    The Bonding Representations Inventory (BRI) consists of 23 multiple choice items (7 one-tiered items and 8 two-tiered item pairs) [1]. This concept inventory was designed to measure students’ ‘misconceptions’ about bonding and bonding representations related to four themes: periodic trends, electrostatic interactions, the octet rule, and surface features [1]. The BRI has been administered in high-school level, advanced placement, and undergraduate general chemistry courses at various high schools and post-secondary institutions across the United States. At each institution, the BRI was administered both post-instruction and post-assessment of content related to bonding and Lewis structures [1].

    Evidence based on test content was collected during development of the BRI. The generation of items and distracters was informed by interview data collected from students enrolled in high school chemistry or undergraduate general chemistry. Additionally, once the developers had created an alpha-version of the concept inventory, each of the items were reviewed by a panel of experts (five general chemistry instructors). Evidence based on response process was collected for the alpha-version of the BRI. This evidence was collected via cognitive interviews with two students enrolled in an undergraduate general chemistry course, although descriptions of these interviews were not provided by the developers. Expert reviews, cognitive interviews, and item analysis of data were all used to inform the modification or removal of items from the alpha-version of the BRI, resulting in the final version of the concept inventory.

    Evidence of single administration reliability was assessed through the calculation of coefficient alpha for each course level in which BRI data was collected (i.e., high school, advanced placement, and undergraduate general chemistry). The alpha value for the advanced placement students’ data was found to be acceptable, while the alpha values for the high school and undergraduate general chemistry data were lower than the typically recommended cutoff for the alpha metric. While the developers chose to report alpha values for their data, they acknowledged that alpha may be an inappropriate indicator of reliability for concept inventories, “owing to the fragmented knowledge being measured” [1].

    Evidence based on relation to other variables was assessed by comparing the performance of high school students, advanced placement students, and undergraduate general chemistry students on the BRI. The developers of the instrument proposed that “[b]ecause the BRI was administered to students in various stages of understanding about the construct of bonding, it would be expected that students enrolled in higher-level courses ought to perform better on the inventory.” Kruskal-Wallis H tests were used to compare the relative rankings of the data and significant effects were found for class type on total BRI scores. Additionally, post-hoc Mann-Whitney U tests revealed that both the undergraduate general chemistry and the advanced placement students significantly outperformed the high school students, and the undergraduate general chemistry students performed significantly better than the advanced placement students, as theorized by the developers.

    Item difficulty and item discrimination of the data generated by the BRI were also examined. Data collected from high school, advanced placement, and undergraduate general chemistry students showed that item difficulty fell within the acceptable cutoff range for all but eight, three, and five items, respectively. Overall, no items were uniformly difficult or uniformly easy across all three groups of students. Regarding discrimination, data collected from the three aforementioned groups showed acceptable values for Fergusons’ delta, which indicates that the measure can distinguish students’ understanding across the full range of scores. Additionally, item discrimination was investigated to assess how well each item could discriminate between the top-performing students and the lowest-performing students. Data collected from high school, advanced placement, and undergraduate general chemistry students showed that item discrimination fell above the acceptable cutoff for all but seven, three, and nine items, respectively. Like item difficulty, there were no items with poor item discrimination values across all three groups of students.

    Recommendations for use   
    The BRI has been used to assess students’ understanding of bonding and bonding representations in high school, advanced placement, and undergraduate general chemistry courses across the country. Additionally, the concept inventory has been used to identify differential levels of understanding among these groups.

    Future users of the concept inventory should be aware that validity evidence based on response process is currently limited for the BRI. When reporting on the limitations of their study, the developers of the BRI highlighted the fact that cognitive interviews intended to collect evidence based on response process were conducted with two undergraduate general chemistry students only and did not include high school and advanced placement students. As highlighted by the developers, future researchers who are interested in using the BRI with these populations may want to conduct additional interviews to determine if these students hold “additional misconceptions not measured by the inventory,” or if “the wording of the items confused [the] students” [1]. This type of validity evidence would greatly strengthen the inferences that can be made from BRI data.

    Details from panel review   
    Regarding evidence based on test content, it is unclear what feedback the expert panel of general chemistry instructors provided on the items, and how that feedback was used to inform the modification or removal of items from the alpha version of the BRI. Regarding evidence based on response process, it is similarly unclear how data collected via cognitive interview data collected from the two undergraduate chemistry students was used to inform the modification or removal of items from the alpha version of the BRI.

    For item difficulty, data collected from high school chemistry students showed that values fell within the acceptable cutoff range of 0.3-0.8 for all but eight items (Items 5,6,9,11,12,13,14,18), data collected from advanced placement students showed that values fell within the acceptable cutoff range for all but three items (Items 1,20,21), and data collected from undergraduate general chemistry students showed that values fell within the acceptable cutoff range for all but five items (Items 1,12,18,20,21). Turning to discrimination, data collected from the three aforementioned groups showed acceptable values for Fergusons’ delta (> 0.9). Additionally, item level discrimination was investigated to assess how well each item could discriminate between the top 27% of students and the lowest 27% of students. For item level discrimination, data collected from high school chemistry students showed that values fell above the acceptable cutoff of 0.3 for all but seven items (Items 5,6,9,11,12,13,18), data collected from advanced placement students showed that values fell above the acceptable cutoff for all but three items (Items 1,2,20), and data collected from undergraduate general chemistry students showed that values fell above the acceptable cutoff for all but five items (Items 1,2,9,10,12,18,19,20,21). The developers of the BRI highlighted that the “overall distributions of the discrimination−difficulty points shift across the three courses, suggesting that as students gain knowledge in chemistry, they [are] able to answer more items correctly. Hence, the corresponding shift in difficulty and higher discrimination indices [are] observed, suggesting that the students have less fragmented understanding" [1].

    References

    [1] Luxford, C.J. & Bretz, S.L. (2014) Development of the Bonding Representations Inventory To Identify Student Misconceptions about Covalent and Ionic Bonding Representations. J. Chem. Educ. 91(3), 312-320.

    Versions
    This instrument has not been modified nor was it created based on an existing instrument.
    Citations
    Listed below are all literature that develop, implement, modify, or reference the instrument.
    1. Luxford, C.J., & Bretz, S.L. (2014). Development of the bonding representations inventory to identify student misconceptions about covalent and ionic bonding representations. Journal of Chemical Education, 91(3), 312-320.