Skip to main content

Chemistry And Biology Assessment For Biochemistry

    OVERVIEW
    Overview
    Listed below is general information about the instrument.
    Summary
    Original author(s)
    • Villafañe, S. M., Loertscher, J., Minderhout, V., & Lewis, J. E.

    • Villafañe, S.M., Bailey, C.P., Loertscher, J., Minderhout, V., & Lewis, J.E.

    Original publication
    • Villafañe, S. M., Loertscher, J., Minderhout, V., & Lewis, J. E. (2011). Uncovering students' incorrect ideas about foundational concepts for biochemistry. Chemistry Education Research and Practice, 12(2), 210-218.

    • Villafañe, S.M., Bailey, C.P., Loertscher, J., Minderhout, V., & Lewis, J.E. (2011). Development and analysis of an instrument to assess student understanding of foundational concepts before biochemistry coursework. Biochemistry and Molecular Biology Educ

    Year original instrument was published 2011, 2011
    Inventory
    Number of items 24
    Number of versions/translations 1
    Cited implementations 7
    Language
    • English
    Country United States
    Format
    • Multiple Choice
    Intended population(s)
    • Students
    • Undergraduate
    Domain
    • Cognitive
    Topic
    • Biochemistry
    Evidence
    The CHIRAL team carefully combs through every reference that cites this instrument and pulls all evidence that relates to the instruments’ validity and reliability. These data are presented in the following table that simply notes the presence or absence of evidence related to that concept, but does not indicate the quality of that evidence. Similarly, if evidence is lacking, that does not necessarily mean the instrument is “less valid,” just that it wasn’t presented in literature. Learn more about this process by viewing the CHIRAL Process and consult the instrument’s Review (next tab), if available, for better insights into the usability of this instrument.

    Information in the table is given in four different categories:
    1. General - information about how each article used the instrument:
      • Original development paper - indicates whether in which paper(s) the instrument was developed initially
      • Uses the instrument in data collection - indicates whether an article administered the instrument and collected responses
      • Modified version of existing instrument - indicates whether an article has modified a prior version of this instrument
      • Evaluation of existing instrument - indicates whether an article explicitly provides evidence that attempt to evaluate the performance of the instrument; lack of a checkmark here implies an article that administered the instrument but did not evaluate the instrument itself
    2. Reliability - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    3. Validity - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    4. Other Information - information that may or may not directly relate to the evidence for validity and reliability, but are commonly reported when evaluating instruments; please see the Glossary for term definitions
    Publications: 1 2 3 4 5 6 7

    General

    Original development paper
    Uses the instrument in data collection
    Modified version of existing instrument
    Evaluation of existing instrument

    Reliability

    Test-retest reliability
    Internal consistency
    Coefficient (Cronbach's) alpha
    McDonald's Omega
    Inter-rater reliability
    Person separation
    Generalizability coefficients
    Other reliability evidence

    Vailidity

    Expert judgment
    Response process
    Factor analysis, IRT, Rasch analysis
    Differential item function
    Evidence based on relationships to other variables
    Evidence based on consequences of testing
    Other validity evidence

    Other information

    Difficulty
    Discrimination
    Evidence based on fairness
    Other general evidence
    Review
    DISCLAIMER: The evidence supporting the validity and reliability of the data summarized below is for use of this assessment instrument within the reported settings and populations. The continued collection and evaluation of validity and reliability evidence, in both similar and dissimilar contexts, is encouraged and will support the chemistry education community’s ongoing understanding of this instrument and its limitations.
    This review was generated by a CHIRAL review panel. Each CHIRAL review panel consists of multiple experts who first individually review the citations of the assessment instrument listed on this page for evidence in support of the validity and reliability of the data generated by the instrument. Panels then meet to discuss the evidence and summarize their opinions in the review posted in this tab. These reviews summarize only the evidence that was discussed during the panel which may not represent all evidence available in the published literature or that which appears on the Evidence tab.
    If you feel that evidence is missing from this review, or that something was documented in error, please use the CHIRAL Feedback page.

    Panel Review: Chemistry & Biology Assessment for Biochemistry

    (Post last updated June 14, 2022)

    Review panel summary

    The Chemistry & Biology Assessment for Biochemistry (C&BA) is intended to assess foundational understanding of chemistry and biology topics that are viewed as prerequisite to biochemistry learning. The instrument consists of 24 items, divided into groups of three questions associated with each of eight topics: bond energy, free energy, London dispersion forces, pH/pKa, hydrogen bonding, alpha helix, amino acids, and protein function. It has been tested with undergraduate biochemistry students at a variety of US institutions, some of which are classified as Hispanic-serving institutions by the US Department of Education.

    The developers of the C&BA described the process of instrument development in great detail [1]. Of note is the attention given to content validity. The items were written by a team of PhD faculty members in chemistry, biology, and biochemistry. The multiple choice items are highly structured so that across all three items designed for each topic, the four multiple choice options contain a single consistent correct idea and three common misconceptions. After an initial writing of many items, the final items selected were again revised for consistency and clarity by the authors. The data produced by the 24-item instrument were assessed for internal structure validity using confirmatory factor analysis. The fit statistics used to determine whether the eight-factor solution matched the data were satisfactory. However, the panel was unable to fully assess the internal structure validity evidence because the complete model was not described. While cognitive interviews were described in a subsequent paper [2, supplementary information], it is unclear how the information from them may have been used to improve the items or the degree to which they provided supportive evidence of response process validity.

    Two publications [1, 2] reported single administration reliability using coefficient alpha values for each of the eight subscales. These values ranged widely and it is unclear whether they provide sufficient evidence for the reliability of the data.

    While difficulty and discrimination values were not reported by item, the low overall scores in all the studies examined [1-5] suggest that the items are quite difficult and therefefore may be unlikely to discriminate well between high and low achieving students.

    Recommendations for use

    The C&BA has been used and tested with undergraduate biochemistry students in the US andis only available by contacting the authors [1] directly.

    Traditionally, the C&BA is scored separately for each topic, with a student either getting all three items correct (scored as correct), or getting two or fewer items correct (scored as incorrect). When scored in this way, student scores tend to be quite low. Mean scores in study [1] on each topic were between 0.05 and 0.33, meaning that about 5-33% of students got all three questions right for that topic. When the instrument was scored as a single scale, as in [5], the average score on the pre-test was 7/21 (33%), and 10/21 (48%) on the post-test. Thus, we recommend that instructors consider use of the C&BA in ways that do not contribute to student grades, such as for trying to understand how prepared students are for the biochemistry course, or which topics might require additional review.
    Many of the studies that have used the C&BA have opted to use the items associated only with topics that were related to specific interventions, and so smaller subsets of questions (i.e., 12, 18, or 21 out of the 24 total items) have been used for that purpose [2-5]. Because the internal structure validity of C&BA data was studied only for the 24-item instrument [1] and the 21-item instrument [2], further structural support for the abbreviated versions is recommended.

    Details from panel review

    No evidence of validity for relations to other variables has been published to date. Therefore, future studies could add to what is known about how student performance on C&BA items relate to other measures of similar content.

    The panel expressed several concerns about the confirmatory factor analysis that was performed and used to justify evidence of internal structure validity. The complete CFA models [1, 2] were not published, only the fit statistics for the final model, leaving the question of what the correlations among the factors might be and whether any correlated errors among items were included in the model. Two items (17 and 23) had very low non-significant factor loadings (<0.25). Therefore, they may not contribute much information about student understanding of the proposed construct they are associated with. Additionally, in Villafañe et al. [2, supplementary information] the CFA was performed for the pre- and post-test, but with different degrees of freedom, indicating that the model was not the same. This raised questions about measurement invariance and whether the instrument performs the same as a pre- vs. post-test. Although the fit statistics reported were good, there was discussion that using the WLSMV estimator can result in inflated fit statistics for CFA. Therefore, future studies might consider the use of more stringent cutoff values when using the WLSMV estimator.

    In order to provide support that the instrument is sensitive to learning gains or changes in instruction, future researchers utilizing a pre-post test design are encouraged to evaluate the stability of the data in the absence of instructional interventions.

    With respect to single administration reliability, the coefficient alpha values ranged from 0.306 to 0.878 for each of the eight subscales [1]. An additional follow-on report [2] provided coefficient alpha values for the entire 21-item pre-test (0.62) and post-test (0.67). The supplementary information for Villafañe et al. [2] also included coefficient alpha values for the seven subscales used, which ranged from 0.10 to 0.89. The variability of the alpha scores, with some of them quite low, may be a concern as they may not represent strong evidence of internal consistency reliability.

    References

    [1] Villafañe, S.M., Bailey, C.P., Loertscher, J., Minderhout, V., & Lewis, J.E. (2011). Development and analysis of an instrument to assess student understanding of foundational concepts before biochemistry coursework. Biochemistry and Molecular Biology Education, 39(2), 102-109. https://doi.org/10.1002/bmb.20464

    [2] Villafañe, S.M., Loertscher, J., Minderhout, V., & Lewis, J.E. (2011). Uncovering students’ incorrect ideas about foundational concepts for biochemistry. Chemistry Education Research and Practice, 12(2), 210-218. https://doi.org/10.1039/C1RP90026A

    [3] Taylor, A.T.S., Olofson, E.L., & Novak, W.R.P. (2017). Enhancing student retention of prerequisite knowledge through pre-class activities and in-class reinforcement. Biology and Molecular Biology Education, 45(2), 97-104. https://doi.org/10.1002/bmb.20992

    [4] Kopecki-Fjetland, M.A., & Steffenson, M. (2021). Design and implementation of active learning strategies to enhance student understanding of foundational concepts in biochemistry. Biochemistry and Molecular Biology Education, 49(3), 446-456. https://doi.org/10.1002/bmb.21498

    [5] Miller, H.B., & Srougi, M.C. (2021). Growth mindset interventions improve academic performance but not mindset in biochemistry. Biochemistry and Molecular Biology Education, 49(5), 748-757. https://doi.org/10.1002/bmb.21556

    Versions
    Listed below are all versions and modifications that were based on this instrument or this instrument were based on.
    Name Authors
    • Xu, XY; Lewis, JE; Loertscher, J; Minderhout, V; Tienson, HL

    Citations
    Listed below are all literature that develop, implement, modify, or reference the instrument.
    1. Villafañe, S. M., Loertscher, J., Minderhout, V., & Lewis, J. E. (2011). Uncovering students' incorrect ideas about foundational concepts for biochemistry. Chemistry Education Research and Practice, 12(2), 210-218.

    2. Villafañe, S.M., Bailey, C.P., Loertscher, J., Minderhout, V., & Lewis, J.E. (2011). Development and analysis of an instrument to assess student understanding of foundational concepts before biochemistry coursework. Biochemistry and Molecular Biology Educ

    3. Kopecki-Fjetland, M.A., & Steffenson, M. (2021). Design and implementation of active learning strategies to enhance student understanding of foundational concepts in biochemistry. Biochemistry and Molecular Biology Education, 49(3), 446-456.

    4. Miller, H.B., & Srougi, M.C. (2021). Growth mindset interventions improve academic performance but not mindset in biochemistry. Biochemistry and Molecular Biology Education, 49(5), 748-757.

    5. Taylor, A.T.S., Olofson, E.L., & Novak, W.R.P. (2017). Enhancing student retention of prerequisite knowledge through pre-class activities and in-class reinforcement. Biochemistry and Molecular Biology Education, 45(2), 97-104.

    6. Grimes, C.L., & White, H.B. (2015). Passing the baton: Mentoring for adoption of active-learning pedagogies by research-active junior faculty. Biochemistry and Molecular Biology Education, 43(5), 345-357.

    7. Bailey, C.P., Minderhout, V., & Loertscher, J. (2012). Learning transferable skills in large lecture halls: Implementing a POGIL approach in biochemistry. Biochemistry and Molecular Biology Education, 40(1), 44568.