Skip to main content

Acid I

ACID I

    OVERVIEW
    Overview
    Listed below is general information about the instrument.
    Summary
    Original author(s)
    • McClary, L.M., & Bretz, S.L.

    Original publication
    • McClary, L.M., & Bretz, S.L. (2012). Development and assessment of a diagnostic tool to identify organic chemistry students' alternative conceptions related to acid strength. International Journal of Science Education, 34(15), 2317-2341.

    Year original instrument was published 2012
    Inventory
    Number of items 9
    Number of versions/translations 1
    Cited implementations 3
    Language
    • English
    Country United States
    Format
    • Multiple Choice
    • Response Scale
    Intended population(s)
    • Students
    • Undergraduate
    Domain
    • Cognitive
    Topic
    • Acid-Base
    Evidence
    The CHIRAL team carefully combs through every reference that cites this instrument and pulls all evidence that relates to the instruments’ validity and reliability. These data are presented in the following table that simply notes the presence or absence of evidence related to that concept, but does not indicate the quality of that evidence. Similarly, if evidence is lacking, that does not necessarily mean the instrument is “less valid,” just that it wasn’t presented in literature. Learn more about this process by viewing the CHIRAL Process and consult the instrument’s Review (next tab), if available, for better insights into the usability of this instrument.

    Information in the table is given in four different categories:
    1. General - information about how each article used the instrument:
      • Original development paper - indicates whether in which paper(s) the instrument was developed initially
      • Uses the instrument in data collection - indicates whether an article administered the instrument and collected responses
      • Modified version of existing instrument - indicates whether an article has modified a prior version of this instrument
      • Evaluation of existing instrument - indicates whether an article explicitly provides evidence that attempt to evaluate the performance of the instrument; lack of a checkmark here implies an article that administered the instrument but did not evaluate the instrument itself
    2. Reliability - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    3. Validity - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    4. Other Information - information that may or may not directly relate to the evidence for validity and reliability, but are commonly reported when evaluating instruments; please see the Glossary for term definitions
    Publications: 1 2 3

    General

    Original development paper
    Uses the instrument in data collection
    Modified version of existing instrument
    Evaluation of existing instrument

    Reliability

    Test-retest reliability
    Internal consistency
    Coefficient (Cronbach's) alpha
    McDonald's Omega
    Inter-rater reliability
    Person separation
    Generalizability coefficients
    Other reliability evidence

    Vailidity

    Expert judgment
    Response process
    Factor analysis, IRT, Rasch analysis
    Differential item function
    Evidence based on relationships to other variables
    Evidence based on consequences of testing
    Other validity evidence

    Other information

    Difficulty
    Discrimination
    Evidence based on fairness
    Other general evidence
    Review
    DISCLAIMER: The evidence supporting the validity and reliability of the data summarized below is for use of this assessment instrument within the reported settings and populations. The continued collection and evaluation of validity and reliability evidence, in both similar and dissimilar contexts, is encouraged and will support the chemistry education community’s ongoing understanding of this instrument and its limitations.
    This review was generated by a CHIRAL review panel. Each CHIRAL review panel consists of multiple experts who first individually review the citations of the assessment instrument listed on this page for evidence in support of the validity and reliability of the data generated by the instrument. Panels then meet to discuss the evidence and summarize their opinions in the review posted in this tab. These reviews summarize only the evidence that was discussed during the panel which may not represent all evidence available in the published literature or that which appears on the Evidence tab.
    If you feel that evidence is missing from this review, or that something was documented in error, please use the CHIRAL Feedback page.

    Panel Review: ACID I

    (Post last updated July 21, 2021)

    Review panel summary

    ACID I is a nine-item multiple-tier, multiple-choice concept inventory developed to identify organic chemistry students’ alternative conceptions about acids and acid strengths. These items are divided into three sets with three items each: for the first item, students are told which acid is more acidic, and students have to select the best reason to explain this fact, for the second item, students are asked to predict the trend in acid strength, and for the third item, students need to select the reason for the trend. Each of these three items is accompanied by an item that solicits confidence. This concept inventory has been evaluated with organic chemistry students at U.S. liberal arts [1, 2] and research [2, 3] institutions. Several aspects of validity and reliability have been assessed for the data generated by ACID I. Expert judgment, providing partial evidence for test content validity, was provided by two faculty teaching the courses where the instrument was administered [1, 3]; each included comments noting that ACID I would be a “fair” [1] or “adequate” [3] assessment of their students’ understanding . Student interviews (N=19) were used to provide evidence for response process validity [3]. The relation between correctness of an item and confidence [1-3] implies some validity by demonstrating relations with other variables, specifically by quantifying the difference in confidence between students that answered the items correctly and those that answered incorrectly. Additional evidence based on the relation to other variables is pre/post scores, which demonstrates a relation between time and instruction [2, 3]. In terms of reliability, coefficient alpha has been used to estimate single administration reliability; the value ranges from 0.28 to 0.4 [1-3]; these values indicate that additional evidence may need to be considered in regards to the reliability of the instrument. Evidence for item statistics were reported, such as item difficulty [1, 2] and item discrimination [1, 2].

    Recommendations for use

    The ACID I instrument is intended to identify alternative conceptions of acids and acid strengths for students in organic chemistry [1]. While some aspects of validity and reliability have been reported, in many instances the evidence provided is limited. Overall, when considering using this instrument, the panel suggests that the content and format of the items be evaluated in terms of their appropriateness for the intended use.

    Details from panel review

    ACID I developers gathered and reported several aspects of validity and reliability evidence for the instrument [1, 2]; however some of the evidence is limited as insufficient details have been reported. For example, expert judgment (to provide evidence of test content validity) from the developers [1] and from another study [3] was provided by gathering feedback from the individual faculty member teaching each course. Typically expert judgment is done with a panel of experts (more than one) and their feedback used to make modifications to the instrument. Thus, the instrument may benefit from additional experts evaluating its content. In terms of response process validity, data was gathered through student interviews [3]; however, the interviews were not presented as evidence of how the items are functioning and if they are being interpreted as intended. In terms of a single administration reliability, coefficient alpha was calculated to provide evidence for internal consistency. However, all of the values reported are very low; which can be interpreted as insufficient evidence for internal consistency in the instrument. The developers of the instrument argued that the format and the content of the instrument could have contributed to these lower values. They stated that there is a possibility that students’ conceptions of acid strength are not coherent and are fragmented; therefore student responses are not consistent [1]. For item difficulty values were presented as aggregate value [1], as ranges [2] and in a histogram [3]; however, item difficulty values were not reported for each individual item. For item discrimination, the developers used ranges [2] to report it, but there is limited details about how it was calculated.

    References

    [1] McClary, L. M., & Bretz, S. L. (2012). Development and Assessment of A Diagnostic Tool to Identify Organic Chemistry Students’ Alternative Conceptions Related to Acid Strength. International Journal of Science Education, 34(15), 2317–2341. https://doi.org/10.1080/09500693.2012.684433

    [2] Bretz, S. L., & McClary, L. (2015). Students’ Understandings of Acid Strength: How Meaningful Is Reliability When Measuring Alternative Conceptions? Journal of Chemical Education, 92(2), 212–219. https://doi.org/10.1021/ed5005195

    [3] Shah, L., Rodriguez, C. A., Bartoli, M., & Rushton, G. T. (2018). Analysing the impact of a discussion-oriented curriculum on first-year general chemistry students’ conceptions of relative acidity. Chemistry Education Research and Practice, 19(2), 543–557. https://doi.org/10.1039/C7RP00154A

    Versions
    Listed below are all versions and modifications that were based on this instrument or this instrument were based on.
    Instrument is derived from:
    Name Authors
    • McClary, L., & Talanquer, V.

    Citations
    Listed below are all literature that develop, implement, modify, or reference the instrument.
    1. McClary, L.M., & Bretz, S.L. (2012). Development and assessment of a diagnostic tool to identify organic chemistry students' alternative conceptions related to acid strength. International Journal of Science Education, 34(15), 2317-2341.

    2. Bretz, S.L., & McClary, L. (2015). Students' understandings of acid strength: How meaningful is reliability when measuring alternative conceptions?. Journal of Chemical Education, 92(2), 212-219.

    3. Shah, L., Rodriguez, C.A., Bartoli, M., & Rushton, G.T. (2018). Analysing the impact of a discussion-oriented curriculum on first-year general chemistry students' conceptions of relative acidity. Chemistry Education Research and Practice, 19(2), 543-557.