Skip to main content

Chemistry Expectations Survey

CHEMX

    OVERVIEW
    Overview
    Listed below is general information about the instrument.
    Summary
    Original author(s)
    • Grove, N., & Bretz, S.L.

    Original publication
    • Grove, N., & Bretz, S.L. (2007). CHEMX: An instrument to assess students' cognitive expectations for learning chemistry. Journal of Chemical Education, 84(9), 1524-1529.

    Year original instrument was published 2007
    Inventory
    Number of items 47
    Number of versions/translations 1
    Cited implementations 3
    Language
    • English
    Country United States
    Format
    • Response Scale
    Intended population(s)
    • Students
    • Undergraduate
    • Faculty
    • Tertiary
    Domain
    • Affective
    • Behavioral
    Topic
    • General (Affective)
    Evidence
    The CHIRAL team carefully combs through every reference that cites this instrument and pulls all evidence that relates to the instruments’ validity and reliability. These data are presented in the following table that simply notes the presence or absence of evidence related to that concept, but does not indicate the quality of that evidence. Similarly, if evidence is lacking, that does not necessarily mean the instrument is “less valid,” just that it wasn’t presented in literature. Learn more about this process by viewing the CHIRAL Process and consult the instrument’s Review (next tab), if available, for better insights into the usability of this instrument.

    Information in the table is given in four different categories:
    1. General - information about how each article used the instrument:
      • Original development paper - indicates whether in which paper(s) the instrument was developed initially
      • Uses the instrument in data collection - indicates whether an article administered the instrument and collected responses
      • Modified version of existing instrument - indicates whether an article has modified a prior version of this instrument
      • Evaluation of existing instrument - indicates whether an article explicitly provides evidence that attempt to evaluate the performance of the instrument; lack of a checkmark here implies an article that administered the instrument but did not evaluate the instrument itself
    2. Reliability - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    3. Validity - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    4. Other Information - information that may or may not directly relate to the evidence for validity and reliability, but are commonly reported when evaluating instruments; please see the Glossary for term definitions
    Publications: 1 2 3

    General

    Original development paper
    Uses the instrument in data collection
    Modified version of existing instrument
    Evaluation of existing instrument

    Reliability

    Test-retest reliability
    Internal consistency
    Coefficient (Cronbach's) alpha
    McDonald's Omega
    Inter-rater reliability
    Person separation
    Generalizability coefficients
    Other reliability evidence

    Vailidity

    Expert judgment
    Response process
    Factor analysis, IRT, Rasch analysis
    Differential item function
    Evidence based on relationships to other variables
    Evidence based on consequences of testing
    Other validity evidence

    Other information

    Difficulty
    Discrimination
    Evidence based on fairness
    Other general evidence
    Review
    DISCLAIMER: The evidence supporting the validity and reliability of the data summarized below is for use of this assessment instrument within the reported settings and populations. The continued collection and evaluation of validity and reliability evidence, in both similar and dissimilar contexts, is encouraged and will support the chemistry education community’s ongoing understanding of this instrument and its limitations.
    This review was generated by a CHIRAL review panel. Each CHIRAL review panel consists of multiple experts who first individually review the citations of the assessment instrument listed on this page for evidence in support of the validity and reliability of the data generated by the instrument. Panels then meet to discuss the evidence and summarize their opinions in the review posted in this tab. These reviews summarize only the evidence that was discussed during the panel which may not represent all evidence available in the published literature or that which appears on the Evidence tab.
    If you feel that evidence is missing from this review, or that something was documented in error, please use the CHIRAL Feedback page.

    Panel Review: The Chemistry Expectations Survey (CHEMX)

    (Post last updated June 24, 2022)

    Review panel summary

    The Chemistry Expectations Survey (CHEMX) is a 47-item, Likert-scale, instrument that was designed to measure student expectations about learning chemistry. It was developed based off of the Maryland Physics Expectations Survey (MPEX) that measures expectations about learning physics. The CHEMX contains 25 items based on the MPEX and 22 items that were developed by the authors to incorporate additional aspects of chemistry. The CHEMX is described as having seven constructs: effort, concepts, math link, reality link, outcome, laboratory, and visualization. Evidence in support of test content validity was provided through the expert knowledge the authors held about chemistry in the development of the additional 22 items [1]. The internal structure of CHEMX data was explored through factor analysis. Seventeen items loaded onto three distinct factors (visualization, concepts, and reality link) [1]. The authors justified the internal structure of the remaining four subscales by their acceptable reliability estimates. Single-administration reliability was explored through Cronbach’s alpha and inter-item correlations. Estimates were provided for both a total CHEMX score and for each subscale [1]. Relations to other variables were explored when comparing total scores on the CHEMX between faculty members of different chemistry disciplines. Chemistry Education faculty had significantly different total scores compared to the other chemistry disciplines and the authors aligned this result with the statement that “individuals who closely study student learning should score higher on an instrument designed to measure one dimension of that learning”. While this statement supports the use of CHEMX scores for comparing chemistry faculty, the panel felt that further justification would be needed to support total score comparisons among different student groups [1].

    Recommendations for use

    The CHEMX was developed as a measure of student expectations about learning chemistry. Evidence for the internal structure only provided support for the subscales of Visualization, Concepts, and Reality Link. Therefore, the panel recommends caution in the interpretation of data from the other subscales as evidence for them was lacking from the factor analysis result [1]. While scores from each of the three subscales are supported, there was no internal structure validity evidence provided that supports the use of a CHEMX total score. Overall, the panel felt more evidence for validity of the data generated by CHEMEX was warranted. Additional insights may be gained through explorations of the test content and the response process validity as well as more detailed internal structure validity.

    Details from panel review

    The authors provided some evidence for validity and reliability for data collected by the CHEMX [1]. While the authors held expert knowledge of chemistry and cited Johnston’s triangle as support for the newly developed items, evidence supporting the test content validity was not evaluated beyond the research team. The validity evidence provided through relations with other variables only provides support for comparing chemistry faculty CHEMEX scores, not those of the intended population (i.e., chemistry students). During the panel discussion of validity for internal structure, it was unclear what type of factor analysis was performed and no output from the analysis was provided. In addition, the authors collected data with both faculty and student participants and it was unclear what specific dataset was used for the internal structure analysis. Finally, the authors found three factors through the factor analysis and justified the structure of the remaining a priori factors through single-administration reliability, which the panel felt was not sufficient evidence for the internal structure.

    References

    [1] Grove, N., & Bretz, S.L. (2007). CHEMX: An instrument to assess students’ cognitive expectations for learning chemistry. Journal of Chemical Education, 84(9), 1524-1529. https://doi.org/10.1021/ed084p1524

    [2] Mazzarone, K.M., & Grove, N.P. (2013). Understanding epistemological development in first- and second-year chemistry students. Journal of Chemical Education, 90(8), 968-975. https://doi.org/10.1021/ed300655s

    [3] Albrecht, B. (2014). Computational chemistry in the undergraduate laboratory: A mechanistic study of the Wittig reaction. Journal of Chemical Education, 91(12), 2182-2185. https://doi.org/10.1021/ed400008d

    Versions
    This instrument has not been modified nor was it created based on an existing instrument.
    Citations
    Listed below are all literature that develop, implement, modify, or reference the instrument.
    1. Grove, N., & Bretz, S.L. (2007). CHEMX: An instrument to assess students' cognitive expectations for learning chemistry. Journal of Chemical Education, 84(9), 1524-1529.

    2. Mazzarone, K.M., & Grove, N.P. (2013). Understanding epistemological development in first- and second-year chemistry students. Journal of Chemical Education, 90(8), 968-975.

    3. Albrecht, B. (2014). Computational chemistry in the undergraduate laboratory: A mechanistic study of the wittig reaction. Journal of Chemical Education, 91(12), 2182-2185.