Skip to main content

Academic Motivation Scale - Chemistry

AMS - Chem

    OVERVIEW
    Overview
    Listed below is general information about the instrument.
    Summary
    Original author(s)
    • Liu, YJ; Ferrell, B; Barbera, J; Lewis, JE

    Original publication
    • Liu, Y., Ferrell, B., Barbera, J., & Lewis, J. E. (2017). Development and evaluation of a chemistry-specific version of the academic motivation scale (AMS-Chemistry). Chemistry Education Research and Practice, 18(1), 191-213.

    Year original instrument was published 2017
    Inventory
    Number of items 28
    Number of versions/translations 3
    Cited implementations 4
    Language
    • English
    Country United States
    Format
    • Response Scale
    Intended population(s)
    • Students
    • Undergraduate
    Domain
    • Affective
    Topic
    • Motivation
    Evidence
    The CHIRAL team carefully combs through every reference that cites this instrument and pulls all evidence that relates to the instruments’ validity and reliability. These data are presented in the following table that simply notes the presence or absence of evidence related to that concept, but does not indicate the quality of that evidence. Similarly, if evidence is lacking, that does not necessarily mean the instrument is “less valid,” just that it wasn’t presented in literature. Learn more about this process by viewing the CHIRAL Process and consult the instrument’s Review (next tab), if available, for better insights into the usability of this instrument.

    Information in the table is given in four different categories:
    1. General - information about how each article used the instrument:
      • Original development paper - indicates whether in which paper(s) the instrument was developed initially
      • Uses the instrument in data collection - indicates whether an article administered the instrument and collected responses
      • Modified version of existing instrument - indicates whether an article has modified a prior version of this instrument
      • Evaluation of existing instrument - indicates whether an article explicitly provides evidence that attempt to evaluate the performance of the instrument; lack of a checkmark here implies an article that administered the instrument but did not evaluate the instrument itself
    2. Reliability - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    3. Validity - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    4. Other Information - information that may or may not directly relate to the evidence for validity and reliability, but are commonly reported when evaluating instruments; please see the Glossary for term definitions
    Publications: 1 2 3 4

    General

    Original development paper
    Uses the instrument in data collection
    Modified version of existing instrument
    Evaluation of existing instrument

    Reliability

    Test-retest reliability
    Internal consistency
    Coefficient (Cronbach's) alpha
    McDonald's Omega
    Inter-rater reliability
    Person separation
    Generalizability coefficients
    Other reliability evidence

    Vailidity

    Expert judgment
    Response process
    Factor analysis, IRT, Rasch analysis
    Differential item function
    Evidence based on relationships to other variables
    Evidence based on consequences of testing
    Other validity evidence

    Other information

    Difficulty
    Discrimination
    Evidence based on fairness
    Other general evidence
    Review
    DISCLAIMER: The evidence supporting the validity and reliability of the data summarized below is for use of this assessment instrument within the reported settings and populations. The continued collection and evaluation of validity and reliability evidence, in both similar and dissimilar contexts, is encouraged and will support the chemistry education community’s ongoing understanding of this instrument and its limitations.
    This review was generated by a CHIRAL review panel. Each CHIRAL review panel consists of multiple experts who first individually review the citations of the assessment instrument listed on this page for evidence in support of the validity and reliability of the data generated by the instrument. Panels then meet to discuss the evidence and summarize their opinions in the review posted in this tab. These reviews summarize only the evidence that was discussed during the panel which may not represent all evidence available in the published literature or that which appears on the Evidence tab.
    If you feel that evidence is missing from this review, or that something was documented in error, please use the CHIRAL Feedback page.

    Panel Review: Academic Motivation Scale-Chemistry (AMS-Chemistry)

    (Post last updated June 23, 2022)

    Review panel summary

    The Academic Motivation Scale-Chemistry (AMS-Chemistry) is a 28-item, 7-factor Likert scale survey that was adapted to assess student motivation toward chemistry [1]. It has also been administered using a rank-sort response format and as a 21-item “short” version [4]. It has been evaluated with students enrolled in general chemistry [1, 4] and first semester organic chemistry [2, 3] in both traditional [1, 4] and flipped [2, 3] classroom environments at three universities in the United States. Several aspects of reliability and validity have been assessed for the data generated by the AMS-Chemistry. Test content evidence was established during initial development of the instrument by a review of the items by a panel of experts including the instrument authors and an educational psychologist [1]. After revision of the items, chemistry graduate students were added to the panel to provide additional commentary on the readability of the items and suitability of the items for the target audience [1]. Response process validity evidence was collected for the instrument through interviews with general chemistry students who were asked to complete the AMS-Chemistry instrument while explaining their reasoning for their answer choices [1]. The interviews demonstrated support for all but two items, which received minor modifications to wording to address subscale consistency. The correlational relationship between the subscales of the AMS-Chemistry and exam performance at multiple time points [1-3] offers validity evidence based on relations to other variables. Validity evidence based on the internal structure of collected Likert-scale data was established using confirmatory factor analysis [1-4]. Additional validity evidence for the rank-sort data was provided by comparing Likert-scale and rank-sort responses using cluster analysis [4]. Single administration reliability was estimated using Cronbach’s alpha for each of the seven subscales measured by the AMS-Chemistry [1-4].

    Recommendations for use

    According to the validity and reliability evidence reported in the literature, the AMS-Chemistry instrument can be used to measure student motivation across multiple classroom environments [1-4], at a single time point [1-4], or to investigate how student motivation changes throughout a course [1, 2]. However, evidence supporting measurement invariance across time points has not yet been established. The instrument can also be administered using a rank-sort response scale to reduce response-style bias that has been observed in Likert-scale instruments [4] and with a 21-item “short” version.

    Details from panel review

    The AMS-Chemistry was designed to measure student motivation towards chemistry. The validity and reliability evidence presented in the literature provides strong support for the intended use of this instrument in both traditional [1-4] and flipped classrooms [2, 3] with students enrolled in both general chemistry and organic I classes. Most importantly, no loss in fidelity of this instrument was observed for flipped or Peer-Led Team Learning (PLTL) classroom environments. The AMS-Chemistry measures motivation on seven subscales including both intrinsic and extrinsic motivation factors as well as amotivation. Correlations between subscores on the seven scales with student performance at different semester time points [1-3] revealed results consistent with the original instrument development and self-determination theory for all classroom contexts [1-4].

    References

    [1] Liu, Y., Ferrell, B., Barbera, J., & Lewis, J.E. (2017). Development and evaluation of a chemistry-specific version of the academic motivation scale (AMS-Chemistry). Chemistry Education Research and Practice, 18, 191-213. https://doi.org/10.1039/C6RP00200E

    [2] Liu, Y., Raker, J.R., & Lewis, J.E. (2018). Evaluating student motivation in organic chemistry courses: moving from a lecture-based to a flipped approach with peer-led team learning. Chemistry Education Research and Practice, 19, 251-264. https://doi.org/10.1039/C7RP00153C

    [3] Raker, J.R., Gibbons, R.E., & Cruz-Ramírez de Arellano, D. (2019). Development and evaluation of the organic chemistry-specific achievement emotions questionnaire (AEQ-OCHEM). Journal of Research in Science Teaching, 56(2), 163-183. https://doi.org/10.1002/tea.21474

    [4] Wang, Y., & Lewis, S.E. (2022). Towards a theoretically sound measure of chemistry students’ motivation; investigating rank–sort survey methodology to reduce response style bias. Chemistry Education Research and Practice, 23, 240-256. https://doi.org/10.1039/D1RP00206F

    Versions
    Listed below are all versions and modifications that were based on this instrument or this instrument were based on.
    Instrument has been modified in:
    Name Authors
    • Wang, Y., & Lewis, S.E.

    • Reimer, L. C., Leslie, J. M., Bidwell, S. L., Isborn, C. M., Lair, D., Menke, E., Stokes, B. J., & Hratchian, H. P.

    Citations
    Listed below are all literature that develop, implement, modify, or reference the instrument.
    1. Liu, Y., Ferrell, B., Barbera, J., & Lewis, J. E. (2017). Development and evaluation of a chemistry-specific version of the academic motivation scale (AMS-Chemistry). Chemistry Education Research and Practice, 18(1), 191-213.

    2. Wang, Y., & Lewis, S.E. (2022). Towards a theoretically sound measure of chemistry students' motivation; Investigating rank-sort survey methodology to reduce response style bias. Chemistry Education Research and Practice, 23(1), 240-256.

    3. Raker, J.R., Gibbons, R.E., & Cruz-Ramirez, de Arellano D. (2019). Development and evaluation of the organic chemistry-specific achievement emotions questionnaire (AEQ-OCHEM). Journal of Research in Science Teaching, 56(2), 163-183.

    4. Liu, Y., Raker, J.R., & Lewis, J.E. (2018). Evaluating student motivation in organic chemistry courses: moving from a lecture-based to a flipped approach with peer-led team learning. Chemistry Education Research and Practice, 19(1), 251-264.