Skip to main content

Metacognitive Activities Inventory

MCA-I

    OVERVIEW
    Overview
    Listed below is general information about the instrument.
    Summary
    Original author(s)
    • Cooper, M. M., & Sandi-Urena, S.

    Original publication
    • Cooper, M. M., & Sandi-Urena, S. (2009). Design and validation of an instrument to assess metacognitive skillfulness in chemistry problem solving. Journal of Chemical Education, 86(2), 240.

    Year original instrument was published 2009
    Inventory
    Number of items 27
    Number of versions/translations 1
    Cited implementations 9
    Language
    • English
    Country United States
    Format
    • Response Scale
    Intended population(s)
    • Students
    • Undergraduate
    • Graduate
    Domain
    Topic
    Evidence
    The CHIRAL team carefully combs through every reference that cites this instrument and pulls all evidence that relates to the instruments’ validity and reliability. These data are presented in the following table that simply notes the presence or absence of evidence related to that concept, but does not indicate the quality of that evidence. Similarly, if evidence is lacking, that does not necessarily mean the instrument is “less valid,” just that it wasn’t presented in literature. Learn more about this process by viewing the CHIRAL Process and consult the instrument’s Review (next tab), if available, for better insights into the usability of this instrument.

    Information in the table is given in four different categories:
    1. General - information about how each article used the instrument:
      • Original development paper - indicates whether in which paper(s) the instrument was developed initially
      • Uses the instrument in data collection - indicates whether an article administered the instrument and collected responses
      • Modified version of existing instrument - indicates whether an article has modified a prior version of this instrument
      • Evaluation of existing instrument - indicates whether an article explicitly provides evidence that attempt to evaluate the performance of the instrument; lack of a checkmark here implies an article that administered the instrument but did not evaluate the instrument itself
    2. Reliability - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    3. Validity - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    4. Other Information - information that may or may not directly relate to the evidence for validity and reliability, but are commonly reported when evaluating instruments; please see the Glossary for term definitions
    Publications: 1 2 3 4 5 6 7 8 9

    General

    Original development paper
    Uses the instrument in data collection
    Modified version of existing instrument
    Evaluation of existing instrument

    Reliability

    Test-retest reliability
    Internal consistency
    Coefficient (Cronbach's) alpha
    McDonald's Omega
    Inter-rater reliability
    Person separation
    Generalizability coefficients
    Other reliability evidence

    Vailidity

    Expert judgment
    Response process
    Factor analysis, IRT, Rasch analysis
    Differential item function
    Evidence based on relationships to other variables
    Evidence based on consequences of testing
    Other validity evidence

    Other information

    Difficulty
    Discrimination
    Evidence based on fairness
    Other general evidence
    Review
    DISCLAIMER: The evidence supporting the validity and reliability of the data summarized below is for use of this assessment instrument within the reported settings and populations. The continued collection and evaluation of validity and reliability evidence, in both similar and dissimilar contexts, is encouraged and will support the chemistry education community’s ongoing understanding of this instrument and its limitations.
    This review was generated by a CHIRAL review panel. Each CHIRAL review panel consists of multiple experts who first individually review the citations of the assessment instrument listed on this page for evidence in support of the validity and reliability of the data generated by the instrument. Panels then meet to discuss the evidence and summarize their opinions in the review posted in this tab. These reviews summarize only the evidence that was discussed during the panel which may not represent all evidence available in the published literature or that which appears on the Evidence tab.
    If you feel that evidence is missing from this review, or that something was documented in error, please use the CHIRAL Feedback page.

    Panel Review: Metacognitive Activities Inventory

    (Post last updated July 21, 2021)

    Review panel summary

    The Metacognitive Activities Inventory (MCAI) is a twenty-seven item, 5-point Likert-scale instrument designed to assess students’ metacognitive skillfulness during chemistry problem solving. It has been evaluated with students enrolled in general chemistry [1, 2] and in an introductory engineering course [3] in two U.S. institutions. Several aspects of validity and reliability have been assessed for the data generated by MCAI. Test content evidence was provided by faculty and graduate students at the development stages and by graduate and senior undergraduate students during pilot testing. During pilot testing undergraduate students provided feedback on the clarity of the instrument and this was used to provide some evidence for response process validity [1]; however, the details provided to assess this evidence are limited. The relation between total MCAI scores and grade point average [1] provides some validity support based on relations with other variables. Another evidence based on relation to other variables is provided by the relation between ability, MCAI scores and problem solving ability [2]. Developers of the MCAI provided information regarding internal structure validity but the results of the factor analysis fail to show evidence for the internal structure of the instrument [1]. In terms of reliability, coefficient alpha has been used to estimate single administration reliability for the whole instrument [1, 3].

    Recommendations for use

    The MCAI instrument was designed to assess students’ metacognitive skillfulness during chemistry problem solving [1]. According to the validity and reliability aspects reported in the literature, this instrument can be used to assess metacognitive skills using a total score. However, in many instances the evidence provided is limited or does not have enough details; therefore it is suggested that additional evidence is gathered, especially for the internal structure validity to support the use of a total score for the MCAI and its relation with other measures such as performance.

    Details from panel review

    The MCAI developers gathered and reported some aspects of validity and reliability evidence for the instrument [1, 2]; however some of the evidence is limited since there is a lack of details that have been reported. For example, for internal structure validity, the developers could not show evidence for the intended factors. From theory, the developers were expecting to group items based on the components of metacognition: planning, evaluating, and monitoring; however, the factor analysis did not support this structure. The developers stated that the interdependency of metacognitive skills could affect this structure; therefore total scores were reported. However, in the absence of evidence for a one-factor solution, more support is needed to warrant the use of composite MCAI score (i.e., a total score). Another example of limited evidence is for response process validity. Student feedback on item clarity was collected [1]; however, it is unclear how this feedback was used in evaluating the items.

    In terms of relation to other variables, there is evidence that metacognitive skills, as measured by MCAI, are related to performance [1] and to problem solving skills [2]. IRT analysis was used to measure ability and this relation was significantly correlated with MCAI scores [2]. The instrument was evaluated using both classical test theory and item response theory.

    References

    [1] Cooper, M. M., & Sandi-Urena, S. (2009). Design and Validation of an Instrument To Assess Metacognitive Skillfulness in Chemistry Problem Solving. Journal of Chemical Education, 86(2), 240. https://doi.org/10.1021/ed086p240

    [2] Cooper, M. M., Sandi–Urena, S., & Stevens, R. (2008). Reliable multi method assessment of metacognition use in chemistry problem solving. Chemistry Education Research and Practice, 9(1), 18–24. https://doi.org/10.1039/B801287N

    [3] Elliott, L., Aqlan, F., Zhao, R., & Janney, M. (2020). Assessment of Metacognitive Skills in Design and Manufacturing. 2020 ASEE Virtual Annual Conference Content Access Proceedings, 34191. https://doi.org/10.18260/1-2--34191

    Versions
    This instrument has not been modified nor was it created based on an existing instrument.
    Citations
    Listed below are all literature that develop, implement, modify, or reference the instrument.
    1. Cooper, M. M., & Sandi-Urena, S. (2009). Design and validation of an instrument to assess metacognitive skillfulness in chemistry problem solving. Journal of Chemical Education, 86(2), 240.

    2. Sandi-Urena, S., Cooper, M.M., & Stevens, R.H. (2011). Enhancement of metacognition use and awareness by means of a collaborative intervention. International Journal of Science Education, 33(3), 323-340.

    3. Dianovsky, M.T., & Wink, D.J. (2012). Student learning through journal writing in a general education chemistry course for pre-elementary education majors. Science Education, 96(3), 543-565.

    4. Sanders, S., Thrill, C., & Winfield, L. L. (2021). Self-regulated learning in organic chemistry: A platform for promoting learner agency among women of African descent. Critical Conversations and the Academy: A National Symposium, 2022.

    5. van, Opstal M.T., & Daubenmire, P.L. (2015). Extending Students’ Practice of Metacognitive Regulation Skills with the Science Writing Heuristic. International Journal of Science Education, 37(7), 1089-1112.

    6. Elliott, L. J., Aqlan, F., Zhao, R., & Janney, M. (2020, June). Assessment of Metacognitive Skills in Design and Manufacturing. In ASEE Annual Conference proceedings.

    7. Cooper, M.M., Sandi-Urena, S., & Stevens, R. (2008). Reliable multi method assessment of metacognition use in chemistry problem solving. Chemistry Education Research and Practice, 9(1), 18-24.

    8. Underwood, S.M., Reyes-Gastelum, D., & Cooper, M.M. (2016). When do students recognize relationships between molecular structure and properties? A longitudinal comparison of the impact of traditional and transformed curricula. Chemistry Education Research

    9. Dianovsky, M., & Wink, D. (2010). Student learning through journal writing in a natural science course for pre-elementary education majors.