Skip to main content

Modified Approaches And Study Skills Inventory For Students

M-ASSIST

    OVERVIEW
    Overview
    Listed below is general information about the instrument.
    Summary
    Original author(s)
    • Bunce, D.M., Komperda, R., Schroeder, M.J., Dillner, D.K., Lin, S., Teichert, M.A., & Hartman, J.R.

    Original publication
    • Bunce, D.M., Komperda, R., Schroeder, M.J., Dillner, D.K., Lin, S., Teichert, M.A., & Hartman, J.R. (2017). Differential use of study approaches by students of different achievement levels. Journal of Chemical Education, 94(10), 1415-1424.

    Year original instrument was published 2017
    Inventory
    Number of items 0
    Number of versions/translations 1
    Cited implementations 4
    Language
    • English
    Country United States
    Format
    • Response Scale
    Intended population(s)
    • Students
    • Undergraduate
    • Teaching Assistants
    Domain
    Topic
    Evidence
    The CHIRAL team carefully combs through every reference that cites this instrument and pulls all evidence that relates to the instruments’ validity and reliability. These data are presented in the following table that simply notes the presence or absence of evidence related to that concept, but does not indicate the quality of that evidence. Similarly, if evidence is lacking, that does not necessarily mean the instrument is “less valid,” just that it wasn’t presented in literature. Learn more about this process by viewing the CHIRAL Process and consult the instrument’s Review (next tab), if available, for better insights into the usability of this instrument.

    Information in the table is given in four different categories:
    1. General - information about how each article used the instrument:
      • Original development paper - indicates whether in which paper(s) the instrument was developed initially
      • Uses the instrument in data collection - indicates whether an article administered the instrument and collected responses
      • Modified version of existing instrument - indicates whether an article has modified a prior version of this instrument
      • Evaluation of existing instrument - indicates whether an article explicitly provides evidence that attempt to evaluate the performance of the instrument; lack of a checkmark here implies an article that administered the instrument but did not evaluate the instrument itself
    2. Reliability - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    3. Validity - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    4. Other Information - information that may or may not directly relate to the evidence for validity and reliability, but are commonly reported when evaluating instruments; please see the Glossary for term definitions
    Publications: 1 2 3 4

    General

    Original development paper
    Uses the instrument in data collection
    Modified version of existing instrument
    Evaluation of existing instrument

    Reliability

    Test-retest reliability
    Internal consistency
    Coefficient (Cronbach's) alpha
    McDonald's Omega
    Inter-rater reliability
    Person separation
    Generalizability coefficients
    Other reliability evidence

    Vailidity

    Expert judgment
    Response process
    Factor analysis, IRT, Rasch analysis
    Differential item function
    Evidence based on relationships to other variables
    Evidence based on consequences of testing
    Other validity evidence

    Other information

    Difficulty
    Discrimination
    Evidence based on fairness
    Other general evidence
    Review
    DISCLAIMER: The evidence supporting the validity and reliability of the data summarized below is for use of this assessment instrument within the reported settings and populations. The continued collection and evaluation of validity and reliability evidence, in both similar and dissimilar contexts, is encouraged and will support the chemistry education community’s ongoing understanding of this instrument and its limitations.
    This review was generated by a CHIRAL review panel. Each CHIRAL review panel consists of multiple experts who first individually review the citations of the assessment instrument listed on this page for evidence in support of the validity and reliability of the data generated by the instrument. Panels then meet to discuss the evidence and summarize their opinions in the review posted in this tab. These reviews summarize only the evidence that was discussed during the panel which may not represent all evidence available in the published literature or that which appears on the Evidence tab.
    If you feel that evidence is missing from this review, or that something was documented in error, please use the CHIRAL Feedback page.

    Panel Review: Modified Approaches and Study Skills Inventory for Students (M-ASSIST)

    (Post last updated June 24, 2022)

    Review panel summary

    The Modified Approaches and Study Skills Inventory for Students (M-ASSIST) is an assessment targeting students’ deep and surface study approaches using a five-point Likert response scale. The M-ASSIST consists of 12 items (6 items each for deep and surface approaches) and was developed based on the original ASSIST instrument with slight modifications to reflect American English [1]. The original authors claimed the ASSIST was based on the conceptual framework of deep and surface approaches to learning, but did not provide explicit content validity evidence to support this [1]. Internal structure validity evidence was provided by conducting a confirmatory factor analysis of a two-factor model for deep and surface study approaches, where a good fit was demonstrated [1]. Measurement invariance testing between students grouped by grades revealed an acceptable fit, therefore, authors used the M-ASSIST to compare deep and surface learning between these groups. This testing provided evidence for true differences across groups rather than differences in instrument function [1]. Evidence for relations to other variables was presented in the comparison of deep and surface scores to students’ study skills/lecture habits [2].

    Recommendations for use

    The M-ASSIST was developed to measure students’ deep and surface study approaches and is not intended to be used as a total score. Rather, each subscale provides information about respective study skills based on the presented internal structure validity evidence [1]. Based on the measurement invariance evidence, M-ASSIST has shown the ability to measure deep and surface learning approaches across students grouped by course grade [1]. While there is substantial evidence for the internal structure, additional evidence supporting test content and response process would provide further support for the data derived from the M-ASSIST.

    Details from panel review

    Authors conducted an ANOVA and structured means models (SMM) to compare student study approaches across achievement groups based on course grades. The two methods for comparing groups were presented, where SMM detected small group differences of the latent variable means resulting in a better measurement of the constructs of interest (deep and surface study approaches) [1]. The deep subscale score was less sensitive than the surface score across student groups based on the variation in subscale scores [1, 2]. As there was no discussion of the target population’s interpretation of the 12 items, there is a lack of evidence for response process validity. The panel found no reported reliability evidence for M-ASSIST data. In terms of relations to other variables, M-ASSIST deep and surface scores were compared to students’ study skills/lecture habits revealing a positive significant correlation between deep scores and favorable study habits (e.g., preparing ahead of time, taking notes in lecture, etc.) [2]. Further, M-ASSIST deep and surface scores have been used in a regression analysis with other variables such as course grade, first-generation status, race/ethnicity, and gender. However, more theoretical support from the literature would help to justify the predictive capabilities of the regression models with these specific variables [2].

    References

    [1] Bunce, D.M., Komperda, R., Schroeder, M.J., Dillner, D.K. Lin, S., Teichert, M.A., & Hartman, J.R. (2017). Differential use of study approaches by students of different achievement levels. Journal of Chemical Education, 94(10), 1415-1424. https://doi.org/10.1021/acs.jchemed.7b00202

    [2] Atieh, E.L., York, D.M., & Muñiz, M.N. (2021). Beneath the surface: An investigation of general chemistry students’ study skills to predict course outcomes. Journal of Chemical Education, 98(2), 281-292. https://doi.org/10.1021/acs.jchemed.0c01074

    [3] Frey, R.F., McDaniel, M.A., Bunce, D.M., Cahill, M.J., & Perry, M.D. (2020). Using students’ concept-building tendencies to better characterize average-performing student learning and problem-solving approaches in general chemistry. CBE–Life Sciences Education, 19(3), ar42. https://doi.org/10.1187/cbe.19-11-0240

    Versions
    Listed below are all versions and modifications that were based on this instrument or this instrument were based on.
    Instrument is derived from:
    Name Authors
    • NA

    Citations
    Listed below are all literature that develop, implement, modify, or reference the instrument.
    1. Bunce, D.M., Komperda, R., Schroeder, M.J., Dillner, D.K., Lin, S., Teichert, M.A., & Hartman, J.R. (2017). Differential use of study approaches by students of different achievement levels. Journal of Chemical Education, 94(10), 1415-1424.

    2. Frey, R.F., McDaniel, M.A., Bunce, D.M., Cahill, M.J., & Perry, M.D. (2020). Using students’ concept-building tendencies to better characterize average-performing student learning and problem-solving approaches in general chemistry. CBE Life Sciences Educ

    3. Atieh, E.L., York, D.M., & MunIiz, M.N. (2021). Beneath the Surface: An Investigation of General Chemistry Students' Study Skills to Predict Course Outcomes. Journal of Chemical Education, 98(2), 281-292.

    4. Atieh, E. (2020). Characterizing the progression of knowledge, beliefs, and behaviors in peer instructors: an evaluation of the general chemistry teaching interns (Doctoral dissertation, Rutgers University-School of Graduate Studies).