Skip to main content

Panel Review: Motivated Strategies for Learning Questionnaire (MSLQ)

(Post last updated 28 December 2024)

Review panel summary   
The Motivated Strategies for Learning Questionnaire (MSLQ) is a 81 item questionnaire designed to assess students’ motivation and learning strategies and has been administered to students from a variety of disciplines [1,2], including undergraduate biology [3], general chemistry [6-8], and organic chemistry [9]. The MSLQ includes 15 different subscales, which are designed to be used individually or in combination [1]. The survey is administered on a 7-point Likert-type scale from “not at all true of me” to “very true of me” [1,2,6,8,9], although the survey has also been administered using five-point Likert-type scales from “not at all true” to “very true” [4] and from “strongly disagree” to “strongly agree” [5]. Scale items were developed based on established models of motivation and learning and information processing [2], which provides some support for evidence based on test content. Data collected with initial items were informally evaluated for evidence of validity and reliability and revised before publication. Evidence for internal structure validity has been provided for data collected with the published items through two confirmatory factor analyses (CFAs); one that included the motivation subscales (6-factor CFA) and one with the cognitive and metacognitive learning strategy subscales (9-factor CFA) [2,3]. Single-factor CFAs have also been used to evaluate data collected for individual subscales [9]. Subscale means have been created by calculating the mean of all the item responses related to a single subscale [1,2]. Significant correlations to final course grade have been found for many individual item scores and subscale mean scores [1]. Additionally, some subscale means have shown significant correlations to the Achievement Emotions Questionnaire for Organic Chemistry (AEQ-OCHEM) subscales [10], as well as affective measures related to class, learning, and test taking (i.e., enjoyment, hope, pride, anger, anxiety, shame, hopelessness, and boredom) [5] and students’ chemistry self-concept, math self-concept, emotional satisfaction, and intellectual accessibility [6]. Some subscale scores have also been integrated into cluster analyses that demonstrate evidence based on relations to other variables, which resulted in clusters representing low, medium, and high affective groups [6,7]. Some subscales that have been used in studies to evaluate potential differences between test and control groups showed no significant differences between the groups, which provides some evidence for validity related to consequence testing [3,8]. Single administration reliability of data collected with the subscales have been estimated using coefficient alpha [1,2,4,5,9].

Recommendations for use   
The intent of the MSLQ is to allow instructors to pick and choose subscales of motivation and learning strategies related to the aspects they are interested in measuring for their population and environment [1]. While some aspects of validity and reliability evidence have been presented, there is currently limited evidence to support test content and response process validity for data collected with the MSLQ. Additionally, the evidence provided for internal structure validity suggests that while data collected with individual subscales may show evidence of good data-model fit [9], overall CFA models for motivation (6-factor correlated CFA) and learning strategies (9-factor correlated CFA) have not met the recommended cutoff values to suggest good data-model fit [1,2]. As MSLQ users have the ability to choose among different subscales to suit their needs, this panel suggests that validity and reliability evidence should continue to be provided for data collected with individual and/or multiple subscales of the MSLQ to provide further support for the proposed factor structure and that the population of interest interprets the items and subscales as intended.

Details from panel review   
While the development of the MSLQ items spanned many years and included data from over 1000 students [1,2], the published evidence in support of validity and reliability for data collected with the MSLQ is more limited. While the items were developed using models of motivation and learning [2], additional evidence to support test content validity is absent. Multiple sources include evidence related to the internal structure of the data collected with the instrument. CFAs with data collected with the motivation subscales (6-factor CFA) and learning strategies subscales (9-factor CFA) resulted in adequate fit statistics [1,2]. Most single-factor CFAs with individual subscales showed evidence of good data-model fit based on CFI and SRMR values [9].

Correlations between a majority of item and subscale mean scores with final course grade were found to be significant [1,2]. Correlations between some MSLQ subscales and other affective measures have also been reported. The task-value subscale has been used to evaluate individual interest and showed significant correlations with situational interest measures [4]. Subscales of self-efficacy and task value have also been found to be significantly correlated with positive and negative affective measures related to enjoyment, hope, pride, anger, anxiety, shame, hopelessness, and boredom of class, learning, and test environments [5]. Self-efficacy and test anxiety subscales have shown significant correlations with chemistry and math self-concept measures, emotional satisfaction, and intellectual accessibility [6]. MSLQ subscales have also had significant correlations with the subscales from the AEQ-OCHEM measure [9]. Some subscales from the MSLQ (i.e., task value, self-efficacy, metacognition and self-regulation, study environment, effort regulation, and peer learning) have been used to ensure that test and control groups are similar in those aspects [3,8]. No evidence to support response process validity was presented. Evidence of reasonable single-administration reliability has been provided through coefficient alpha values ranging from 0.51-0.95 for each subscale [1,2,3,4,9].

References

[1] Pintrich, P.R., Smith, D.A.F., Garcia, T., and McKeachie, W.J. (1991). A Manual for the Use of the Motivated Strategies Learning Questionnaire (MSLQ). National Center for Research to Improve Postsecondary Teaching and Learning Project on Instructional Processes and Educational Outcomes, Washington, DC.

[2] Pintrich, P.R., Smith, D.A.F., Garcia, T., and McKeachie, W.J. (1993). Reliability and Predictive Validity of the Motivated Strategies for Learning Questionnaire (Mslq). Educ. and Psych. Meas., 53(3), 801-813.

[3] Feldon, D.F., Timmerman, B.C., Stowe, K.A., & Showman, R. (2010). Translating expertise into effective instruction: The impacts of cognitive task analysis (CTA) on lab report quality and student retention in the biological sciences. J. Res. Sci. Teach. 47(10), 1165-1185.

[4] Linnenbrink-Garcia, L.L., Durik, A.M., Conley, A.M., Barron, K.E., Tauer, J.M., Karabenick, S.A., & Harackiewicz, J.M. (2010). Measuring Situational Interest in Academic Domains. Educ. and Psych. Meas., 70(4), 647-671.

[5] Pekrun, R., Goetz, T., Frenzel, A.C., Barchfeld, P., & Perry, R.P. (2011). Measuring emotions in students’ learning and performance: The Achievement Emotions Questionnaire (AEQ). Contem. Educ. Psych., 36, 36-48.

[6] Chan, J.Y.K. and Bauer, C.F. (2014). Identifying At-Risk Students in General Chemistry via Cluster Analysis of Affective Characteristics. J. Chem. Educ., 91(9), 1417-1425.

[7] Chan, J.Y.K. and Bauer, C.F. (2016). Learning and studying strategies used by general chemistry students with different affective
characteristics. Chem. Educ. Res. & Pract., 17, 675-684.

[8] Underwood, S.M., Reyes-Gastelum, D., & Cooper, M.M. (2016). When do students recognize relationships between molecular structure and properties? A longitudinal comparison of the impact of traditional and transformed curricula. Chem. Educ. Res. & Pract., 17, 365-380.

[9] Raker, J.R., Gibbons, R.E., Cruz-Ramirez de Arellano, D. (2018). Development and evaluation of the organic chemistry-specific achievement emotions questionnaire (AEQ-OCHEM). J. Res. Sci. Teach., 56(2). 163-183.