Skip to main content

Math-Up Skills Test

MUST

    OVERVIEW
    Overview
    Listed below is general information about the instrument.
    Summary
    Original author(s)
    • Williamson, V.M., Walker, D.R., Chuu, E., Broadway, S., Mamiya, B., Powell, C.B., Shelton, G.R., Weber, R., Dabney, A.R., & Mason, D.

    Original publication
    • Williamson, V.M., Walker, D.R., Chuu, E., Broadway, S., Mamiya, B., Powell, C.B., Shelton, G.R., Weber, R., Dabney, A.R., & Mason, D. (2020). Impact of basic arithmetic skills on success in first-semester general chemistry. Chemistry Education Research an

    Year original instrument was published 2020
    Inventory
    Number of items 20
    Number of versions/translations 1
    Cited implementations 5
    Language
    • English
    Country United States
    Format
    • Multiple Choice
    • Open Ended
    Intended population(s)
    • Students
    • Undergraduate
    Domain
    • Cognitive
    Topic
    • Mathematics
    Evidence
    The CHIRAL team carefully combs through every reference that cites this instrument and pulls all evidence that relates to the instruments’ validity and reliability. These data are presented in the following table that simply notes the presence or absence of evidence related to that concept, but does not indicate the quality of that evidence. Similarly, if evidence is lacking, that does not necessarily mean the instrument is “less valid,” just that it wasn’t presented in literature. Learn more about this process by viewing the CHIRAL Process and consult the instrument’s Review (next tab), if available, for better insights into the usability of this instrument.

    Information in the table is given in four different categories:
    1. General - information about how each article used the instrument:
      • Original development paper - indicates whether in which paper(s) the instrument was developed initially
      • Uses the instrument in data collection - indicates whether an article administered the instrument and collected responses
      • Modified version of existing instrument - indicates whether an article has modified a prior version of this instrument
      • Evaluation of existing instrument - indicates whether an article explicitly provides evidence that attempt to evaluate the performance of the instrument; lack of a checkmark here implies an article that administered the instrument but did not evaluate the instrument itself
    2. Reliability - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    3. Validity - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    4. Other Information - information that may or may not directly relate to the evidence for validity and reliability, but are commonly reported when evaluating instruments; please see the Glossary for term definitions
    Publications: 1 2 3 4 5

    General

    Original development paper
    Uses the instrument in data collection
    Modified version of existing instrument
    Evaluation of existing instrument

    Reliability

    Test-retest reliability
    Internal consistency
    Coefficient (Cronbach's) alpha
    McDonald's Omega
    Inter-rater reliability
    Person separation
    Generalizability coefficients
    Other reliability evidence

    Vailidity

    Expert judgment
    Response process
    Factor analysis, IRT, Rasch analysis
    Differential item function
    Evidence based on relationships to other variables
    Evidence based on consequences of testing
    Other validity evidence

    Other information

    Difficulty
    Discrimination
    Evidence based on fairness
    Other general evidence
    Review
    DISCLAIMER: The evidence supporting the validity and reliability of the data summarized below is for use of this assessment instrument within the reported settings and populations. The continued collection and evaluation of validity and reliability evidence, in both similar and dissimilar contexts, is encouraged and will support the chemistry education community’s ongoing understanding of this instrument and its limitations.
    This review was generated by a CHIRAL review panel. Each CHIRAL review panel consists of multiple experts who first individually review the citations of the assessment instrument listed on this page for evidence in support of the validity and reliability of the data generated by the instrument. Panels then meet to discuss the evidence and summarize their opinions in the review posted in this tab. These reviews summarize only the evidence that was discussed during the panel which may not represent all evidence available in the published literature or that which appears on the Evidence tab.
    If you feel that evidence is missing from this review, or that something was documented in error, please use the CHIRAL Feedback page.

    Panel Review: Math-Up Skills Test (MUST)

    (Post last updated June 14, 2022)

    Review panel summary

    The original Math-Up Skills Test (MUST) is a 16 question, free-response math instrument that was developed to assess students’ ability to conduct basic pre-college mathematical operations including the following topics: multiplication, division, fractions, scientific and exponential notation, logarithms, square roots, and balancing chemical equations [1]. An updated version of MUST was subsequently developed after the pilot study was conducted and was expanded to contain 20 questions, which includes the following additional topics: simplification of a complex fraction, division by zero, simplification of a mixed operations fraction, and calculation of fraction decimal equivalent [2]. In Texas where the MUST was developed, some of these math topics are introduced as early as fourth grade and the remaining topics are covered by eleventh grade [1]. Students take 15-minutes to complete the MUST in-person without the use of a calculator. The MUST is graded by hand, and for each correct answer, a score of 1 point is awarded and no points are awarded for an incorrect answer. An updated version of the MUST instrument and its answer key is readily available in the supplementary information [2].

    There are two versions of the MUST. In terms of single administration reliability, both versions of MUST show good evidence of reliability. Test content validity evidence was provided by professors from multiple disciplines (mathematics, chemistry, chemistry education) and an agreement of 100% was reached on correct answers to the items [2]. Validity evidence of relation to other variables was repeatedly established with correlation and regression analyses [1 - 6].

    The MUST has been evaluated with students enrolled in general chemistry [1 - 6], and engineering courses [1] across various different types of institutions (public and private universities and Hispanic Serving Institutions (HSI)), specifically in Texas, USA.

    Recommendations for use

    The MUST is developed to assess students’ ability to perform simple pre-college math problems. According to validity aspects reported in literature, the data collected with this instrument have shown good evidence of relations to other variables. According to the reliability aspects reported in literature, single administration reliability based on KR-20, 21 and coefficient alpha values are consistently high throughout the analyzed studies [1 - 6]. Users should be aware that the instrument is open response, and must be graded by hand. This might cause errors introduced by graders or be a time burden on the graders.

    Since the instrument has good predictability for success in general chemistry, instructors might consider using MUST scores (in conjunction with other information) for advising purposes. However, no evidence for the discrimination and difficulty of items on the instrument are provided. In the studies reviewed by the panel [1 - 6], the MUST has been used only for observational or informational purposes, and not to make advising/placement decisions or restrict student access to coursework, therefore the collection of consequence testing validity evidence is recommended before using the MUST in high stakes decisions. Users should be aware that the MUST was developed based on Texas’ K-12 math and chemistry standards and has only been used in Texas, albeit at a variety of institutions.

    Details from panel review

    The developers and users of MUST reported some aspects of validity (relation to other variables). Particularly, medium to strong correlations (r = 0.288 - 0.542) were reported between MUST score and course grades in general chemistry [1 - 6] and engineering chemistry [1] and MUST score and quantitative literacy and quantitative reasoning [5] (r =0.60). In addition, a relation was found between MUST score and Test of Logical Thinking (TOLT) scores where higher scores on both coocurred as did lower scores on both [4]. Reliability evidence (single administration reliability) for both versions of the instrument based on KR-20, 21 and coefficient alpha are consistent throughout the papers analyzed (KR-21 = 0.821 to 0.855; coefficient alpha = 0.85) [1 - 6].

    Although there is no formal information provided for the difficulty of items on MUST, there is sufficient information provided to calculate the difficulty based on given mean scores for each topic assessed on the MUST [1]. Regression analyses have demonstrated the ability for MUST scores to be predictive of success in general chemistry [1-4], specifically when coupled with other variables (i.e., quantitative reasoning, quantitative literacy) [5] and other demographic factors [6]. When the MUST instrument was coupled with quantitative reasoning and quantitative literacy instruments, up to 50% of chem I students and about 45% of chem II students who will not succeed has been predicted [5]. It is also interesting to note that when the MUST instrument has been used with ACS exam and demographic variables, it provided an even more accurate predictor of success (83.4%) [6].

    Lastly, MUST has been used at multiple institutions (R1, private, public, HSI) and it is encouraging to see that similar results were gathered across all studies. However, all of the studies were conducted in Texas [1 - 6]. Therefore, the results cannot necessarily be generalized at the national level. While the similar results are encouraging, the addition of evidence supporting response process validity would further strengthen the interpretation of MUST data.

    References
    [1] Albaladejo, J.D.P., Broadway, S., Mamiya, B., Petros, A., Powell, C.B., Shelton, G.R., Walker, D.R., Weber, R., Williamson, V.M., & Mason, D. (2018). ConfChem Conference on Mathematics in Undergraduate Chemistry Instruction: MUST-Know Pilot Study–Math Preparation Study from Texas. Journal of Chemical Education, 95(8), 1428-1429. https://doi.org/10.1021/acs.jchemed.8b00096

    [2] Williamson, V.M., Walker, D.R., Chuu, E.,Broadway, S., Mamiya, B., Powell, C.B., Shelton, G.R., Weber, R., Dabney, A.R., & Mason, D. (2020). Impact of basic arithmetic skills on success in first-semester general chemistry. Chemistry Education Research and Practice, 21(1), 51-61. https://doi.org/10.1039/C9RP00077A

    [3] Powell, C.B., Simpson, J., Williamson, V.M., Dubrovskiy, A., Walker, D.R., Jang, B., Shelton, G.R., & Mason, D. (2020). Impact of arithmetic automaticity on students’ success in second-semester general chemistry. Chemistry Education Research and Practice, 21(4), 1028-1041. https://doi.org/10.1039/D0RP00006J

    [4] Alivio, T.E.G., Howard, E., Mamiya, B., & Williamson, V.M. (2020). How does a math review impact a student’s arithmetic skills and performance in first-semester general chemistry? Journal of Science Education and Technology, 29, 703-712. https://doi.org/10.1007/s10956-020-09851-7

    [5] Sheldon, G.R., Mamiya, B., Weber, R., Walker, D.R., Powell, C.B., Jang, B., Dubrovskiy, A.V., Villalta-Cerdas, A., & Mason, D. (2021). Early warning signals from automaticity diagnostic instruments for first- and second-semester general chemistry. Journal of Chemical Education, 98, 3061-3072. https://doi.org/10.1021/acs.jchemed.1c00714

    [6] Willis, W.K., Williamson, V.M., Chuu, E., & Dabney, A.R. (2022). The relationship between a student’s success in first-semester general chemistry and their mathematics fluency, profile, and performance on common questions. Journal of Science Education and Technology, 31, 1-15. https://doi.org/10.1007/s10956-021-09927-y

    Versions
    Listed below are all versions and modifications that were based on this instrument or this instrument were based on.
    Instrument is derived from:
    Name Authors
    • Albaladejo, J.D.P., Broadway, S., Mamiya, B., Petros, A., Powell, C.B., Shelton, G.R., Walker, D.R., Weber, R., Williamson, V.M., & Mason, D.

    Citations
    Listed below are all literature that develop, implement, modify, or reference the instrument.
    1. Williamson, V.M., Walker, D.R., Chuu, E., Broadway, S., Mamiya, B., Powell, C.B., Shelton, G.R., Weber, R., Dabney, A.R., & Mason, D. (2020). Impact of basic arithmetic skills on success in first-semester general chemistry. Chemistry Education Research an

    2. Shelton, G.R., Mamiya, B., Weber, R., Rush, Walker D., Powell, C.B., Jang, B., Dubrovskiy, A.V., Villalta-Cerdas, A., & Mason, D. (2021). Early Warning Signals from Automaticity Diagnostic Instruments for First- And Second-Semester General Chemistry. Jour

    3. Powell, C.B., Simpson, J., Williamson, V.M., Dubrovskiy, A., Walker, D.R., Jang, B., Shelton, G.R., & Mason, D. (2020). Impact of arithmetic automaticity on students' success in second-semester general chemistry. Chemistry Education Research and Practice,

    4. Willis, W.K., Williamson, V.M., Chuu, E., & Dabney, A.R. (2022). The Relationship Between a Student’s Success in First-Semester General Chemistry and Their Mathematics Fluency, Profile, and Performance on Common Questions. Journal of Science Education and

    5. Alivio, T.E.G., Howard, E., Mamiya, B., & Williamson, V.M. (2020). How Does a Math Review Impact a Student’s Arithmetic Skills and Performance in First-Semester General Chemistry?. Journal of Science Education and Technology, 29(6), 703-712.