Skip to main content

Panel Review: Scale Literacy Skills Test (SLST)

(Post last updated June 14, 2022)

Review panel summary

The Scale Literacy Skills Test (SLST) is a 45-item multiple choice instrument designed to assess students’ knowledge and understanding of several aspects of scale, such as measurement and estimation, creating reliable scales, applying conceptual anchors, and others. Although little information on test content validity was provided, according to the developers, the SLST was developed based on ideas from the literature, particularly the notion that scale literacy skills are formed in a stepwise trajectory from novice, to developing, and finally to expert [1]. Additionally, the developers used published instruments as well as interviews with students to create the items; although not much information is provided from these initial interviews [1]. Later, a qualitative study reported 58 cognitive interviews where response process validity evidence was gathered [3]. The interviews evaluated whether students gave correct or incorrect reasoning for their answer choice. This study, as well as others, provided an idea of common misconceptions or alternate conceptions that students may have [1-3]. The authors of several studies [1-3] presented good validity evidence based on relations to other variables, including correlations with grades in general and preparatory chemistry.

The developers of the SLST [1] provided KR-21 values as good evidence of single administration reliability. Subsequent studies [2, 3] did not report reliability values.

Item discrimination and item difficulty were reported in the original development paper as a range, as opposed to per item [1]. The ranges for these statistics were very broad, including a negative item discrimination value (which typically indicates a problematic item or scoring error). The average difficulty value for preparatory chemistry was 0.435 and 0.536 for general chemistry.

Recommendations for use

This instrument has been used with students in preparatory and general chemistry courses [1-3]. Additionally, a group of graduate students participated in a study and completed the instrument for analyses [1]. However, this group was intended to test expert knowledge and demonstrate the difference in scores from a novice to an expert within a time-point [1]. Students in anatomy and physiology participated in cognitive interviews; however, they were not participants in any of the quantitative analyses in that study [3]. The authors report requiring 90 minutes to complete this instrument.

The SLST has been used in connection with another instrument called the Scale Concept Inventory (SCI), and often grouped with this instrument as one Scale Literacy Score (SLS) [1, 2]. Each of these two instruments can stand alone and have shown to significantly correlate to measures of performance, such as final exam scores [1-3].

Validity and reliability evidence was adequate, especially based on response process and relation to other variables validity. However, there is currently a lack of evidence of internal structure, which would help in understanding how the items relate to each other and provide support for the determination of a total score.

The developers suggest that this instrument can be used to assess students' level of understanding of scale in a chemistry course. However, as this instrument was not available in the publications, nor in the supplemental information, this was difficult to evaluate without a deeper understanding of the items.

Details from panel review

The authors of several papers provided rich data and evidence of validity based on relations to other variables by demonstrating significant correlations between SLST score and final exam score [1-3]. Additionally, the developers showed a correlation matrix between all of the variables tested, including ACT Math scores, indicating significant correlations between these variables [1]. Correlation analyses between the SLST and the course final grade was also performed. The correlation values reported are 0.550-0.606 for general chemistry, and 0.332-0.466 for preparatory chemistry [1]. In addition, regression analyses were conducted to test variables that predict final exam scores. The SLST in combination with another instrument, the SCI, as a weighted average score in what the authors call the the Scale Literacy Score (SLS) was used as a predictor of class final score and found to be significant [1].

Additionally, the developers demonstrated appropriate reliability evidence for the SLST [1]. In addition to aspects of validity evidence shown, single administration reliability was explored with a Kuder Richardson-21 (KR-21) analysis, which displayed values of 0.62 and 0.70 for preparatory chemistry and general chemistry, respectively [1]. The values reported demonstrate satisfactory reliability evidence.

The developers performed item difficulty and discrimination studies, however only the range of these values were provided [1]. While the average item difficulty and discrimination were within the appropriate cutoff values, the high and low values were well outside of the typical cutoff values. For example, the authors report item discrimination of -0.029, indicating that low performing students chose the correct answer on this item disproportionately in comparison to high performing students. The authors mentioned that items with difficulty and discrimination outside of the cutoff range were kept because they test misconceptions. But without an understanding of the items and explicit item statistics, it is difficult to evaluate this aspect of the instrument.

While the authors mentioned having interviewed students and also taken ideas from the literature to develop the instrument [1], these are not evidenced in the manuscript or the supplemental materials. Internal structure validity is not addressed in any of the publications; thus, the choice of using the score of the instrument as a whole unit is not supported by statistical analyses.

References

[1] Gerlach, K., Trate, J., Blecking, A., Geissinger, P., & Murphy, K. (2014). Valid and reliable assessment to measure scale literacy of students in introductory college chemistry courses. Journal of Chemical Education, 91(10), 1538-1545. https://doi.org/10.1021/ed400471a

[2] Trate, J.M., Geissinger, P., Blecking, A., & Murphy, K.L. (2019). Integrating scale-themed instruction across the general chemistry curriculum. Journal of Chemical Education, 96(11), 2361-2370. https://doi.org/10.1021/acs.jchemed.9b00594

[3] Trate, J.M., Fisher, V., Blecking, A., Geissinger, P., & Murphy, K.L. (2019). Response process validity studies of the Scale Literacy Skills Test. Journal of Chemical Education, 96(7), 1351-1358. https://doi.org/10.1021/acs.jchemed.8b00990