Skip to main content

Colorado Learning Attitudes About Science Survey For Use In Chemistry

CLASS-Chem

    OVERVIEW
    Overview
    Listed below is general information about the instrument.
    Summary
    Original author(s)
    • Barbera, J., Adams, W.K., Wieman, C.E., & Perkins, K.K.

    Original publication
    • Barbera, J., Adams, W.K., Wieman, C.E., & Perkins, K.K. (2008). Modifying and validating the Colorado learning attitudes about science survey for use in chemistry. Journal of Chemical Education, 85(10), 1435-1439.

    Year original instrument was published 2008
    Inventory
    Number of items 50
    Number of versions/translations 3
    Cited implementations 13
    Language
    • English
    Country United States
    Format
    • Response Scale
    Intended population(s)
    • Students
    • Undergraduate
    • Teaching Assistants
    • High School
    Domain
    • Affective
    Topic
    • Attitutde
    Evidence
    The CHIRAL team carefully combs through every reference that cites this instrument and pulls all evidence that relates to the instruments’ validity and reliability. These data are presented in the following table that simply notes the presence or absence of evidence related to that concept, but does not indicate the quality of that evidence. Similarly, if evidence is lacking, that does not necessarily mean the instrument is “less valid,” just that it wasn’t presented in literature. Learn more about this process by viewing the CHIRAL Process and consult the instrument’s Review (next tab), if available, for better insights into the usability of this instrument.

    Information in the table is given in four different categories:
    1. General - information about how each article used the instrument:
      • Original development paper - indicates whether in which paper(s) the instrument was developed initially
      • Uses the instrument in data collection - indicates whether an article administered the instrument and collected responses
      • Modified version of existing instrument - indicates whether an article has modified a prior version of this instrument
      • Evaluation of existing instrument - indicates whether an article explicitly provides evidence that attempt to evaluate the performance of the instrument; lack of a checkmark here implies an article that administered the instrument but did not evaluate the instrument itself
    2. Reliability - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    3. Validity - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    4. Other Information - information that may or may not directly relate to the evidence for validity and reliability, but are commonly reported when evaluating instruments; please see the Glossary for term definitions
    Publications: 1 2 3 4 5 6 7 8 9 10 11 12 13

    General

    Original development paper
    Uses the instrument in data collection
    Modified version of existing instrument
    Evaluation of existing instrument

    Reliability

    Test-retest reliability
    Internal consistency
    Coefficient (Cronbach's) alpha
    McDonald's Omega
    Inter-rater reliability
    Person separation
    Generalizability coefficients
    Other reliability evidence

    Vailidity

    Expert judgment
    Response process
    Factor analysis, IRT, Rasch analysis
    Differential item function
    Evidence based on relationships to other variables
    Evidence based on consequences of testing
    Other validity evidence

    Other information

    Difficulty
    Discrimination
    Evidence based on fairness
    Other general evidence
    Review
    DISCLAIMER: The evidence supporting the validity and reliability of the data summarized below is for use of this assessment instrument within the reported settings and populations. The continued collection and evaluation of validity and reliability evidence, in both similar and dissimilar contexts, is encouraged and will support the chemistry education community’s ongoing understanding of this instrument and its limitations.
    This review was generated by a CHIRAL review panel. Each CHIRAL review panel consists of multiple experts who first individually review the citations of the assessment instrument listed on this page for evidence in support of the validity and reliability of the data generated by the instrument. Panels then meet to discuss the evidence and summarize their opinions in the review posted in this tab. These reviews summarize only the evidence that was discussed during the panel which may not represent all evidence available in the published literature or that which appears on the Evidence tab.
    If you feel that evidence is missing from this review, or that something was documented in error, please use the CHIRAL Feedback page.

    Panel Review: Colorado Learning Attitudes about Science Survey (CLASS-Chem)

    (Post last updated June 23, 2022)

    Review panel summary

    The Colorado Learning Attitudes about Science Survey (CLASS-Chem) is a 50 item, 5-point Likert scale, survey that was designed to measure students’ beliefs about chemistry and the learning of chemistry. It has been evaluated with undergraduate students enrolled in a variety of chemistry courses such as introductory chemistry [1], general chemistry I [1, 2], organic I [1], as well as undergraduate students serving as general chemistry peer instructors [3] in the U.S. Several aspects of reliability and validity have been assessed for the data generated by the CLASS-Chem instrument. Test content evidence was established during initial development of the chemistry version of the CLASS-Chem survey by asking 50 chemistry faculty to take the survey and provide feedback on the statements [1]. Response process validity evidence was collected through student interviews [1, 3] which asked participants to explain their reasoning for a selected answer. The interviews allowed the authors to tune the survey to 50 items with unambiguous interpretations [1], although misinterpretations of several items were found in the unique sample of undergraduate students serving as peer instructors [3]. The ability of the instrument to capture differences between student populations that would be expected to have different beliefs (for example, chemistry majors vs. nonmajors) offers validity evidence based on relations to other variables [1]. Validity evidence based on the internal structure of the data using confirmatory factor analysis (CFA) showed adequate model fit for the nine single-factor solutions proposed by the instrument authors [2]. However, a CFA for the correlated nine-factor model failed to converge due to a large number of items overlapping on multiple scales. This result implies that CLASS-Chem data cannot simultaneously measure all nine proposed subscales. A three-factor solution using only unique items from non-highly correlated factors showed adequate model fit [2]. Single administration reliability was estimated using coefficient alpha for the overall CLASS-chem survey [1] and each of the nine subscales measured by the CLASS-Chem [2].

    Recommendations for use

    The CLASS-Chem was developed with the intention to measure student’s beliefs about chemistry and the learning of chemistry [1]. According to the validity and reliability evidence reported in the literature, there is support that the instrument data can be used to measure undergraduate chemistry student’s beliefs on the subscales of Personal Interest, Confidence, and Atomic-Molecular Perspective of Chemistry simultaneously when the full 50-item survey is administered [2]. Additional validity evidence supports that the 7-item Conceptual Learning scale can be administered as a stand-alone instrument [2]. There is no evidence that any of the other eight scales can be used as stand-alone instruments or that data from the 50-item survey can be used to simultaneously measure all nine subscales. Caution should be used if administering this survey to other samples as some evidence of misinterpretation of items was seen in a sample of peer instructors [3].

    Details from panel review

    The CLASS-Chem developers [1] and other researchers [2, 3] have reported several aspects of validity and reliability evidence for the instrument . While the CLASS-Chem instrument development provided evidence in support of test content validity and response process validity [1]; further analysis showed some additional aspects of validity and reliability to be lacking. One technique used in the development of CLASS-Chem, reduced basis factor analysis [1], is described as a method used to incorporate the eleven chemistry-specific items into an optimized factor structure. However, this method allowed for individual items to be placed on more than one subscale, thereby complicating the meaning of subscale scores with cross-placed items. Confirmatory factor analysis using the theorized nine-factor solution failed to converge owing to the fact that many items of the 50-item survey overlap into multiple sub-categories [2]. This lack of discriminant validity, coupled with many items lacking a theoretical basis for factor inclusion [2], threatens the validity based on the internal structure of the instrument. No studies in the literature have been able to reproduce evidence based on the internal structure of the 50-item instrument [1, 2]. While a high correlation between students’ responses across different semesters was provided in support of the test-retest reliability of the instrument [1], this type of reliability is typically determined with the same sample of respondents.

    References

    [1] Adams, K.W., Wieman, C.E., Perkins, K.K., & Barbera, J. (2008). Modifying and validating the Colorado Learning Attitudes about Science Survey for use in chemistry. Journal of Chemical Education, 85(10), 1435-1439. https://doi.org/10.1021/ed085p1435

    [2] Heredia, K., & Lewis, J.E. (2012). A psychometric evaluation of the Colorado Learning Attitudes about Science Survey for use in chemistry. Journal of Chemical Education, 89(4), 436-441. https://doi.org/10.1021/ed100590t

    [3] Atieh, E.L., & York, D.M. (2020). Through the looking CLASS: When peer leader learning attitudes are not what they seem. Journal of Chemical Education, 97(8), 2078-2090. https://doi.org/10.1021/acs.jchemed.0c00129

    Versions
    Listed below are all versions and modifications that were based on this instrument or this instrument were based on.
    Instrument is derived from:
    Name Authors
    • Adams, W.K., Perkins, K.K., Podolefsky, N.S., Dubson, M., Finkelstein, N.D., & Wieman, C.E.

    Instrument has been modified in:
    Name Authors
    • Bueno, P. M.

    • Boesdorfer, S.B., Baldwin, E., & Lieberum, K.A.

    • Vincent-Ruz, P., Binning, K., Schunn, C.D., & Grabowski, J.

    Citations
    Listed below are all literature that develop, implement, modify, or reference the instrument.
    1. Barbera, J., Adams, W.K., Wieman, C.E., & Perkins, K.K. (2008). Modifying and validating the Colorado learning attitudes about science survey for use in chemistry. Journal of Chemical Education, 85(10), 1435-1439.

    2. Barker, J. G. Effect of instructional methodologies on student achievement modeling instruction vs. traditional

      instruction (2012). Master's Thesis.

    3. Atieh, E. (2020). Characterizing the progression of knowledge, beliefs, and behaviors in peer instructors: an evaluation of the general chemistry teaching interns (Doctoral dissertation, Rutgers University-School of Graduate Studies).

    4. Sommers, A.S., Miller, A.W., Gift, A.D., Richter-Egger, D.L., Darr, J.P., & Cutucache, C.E. (2021). CURE Disrupted! Takeaways from a CURE without a Wet-Lab Experience. Journal of Chemical Education, 98(2), 357-367.

    5. Allen, G., Guzman-Alvarez, A., Molinaro, M., & Larsen, D. (2015). Assessing the impact and efficacy of the open-access ChemWiki textbook project. Educause Learning Initiative Brief, 1-8.

    6. Allen, G., Guzman-Alvarez, A., Smith, A., Gamage, A., Molinaro, M., & Larsen, D.S. (2015). Evaluating the effectiveness of the open-access ChemWiki resource as a replacement for traditional general chemistry textbooks. Chemistry Education Research and Pra

    7. Kiste, A. L., Scott, G. E., Bukenberger, J., Markmann, M., & Moore, J. (2017). An examination of student outcomes in studio chemistry. Chemistry Education Research and Practice, 18(1), 233-249.

    8. Heredia, K., & Lewis, J.E. (2012). A psychometric evaluation of the Colorado learning attitudes about science survey for use in chemistry. Journal of Chemical Education, 89(4), 436-441.

    9. Phillips, K. E., & Grose-Fifer, J. (2011). A performance enhanced interactive learning workshop model as a supplement for organic chemistry instruction. Journal of College Science Teaching, 40(3), 90.

    10. Schaller, C.P., Graham, K.J., Johnson, B.J., Jakubowski, H.V., McKenna, A.G., McIntee, E.J., Nicholas, Jones T., Fazal, M.A., & Peterson, A.A. (2015). Chemical structure and properties: A modified atoms-first, one-semester introductory chemistry course. J

    11. Kennerly, W.W., Frederick, K.A., & Sheppard, K. (2020). General Chemistry in Just One Semester for All Majors. Journal of Chemical Education, 97(5), 1295-1302.

    12. Atieh, E.L., & York, D.M. (2020). Through the Looking CLASS: When Peer Leader Learning Attitudes Are Not What They Seem. Journal of Chemical Education, 97(8), 2078-2090.

    13. Ray, S. (2015). The Effects of Project-Based Science on Students' Attitudes and Understanding of High School Chemistry.