Skip to main content

Chemistry Self-Concept Inventory

CSCI

    OVERVIEW
    Overview
    Listed below is general information about the instrument.
    Summary
    Original author(s)
    • Bauer, C.F.

    Original publication
    • Bauer, C.F. (2005). Beyond "student attitudes": Chemistry self-concept inventory for assessment of the affective component of student learning. Journal of Chemical Education, 82(12), 1864-1870.

    Year original instrument was published 2005
    Inventory
    Number of items 40
    Number of versions/translations 2
    Cited implementations 13
    Language
    • English
    Country United States, United Kingdom
    Format
    • Response Scale
    Intended population(s)
    • Students
    • Undergraduate
    • Secondary School
    • High School
    Domain
    • Affective
    Topic
    • Self-Concept
    Evidence
    The CHIRAL team carefully combs through every reference that cites this instrument and pulls all evidence that relates to the instruments’ validity and reliability. These data are presented in the following table that simply notes the presence or absence of evidence related to that concept, but does not indicate the quality of that evidence. Similarly, if evidence is lacking, that does not necessarily mean the instrument is “less valid,” just that it wasn’t presented in literature. Learn more about this process by viewing the CHIRAL Process and consult the instrument’s Review (next tab), if available, for better insights into the usability of this instrument.

    Information in the table is given in four different categories:
    1. General - information about how each article used the instrument:
      • Original development paper - indicates whether in which paper(s) the instrument was developed initially
      • Uses the instrument in data collection - indicates whether an article administered the instrument and collected responses
      • Modified version of existing instrument - indicates whether an article has modified a prior version of this instrument
      • Evaluation of existing instrument - indicates whether an article explicitly provides evidence that attempt to evaluate the performance of the instrument; lack of a checkmark here implies an article that administered the instrument but did not evaluate the instrument itself
    2. Reliability - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    3. Validity - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    4. Other Information - information that may or may not directly relate to the evidence for validity and reliability, but are commonly reported when evaluating instruments; please see the Glossary for term definitions
    Publications: 1 2 3 4 5 6 7 8 9 10 11 12 13

    General

    Original development paper
    Uses the instrument in data collection
    Modified version of existing instrument
    Evaluation of existing instrument

    Reliability

    Test-retest reliability
    Internal consistency
    Coefficient (Cronbach's) alpha
    McDonald's Omega
    Inter-rater reliability
    Person separation
    Generalizability coefficients
    Other reliability evidence

    Vailidity

    Expert judgment
    Response process
    Factor analysis, IRT, Rasch analysis
    Differential item function
    Evidence based on relationships to other variables
    Evidence based on consequences of testing
    Other validity evidence

    Other information

    Difficulty
    Discrimination
    Evidence based on fairness
    Other general evidence
    Review
    DISCLAIMER: The evidence supporting the validity and reliability of the data summarized below is for use of this assessment instrument within the reported settings and populations. The continued collection and evaluation of validity and reliability evidence, in both similar and dissimilar contexts, is encouraged and will support the chemistry education community’s ongoing understanding of this instrument and its limitations.
    This review was generated by a CHIRAL review panel. Each CHIRAL review panel consists of multiple experts who first individually review the citations of the assessment instrument listed on this page for evidence in support of the validity and reliability of the data generated by the instrument. Panels then meet to discuss the evidence and summarize their opinions in the review posted in this tab. These reviews summarize only the evidence that was discussed during the panel which may not represent all evidence available in the published literature or that which appears on the Evidence tab.
    If you feel that evidence is missing from this review, or that something was documented in error, please use the CHIRAL Feedback page.

    Panel Review: Chemistry Self-Concept Inventory

    (Post last updated July 21, 2021)

    Review panel summary

    The CSCI was developed to measure a student’s “self-concept as a learner of chemistry” [1]. It consists of 40 items where students respond to each item using a seven-point Likert scale, ranging from “very inaccurate of me” to ”very accurate of me”. Many of the items derive from a previously published instrument, Self Description Questionnaire III (SDQIII), with items modified and added to address self-concept in chemistry specifically [1]. Student response data has been collected from multiple courses including university general chemistry [1-4], one-semester nursing chemistry [7], and four variations of high school chemistry [6].

    During the development of the instrument, exploratory factor analysis identified five subscales [1]. Using data collected from high school students, an exploratory factor analysis suggested a six-factor solution was more appropriate [6]. These results suggest that there is conflicting evidence of internal structure validity. Comparisons of the scale scores of general chemistry students (non-majors) to chemistry majors [1] and high school students in different chemistry courses [6] have shown expected differences in self-concept among the different groups. Student scores on two of the subscales have been used as an input variable in cluster analysis to establish student groups and membership in these groups has been found to be related to course performance [2, 3, 6]. This provides some validity support in regards to relations to other variables. Coefficient alpha has been used to estimate the single administration reliability of the subscales [1, 2, 4, 5, 7]. Across studies, the Math and Chemistry Self-Concept subscales consistently have higher alpha values (0.89 - 0.93) than the others (0.25 - 0.80). Evidence of suitable test-retest reliability has been reported [1, 5].

    Recommendations for use

    The CSCI has seen limited use in university general chemistry and high school chemistry courses. The conflicting evidence for internal structure found in the literature leads to the recommendation that users analyze data collected using the instrument for further evidence in support of internal structure. While some evidence of internal structure has been presented based on exploratory factor analysis, future users are encouraged to consider moving to a confirmatory factor analysis framework to explore the fit of an existing structure, thereby providing additional data about the suggested factor structures. In cases where a user cannot collect sufficient data to investigate the internal structure, the Math and Chemistry subscales have demonstrated the most stability, and they could potentially be used separately.

    Details from panel review

    Principal components analysis was used to identify five factors in a development study, with two items not associating with any of the factors. The five factors of self-concept were labeled: Mathematics, Chemistry, Academic, Academic Enjoyment, and Creativity [1]. Two of the factors were in good agreement with factors found in the SDQIII, while three were different. When data from high school students were analyzed using principal axis factoring, a different factor structure was found with some resemblance to the factor structure of the original SDQIII [5]. In a study with high school students [6], three items were removed, based on an analysis of response distributions, after the initial exploratory factor analysis, but the authors did not address if the fit improved. The evidence for the internal structure of the entire, five-subscale instrument, is therefore limited and conflicting. There is evidence that the Math and Chemistry Self-Concept subscales are robust.

    Two studies have collected data that compared the scale scores of students expected to have different levels of self-concept [1, 5]. One study with college students found that the self-concept scores of chemistry majors was higher than that of non-chemistry majors in a general chemistry class [1]. Expected patterns of self-concept scores were also found amongst high school students enrolled in different levels of chemistry courses. Self-concept scores increased from non-college prep chemistry, to college prep chemistry, to honors chemistry, to Advanced Placement chemistry students [5]. In this same study [5], the scores on the subscales were found to be correlated with scores on a chemistry misconceptions assessment. Another study used the increase in subscale scores (pre to post) as evidence of the success of a course redesign [7]. There has been use of the Math and Chemistry subscale scores as input variables in cluster analysis [3, 6]. These reports provide limited evidence for validity with respect to relations with other variables.

    Reliability evidence has been based on single administration reliability by providing coefficient alpha values, for each subscale. The coefficient alpha values for the Math and Chemistry Self-Concept have been consistently higher than the other three scales. Test-retest reliability has been evaluated in two ways. In the development process, the instrument was given twice, one week apart, at the beginning of the semester with correlation values ranging from 0.90 - 0.64 [3]. Another study compared scores two months apart by examining the scatter plot and reported an R2 value of 0.6267 [5].

    References

    [1] Bauer, C. F. (2005). Beyond “Student Attitudes”: Chemistry Self-Concept Inventory for Assessment of the Affective Component of Student Learning. Journal of Chemical Education, 82(12), 1864. https://doi.org/10.1021/ed082p1864

    [2] Lewis, S. E., Shaw, J. L., Heitz, J. O., & Webster, G. H. (2009). Attitude Counts: Self-Concept and Success in General Chemistry. Journal of Chemical Education, 86(6), 744. https://doi.org/10.1021/ed086p744

    [3] Chan, J. Y. K., & Bauer, C. F. (2014). Identifying At-Risk Students in General Chemistry via Cluster Analysis of Affective Characteristics. Journal of Chemical Education, 91(9), 1417–1425. https://doi.org/10.1021/ed500170x

    [4] Chan, J.Y.K. and Bauer, C.F. (2015), Effect of peer-led team learning (PLTL) on student achievement, attitude, and self-concept in college general chemistry in randomized and quasi experimental designs. Journal of Research in Science Teaching, 52: 319-346. https://doi.org/10.1002/tea.21197

    [5] Nielsen, S. E., & Yezierski, E. (2015). Exploring the Structure and Function of the Chemistry Self-Concept Inventory with High School Chemistry Students. Journal of Chemical Education, 92(11), 1782–1789. https://doi.org/10.1021/acs.jchemed.5b00302

    [6] Nielsen, S. E., & Yezierski, E. J. (2016). Beyond academic tracking: Using cluster analysis and self-organizing maps to investigate secondary students’ chemistry self-concept. Chemistry Education Research and Practice, 17(4), 711–722. https://doi.org/10.1039/C6RP00058D

    [7] Smith, A. L., Paddock, J. R., Vaughan, J. M., & Parkin, D. W. (2018). Promoting Nursing Students’ Chemistry Success in a Collegiate Active Learning Environment: “If I Have Hope, I Will Try Harder.” Journal of Chemical Education, 95(11), 1929–1938. https://doi.org/10.1021/acs.jchemed.8b00201

     

    Versions
    Listed below are all versions and modifications that were based on this instrument or this instrument were based on.
    Instrument has been modified in:
    Name Authors
    • Ross, J., Lai, C., & Nunez, L.

    • Vincent-Ruz, P., Binning, K., Schunn, C.D., & Grabowski, J.

    Citations
    Listed below are all literature that develop, implement, modify, or reference the instrument.
    1. Bauer, C.F. (2005). Beyond "student attitudes": Chemistry self-concept inventory for assessment of the affective component of student learning. Journal of Chemical Education, 82(12), 1864-1870.

    2. Nielsen, S.E., & Yezierski, E.J. (2016). Beyond academic tracking: Using cluster analysis and self-organizing maps to investigate secondary students' chemistry self-concept. Chemistry Education Research and Practice, 17(4), 711-722.

    3. Chan, J.Y.K., & Bauer, C.F. (2015). Effect of Peer-Led Team Learning (PLTL) on student achievement, attitude, and self-concept in college general chemistry in randomized and quasi experimental designs. Journal of Research in Science Teaching, 52(3), 319-3

    4. Chan, J.Y.K., & Bauer, C.F. (2014). Identifying at-risk students in general chemistry via cluster analysis of affective characteristics. Journal of Chemical Education, 91(9), 1417-1425.

    5. Lewis, S.E., Shaw, J.L., Heitz, J.O., & Webster, G.H. (2009). Altitude counts: Self-concept and success in general chemistry. Journal of Chemical Education, 86(6), 744-749.

    6. Chan, J.Y.K., & Bauer, C.F. (2016). Learning and studying strategies used by general chemistry students with different affective characteristics. Chemistry Education Research and Practice, 17(4), 675-684.

    7. Shaw, A. J., Harrison, T. G., Croker, S. J., Medley, M., Sellou, L., Shallcross, K. L., & Williams, S. (2010). University-School Partnerships: On the Impact on Students of Summer Schools (for School Students Aged 17-18) Run by Bristol ChemLabs. Acta Didac

    8. Shibley, I., Amaral, K. E., Shank, J. D., & Shibley, L. R. (2011). Designing a blended course: Using ADDIE to guide instructional design. Journal of College Science Teaching, 40(6).

    9. Smith, A.L., Paddock, J.R., Vaughan, J.M., & Parkin, D.W. (2018). Promoting Nursing Students' Chemistry Success in a Collegiate Active Learning Environment: "if i Have Hope, i Will Try Harder". Journal of Chemical Education, 95(11), 1929-1938.

    10. Zhao, N., Wardeska, J. G., McGuire, S. Y., & Cook, E. (2014). Metacognition: An effective tool to promote success in college science learning. Journal of College Science Teaching, 43(4), 48-54.

    11. Bauer, C.F. (2008). Attitude towards chemistry: A semantic differential instrument for assessing curriculum impacts. Journal of Chemical Education, 85(10), 1440-1445.

    12. Nielsen, S.E., & Yezierski, E. (2015). Exploring the Structure and Function of the Chemistry Self-Concept Inventory with High School Chemistry Students. Journal of Chemical Education, 92(11), 1782-1789.

    13. Augustine, B. H., Miller, H. B., Knippenberg, M. T., & Augustine, R. G. (2019). Strategies, Techniques, and Impact of Transitional Preparatory Courses for At-Risk Students in General Chemistry. In Enhancing Retention in Introductory Chemistry Courses: Teaching Practices and Assessments (pp. 15-47). American Chemical Society.