Skip to main content

Chemistry Attitudes And Experiences Questionnaire

CAEQ

    OVERVIEW
    Overview
    Listed below is general information about the instrument.
    Summary
    Original author(s)
    • Coll, R. K., Dalgety, J., & Salter, D.

    • Dalgety, J., Coll, R.K., & Jones, A.

    Original publication
    • Coll, R. K., Dalgety, J., & Salter, D. (2002). The development of the Chemistry Attitudes and Experiences Questionnaire (CAEQ). Chemistry Education; Research and Practice in Europe, 3, 19–32.

    • Dalgety, J., Coll, R.K., & Jones, A. (2003). Development of chemistry attitudes and experiences questionnaire (CAEQ). Journal of Research in Science Teaching, 40(7), 649-668.

    Year original instrument was published 2002, 2003
    Inventory
    Number of items 75
    Number of versions/translations 5
    Cited implementations 9
    Language
    • English
    • Unknown
    Country New Zealand, United States, Australia, Qatar, Indonesia
    Format
    • Response Scale
    Intended population(s)
    • Students
    • Undergraduate
    Domain
    • Affective
    • Behavioral
    Topic
    • Attitutde
    Evidence
    The CHIRAL team carefully combs through every reference that cites this instrument and pulls all evidence that relates to the instruments’ validity and reliability. These data are presented in the following table that simply notes the presence or absence of evidence related to that concept, but does not indicate the quality of that evidence. Similarly, if evidence is lacking, that does not necessarily mean the instrument is “less valid,” just that it wasn’t presented in literature. Learn more about this process by viewing the CHIRAL Process and consult the instrument’s Review (next tab), if available, for better insights into the usability of this instrument.

    Information in the table is given in four different categories:
    1. General - information about how each article used the instrument:
      • Original development paper - indicates whether in which paper(s) the instrument was developed initially
      • Uses the instrument in data collection - indicates whether an article administered the instrument and collected responses
      • Modified version of existing instrument - indicates whether an article has modified a prior version of this instrument
      • Evaluation of existing instrument - indicates whether an article explicitly provides evidence that attempt to evaluate the performance of the instrument; lack of a checkmark here implies an article that administered the instrument but did not evaluate the instrument itself
    2. Reliability - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    3. Validity - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
    4. Other Information - information that may or may not directly relate to the evidence for validity and reliability, but are commonly reported when evaluating instruments; please see the Glossary for term definitions
    Publications: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18

    General

    Original development paper
    Uses the instrument in data collection
    Modified version of existing instrument
    Evaluation of existing instrument

    Reliability

    Test-retest reliability
    Internal consistency
    Coefficient (Cronbach's) alpha
    McDonald's Omega
    Inter-rater reliability
    Person separation
    Generalizability coefficients
    Other reliability evidence

    Vailidity

    Expert judgment
    Response process
    Factor analysis, IRT, Rasch analysis
    Differential item function
    Evidence based on relationships to other variables
    Evidence based on consequences of testing
    Other validity evidence

    Other information

    Difficulty
    Discrimination
    Evidence based on fairness
    Other general evidence
    Review
    DISCLAIMER: The evidence supporting the validity and reliability of the data summarized below is for use of this assessment instrument within the reported settings and populations. The continued collection and evaluation of validity and reliability evidence, in both similar and dissimilar contexts, is encouraged and will support the chemistry education community’s ongoing understanding of this instrument and its limitations.
    This review was generated by a CHIRAL review panel. Each CHIRAL review panel consists of multiple experts who first individually review the citations of the assessment instrument listed on this page for evidence in support of the validity and reliability of the data generated by the instrument. Panels then meet to discuss the evidence and summarize their opinions in the review posted in this tab. These reviews summarize only the evidence that was discussed during the panel which may not represent all evidence available in the published literature or that which appears on the Evidence tab.
    If you feel that evidence is missing from this review, or that something was documented in error, please use the CHIRAL Feedback page.

    Panel Review: Chemistry Attitudes and Experiences Questionnaire

    (Post last updated July 21, 2021)

    Review panel summary

    The Chemistry Attitudes and Experiences Questionnaire (CAEQ) was developed to measure New Zealand first-year university chemistry students’ attitude toward chemistry, chemistry self-efficacy, and learning experiences [1]. The instrument has three scales (attitude toward chemistry, chemistry self-efficacy, and learning experiences); each of these, with the exception of the chemistry self-efficacy scale, has subscales. The first two scales use a 7-point semantic differential and the last scale uses a 5-point Likert. The final instrument has 69 items.

    Several aspects of reliability and validity of the CAEQ have been investigated. During the development process, semi-structured interviews were used with faculty and graduate students to gather evidence for test content validity [1]. Student interviews were used to investigate readability, which constitutes limited evidence for response process validity [1]. Two studies utilizing exploratory factor analysis have provided some evidence for internal structure validity [1]. Expected statistically significant differences between chemistry majors and non-chemistry majors were found on all subscales, providing evidence that supports validity with regard to relations to other variables [2]. Values of coefficient alpha for the various subscales have been reported as evidence of single administration reliability [1-3].

    Recommendations for use

    The CAEQ was developed using a rigorous, theory-based process, which suggests it has the potential to be a useful measure of affective components. However, there are only two studies (done in New Zealand) reporting data about aspects of validity. The only study done in the USA did not investigate the validity of the data generated using the instrument and only reported evidence of single administration reliability, coefficient alpha [3]. Therefore, CAEQ users within other educational systems are especially encouraged to analyze data collected using the instrument for further evidence of validity and reliability within broader populations.

    Details from panel review

    The development of the CAEQ relied on a theoretical basis, Theory of Planned Behavior, and interviews with faculty and graduate students [3]. An item pool for the pilot version was sourced from previously reported instruments and items written by chemistry faculty. A panel of experts (chemistry faculty and graduate students) participated in semi-structured interviews intended to inform the development of items, scales, and subscales. These experts ultimately selected the instrument's items from a large pool of potential items. Additionally, an expert in teaching English as a second language (ESL) students screened the items for readability [3]. While interviews with students from the intended population explored the readability of the items, providing limited evidence for response process validity, additional response process validity evidence collected using cognitive interviews to investigate students’ option-choice reasoning would further expand support for the items.

    Based on the results of a pilot study, items were removed and a “final'' version was created. A validation study using this final version was done, and exploratory factor analysis on the final version resulted in the subscales of the self-efficacy factor collapsing into one scale [1]. In this same study, the authors reported the calculation of discriminant validity coefficients, a measure of the uniqueness of each scale, as evidence that scales measure distinct constructs [1]. They report values below the target of 0.30, but it is unclear how these values were calculated. A second study repeated the exploratory factor analysis and reported “similar factor structures [to the original study] with some items removed” [2]. Explanation and discussion of which items were removed and the cross-loading of several items is unclear, therefore more evidence is needed regarding internal structure validity for the CAEQ. When comparing scale scores for chemistry and non-chemistry majors, the anticipated significant differences were observed [2]. This provides limited evidence for validity with respect to relations to other variables. Single administration reliability evidence, coefficient alpha values, have been reported in three studies, and with values from 0.56 to 0.96 for the scales and subscales [1-3].

    References

    [1] Dalgety, J., Coll, R.K., & Jones, A. (2003). Development of chemistry attitudes and experiences questionnaire (CAEQ). Journal of Research in Science Teaching, 40: 649-668. https://doi.org/10.1002/tea.10103

    [2] Dalgety, J., & Coll, R. K. (2006). The influence of first‐year chemistry students’ learning experiences on their educational choices 1. Assessment & Evaluation in Higher Education, 31(3), 303–328. https://doi.org/10.1080/02602930500352931

    [3] Tomasik, J. H., LeCaptain, D., Murphy, S., Martin, M., Knight, R. M., Harke, M. A., Burke, R., Beck, K., & Acevedo-Polakovich, I. D. (2014). Island Explorations: Discovering Effects of Environmental Research-Based Lab Activities on Analytical Chemistry Students. Journal of Chemical Education, 91(11), 1887–1894. https://doi.org/10.1021/ed5000313

    Versions
    Listed below are all versions and modifications that were based on this instrument or this instrument were based on.
    Instrument is derived from:
    Name Authors
    • Fraser, B. J.

    Instrument has been modified in:
    Name Authors
    • Avargil, S., Kohen, Z., & Dori, Y.J.

    • Calik, M., & Cobern, W.W.

    • Jurisevic, M., Glazar, S., Pucko, C.R., & Devetak, I.

    • Fakayode, S.O., King, A.G., Yakubu, M., Mohammed, A.K., & Pollard, D.A.

    • Calik, M., Ultay, N., Kolomuc, A., & Aytar, A.

    Citations
    Listed below are all literature that develop, implement, modify, or reference the instrument.
    1. Coll, R. K., Dalgety, J., & Salter, D. (2002). The development of the Chemistry Attitudes and Experiences Questionnaire (CAEQ). Chemistry Education; Research and Practice in Europe, 3, 19–32.

    2. Dalgety, J., Coll, R.K., & Jones, A. (2003). Development of chemistry attitudes and experiences questionnaire (CAEQ). Journal of Research in Science Teaching, 40(7), 649-668.

    3. Mataka, L.M., & Kowalske, M.G. (2015). The influence of PBL on students' self-efficacy beliefs in chemistry. Chemistry Education Research and Practice, 16(4), 929-938.

    4. Villafañe, S.M., Garcia, C.A., & Lewis, J.E. (2014). Exploring diverse students' trends in chemistry self-efficacy throughout a semester of college-level preparatory chemistry. Chemistry Education Research and Practice, 15(2), 114-127.

    5. Griep, M. A., Stains, M., & Velasco, J. (2018). Coordination of the Chemistry REU Program at the University of Nebraska− Lincoln. In Best Practices for Chemistry REU Programs (pp. 139-156). American Chemical Society.

    6. Chase, A., Pakhira, D., & Stains, M. (2013). Implementing process-oriented, guided-inquiry learning for the first time: Adaptations and short-term impacts on students’ attitude and performance. Journal of Chemical Education, 90(4), 409-416.

    7. Vishnumolakala, V.R., Southam, D.C., Treagust, D.F., Mocerino, M., & Qureshi, S. (2017). Students' attitudes, self-efficacy and experiences in a modified process-oriented guided inquiry learning undergraduate chemistry classroom. Chemistry Education Resea

    8. Vishnumolakala, V. R., Qureshi, S. S., Treagust, D. F., Mocerino, M., Southam, D. C., & Ojeil, J. (2018). Longitudinal impact of process-oriented guided inquiry learning on the attitudes, self-efficacy and experiences of pre-medical chemistry students. QS

    9. Wahyudiati, D., Rohaeti, E., Irwant,o, Wiyarsi, A., & Sumardi, L. (2020). Attitudes toward chemistry, self-efficacy, and learning experiences of pre-service chemistry teachers: Grade level and gender differences. International Journal of Instruction, 13(1