Skip to main content

Panel Review: Chemistry Attitudes and Experiences Questionnaire

(Post last updated July 21, 2021)

Review panel summary

The Chemistry Attitudes and Experiences Questionnaire (CAEQ) was developed to measure New Zealand first-year university chemistry students’ attitude toward chemistry, chemistry self-efficacy, and learning experiences [1]. The instrument has three scales (attitude toward chemistry, chemistry self-efficacy, and learning experiences); each of these, with the exception of the chemistry self-efficacy scale, has subscales. The first two scales use a 7-point semantic differential and the last scale uses a 5-point Likert. The final instrument has 69 items.

Several aspects of reliability and validity of the CAEQ have been investigated. During the development process, semi-structured interviews were used with faculty and graduate students to gather evidence for test content validity [1]. Student interviews were used to investigate readability, which constitutes limited evidence for response process validity [1]. Two studies utilizing exploratory factor analysis have provided some evidence for internal structure validity [1]. Expected statistically significant differences between chemistry majors and non-chemistry majors were found on all subscales, providing evidence that supports validity with regard to relations to other variables [2]. Values of coefficient alpha for the various subscales have been reported as evidence of single administration reliability [1-3].

Recommendations for use

The CAEQ was developed using a rigorous, theory-based process, which suggests it has the potential to be a useful measure of affective components. However, there are only two studies (done in New Zealand) reporting data about aspects of validity. The only study done in the USA did not investigate the validity of the data generated using the instrument and only reported evidence of single administration reliability, coefficient alpha [3]. Therefore, CAEQ users within other educational systems are especially encouraged to analyze data collected using the instrument for further evidence of validity and reliability within broader populations.

Details from panel review

The development of the CAEQ relied on a theoretical basis, Theory of Planned Behavior, and interviews with faculty and graduate students [3]. An item pool for the pilot version was sourced from previously reported instruments and items written by chemistry faculty. A panel of experts (chemistry faculty and graduate students) participated in semi-structured interviews intended to inform the development of items, scales, and subscales. These experts ultimately selected the instrument's items from a large pool of potential items. Additionally, an expert in teaching English as a second language (ESL) students screened the items for readability [3]. While interviews with students from the intended population explored the readability of the items, providing limited evidence for response process validity, additional response process validity evidence collected using cognitive interviews to investigate students’ option-choice reasoning would further expand support for the items.

Based on the results of a pilot study, items were removed and a “final'' version was created. A validation study using this final version was done, and exploratory factor analysis on the final version resulted in the subscales of the self-efficacy factor collapsing into one scale [1]. In this same study, the authors reported the calculation of discriminant validity coefficients, a measure of the uniqueness of each scale, as evidence that scales measure distinct constructs [1]. They report values below the target of 0.30, but it is unclear how these values were calculated. A second study repeated the exploratory factor analysis and reported “similar factor structures [to the original study] with some items removed” [2]. Explanation and discussion of which items were removed and the cross-loading of several items is unclear, therefore more evidence is needed regarding internal structure validity for the CAEQ. When comparing scale scores for chemistry and non-chemistry majors, the anticipated significant differences were observed [2]. This provides limited evidence for validity with respect to relations to other variables. Single administration reliability evidence, coefficient alpha values, have been reported in three studies, and with values from 0.56 to 0.96 for the scales and subscales [1-3].

References

[1] Dalgety, J., Coll, R.K., & Jones, A. (2003). Development of chemistry attitudes and experiences questionnaire (CAEQ). Journal of Research in Science Teaching, 40: 649-668. https://doi.org/10.1002/tea.10103

[2] Dalgety, J., & Coll, R. K. (2006). The influence of first‐year chemistry students’ learning experiences on their educational choices 1. Assessment & Evaluation in Higher Education, 31(3), 303–328. https://doi.org/10.1080/02602930500352931

[3] Tomasik, J. H., LeCaptain, D., Murphy, S., Martin, M., Knight, R. M., Harke, M. A., Burke, R., Beck, K., & Acevedo-Polakovich, I. D. (2014). Island Explorations: Discovering Effects of Environmental Research-Based Lab Activities on Analytical Chemistry Students. Journal of Chemical Education, 91(11), 1887–1894. https://doi.org/10.1021/ed5000313