OVERVIEW
Summary | |
---|---|
Original author(s) |
|
Original publication |
|
Year original instrument was published | 2015 |
Inventory | |
Number of items | 15 |
Number of versions/translations | 1 |
Cited implementations | 3 |
Language |
|
Country | United States |
Format |
|
Intended population(s) |
|
Domain |
|
Topic |
|
EVIDENCE
Information in the table is given in four different categories:
- General - information about how each article used the instrument:
- Original development paper - indicates whether in which paper(s) the instrument was developed initially
- Uses the instrument in data collection - indicates whether an article administered the instrument and collected responses
- Modified version of existing instrument - indicates whether an article has modified a prior version of this instrument
- Evaluation of existing instrument - indicates whether an article explicitly provides evidence that attempt to evaluate the performance of the instrument; lack of a checkmark here implies an article that administered the instrument but did not evaluate the instrument itself
- Reliability - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
- Validity - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
- Other Information - information that may or may not directly relate to the evidence for validity and reliability, but are commonly reported when evaluating instruments; please see the Glossary for term definitions
Publications: | 1 | 2 | 3 |
---|---|---|---|
General |
|||
Original development paper | ✔ | ||
Uses the instrument in data collection | ✔ | ✔ | ✔ |
Modified version of existing instrument | ✔ | ||
Evaluation of existing instrument | ✔ | ✔ | ✔ |
Reliability |
|||
Test-retest reliability | |||
Internal consistency | |||
Coefficient (Cronbach's) alpha | ✔ | ✔ | |
McDonald's Omega | ✔ | ||
Inter-rater reliability | |||
Person separation | |||
Generalizability coefficients | |||
Other reliability evidence | |||
Validity |
|||
Expert judgment | |||
Response process | ✔ | ✔ | |
Factor analysis, IRT, Rasch analysis | ✔ | ✔ | |
Differential item function | ✔ | ||
Evidence based on relationships to other variables | ✔ | ✔ | ✔ |
Evidence based on consequences of testing | |||
Other validity evidence | |||
Other information |
|||
Difficulty | |||
Discrimination | |||
Evidence based on fairness | |||
Other general evidence |
REVIEW
This review was generated by a CHIRAL review panel. Each CHIRAL review panel consists of multiple experts who first individually review the citations of the assessment instrument listed on this page for evidence in support of the validity and reliability of the data generated by the instrument. Panels then meet to discuss the evidence and summarize their opinions in the review posted in this tab. These reviews summarize only the evidence that was discussed during the panel which may not represent all evidence available in the published literature or that which appears on the Evidence tab.
If you feel that evidence is missing from this review, or that something was documented in error, please use the CHIRAL Feedback page.
Panel Review: Initial and Maintained Interest in Chemistry
(Post last updated June 23, 2022)
Review panel summary
The Initial and Maintained Interest in Chemistry scales are a pair of Likert-type scales designed to measure students’ initial (seven items on two subscales) and maintained (eight items on two subscales) interest over two time points, usually at the beginning and the later half of the course, respectively. The scales were created by modifying an existing measure of interest to specifically measure undergraduate student interest in chemistry using theoretical underpinnings of interest theory from educational psychology. Each scale measures two components of interest – feeling-related and value-related. The instrument has been evaluated with undergraduate chemistry students in general chemistry lecture [2] and laboratory courses [1, 3]. Response process validity evidence was generated by interviewing undergraduate students, who provided insight into their reasoning for their answer choices and feedback regarding the readability of items during the instrument development study [1] and in subsequent use at a different institution [3]. The instrument developers provide evidence of internal structure validity using confirmatory factor analysis. A 2-factor model showed adequate fit for both the initial interest and maintained interest scales in both the development study [1] and in a later study [3]. Validity evidence based on relations to other variables is provided by the observation that chemistry majors had higher feeling- and value-related interest scores than non-majors [1]. In another study, maintained feeling-related interest was positively and significantly correlated with final course grade [2]. Coefficient alpha [1, 2] and McDonald’s omega [3] have been used to estimate single administration reliability. Invariance testing indicated that, in at least one investigation [3], the scales were supported for their use in comparing interest scores for students in virtual and in-person general chemistry laboratory course settings.
Recommendations for use
The Initial and Maintained Interest in Chemistry is a pair of scales which can be used to measure students’ interest in chemistry at two time points. Validity and reliability evidence supports its use with undergraduate general chemistry students [1-3]. Considering that evidence supports both 1- and 2-factor models, it is suggested that users continue to investigate the factor structure of their data during analysis [1, 3].
Details from panel review
An additional analysis in relation to other variables was completed by the instrument developers where they hypothesized and explored relations between scores from an adapted version of the Cognitive scale of the College Chemistry Self-Efficacy Scale and interest in chemistry through multiple regression and path analysis [2]. Regarding internal structure evidence, prior literature and response process interviews provided evidence that both 1-factor and 2-factor models should be considered. The development paper found poor fit with the 1-factor model and an adequate fit with the 2-factor model after confirmatory factor analysis [1]. Findings in a follow up study found a 2-factor model had acceptable fit (CFI, 0.98; RMSEA, 0.07; SRMR, 0.03) [3]. The 1-factor was also tested and had an acceptable fit (RMSEA high), which was the only study so far to present evidence supporting a 1-factor model for this instrument [3]. Single administration reliability was estimated in all studies. All coefficient alpha values were above 0.79 in the development study and a second study at the same institution [1, 2]. A second institution used the instrument and reported a McDonald’s omega for the initial feeling-interest subscale of 0.88 [3].
References
[1] Ferrell, B., & Barbera, J. (2015). Analysis of students’ self-efficacy, interest, and effort beliefs in general chemistry. Chemistry Education Research and Practice, 16(2), 318-337. https://doi.org/10.1039/C4RP00152D
[2] Ferrell, B., Phillips, M.M., & Barbera, J. (2016). Connecting achievement motivation to performance in general chemistry. Chemistry Education Research and Practice, 17(4), 1054-1066. https://doi.org/10.1039/C6RP00148C
[3] Hensen, C., & Barbera, J. (2019). Assessing affective differences between a virtual general chemistry experiment and a similar hands-on experiment. Journal of Chemical Education, 96(10), 2097-2108. https://doi.org/10.1021/acs.jchemed.9b00561
VERSIONS
Name | Authors |
---|---|
Measure Of Chemistry Identity |
|
CITATIONS
Ferrell, B., & Barbera, J. (2015). Analysis of students' self-efficacy, interest, and effort beliefs in general chemistry. Chemistry Education Research and Practice, 16(2), 318-337.
Ferrell, B., Phillips, M.M., & Barbera, J. (2016). Connecting achievement motivation to performance in general chemistry. Chemistry Education Research and Practice, 17(4), 1054-1066.
Hensen, C., & Barbera, J. (2019). Assessing Affective Differences between a Virtual General Chemistry Experiment and a Similar Hands-On Experiment. Journal of Chemical Education, 96(10), 2097-2108.