OVERVIEW
Summary | |
---|---|
Original author(s) |
|
Original publication |
|
Year original instrument was published | 2015 |
Inventory | |
Number of items | 9 |
Number of versions/translations | 1 |
Cited implementations | 2 |
Language |
|
Country | United States |
Format |
|
Intended population(s) |
|
Domain |
|
Topic |
|
EVIDENCE
Information in the table is given in four different categories:
- General - information about how each article used the instrument:
- Original development paper - indicates whether in which paper(s) the instrument was developed initially
- Uses the instrument in data collection - indicates whether an article administered the instrument and collected responses
- Modified version of existing instrument - indicates whether an article has modified a prior version of this instrument
- Evaluation of existing instrument - indicates whether an article explicitly provides evidence that attempt to evaluate the performance of the instrument; lack of a checkmark here implies an article that administered the instrument but did not evaluate the instrument itself
- Reliability - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
- Validity - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
- Other Information - information that may or may not directly relate to the evidence for validity and reliability, but are commonly reported when evaluating instruments; please see the Glossary for term definitions
Publications: | 1 | 2 |
---|---|---|
General |
||
Original development paper | ✔ | |
Uses the instrument in data collection | ✔ | ✔ |
Modified version of existing instrument | ✔ | |
Evaluation of existing instrument | ✔ | ✔ |
Reliability |
||
Test-retest reliability | ||
Internal consistency | ||
Coefficient (Cronbach's) alpha | ✔ | ✔ |
McDonald's Omega | ||
Inter-rater reliability | ||
Person separation | ||
Generalizability coefficients | ||
Other reliability evidence | ||
Validity |
||
Expert judgment | ||
Response process | ✔ | |
Factor analysis, IRT, Rasch analysis | ✔ | |
Differential item function | ||
Evidence based on relationships to other variables | ✔ | ✔ |
Evidence based on consequences of testing | ||
Other validity evidence | ||
Other information |
||
Difficulty | ||
Discrimination | ||
Evidence based on fairness | ||
Other general evidence |
REVIEW
This review was generated by a CHIRAL review panel. Each CHIRAL review panel consists of multiple experts who first individually review the citations of the assessment instrument listed on this page for evidence in support of the validity and reliability of the data generated by the instrument. Panels then meet to discuss the evidence and summarize their opinions in the review posted in this tab. These reviews summarize only the evidence that was discussed during the panel which may not represent all evidence available in the published literature or that which appears on the Evidence tab.
If you feel that evidence is missing from this review, or that something was documented in error, please use the CHIRAL Feedback page.
Panel Review: Effort Beliefs in Chemistry
(Post last updated June 23, 2022)
Review panel summary
The Effort Beliefs in Chemistry is a 6-item instrument scored on a 5-point Likert scale. It is designed to measure the degree to which students believe their effort will lead to positive outcomes in their chemistry courses. It has been used at one institution with students recruited from first-year undergraduate chemistry lecture [2] and laboratory courses [1]. The validity and reliability of the data generated using the instrument has been examined by the original authors. Response process validity evidence was collected by interviewing undergraduate students, who commented on the readability of items and their reasoning of answer choice [1]. Effort belief scores were found to be positively correlated with final grade percentage, which provides some evidence of validity based on relations to other variables [1]. Internal structure validity evidence is provided through results from a confirmatory factor analysis, which suggests a 1-factor model [1]. Single administration reliability was estimated using coefficient alpha [1, 2].
Recommendations for use
Based on the attributional theory of motivation that effort is tied to conceptions of ability, the Effort Beliefs in Chemistry instrument is designed to measure the degree to which students believe their effort will lead to positive outcomes in their chemistry course. Validity and reliability evidence support the use of the instrument to measure this aspect of motivation of undergraduate students in first-year laboratory and lecture courses [1, 2].
Details from panel review
The scale was originally designed and administered as a 9-item scale, but three items were dropped based on factor analysis results (i.e., low factor loadings). The decision to drop two of these three items was also supported by response process interview data and the modification indices for the items. Therefore, it is recommended that the instrument be used in its revised 6-item form.
Effort belief scores were found to be positively correlated with self-efficacy, interest, and course performance [2]. However, in a multiple regression analysis, effort beliefs scores did not account for a significant amount of variance in course grade [2]. Single administration reliability for the revised 6-item scale was estimated using coefficient alpha, which was reported as above 0.77 for all time points including cross-validation in the development study [1]. A follow-up study reported coefficient alpha values of 0.68 at time 1 and 0.83 at time 2 [2].
References
[1] Ferrell, B., & Barbera, J. (2015). Analysis of students’ self-efficacy, interest, and effort beliefs in general chemistry. Chemistry Education Research and Practice, 16(2), 318-337. https://doi.org/10.1039/C4RP00152D
[2] Ferrell, B., Phillips, M.M., & Barbera, J. (2016). Connecting achievement motivation to performance in general chemistry. Chemistry Education Research and Practice, 17(4), 1054-1066. https://doi.org/10.1039/C6RP00148C
VERSIONS
Name | Authors |
---|---|
Measure Of Science Identity |
|
CITATIONS
Ferrell, B., & Barbera, J. (2015). Analysis of students' self-efficacy, interest, and effort beliefs in general chemistry. Chemistry Education Research and Practice, 16(2), 318-337.
Ferrell, B., Phillips, M.M., & Barbera, J. (2016). Connecting achievement motivation to performance in general chemistry. Chemistry Education Research and Practice, 17(4), 1054-1066.