TCI
OVERVIEW
Summary | |
---|---|
Original author(s) |
|
Original publication |
|
Year original instrument was published | 2013 |
Inventory | |
Number of items | 10 |
Number of versions/translations | 1 |
Cited implementations | 2 |
Language |
|
Country | United States |
Format |
|
Intended population(s) |
|
Domain |
|
Topic |
|
EVIDENCE
Information in the table is given in four different categories:
- General - information about how each article used the instrument:
- Original development paper - indicates whether in which paper(s) the instrument was developed initially
- Uses the instrument in data collection - indicates whether an article administered the instrument and collected responses
- Modified version of existing instrument - indicates whether an article has modified a prior version of this instrument
- Evaluation of existing instrument - indicates whether an article explicitly provides evidence that attempt to evaluate the performance of the instrument; lack of a checkmark here implies an article that administered the instrument but did not evaluate the instrument itself
- Reliability - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
- Validity - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
- Other Information - information that may or may not directly relate to the evidence for validity and reliability, but are commonly reported when evaluating instruments; please see the Glossary for term definitions
Publications: | 1 | 2 |
---|---|---|
General |
||
Original development paper | ✔ | |
Uses the instrument in data collection | ||
Modified version of existing instrument | ||
Evaluation of existing instrument | ✔ | ✔ |
Reliability |
||
Test-retest reliability | ||
Internal consistency | ||
Coefficient (Cronbach's) alpha | ✔ | |
McDonald's Omega | ||
Inter-rater reliability | ||
Person separation | ||
Generalizability coefficients | ||
Other reliability evidence | ||
Validity |
||
Expert judgment | ✔ | |
Response process | ✔ | ✔ |
Factor analysis, IRT, Rasch analysis | ✔ | |
Differential item function | ||
Evidence based on relationships to other variables | ✔ | |
Evidence based on consequences of testing | ||
Other validity evidence | ||
Other information |
||
Difficulty | ✔ | |
Discrimination | ||
Evidence based on fairness | ||
Other general evidence |
REVIEW
This review was generated by a CHIRAL review panel. Each CHIRAL review panel consists of multiple experts who first individually review the citations of the assessment instrument listed on this page for evidence in support of the validity and reliability of the data generated by the instrument. Panels then meet to discuss the evidence and summarize their opinions in the review posted in this tab. These reviews summarize only the evidence that was discussed during the panel which may not represent all evidence available in the published literature or that which appears on the Evidence tab.
If you feel that evidence is missing from this review, or that something was documented in error, please use the CHIRAL Feedback page.
Panel Review: Thermochemistry Concept Inventory (TCI)
(Post last updated June 24, 2022)
Review panel summary
The Thermochemistry Concept Inventory (TCI) is a 10-item concept inventory designed to assess students’ understanding of thermochemistry concepts typically introduced in general (introductory) chemistry courses. Various pieces of reliability and validity evidence have been collected for this assessment. Test content validity evidence was provided by surveying chemistry faculty about important topics in thermochemistry. Further, interviews with faculty focused on topic coverage, item feedback, and consensus of correct answers [1]. Student interviews were also conducted to elucidate common alternate conceptions which were then used to generate item distractors [1]. Response process validity (RPV) evidence was provided through semi-structured interviews with students that focused on item wording, option plausibility, and student conceptions matching their selected response option. Students in an Honors general chemistry course outperformed the students in the other general chemistry courses, which can be considered evidence for relation to other variables. Sufficient evidence for the internal structure validity and single administration reliability were presented using Rasch analysis and suggest that most items can appropriately discriminate students within this range of abilities [2].
Recommendations for use
The TCI was developed to assess students’ understanding of thermochemistry in general (introductory) chemistry. Evidence supporting the test content and response process validity is published [1]. Internal structure validity supports the use of a total TCI score with the exception of one item (Item K) which did not have acceptable outfit statistics in the Rasch model analysis. Therefore this item should not be included in the assessment when evaluating the total score in a summative assessment [2]. The TCI items functioned equally well (aside from Item K) when administered as a formative or summative assessment, and in lab or lecture settings [2].
Details from panel review
Internal structure evidence was provided by evaluating the unidimensionality and local independence through principal component analysis. Fit statistics of the Rasch model also indicated an acceptable fit for most items [2]. For single administration reliability evidence, Cronbach’s alpha was below the acceptable value. However, the authors note that alpha may not be the most appropriate value to evaluate the TCI and provide a discussion of reliability at the item-level. The TCI authors discuss the acceptable fit statistics of the Rasch model (except for Item K) as a proxy for evidence of reliability and the option probability curves as a way to evaluate the reliability of responses for the range of student abilities [2]. The authors also provided further response process validity evidence through option probability curves from a Rasch analysis which revealed items’ ability to discriminate between a range of performers.
References
[1] Wren, D., & Barbera, J. (2013). Gathering evidence for validity during the design, development, and qualitative evaluation of Thermochemistry Concept Inventory items. Journal of Chemical Education, 90(12), 1590-1601. https://doi.org/10.1021/ed400384g
[2] Wren, D., & Barbera, J. (2014). Psychometric analysis of the thermochemistry concept inventory. Chemistry Education Research and Practice, 15, 380-390. https://doi.org/10.1039/C3RP00170A
VERSIONS
CITATIONS
Wren, D., & Barbera, J. (2013). Gathering evidence for validity during the design, development, and qualitative evaluation of Thermochemistry Concept Inventory items. Journal of Chemical Education, 90(12), 1590-1601.
Wren, D., & Barbera, J. (2014). Psychometric analysis of the thermochemistry concept inventory. Chemistry Education Research and Practice, 15(3), 380-390.