RTOP
OVERVIEW
| Summary | |
|---|---|
| Original author(s) |
|
| Original publication |
|
| Year original instrument was published | 2002, 2000 |
| Inventory | |
| Number of items | 25 |
| Number of versions/translations | 3 |
| Cited implementations | 8 |
| Language |
|
| Country | United States |
| Format |
|
| Intended population(s) |
|
| Domain |
|
| Topic |
EVIDENCE
Information in the table is given in four different categories:
- General - information about how each article used the instrument:
- Original development paper - indicates whether in which paper(s) the instrument was developed initially
- Uses the instrument in data collection - indicates whether an article administered the instrument and collected responses
- Modified version of existing instrument - indicates whether an article has modified a prior version of this instrument
- Evaluation of existing instrument - indicates whether an article explicitly provides evidence that attempt to evaluate the performance of the instrument; lack of a checkmark here implies an article that administered the instrument but did not evaluate the instrument itself
- Reliability - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
- Validity - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
- Other Information - information that may or may not directly relate to the evidence for validity and reliability, but are commonly reported when evaluating instruments; please see the Glossary for term definitions
| Publications: | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
General |
||||||||||||||||
| Original development paper | ✔ | ✔ | ✔ | ✔ | ||||||||||||
| Uses the instrument in data collection | ✔ | ✔ | ✔ | ✔ | ✔ | ✔ | ✔ | ✔ | ✔ | ✔ | ✔ | ✔ | ✔ | ✔ | ||
| Modified version of existing instrument | ✔ | ✔ | ||||||||||||||
| Evaluation of existing instrument | ✔ | ✔ | ✔ | ✔ | ✔ | ✔ | ✔ | ✔ | ||||||||
Reliability |
||||||||||||||||
| Test-retest reliability | ||||||||||||||||
| Internal consistency | ||||||||||||||||
| Coefficient (Cronbach's) alpha | ✔ | ✔ | ✔ | ✔ | ||||||||||||
| McDonald's Omega | ||||||||||||||||
| Inter-rater reliability | ✔ | ✔ | ✔ | ✔ | ✔ | ✔ | ✔ | ✔ | ✔ | ✔ | ||||||
| Person separation | ||||||||||||||||
| Generalizability coefficients | ||||||||||||||||
| Other reliability evidence | ||||||||||||||||
Validity |
||||||||||||||||
| Expert judgment | ||||||||||||||||
| Response process | ||||||||||||||||
| Factor analysis, IRT, Rasch analysis | ✔ | ✔ | ✔ | ✔ | ||||||||||||
| Differential item function | ||||||||||||||||
| Evidence based on relationships to other variables | ✔ | ✔ | ✔ | ✔ | ✔ | ✔ | ||||||||||
| Evidence based on consequences of testing | ||||||||||||||||
| Other validity evidence | ||||||||||||||||
Other information |
||||||||||||||||
| Difficulty | ||||||||||||||||
| Discrimination | ||||||||||||||||
| Evidence based on fairness | ||||||||||||||||
| Other general evidence |
REVIEW
VERSIONS
| Name | Authors |
|---|---|
| Postsecondary Instructional Practices Survey |
|
| Name | Authors |
|---|---|
| Postsecondary Instructional Practices Survey |
|
| Electronic Quality Of Inquiry Protocol |
|
| Korean Teaching Observation Protocol |
|
CITATIONS
Sawada, D., Piburn, M. D., Judson, E., Turley, J., Falconer, K., Benford, R., & Bloom, I. (2002). Measuring reform practices in science and mathematics classrooms: The reformed teaching observation protocol. School science and mathematics, 102(6), 245-253.
Piburn, M., Sawada, D., Turley, J., Falconer, K., Benford, R., Bloom, I., & Judson, E. (2000). Reformed teaching observation protocol (RTOP) reference manual. Tempe, Arizona: Arizona Collaborative for Excellence in the Preparation of Teachers.
Philipp, S.B., Johnson, D.K., & Yezierski, E.J. (2014). Development of a protocol to evaluate the use of representations in secondary chemistry instruction. Chemistry Education Research and Practice, 15(4), 777-786.
Herrington, D. G., Yezierski, E. J., Luxford, K. M., & Luxford, C. J. (2011). Target inquiry: Changing chemistry high school teachers' classroom practices and knowledge and beliefs about inquiry instruction. Chemistry Education Research and Practice, 12(1), 74-84.
Jacobs, C. L., Martin, S. N., & Otieno, T. C. (2008). A Science Lesson Plan Analysis Instrument for formative and summative program evaluation of a teacher education program. Science Education, 92(6), 1096–1126. doi.org/10.1002/sce.20277
Yezierski, E.J., & Herrington, D.G. (2011). Improving practice with target inquiry: High school chemistry teacher professional development that works. Chemistry Education Research and Practice, 12(3), 344-354.
Lund, T. J., Pilarz, M., Velasco, J. B., Chakraverty, D., Rosploch, K., Undersander, M., & Stains, M. (2015). The best of both worlds: Building on the COPUS and RTOP observation protocols to easily and reliably measure various levels of reformed instructional practice. CBE—Life Sciences Education, 14(2), ar18.
Park, S., Jang, J.-Y., Chen, Y.-C., & Jung, J. (2010). Is Pedagogical Content Knowledge (PCK) Necessary for Reformed Science Teaching?: Evidence from an Empirical Study. Research in Science Education, 41(2), 245-260.