VPCS
OVERVIEW
Summary | |
---|---|
Original author(s) |
|
Original publication |
|
Year original instrument was published | 2014 |
Inventory | |
Number of items | 33 |
Number of versions/translations | 1 |
Cited implementations | 1 |
Language |
|
Country | United States |
Format |
|
Intended population(s) |
|
Domain |
|
Topic |
|
EVIDENCE
Information in the table is given in four different categories:
- General - information about how each article used the instrument:
- Original development paper - indicates whether in which paper(s) the instrument was developed initially
- Uses the instrument in data collection - indicates whether an article administered the instrument and collected responses
- Modified version of existing instrument - indicates whether an article has modified a prior version of this instrument
- Evaluation of existing instrument - indicates whether an article explicitly provides evidence that attempt to evaluate the performance of the instrument; lack of a checkmark here implies an article that administered the instrument but did not evaluate the instrument itself
- Reliability - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
- Validity - information about the evidence presented to establish reliability of data generated by the instrument; please see the Glossary for term definitions
- Other Information - information that may or may not directly relate to the evidence for validity and reliability, but are commonly reported when evaluating instruments; please see the Glossary for term definitions
Publications: | 1 |
---|---|
General |
|
Original development paper | ✔ |
Uses the instrument in data collection | |
Modified version of existing instrument | ✔ |
Evaluation of existing instrument | ✔ |
Reliability |
|
Test-retest reliability | |
Internal consistency | ✔ |
Coefficient (Cronbach's) alpha | ✔ |
McDonald's Omega | |
Inter-rater reliability | |
Person separation | |
Generalizability coefficients | |
Other reliability evidence | |
Validity |
|
Expert judgment | ✔ |
Response process | ✔ |
Factor analysis, IRT, Rasch analysis | ✔ |
Differential item function | |
Evidence based on relationships to other variables | |
Evidence based on consequences of testing | |
Other validity evidence | |
Other information |
|
Difficulty | ✔ |
Discrimination | ✔ |
Evidence based on fairness | |
Other general evidence |
REVIEW
This review was generated by a CHIRAL review panel. Each CHIRAL review panel consists of multiple experts who first individually review the citations of the assessment instrument listed on this page for evidence in support of the validity and reliability of the data generated by the instrument. Panels then meet to discuss the evidence and summarize their opinions in the review posted in this tab. These reviews summarize only the evidence that was discussed during the panel which may not represent all evidence available in the published literature or that which appears on the Evidence tab.
If you feel that evidence is missing from this review, or that something was documented in error, please use the CHIRAL Feedback page.
Panel Review: Visual-Perceptual Chemistry
(Post last updated July 21, 2021)
Review panel summary
The Visual-Perceptual Chemistry Specific (VPCS) instrument is a 33-item, multiple-choice instrument designed to measure visual-perceptual skills in chemistry contexts. The instrument’s design was informed by extant literature (including existing instruments) on domain-general visual-perceptual skills and modified to include 1) skills which are relevant for chemistry students and 2) chemistry-specific representations. It has been evaluated with students enrolled in undergraduate courses ranging from general chemistry to advanced courses (physical, inorganic, and computational chemistry) at a single university [1].
The VPCS went through three iterations, which included refinement and reduction of the number of items and implementation of a 45-minute time limit. Item and instrument revisions were incorporated based on peer review by faculty and graduate students (test content validity evidence) and feedback from undergraduate students via think-aloud interviews (response process validity evidence). The developers report using factor analysis to investigate the internal structure validity, reporting both exploratory factor analysis and confirmatory factor analysis, which differed in outcomes. Therefore, more evidence is needed pertaining to the internal structure validity of data generated using VPCS.
Finally, coefficient alpha and Kuder-Richardson Formula 20 were reported as estimates of the single administration reliability of the instrument; authors report a single value for coefficient alpha (0.6555) and Kuder-Richardson Formula 20 (0.6337) [1], as opposed to reporting values for each of the three scales (factors) identified, which would have been more aligned with currently recommended practice. The developers also use item response theory (IRT) to investigate item-level characteristics, including difficulty and discrimination of items [1].
Recommendations for use
The VPCS is designed to measure visual-perceptual skills in chemistry contexts. The VPCS has only been used in a single university context with students enrolled in multiple university-level chemistry courses (general, inorganic, physical, computational) [1]. There is limited evidence to support the notion that data generated using VPCS can be interpreted/scored according to the suggested three-factor model found using exploratory factor analysis (three scales representing distinct visual-perceptual skills) nor the theoretically-based eight skills suggested by the authors. Additionally, the methods used to estimate single administration reliability of data generated using VPCS do not adhere to currently recommended best practices (estimation by scale). The developers did not report investigation of item or instrument function across student groups (i.e., students in different courses). Overall, VPCS users are encouraged to analyze data collected using the instrument for further evidence of validity and reliability.
Details from panel review
The development of the VPCS was based on extant literature/theory of visual-perceptual skills, and items were derived from domain-general instruments, modified to measure chemistry-specific visual-perceptual skills. The presented version of VPCS in [1] is the result of multiple instrument revisions, which included the revision of items based on faculty feedback and student interviews, a reduction in the total number of items, the implementation of a 45 minute time limit, and the inclusion of “categorical questions'' to elicit demographic information, chemistry coursework, and whether students used a modeling kit. Using the final version of VPCS, data were collected from 978 students in multiple courses over three academic years at a single institution [1]. There is no evidence that the developers investigated item or instrument function across demographic groups or courses.
Initially, the developers tested (using confirmatory factor analysis) an eight-factor model based on extant literature related to visual-perceptual skills, which was not sufficiently supported by data. The authors ultimately identify a three-factor model, or three groups of items within the 33-item instrument (using exploratory factor analysis). However, many items overlap across factors while other items are not strongly associated with any of the three factors, which complicates interpretation of VPCS data. Additionally, the three-factor model is not well-aligned with the theory used to develop the items [1]. The evidence for the internal structure of VPCS data is limited, and users are encouraged to investigate further. Developers report a single value for coefficient alpha and Kuder-Richardson Formula 20 (estimates of single administration reliability), as opposed to reporting reliability coefficients by scale.
The developers used item response theory to investigate item-level characteristics and report difficulty and discrimination values by item [1]. However, item response theory assumes the measured construct is unidimensional, but the developers suggest that VPCS data are multidimensional (i.e., measure multiple visual-perceptual skills). Ambiguity regarding dimensionality brings the item response theory results into question.
References
[1] Oliver-Hoyo, M., & Sloan, C. (2014). The development of a Visual-Perceptual Chemistry Specific (VPCS) assessment tool. Journal of Research in Science Teaching, 51(8), 963-981. https://doi.org/10.1002/tea.21154
VERSIONS
CITATIONS
Oliver-Hoyo, M., & Sloan, C. (2014). The development of a Visual-Perceptual Chemistry Specific (VPCS) assessment tool. Journal of Research in Science Teaching, 51(8), 963-981.