Whether online proctoring will impact test scores is a question that is often raised by test takers, testing organizations, and other stakeholders in the testing process. With the sudden surge to virtual learning and working environments due to the pandemic, online testing has followed suit and become a common mode of testing, even for high stakes settings such as college admissions, professional certification, and occupational licensing.
There are many ways to address this question. From the perspective of Chief Science Officer, I encourage testing organizations to consider the psychometric quality, and when multiple modes of proctoring are used, comparability of scores.
The candidate experience is also important to monitor on factors related to the testing environment and system. And of course, the security of the test should be monitored to identify any breaches that might undermine the validity and fairness of the test, such as cheating, compromise, or collusion.
At PSI, we have been conducting test security analytics for many years to monitor and assure the measurement quality of the high stakes exams that we administer, in keeping with professional testing standards.1 Over the years we have developed analytic methods, including a proprietary measure to detect forms of potential cheating.2
In 2017, we published the first comparative study of high stakes credentialing exams administered and proctored in a test center vs online proctoring of testing at a remote kiosk.i The results were encouraging – strong measurement quality and reliability of scores, comparability in average test scores between modes, favorable test taker experiences, and test scores unrelated to testing experience.
As organizations have quickly adopted online proctored testing, we have additional opportunities to study high stakes credentialing exam programs that have used test centers with onsite proctors and online delivery of the same exams to test takers at home with remote proctors – so-called mixed mode programs. This has provided a unique opportunity to conduct studies examining measurement quality and potential impact associated with online proctoring.
For example, we recently studied six different occupational licensing and professional certification exams that had used onsite proctoring in test centers and remote online proctoring of tests taken at home during the first half of 2020.
The results of the study showed that:
Test takers scored the same, on average, whether at home with online proctoring or in a test center with onsite proctoring.
Test scores obtained under both the online and onsite proctoring conditions were psychometrically sound, with high levels of score precision.
Test takers rated online proctored exams favorably, and ratings of testing conditions were unrelated to both proctoring mode and exam performance.
The online proctored exams were secure and did not result in any higher levels of cheating or compromised content than onsite proctoring.
As we look to the future of test delivery, it seems clear that online proctoring is here to stay. Certainly, it will continue to evolve as new methods and technologies advance, especially with the advent of AI assisted proctoring and new approaches to secure test design and delivery evolve.
In the meantime, we plan to continue our research to help inform practice and thought leadership in the testing industry. Research will continue to explore the key measurement considerations across test delivery and proctoring modes for different examination programs, remote proctoring systems, and protocols. The data from the large-scale switch to online testing brought about by COVID-19 will enable adding to this body of evidence.
As for now, the current findings lend encouraging support to the use of well-developed and well-administered online proctoring.
1. AERA, APA, NCME (2014). Standards for Educational and Psychological Testing.
2. Hurtz, G., & Weiner, J. (2019). Analysis of Test-Taker Profiles Across a Suite of Statistical Indices for Detecting the Presence and Impact of Cheating. Journal of Applied Testing technology.
i. Weiner, J., & Hurtz, G. (2017). A comparative study of online remote proctored versus onsite proctored high-stakes exams. Journal of Applied Testing Technology.