Matthew West

Computerized exam reviews: In-person and individualized feedback to students after a computerized exam

W. L. Chang, M. West, C. Zilles, D. Mussulman, and C. Sacris

in Proceedings of the 2020 American Society for Engineering Education Virtual Annual Conference (ASEE 2020), 2020.

Computerized testing centers are a promising new technology for running exams in large (200+ students) courses, which eliminate many logistical problems of pencil-and-paper exams such as scheduling conflict exams, grading efficiently and consistently, and providing timely feedback to students. Computerized testing also dramatically shortens the feedback cycle between student learning and feedback from assessment, and enables the use of frequent testing and second-chance testing in large courses, which has been shown to lead to significant improvements in learning outcomes. However, an important student dissatisfaction with computerized testing is that numerical-answer questions are typically graded solely on the correctness of the final answer. The two major concerns reported by students are: (1) limited access to the assessment and corresponding learning opportunities post-assessment, and (2) the lack of partial credit given for correct solution procedures with incorrect final answers. To address these concerns from students, a large public Midwestern university has developed a new exam-review service to provide in-person feedback to students after the completion of computerized exams, with the option of human-assigned partial credit for a correct solution procedure. These review sessions are hosted in the computerized testing facility to ensure the integrity of exam problems for future use. In these review sessions, students are able to go through their scratch work collected at the end of the exam, and any program code they wrote to solve problems, under the guidance of a course staff member. The format of the session is student guided in nature, where course staff are present to assist with the identification of conceptual errors. In this paper, we present the design of the review system in a large-scale computerized testing facility, including the scheduling logistics, software support, course staff training, and guidance to students. Detailed data from student usage is reported, including survey data of student affect and learning outcome changes after review sessions. We focus on the extent to which review sessions enable practice in a guided environment, personalized leaning experiences in large courses, and increased exposure to learning opportunities.

DOI: 10.18260/1-2--34321

Full text: ChWeZiMuSa2020.pdf