Matthew West

Reducing difficulty variance in randomized assessments

P. Sud, M. West, and C. Zilles

in Proceedings of the 126th American Society for Engineering Education Annual Conference and Exposition (ASEE 2019), Paper ID #25513, 2019.

When exams are run asynchronously (i.e., students take it at different times), a student can potentially gain an advantage by receiving information about the exam from someone who took it earlier. Generating random exams from pools of problems mitigates this potential advantage, but has the potential to introduce unfairness if the problems in a given pool are of significantly different difficulty. In this paper, we present an algorithm that takes a collection of problem pools and historical data on student performance on these problems and produces exams with reduced variance of difficulty (relative to naive random selection) while maintaining sufficient variation between exams to ensure security. Specifically, for a synthetic example exam, we can roughly halve the standard deviation of generated assessment difficulty levels with negligible effects on cheating cost functions (e.g., entropy-based measures of diversity).

External link: https://peer.asee.org/33228

Full text: SuWeZi2019.pdf