What is it about?

The use of student learning data to predict educational outcomes has been widely studied, both in terms of model performance and fairness. An example of these predictive models is the use of Early Warning Systems (EWS), which identify students at risk of dropping out. They can be used continuously to make predictions, from the time of first enrollment until years into a degree program, to provide timely support. However, changes to student composition and their learning trajectories can alter the performance and group fairness of predictions over time. Using a nationwide higher education dataset, we examine changes in the fairness of a dropout prediction model at various points along the academic calendar. Our findings reveal that fairness is not static but evolves over time: the largest differences in AUC occur at 12 months after enrollment, a common evaluation point for dropout EWS. We discuss implications for the continued assessment of fairness in predictive algorithms in education.

Featured Image

Read the Original

This page is a summary of: Fairness Over Time: A Nationwide Study of Evolving Bias in Dropout Prediction, July 2025, ACM (Association for Computing Machinery),
DOI: 10.1145/3698205.3733933.
You can read the full text:

Read

Contributors

The following have contributed to this page