Search and Filter

Submit a paper

Not yet affiliated? Have a paper you wish to post? Check out the EdWorkingPapers' scope and FAQs, and then submit your manuscript here.

Human versus Machine: Do college advisors outperform a machine-learning algorithm in predicting student enrollment?

Prediction algorithms are used across public policy domains to aid in the identification of at-risk individuals and guide service provision or resource allocation. While growing research has investigated concerns of algorithmic bias, much less research has compared algorithmically-driven targeting to the counterfactual: human prediction. We compare algorithmic and human predictions in the context of a national college advising program, focusing in particular on predicting high-achieving, lower-income students’ college enrollment quality. College advisors slightly outperform a prediction algorithm; however, greater advisor accuracy is concentrated among students with whom advisors had more interactions. The algorithm achieved similar accuracy among students lower in the distribution of interactions, despite advisors having substantially more information. We find no evidence that the advisors or algorithm exhibit bias against vulnerable populations. Our results suggest that, especially at scale, algorithms have the potential to provide efficient, accurate, and unbiased predictions to target scarce social services and resources.

Keywords
machine learning, algorithms, data science, predictive analytics, college success, algorithmic bias, implicit bias,
Education level
Document Object Identifier (DOI)
10.26300/gadf-ey53
EdWorkingPaper suggested citation:
Akmanchi, Suchitra, Kelli A. Bird, and Benjamin L. Castleman. (). Human versus Machine: Do college advisors outperform a machine-learning algorithm in predicting student enrollment?. (EdWorkingPaper: -699). Retrieved from Annenberg Institute at Brown University: https://doi.org/10.26300/gadf-ey53

Machine-readable bibliographic record: RIS, BibTeX