Search EdWorkingPapers

Search EdWorkingPapers by author, title, or keywords.

Can Automated Feedback Improve Teachers’ Uptake of Student Ideas? Evidence From a Randomized Controlled Trial In a Large-Scale Online Course

Providing consistent, individualized feedback to teachers is essential for improving instruction but can be prohibitively resource intensive in most educational contexts. We develop an automated tool based on natural language processing to give teachers feedback on their uptake of student contributions, a high-leverage teaching practice that supports dialogic instruction and makes students feel heard. We conduct a randomized controlled trial as part of an online computer science course, Code in Place (n=1,136 instructors), to evaluate the effectiveness of the feedback tool. We find that the tool improves instructors’ uptake of student contributions by 27% and present suggestive evidence that our tool also improves students’ satisfaction with the course and assignment completion. These results demonstrate the promise of our tool to complement existing efforts in teachers’ professional development.

Education level
Document Object Identifier (DOI)
10.26300/thn9-wh86

This EdWorkingPaper is published in:

Demszky, D., Liu, J., Hill, H.C., Jurafsky, D., & Piech, C. (2023). Can Automated Feedback Improve Teachers’ Uptake of Student Ideas? Evidence From a Randomized Controlled Trial In a Large-Scale Online Course. Educational Evaluation and Policy Analysis. https://doi.org/10.3102/01623737231169270

EdWorkingPaper suggested citation:

Demszky, Dorottya, Jing Liu, Heather C. Hill, Dan Jurafsky, and Chris Piech. (). Can Automated Feedback Improve Teachers’ Uptake of Student Ideas? Evidence From a Randomized Controlled Trial In a Large-Scale Online Course. (EdWorkingPaper: 21-483). Retrieved from Annenberg Institute at Brown University: https://doi.org/10.26300/thn9-wh86

Machine-readable bibliographic record: RIS, BibTeX