Search EdWorkingPapers

Search for EdWorkingPapers here by author, title, or keywords.

Jackie Eunjung Relyea

Jackie Eunjung Relyea, Patrick Rich, James S. Kim, Joshua B. Gilbert.

The current study aimed to explore the COVID-19 impact on the reading achievement growth of Grade 3-5 students in a large urban school district in the U.S. and whether the impact differed by students’ demographic characteristics and instructional modality. Specifically, using administrative data from the school district, we investigated to what extent students made gains in reading during the 2020-2021 school year relative to the pre-COVID-19 typical school year in 2018-2019. We further examined whether the effects of students’ instructional modality on reading growth varied by demographic characteristics. Overall, students had lower average reading achievement gains over the 9-month 2020-2021 school year than the 2018-2019 school year with a learning loss effect size of 0.54, 0.27, and 0.28 standard deviation unit for Grade 3, 4, and 5, respectively. Substantially reduced reading gains were observed from Grade 3 students, students from high-poverty backgrounds, English learners, and students with reading disabilities. Additionally, findings indicate that among students with similar demographic characteristics, higher-achieving students tended to choose the fully remote instruction option, while lower-achieving students appeared to opt for in-person instruction at the beginning of the 2020-2021 school year. However, students who received in-person instruction most likely demonstrated continuous growth in reading over the school year, whereas initially higher-achieving students who received remote instruction showed stagnation or decline, particularly in the spring 2021 semester. Our findings support the notion that in-person schooling during the pandemic may serve as an equalizer for lower-achieving students, particularly from historically marginalized or vulnerable student populations.

More →

Jackie Eunjung Relyea, James S. Kim, Patrick Rich.

The current study replicated and extended the previous findings of content-integrated literacy intervention focusing on its effectiveness on first- and second-grade English learners’ (N = 1,314) reading comprehension, writing, vocabulary knowledge, and oral proficiency. Statistically significant findings were replicated on science and social studies vocabulary knowledge (ES = .51 and .53, respectively) and argumentative writing (ES = .27 and .41, respectively). Furthermore, treatment group outperformed control group on reading (ES = .08) and listening comprehension (ES = .14). Vocabulary knowledge and oral proficiency mediated treatment effects on reading comprehension, whereas only oral proficiency mediated effects on writing. Findings replicate main effects on vocabulary knowledge and writing, while also extending previous research by highlighting mechanisms underlying improved reading comprehension and writing.

More →

Reagan Mozer, Luke W. Miratrix, Jackie Eunjung Relyea, James S. Kim.

In a randomized trial that collects text as an outcome, traditional approaches for assessing treatment impact require that each document first be manually coded for constructs of interest by human raters. An impact analysis can then be conducted to compare treatment and control groups, using the hand-coded scores as a measured outcome. This process is both time and labor-intensive, which creates a persistent barrier for large-scale assessments of text. Furthermore, enriching ones understanding of a found impact on text outcomes via secondary analyses can be difficult without additional scoring efforts. Machine-based text analytic and data mining tools offer one potential avenue to help facilitate research in this domain. For instance, we could augment a traditional impact analysis that examines a single human-coded outcome with a suite of automatically generated secondary outcomes. By analyzing impacts across a wide array of text-based features, we can then explore what an overall change signifies, in terms of how the text has evolved due to treatment. In this paper, we propose several different methods for supplementary analysis in this spirit. We then present a case study of using these methods to enrich an evaluation of a classroom intervention on young children’s writing. We argue that our rich array of findings move us from “it worked” to “it worked because” by revealing how observed improvements in writing were likely due, in part, to the students having learned to marshal evidence and speak with more authority. Relying exclusively on human scoring, by contrast, is a lost opportunity.

More →