Browse by Topics
- Covid-19 Education Research for Recovery
- Early childhood
- K-12 Education
- Post-secondary education
- Access and admissions
- Education outside of school (after school, summer…)
- Educator labor markets
- Educator preparation, professional development, performance and evaluation
- Finance
- Inequality
- Markets (vouchers, choice, for-profits, vendors)
- Methodology, measurement and data
- Multiple outcomes of education
- Parents and communities
- Politics, governance, philanthropy, and organizations
- Program and policy effects
- Race, ethnicity and culture
- Standards, accountability, assessment, and curriculum
- Students with Learning Differences
Breadcrumb
Search EdWorkingPapers
Lindsay C. Page
Award displacement occurs when one type of financial aid award directly contributes to the change in the quantity of another award. We explore whether postsecondary institutions displaced awards in response to the Pittsburgh Promise scholarship by capitalizing on the doubling of the maximum Promise amount in 2012. We use de-identified student-level data on each Promise recipient’s actual cost of attendance, grants, and scholarships, as well as demographic and academic characteristics from school district administrative files to examine whether and how components of students’ financial aid packages and total costs of attendance changed after the Promise award increase. To account for overall trends in pricing and financial aid, we compare Promise recipients to the average first-time, full-time freshman entering the same institutions in the same year as reported by the Integrated Postsecondary Education Data System (IPEDS). With these two data sources, we assess differences in costs and awards between Promise students and their peers, on average, and examine whether and in what ways these differences changed after the increase in Promise funding. We refer to this strategy as a “quasi-difference-in-differences” design. We do not find evidence that institutions are responding to the Promise increase through aid reductions.
Despite documented benefits to college completion, more than a third of students who initially enroll in college do not ultimately earn a credential. Completing college requires students to navigate both institutional administrative tasks (e.g., registering for classes) and academic tasks within courses (e.g., completing homework). In postsecondary education, several promising interventions have shown that text-based outreach and communication can be a low-cost, easy to implement, and effective strategy for supporting administrative task navigation. In this paper, we report on two randomized controlled trials testing the effect of a text-based chatbot with artificial intelligence (AI) capability on students' academic task navigation in introductory courses (political science and economics). We find the academic chatbot significantly shifted students’ final grades, increasing the likelihood students received a course grade of B or higher by 5-6 percentage points and reduced the likelihood students dropped the course.
College success requires students to engage with their institution academically and administratively. Missteps with required administrative processes can threaten student persistence and success. Through two experimental studies, we assessed the effectiveness of an artificially intelligent text-based chatbot that provided proactive outreach and support to college students to navigate administrative processes and use campus resources. In both the two-year and four-year college context, outreach was most effective when focused on discrete administrative processes such as filing financial aid forms or registering for courses which were acute, time-sensitive, and for which outreach could be targeted to those for whom it was relevant. In the context of replicating studies to better inform policy and programmatic decision making, we draw three core lessons regarding the effective use of nudge-type efforts to promote college success.
Policy makers periodically consider using student assignment policies to improve educational outcomes by altering the socio-economic and academic skill composition of schools. We exploit the quasi-random reassignment of students across schools in the Wake County Public School System to estimate the academic and behavioral effects of being reassigned to a different school and, separately, of shifts in peer characteristics. We rule out all but substantively small effects of transitioning to a different school as a result of reassignment on test scores, course grades and chronic absenteeism. In contrast, increasing the achievement levels of students' peers improves students' math and ELA test scores but harms their ELA course grades. Test score benefits accrue primarily to students from higher-income families, though students with lower family income or lower prior performance still benefit. Our results suggest that student assignment policies that relocate students to avoid the over-concentration of lower-achieving students or those from lower-income families can accomplish equity goals (despite important caveats), although these reassignments may reduce achievement for students from higher-income backgrounds.
We investigate whether and how Achieve Atlanta’s college scholarship and associated services impact college enrollment, persistence, and graduation among Atlanta Public School graduates experiencing low household income. Qualifying for the scholarship of up to $5,000/year does not meaningfully change college enrollment among those near the high school GPA eligibility thresholds. However, scholarship receipt does have large and statistically significant effects on early college persistence (i.e., 14%) that continue through BA degree completion within four years (22%). We discuss how the criteria of place-based programs that support economically disadvantaged students may influence results for different types of students.
We explore the role of defaults and choice architecture on student loan decision-making, experimentally testing the impact pre-populating either decline or accept decisions compared to an active choice, no pre-population, decision. We demonstrate that the default choice presented does influence student loan borrowing decisions. Specifically, compared to active choice, students presented within a pre-populated decline decision were almost five percent less likely to accept all packaged loans and borrowed between 4.6 and 4.8 percent less in federal educational loans. The reductions in borrowing appears to be concentrated within unsubsidized loans with those assigned to the opt-in condition borrowing 8.3 percent less in unsubsidized loans. These changes in borrowing did not induce substitution towards private or Parent PLUS loans nor did they negatively impact enrollment, academic performance, or on-campus work outcomes in the same academic year.
Verification is a federally mandated process that requires selected students to further attest that the information reported on their FAFSA is accurate and complete. In this brief, we estimate institutional costs of administrating the FAFSA verification mandate and consider variation in costs by institution type and sector. Using data from 2014, we estimate that compliance costs to institutions in that year totaled nearly $500 million with the burden falling disproportionately on public institutions and community colleges, in particular. Specifically, we estimate that 22% of an average community college’s financial aid office operating budget is devoted to verification procedures, compared to 15% at public four-year institutions. Our analysis is timely, given that rates of FAFSA verification have increased in recent years.
Clustered observational studies (COSs) are a critical analytic tool for educational effectiveness research. We present a design framework for the development and critique of COSs. The framework is built on the counterfactual model for causal inference and promotes the concept of designing COSs that emulate the targeted randomized trial that would have been conducted were it feasible. We emphasize the key role of understanding the assignment mechanism to study design. We review methods for statistical adjustment and highlight a recently developed form of matching designed specifically for COSs. We review how regression models can be profitably combined with matching and note best practice for estimates of statistical uncertainty. Finally, we review how sensitivity analyses can determine whether conclusions are sensitive to bias from potential unobserved confounders. We demonstrate concepts with an evaluation of a summer school reading intervention in Wake County, North Carolina.
Many interventions in education occur in settings where treatments are applied to groups. For example, a reading intervention may be implemented for all students in some schools and withheld from students in other schools. When such treatments are non-randomly allocated, outcomes across the treated and control groups may differ due to the treatment or due to baseline differences between groups. When this is the case, researchers can use statistical adjustment to make treated and control groups similar in terms of observed characteristics. Recent work in statistics has developed matching methods designed for contexts where treatments are clustered. This form of matching, known as multilevel matching, may be well suited to many education applications where treatments are assigned to schools. In this article, we provide an extensive evaluation of multilevel matching and compare it to multilevel regression modeling. We evaluate multilevel matching methods in two ways. First, we use these matching methods to recover treatment effect estimates from three clustered randomized trials using a within-study comparison design. Second, we conduct a simulation study. We find evidence that generally favors an analytic approach to statistical adjustment that combines multilevel matching with regression adjustment. We conclude with an empirical application.
We examine through a field experiment whether outreach and support provided through an AI-enabled chatbot can reduce summer melt and improve first-year college enrollment at a four-year university and at a community college. At the four-year college, the chatbot increased overall success with navigating financial aid processes, such that student take up of educational loans increased by four percentage points. This financial aid effect was concentrated among would-be first-generation college goers, for whom loan acceptances increased by eight percentage points. In addition, the outreach increased first-generation students’ success with course registration and fall semester enrollment each by three percentage points. For the community college, where the randomized experiment could not be robustly implemented due to limited cell phone number information, we present a qualitative analysis of organizational readiness for chatbot implementation. Together, our findings suggest that proactive outreach to students is likely to be most successful when targeted to those who may be struggling (for example, in keeping up with required administrative tasks). Yet, such targeting requires university systems to have ready access to and ability to make use of their administrative data.