Search and Filter

Submit a paper

Not yet affiliated? Have a paper you wish to post? Check out the EdWorkingPapers' scope and FAQs, and then submit your manuscript here.

Quasi-Experimental Evaluation of Alternative Sample Selection Corrections

We use a natural experiment to evaluate sample selection correction methods' performance. In 2007, Michigan began requiring that all students take a college entrance exam, increasing the exam-taking rate from 64 to 99%. We apply different selection correction methods, using different sets of predictors, to the pre-policy exam score data. We then compare the corrected data to the complete post-policy exam score data as a benchmark. We find that performance is sensitive to the choice of predictors, but not the choice of selection correction method. Using stronger predictors such as lagged test scores yields more accurate results, but simple parametric methods and less restrictive semiparametric methods yield similar results for any set of predictors. We conclude that gains in this setting from less restrictive econometric methods are small relative to gains from richer data. This suggests that empirical researchers using selection correction methods should focus more on the predictive power of covariates than robustness across modeling choices.

Education level
Topics
Document Object Identifier (DOI)
10.26300/vp0z-hp31
EdWorkingPaper suggested citation:
Garlick, Robert, and Joshua Hyman. (). Quasi-Experimental Evaluation of Alternative Sample Selection Corrections. (EdWorkingPaper: -37). Retrieved from Annenberg Institute at Brown University: https://doi.org/10.26300/vp0z-hp31

Machine-readable bibliographic record: RIS, BibTeX