Search EdWorkingPapers

Search EdWorkingPapers by author, title, or keywords.

Jake Anders

Sam Sims, Jake Anders, Matthew Inglis, Hugues Lortie-Forgues, Ben Styles, Ben Weidmann.

Over the last twenty years, education researchers have increasingly conducted randomised experiments with the goal of informing the decisions of educators and policymakers. Such experiments have generally employed broad, consequential, standardised outcome measures in the hope that this would allow decisionmakers to compare effectiveness of different approaches. However, a combination of small effect sizes, wide confidence intervals, and treatment effect heterogeneity means that researchers have largely failed to achieve this goal. We argue that quasiexperimental methods and multi-site trials will often be superior for informing educators’ decisions on the grounds that they can achieve greater precision and better address heterogeneity. Experimental research remains valuable in applied education research. However, it should primarily be used to test theoretical models, which can in turn inform educators’ mental models, rather than attempting to directly inform decision making. Since comparable effect size estimates are not of interest when testing educational theory, researchers can and should improve the power of theory-informing experiments by using more closely aligned (i.e., valid) outcome measures. We argue that this approach would reduce wasteful research spending and make the research that does go ahead more statistically informative, thus improving the return on investment in educational research.

More →


Sam Sims, Harry Fletcher-Wood, Alison O’Mara-Eves, Sarah Cottingham, Claire Stansfield, Josh Goodrich, Jo Van Herwegen, Jake Anders.

Multiple meta-analyses have now documented small positive effects of teacher professional development (PD) on pupil test scores. However, the field lacks any validated explanatory account of what differentiates more from less effective in-service training. As a result, researchers have little in the way of advice for those tasked with designing or commissioning better PD. We set out to remedy this by developing a new theory of effective PD based on combinations of causally active components targeted at developing teachers’ insights, goals, techniques, and practice. We test two important implications of the theory using a systematic review and meta-analysis of 104 randomized controlled trials, finding qualified support for our framework. While further research is required to test and refine the theory, we argue that it presents an important step forward in being able to offer actionable advice to those responsible for improving teacher PD.

More →