Also in this Issue
Science Informing Policy. Can it Work?
By Merry Bullock, PhD
The dream of every science policy wonk is that the best science will inform the ways that decision makers frame issues, consider solutions, decide on programs and implement outcomes. Such a goal underlies the What Works Clearinghouse (WWC), a new initiative to provide evidence for education intervention and policy. The WWC is the brainchild of a broad collaboration, including the Department of Education, the founders of the Campbell Collaboration (an organization devoted to soliciting and disseminating systematic reviews of the effectiveness of behavioral and social interventions on societal issues), and others. Funded by the Department of Education, the WWC has begun modestly with systematic reviews on two topics: peer-assisted learning strategies and middle school mathematics programs. But the long term goal is to provide evidence on a broad range of intervention topics from character education to adult literacy.
How this endeavor fares, the issues it raises, and the product it releases, will have a strong impact on the education and science communities. If it works well, it will be a model for one strategy to get science to policy makers; if it doesn't, it will be an example of how battles within the science and education communities on methodology, values, training and tradition interact with large scale, top down mandates to influence a complex policy and research system. In either case, the WWC and responses to its goals and methods is an interesting case study that raises issues about the very definition of science, research and outcome, and that has the science community abuzz with both praise and criticism.
Modeled on the Cochrane Collaboration, which collects and disseminates systematic reviews on medical interventions, the idea behind the WWC is to provide policy makers - those who decide about curricula and other education issues -- with a resource that will give them "the best available scientific evidence" about which potential interventions work and which do not. Pretty much everyone would agree that is a good idea. But there is hearty disagreement among researchers on the rest of the equation -- including the definition of "best scientific evidence", systematic review, and scientific research.
At one end of a continuum are those who hold random controlled trials (RCT) designs as the ultimate gold standard for providing valid evidence because this is the only design that can control for bias. At the other end are those who claim that relying on RCT trials will not provide viable information - either because it is practically impossible to achieve random control in school settings or because experimental designs will miss the complexities of school-based behavior, or because such information will be incomplete. This is a classic debate between those who require the experimental design that allows the most unambiguous inferences about causation (RCTs) and those who require designs that mirror the complexity of the phenomenon studied. The WWC's compromise position is to include "comparison studies that use carefully matched groups and 'regression continuity designs,' which are experiments that use a cutoff point to separate comparison groups and to statistically account for differences between groups." What the WWC does not include are case studies, surveys, studies that rely on pre- and post-test data, and descriptive reports, exclusions that have been criticized by some in the educational research community who argue that much educational policy is based on just such research because it is the only kind that can be reasonably collected.
Presently, the WWC welcomes input and suggestions for specific interventions related to "current WWC Reports, studies or study citations on the effects of educational interventions .... , and/or nominations for other interventions, studies, or future topics that you would like to see considered for review by the WWC. You can also nominate a specific intervention, such as a particular curriculum, rather than an entire class of interventions, such as math curricula. Suggestions can be made on the WWC website.
In addition, the WWC is creating a "registry of evaluators" - an online database individuals or organizations who conduct research on the effects of replicable educational interventions. The registry will be used to help schools, school districts, and educational program developers identify potential evaluators (and presumably will provide a source of intervention evaluations as a basis for reviews of what works). See the WWC website for further information.