Feature

Beginning this July, 11 medical journals, including the New England Journal of Medicine, will publish only those clinical studies registered in a publicly available database prior to being conducted. By doing this, these journals' editors hope to keep clinical-intervention sponsors--such as pharmaceutical companies--from suppressing negative results. And this move may also make studies that find no post-intervention improvements more readily accessible to both researchers and the public.

Might psychological journals follow suit and require the registration of behavioral interventions? Some psychologists hope so, including members of the Campbell Collaboration--a nonprofit organization that maintains a registry of randomized behavioral intervention trials.

Since forming four years ago, the collaboration has grown from 80 individual members to more than 500, and its trial registration database currently houses about 14,000 abstracts describing randomized trials in the behavioral and social sciences. What differentiates it from other databases, such as the psychological study abstract repository, PsycINFO and its counterpart in education, the Education Resource Information Center, is that the Campbell registry includes both published and unpublished trials, and even contains abstracts of studies that are in progress. That means that both null results and ones that find improvements after an intervention will be equally represented, say Campbell supporters.

Therefore, the Campbell database could serve as an exhaustive resource for people who are conducting literature reviews and meta-analyses of behavioral interventions, says psychologist Hannah Rothstein, PhD, a committee leader in the Campbell Collaboration and professor of management at the Zicklin School of Business, part of the City University of New York's Baruch College. Such a resource could allow people who are reviewing the effectiveness of interventions in education, social welfare and criminology--for example, "scared straight" programs for juvenile offenders--to weigh a thorough body of evidence, which is what many medical journal editors are also pushing for by requiring the public registration of pharmaceutical trials.

"One of the things I believe strongly in is not to waste data you've collected from human participants," says Rothstein. "If you have a trial you never published or registered, basically you have wasted those people's time."

By synthesizing the results from many studies, the Campbell Collaboration hopes to inform policy-makers of what works, what doesn't work, and ultimately, which programs should be funded, says Campbell Collaboration founder Robert Boruch, PhD, director of the University of Pennsylvania's Center for Research and Evaluation of Social Policy.

"Our aim is to pump out reviews and get them into the hands of the people who can use them," says Boruch.

A rigorous process

So far, the collaboration has pumped out six reviews of interventions, but 40 more are to be published within the next few years, Boruch says. In part, the reviews have been slow in coming due to an exhaustive search and peer-review process, he notes.

For example, Campbell reviewers attempt to unearth information on every randomized trial on a topic--both published and unpublished--and then justify to their colleagues which ones will be excluded from further analysis due to flaws in experimental design. Finding all of these studies can be time consuming, as researchers search the Campbell trial database, the Internet and lists of federally funded projects. A group of statistical experts then approves the specific calculations and even the software programs employed in the review's meta-analysis.

And after the paper is drafted, the principal investigator sends it out for review to a group of people including the researchers whose studies the paper critiques.

"We are producing these reviews with science in mind, but also with policy in mind," says Rothstein. "We don't want people attacking the review on the basis of method."

A lack of funding for researchers going through these intensive reviews may also slow the process--as academics must give priority to funded research that will be published in peer-reviewed journals, Rothstein says. The Campbell Collaboration does not have funding for report writers, though some researchers find support from other sources.

"We are currently running an all-volunteer effort," says Boruch.

Instead of funding, what motivates report writers is the chance to work with experts on meta-analysis and information retrieval, says Julia Littell, PhD, a professor of social work at Bryn Mawr College in Philadelphia whose Campbell report on programs to help children with severe behavioral problems will be released next year. Specifically, she is investigating the effectiveness of Multisystemic Therapy (MST), a combination of traditional counseling and a group of interrelated behavioral programs targeting a child's family, school and community.

"Campbell gave me access to people like [Vanderbilt University psychologist and public policy professor] Mark Lipsey, [PhD,] who is a world expert on statistical analysis," says Littell. "He looked over my plans and made suggestions I would never have thought of myself."

Through Campbell Collaboration connections, Littell also learned about a database of clinical trials in England, and she unearthed an unpublished study that turned out to be central to her paper. The principal investigator, Alan Leschied, PhD, a professor of psychology at the University of Western Ontario, never published his trial of MST with 409 Canadian youths, probably because he found no significant results, said Littell.

However, evidence that a program doesn't work is just as important as positive findings, as both significant and null results contribute to the overall picture of whether a behavioral intervention works, says Littell. And as a result of including the Canadian study and other unpublished research in her meta-analysis, Littell is finding that this well-established intervention might not be as effective as previously thought.

"The biggest and best studies show it doesn't work," says Littell. "Our preliminary findings show that a lot of experiments that have been held up as evidence for MST have problems with their design."

Targeting policy

Even though the Campbell Collaboration has not yet released Littell's review, government officials are already taking note of her preliminary findings, says Boruch. In fact, the Danish government has offered Littell a grant to continue reviewing MST trials as researchers finish them, and government officials in Sweden and Norway have requested copies of the final report.

"Julia Littell is making networks with [policy-makers] to get 'buy in' at the front end," says Boruch. "That's the way to have an impact--if you produce a report and just drop it on a person's desk the likelihood is low it will be used."

And informing public policy may motivate more researchers like Littell to produce reports through the Campbell Collaboration, Rothstein says.

"For the average journal article, the median number of citations is something like one," says Rothstein. "One of the rewards [for those conducting a Campbell review] is people have the chance to see their meta-analysis make a difference somewhere."

The Campbell Collaboration publishes its reviews online, but has not yet started actively publicizing its findings. However, international leaders--including Philip Davies, a British Cabinet official--attend the group's meetings and take note of its inroads, Boruch says.

"But we are still in the production process as opposed to putting a lot of emphasis on the use process," says Boruch. "Both are equally important, but we can't do everything at once."

Further Reading

To learn more about the Campbell Collaboration, including how to add a study to its trial registration database, visit its Web site at http://www.campbellcollaboration.org.

RELATED ARTICLES