Feature

After a fire in a mental institution, patients fled out into a snowy night. One man, clad only in his shoes and underpants, wandered disoriented before seeking shelter on the porch of a house. There he died, nearly naked and blue from cold. When the homeowner discovered his body in the morning, her scream rang out in the morning air.

Psychologist Frank M. Dattilio, PhD, an Allentown, Pa., practitioner who’s also a clinical instructor in psychiatry at Harvard Medical School, is convinced that vignette will catch your attention. He also knows it can help you understand how some people — such as the homeowner who discovered the body — develop phobias about being traumatized. That’s why he included it in a manuscript about such phobias he submitted to a journal. The editors didn’t agree, claiming such writing was not scientific.

To Dattilio, that’s a mistake. One reason practicing psychologists don’t read the scientific literature is because it’s just not engaging, he says. More important, they don’t see how the empirical evidence relates to their own work. Now Dattilio and colleagues Daniel B. Fishman, PhD, and David J.A. Edwards, PhD, have proposed an alternative approach.

In a 2010 paper published in Psychotherapy Theory, Research, Practice, Training (Vol. 47, No. 4), they suggest a new paradigm that combines data from randomized controlled trials evaluating psychotherapeutic treatments with systematic case studies that help make research more relevant to psychologists’ practice.

The research paradigm that Dattilio and his colleagues propose is grounded in the idea that the kind of information randomized controlled trials provide doesn’t fully reflect the complexity of individual cases. Adding case studies complements the evidence from trials in ways that make it more useful for practitioners, they say.

Each method captures something different about psychotherapy’s reality, says Fishman, a clinical psychology professor at Rutgers and editor of the online, peer-reviewed, open-access journal Pragmatic Case Studies in Psychotherapy (PCSP).

“One facet has to do with broad statistical trends that are associated with clearly defined, structured therapy procedures,” he says. “The other has to do with the reality of the nitty-gritty that emerges when individuals come together in a transaction, with people bringing their humanity and their narrative histories into the mix.”

The authors propose a new gold standard for authoritative research on psychotherapy treatments: Instead of relying solely on randomized controlled trials, scientific studies should include randomized controlled trial data, a qualitative evaluation of the study’s implementation, a set of systematic case studies illustrating factors that contribute to or detract from the treatment’s effectiveness and a synthesis of these three components.

Think that sounds like a lot more work and expense? Traditional studies that don’t relate to practice and thus don’t get read by practitioners are a more disheartening waste of time and effort, says Dattilio, since their findings don’t get put into practice. And, he says, since therapy sessions are typically audio- or videotaped in randomized controlled trials, writing up systematic case studies — ones that illustrate both good and poor outcomes, for example — after the trial is over is quite feasible. Besides, adds Fishman, online journals now make it possible to add information beyond what’s possible in traditional print journals — rich qualitative information offering context about the range of individual variation in therapy.

The authors also acknowledge that case studies have to be more than the “my favorite case” approach that has given the methodology a bad reputation in scientific circles. To be taken seriously, says Fishman, case studies must be written in a structured, rigorous way. In addition to case studies, he adds, PCSP features articles focused on developing strategies and procedures for increasing the rigor of case studies. (The Sixth Edition of the Publication Manual of the American Psychological Association also contains information about how to write clinical case studies and how to protect patient confidentiality in such work.)

Other psychologists question the fundamental assumptions behind the authors’ proposal.

“I would challenge the authors to come up with really solid examples where a particular research question has been tackled with the elaborate strategy they propose and actually produced better results than a well-conducted, well-reported clinical trial,” says James C. Coyne, PhD, a psychology professor at the University of Pennsylvania School of Medicine.

Coyne also objects to the authors’ call for more qualitative research, explaining that the subjective nature of such work often renders it unhelpful. If two different groups of researchers analyze the same transcripts, tapes or data, he says, they’ll get entirely different results. That’s because they interpret what they’re seeing through their own theoretical perspectives. “They think they’re looking through a window,” he says, “but they’re only looking at a mirror.”

Similarly, he says, researchers often select case studies that reflect their own theoretical orientation rather than choosing cases objectively. The tendency to choose cases that are interesting rather than representative compounds the problem, he adds. “Journal editors like provocative case examples — man bites dog stories — precisely because they defy expectations,” he says.

While Coyne is sympathetic to the authors’ call to make the research literature more engaging and thus bridge the gap between research and practice, he thinks a better solution is simply to improve the way traditional studies are written.

“The message I give my students is, ‘Don’t be boring,’” says Coyne, who teaches scientific writing. “A well-done, highly cited scientific paper tells a story: It frames an interesting question and uses its introduction to create tension that makes the person want to read through to the discussion.”

While Dattilio, Edwards and Fishman acknowledge traditional criticisms of case studies like Coyne’s, they offer several counterarguments.

It’s simplistic to dismiss qualitative research as subjective, says Edwards, a psychology professor at Rhodes University in South Africa. “So-called objective research methods like those used in randomized controlled trials are not as objective as they look, because the human factor cannot be eliminated,” he says. “Any research method introduces bias in some direction, so the only way to overcome this is to combine methods that balance each other out.”

Edwards also points out that you can’t extrapolate from the mean score of a group to any single individual. “To treat individuals effectively, you need a different kind of theory that group comparison studies cannot produce,” he says.

In their Psychotherapy Theory, Research, Practice, Training article, the authors illustrate their approach with examples from PCSP of systematic case studies drawn from randomized controlled trials.

“We need to further implement and test this paradigm,” emphasizes Dattilio. “Then if we see that people are starting to pick up the literature, read it and use it because there are more case examples, that would speak for itself.”


Rebecca A. Clay is a writer in Washington, D.C.