An article published by the American Psychological Society (APS) and widely covered in the media has raised concerns about the quality and scientific training offered by clinical psychology programs. But the report was marred by inaccuracies, say APA officials and others in the psychological community, who worry that the distortions could turn away consumers who may need psychological services.

The article, "Current Status and Future Prospects of Clinical Psychology: Toward a Scientifically Principled Approach to Mental and Behavioral Health Care," appears in November's Psychological Science in the Public Interest. It claims psychology practitioners often fail to base their treatments on scientific evidence. They single out PsyD programs for criticism, claiming "their very nature and goals often are antithetical to science-based training," and that they "[train] future psychologists to value local knowledge over knowledge accumulated by conventional science."

The authors—Timothy B. Baker, PhD, Richard M. McFall, PhD, and Varda Shoham, PhD—primarily blame APA's accreditation system for not holding PsyD programs to a standard that emphasizes scientific training. They say a new accreditation system supported by APS, known as the Psychological Clinical Science Accreditation System (PCSAS), will better ensure clinicians' scientific grounding.

"The current interest in health care makes it very pressing that psychology be really clear in terms of who is doing the services and where they're coming from and what their perspectives are," McFall said in an interview with the Monitor. "We think it's a responsibility to be clear to the public about what [psychologists'] training is like. The models are not homogeneous; there are some sharp differences among them."

On Oct. 2, several news outlets, including Newsweek, Science, Nature and The Chronicle of Higher Education published pieces discussing the journal article in ways that showed a misunderstanding of both psychology training and practice, say APA officials. For example, Newsweek reporter Sharon Begley reported that "millions of patients are receiving chaotic meditation therapy, facilitated communication, dolphin-assisted therapy [and] eye-movement desensitization." But practitioners in these areas usually aren't licensed psychologists, stresses APA Executive Director for Professional Practice Katherine Nordal, PhD, even though they may claim their practices are types of psychotherapy.

APA officials also found fault with the journal article for comparing the state of clinical psychology today to turn-of-the-20th-century medicine, when it was viewed by physicians "as a craft or an art." Many practitioners today, the journal authors wrote, eschew science in favor of their own pet treatments.

One of the studies the authors base that assessment on is a 2007 survey by University of Pennsylvania psychology graduate student Rebecca Stewart and her adviser, clinical psychologist Dianne Chambless, PhD (Journal of Clinical Psychology, Vol. 63, No. 3). The survey found that a non-random sample of independent practitioners reported they would rely more on clinical experience than current research when treating a hypothetical patient with a panic disorder.

Both the Chronicle of Higher Education and Science pieces repeated the APS journal article author's characterization of these psychologists' views as anti-scientific. But that's an unfair comparison, says Steve Breckler, PhD, APA's executive director for science.

"If you look at the results carefully," he says, "they show that clinicians report relying on both clinical experience and the research evidence."

What's more, the same study found that the vast majority of practitioners were indeed selecting the evidence-based course of treatment for panic disorder, Breckler says.

"The study demonstrated that clinicians are doing precisely what the evidence supports," he says. "Isn't that what we are after?"

The point of the Journal of Clinical Psychology article was that, when given access to the latest scientific findings, psychologists embrace the science, Breckler says.

In fact, says University of Wisconsin–Madison counseling psychologist Bruce Wampold, PhD, the authors' criticisms of PsyDs as being less cognizant of science than scientific clinical psychologists is a red herring. "[The journal authors] have a very narrow view of what science should be," he says. "And if clinicians do not conform to their view of science, then they are being characterized as unscientific."

Wampold, who chronicled the roles research evidence and the client's relationship with practitioner and treatment play in psychotherapy in his book, "The Great Psychotherapy Debate" (Lawrence Erlbaum, 2001), says one interpretation of the evidence is that factors such as the therapeutic alliance—which is a combination of the bond between the psychologist and the patient as well as agreement about the tasks and goals of therapy—are far more important to an effective therapy than the choice of a particular treatment. Instead of recognizing the complexity of moving research out of the lab and into the clinic, the journal article's authors have made PsyD programs and more generally APA's accreditation system scapegoats, he says.

"The journal authors are correct in observing a disconnect between evidence being assessed and utilization of that evidence," adds Breckler. "But they're wrong in attributing it to an anti-science attitude among practitioners."

Nordal agrees, adding that Newsweek's report is especially off the mark because it went beyond an academic critique into a slam of psychotherapy.

Newsweek published Nordal's response to Begley's article on Oct. 12. In her letter to the editor, Nordal said, "The American Psychological Association ethics code dictates that psychologists base their clinical judgments on scientific and professional knowledge. Clinical guidelines for physicians recognize the psychotherapy we provide as an effective first-line treatment for depression due to a substantial body of supporting research."

She's concerned that distortions of the issue in the media could turn off people who could benefit from therapy if they think therapists don't use science.

"What if someone who had depression had picked up this article?" she asks. "What's the potential impact on consumers?"

On this point, McFall agrees that the media's handling of the issue has misrepresented both sides' positions and went too far in its portrayal of the issue as a conflict.

"What's really unfortunate, I think, is the way the press has tended to want to pick fights," McFall says. "That is not what we're trying to do. We're trying to promote a positive concept, we're not trying to take on a battle with anybody."

Breckler says that although the journal article's authors may have been disappointed with the media reports' tone, he's displeased with the way they promoted their views in the media in the first place.

"My biggest concern is that the article's themes made their way into the mainstream media," Breckler says. "To me, this reflects an organized effort to promulgate a poorly thought-out alternative accreditation system."

Nordal also took issue with the article's characterization of PsyD programs as being lax on science training. "You can't do a broad-brush on all of them," she says. "Some PsyD programs provide considerable research training."

In fact, points out Cynthia Belar, PhD, APA's executive director for education, APA's Guidelines and Principles for Accreditation of Programs in Professional Psychology (PDF, 460KB) explicitly state that science is foundational to the education and training of professional psychologists. All students, whether they're seeking a PhD or PsyD, are required to be competent in "the breadth of scientific psychology" and the "scientific, methodological and theoretical foundations of practice," including "training in empirically supported procedures," according to the document.

Many psychologists would agree that PsyD programs don't require the same scientific rigor as PhD programs. But, says David Barlow, PhD, the highly regarded scientist-practitioner of Boston University, it's not fair to accuse them as broadly as the APS journal article authors did.

"There's a lot of variability [in PsyD programs] that isn't necessarily recognized on either side," he says. "There are some excellent professional schools that are on the cutting edge of producing psychologists competent to deliver evidence-based psychological practice. And then there are some obviously who barely know what the word means."

Barlow says one of the main problems is that the current accreditation system is too rigid to allow practitioners and researchers to spend time developing their own particular skill sets. For example, psychologists looking to go into "big science"—such as research that incorporates large multi-site clinical trials, psychopathology work involving brain structure and function, and work that intersects with medicine and neurobiology—often aren't mentored in how to write grants or work across disciplines because so much of their training goes toward learning clinical skills, he says.

"The Commission on Accreditation, in attempting to be all things to all people, is having trouble bridging that gap given the extraordinarily wide range of programs out there," Barlow says.

But the solution might not be another level of accreditation, says John Kihlstrom, PhD, a cognitive science professor at the University of California, Berkeley. He says that although he believes there are problems with the current PsyD model, PCSAS probably wouldn't solve the perceived problems, and could even hurt psychology's reputation. It's analogous to training in medicine, he says, where physicians and medical researchers receive quite different training but the training institutions aren't accredited differently.

"Medicine doesn't have two competing accreditation systems," Kihlstrom says. "The establishment of a second accreditation system can only give the appearance of a field in disarray."

Breckler doesn't blame the accreditation standards. Instead, he says, a more logical explanation is that practitioners don't have an easy, efficient way to access the latest scientific findings. Physicians, for example, frequently receive updates on the most recent drugs and treatments because the pharmaceutical industry pushes out the latest science, he says. Psychologists lack a similar infrastructure.

That could be changing, though, Breckler says. APA's Practice and Science directorates are collaborating to develop treatment guidelines for practitioners. To create a guideline, clinicians and research scientists would identify a focus area, such as depression, form an independent panel to review available treatments and outcome measures, then decide whether there's enough evidence to recommend one treatment over another under various circumstances.

Breckler stresses such guidelines would not be a cookbook. Clinicians would still need to rely on their expertise and experience to determine which treatment is best for which patients.

APA President-elect Carol Goodheart, EdD, a private practitioner in Princeton, N.J., says this last point is often at the root of conflicts between researchers and clinicians. In clinical settings, psychologists frequently have to tailor treatments to individual circumstances and make judgment calls as to which scientifically informed treatment will work best. There aren't always easy answers and "the almost exclusive focus on randomized controlled clinical trials in our disciplines often serves as an impediment to our reaching common ground," she says.

Treatment guidelines will help practitioners update their training to include the latest findings, which would not only improve treatment efficacy, but also would give consumers confidence that practitioners are relying on cutting-edge science, Breckler adds. This would encourage them to seek out help when they need it, he says.

"In the end," Goodheart says, "it's important for all psychologists to remember that the goal of the clinical work we do is to improve people's lives."