Like nursing, medicine and other health-care disciplines, psychology is grappling with and helping to shape the evidence-based practice (EBP) movement, a public health agenda that calls on practitioners to use the best available scientific evidence as a basis for formulating treatments for individual clients. In cancer care, for example, EBP can mean informing patients about the most recent advances in chemotherapy and guiding them to the best type for their particular illness. In psychology, meanwhile, it can entail telling clients about the most effective phobia treatments and helping them pick one based on their preferences, personality traits and cultural attitudes.
"The public wants to know that the health professions are practicing based on the best evidence available," says APA President Ronald F. Levant, EdD, who took the helm as chair of an association-wide task force that is crafting an APA policy on the topic. "In addition to consumer demand, the marketplace will require evidence that our interventions work. It is vitally important that psychology help shape how evidence gets defined and how evidence-based practice evolves."
To achieve this end, psychology faces challenges similar to those of other professions, including the need to conduct trials on little-tested therapies and to otherwise demonstrate that its practices are well backed by data and improve relevant outcomes. In essence, this involves compiling and fleshing out existing research so it makes an effective case for widely used psychotherapy practices, members of the APA task force say.
"Any discipline that regards itself as a scientific discipline needs to pay attention to these kinds of issues," comments Steve Hollon, PhD, a task force member and longtime depression researcher at Vanderbilt University. In medicine, drugs must undergo rigorous trials before going to market; likewise, he says, psychology must show that its treatments work for the public and insurers to put stock in them. "It is incumbent on us to ensure that the treatments we offer have an empirical base," he says.
In addition, psychologists confront issues unique to their field, in particular providing forums for scientists and practitioners to work together to build consensus on what constitutes evidence and how best to collect it. Some research-oriented psychologists believe that only those psychological services that have been supported by randomized controlled trials (RCTs)-the most rigorous methodology for testing treatment efficacy-should be offered to the public. Meanwhile many practice-oriented psychologists would like to see greater attention paid to contextual, qualitative research and observation, which they contend better captures the nature of the therapy experience.
That said, the field is the only mental health discipline that has championed the integration of science and practice since its inception, notes task force member and veteran anxiety researcher David H. Barlow, PhD, of Boston University. "Psychology has been ahead of the curve in setting the tone for EBP," he says.
Despite this, the movement toward accountability in health practice creates a need to further demonstrate and communicate the effectiveness of psychotherapy, says Levant. And while there is a growing body of evidence showing that psychotherapies work, some voices in the discipline express concern over the lack of a dividing line between interventions supported by sound science and those that have not been rigorously tested or have been shown to be harmful. The controversy, says Levant, highlights the fact that "this is an era of accountability, and now is the time for us to further develop our evidence base."
The mission is important to jump on, emphasizes University of Scranton clinical researcher John Norcross, PhD, a task force member, as it promises to shape the direction of the field: "What is designated as evidence-based," he notes, "will increasingly determine what therapies and tests are conducted, what is taught, what is researched and what is reimbursed."
Starting point: The IOM definition
Levant, a longtime practice advocate, launched the task force so the field would take a proactive and unified stance toward EBP--important, he believes, both because of the rifts caused by the recent controversy over the integrity of some psychological treatments and because many APA divisions were devising policies on EBP but the association as a whole had not yet developed its own statement. To this end, Levant charged the group with the basic goal of creating both an EBP policy statement and a public position paper that represent the views of the field and are targeted to external audiences such as policy-makers, insurers and the media.
The group, made up of a widely representative group of scientists, practitioners and policy experts (see box, this page), met twice, once in October and once in January, and used as its starting point the most broadly accepted definition of EBP as set by an influential 2001 Institute of Medicine (IOM) report: EBP is "the integration of best research evidence with clinical expertise and client values."
In its work, the group:
Considered how a broad range of evidence--everything from randomized controlled trials (RCTs) to clinicians' expert observations--should be integrated into psychology's definition of EBP;
Explicated the role of clinical expertise in treatment decision-making, including how practitioners should consider multiple streams of evidence and relevant research when they deal with clients; and
Examined how patient values, sociocultural factors, and other patient characteristics influence treatment acceptability and consumer choice and should inform treatment decisions.
The group's final document--now beginning a lengthy APA review process (see box, below)--combines these ideas into a succinct but detailed statement that fleshes out each area and includes an extra section on their successful integration in practice.
"The statement is a positive, forward-looking, affirmative yet richly nuanced model of EBP," says Barlow. "It should serve as a guide to other health care professions and policy-makers for further developing the EBP concept in health care."
Best research evidence
One factor is the debate over the relative merit of RCTs compared with other kinds of research and observation. Some psychologists say RCTs constitute the most rigorous means of testing therapy; others say it is nearly impossible to fully capture the therapy experience in discrete, highly formulated studies. Also, current clinical research in psychology provides good evidence supporting the general benefits of therapy but less regarding which specific forms of therapy work for whom and why. Finally, the outcome of the discussion holds incredibly high stakes, task force member Norcross notes.
"If everyone came out and said that only a certain kind of therapy is evidence-based and therefore it is the only therapy that should be reimbursed, practitioners would start thinking they should only conduct that kind of therapy," says Norcross. Since many good treatments haven't been rigorously tested with RCTs, this outcome would severely curtail practice, he believes.
A major discussion entailed the extent to which RCTs should determine the direction of treatment, research and reimbursement. Commonly used in medicine and considered the "gold standard" of clinical trials, RCTs' greatest strength, task force members agreed, is their ability to show that a given intervention causes a given outcome by controlling for client characteristics and randomizing clients to different treatment conditions, thus allowing an "objective" assessment of a given treatment's effect. They also are considered to provide the most compelling basis for large-scale health-policy decisions. For example, RCTs were able to show that hormone replacement therapy (HRT) caused increased breast-cancer rates in users, which nonrandomized studies hadn't. The reason for that, Hollon explains, is that mainly health-conscious women tended to join the initial nonrandomized HRT trials, and those early results were therefore skewed to healthy users.
But RCTs can be problematic in ways that make using other forms of evidence imperative as well, Norcross and others say. As some individual RCTs have been constructed, they can trade internal validity--research precision--for external validity--the ability of a treatment to work in real-life settings. "If your question is to causally ask, 'Does this treatment work?' then RCTs are fine," Norcross says. "But if the broader question is, 'What do I do with the client in front of me--a person with unique characteristics who likely won't respond well to a one-size-fits-all treatment?'--then we absolutely need to rely on the diversity of research methods." These include effectiveness studies, which establish the validity of treatments as they're practiced in the community; single-subject designs, which measure a client variable multiple times before and after an intervention; process-outcome studies, which link in-session behaviors to treatment outcome; and qualitative analyses, which can provide a more fine-grained look at what is occurring in therapy.
In addition, bias can plague RCTs at any point along the way, whether by the researcher, the funder or the publisher--though of course this problem is not limited to RCT research, task force members agreed. And the RCTs conducted to date have not been designed to examine subjective aspects of therapy such as the nuances of therapist-client communication or the therapeutic alliance--factors that research show help determine therapy success, says Task Force Co-chair Carol Goodheart, PhD, an independent practitioner in Princeton, N.J. Process-outcome studies, which examine the quality of the therapy relationship, can help to fill this gap, she notes.
And, says Geoffrey M. Reed, PhD, APA's assistant executive director for professional development, little is known about how the findings from RCTs translate to everyday practice. Finally, RCTs have not yet been designed to appropriately capture global findings about psychotherapy, nor have they examined the therapies most widely used in the community, says Reed, who is working with the task force. Nevertheless, decades of research show that a wide variety of therapy approaches are more effective than no treatment at all; that the quality of the therapy relationship is more important than treatment type; and that psychotherapy is as, or more, effective at treating depression and anxiety than medication, for instance, he says.
"Techniques account for a relatively small proportion of the variance in therapy outcome," says Reed, "yet that is what the fuss is all about." For example, RCTs have shown that cognitive behavioral therapy and interpersonal therapy work well in treating depression, he says, but both techniques were specifically formulated to be easy to test experimentally, so they are particularly well suited to clinical trials and therefore have been studied more often. And the fact that they succeed in treating depression, he adds, does not mean other commonly used therapies do not.
Looked at another way, though, RCTs have potential that should be more widely tapped, task force members say. Behavioral health researcher Karina Davidson, PhD, of Columbia University and a task force member, for instance, notes that RCT protocols have become more sophisticated over time and are far more likely to represent real clients and therapy than they did in the past. Because of improved RCT methodology, Davidson thinks the field would benefit from running RCTs of treatments that are more widely practiced, such as eclectic therapy--where practitioners use a range of approaches in treatment as deemed appropriate to the client--and psychodynamic therapy, both of which are used to treat clients with a wide range of common conditions, including depression and anxiety.
It could do this in the language of third-party payers, she believes, by designing a "best clinical judgment" study where clients with anxiety, for example, are randomized to a best clinical judgment treatment condition or a wait-list control condition. Indeed, RCTs already have been shown to benefit the field. In a December 2004 article in the American Psychologist (Vol. 59, No. 9), Barlow argues that over the past several years, the country's leading medical journals--the Journal of the American Medical Association and The New England Journal of Medicine--"have run numerous studies meeting every current gold standard of evidence showing that psychological treatments are as good or better than alternative treatments such as drugs," he says. In addition, data show that the treatments, conducted on a range of conditions as diverse as panic disorder and insomnia, last longer and are better liked than drug therapy, he notes.
More such work needs to be done, Davidson and others believe. "At present, we are not sufficiently shaping our field to initiate clinical trials that will be persuasive to government and reimbursement agencies," says Davidson, who is leading the effort to create a behavioral medicine field as part of The Cochrane Collaboration, an international voluntary organization that systematically reviews evidence for best practices in health care. "For the good of the public," she emphasizes, "we need to conduct studies on the range of our treatments so that effective treatments can be reimbursed."
If psychologists are not clear about the broader definition of EBP, they may find themselves vulnerable to insurance and managed-care companies who have an economic interest in using a narrow definition, adds APA's executive director for professional practice, Russ Newman, PhD, JD.
"Attempts to inappropriately limit and reduce which treatments are reimbursed is not a new thing," says Newman. "We need to make sure that EBP doesn't become another weapon to be used in the service of putting profits before patients," he cautions.
Clinical expertise and client values must also play strong roles in psychology's definition of EBP, since the best-researched treatments won't work unless clinicians apply them effectively and clients accept them, task force members say.
Hence, the task force honed a definition of clinical expertise by examining the literature on what differentiates experts from novices, says Levant.
"Clinical expertise cannot be dismissed as mere intuition," he says. In psychology, for example, data show that seasoned clinicians are better than younger professionals at making diagnoses and at modifying them when clients reveal information they were previously afraid to reveal, he says.
Clinical expertise also means the ability to choose and tailor treatments to a client's changing situation, says Goodheart--not to simply pick a treatment because research says it's effective.
"In the end, it comes down to that moment in the room where you have all of psychology's knowledge to draw upon, but you have to make an individual choice based on what you think will best help this person," she says. This perspective was incorporated into a definition of EBP to accurately portray psychology's strengths, she believes.
The "client values" portion of EBP is still the newest area and consequently that in need of most work, task force members agreed. For one, the IoM definition needs to be expanded "to include a whole range of client characteristics, including ethnicity and cultural variables, that don't get adequately represented by the term 'values,'" Reed says.
More importantly, the field must face the fact that it lacks research on many diverse client groups, including ethnic-minority groups, says task force member Nolan Zane, PhD, of the University of California-Davis, who coordinated the client-values groups. While psychology has done more research than other fields in this area and is better equipped to study diversity issues in treatment, there is relatively little research that specifically addresses the therapy needs of certain groups who historically have been underserved in mental health care, Zane says. And the research that does exist, he adds, suggests that some minority groups may not respond well to traditional therapies. His own research on Asian immigrants, for example, reveals that they are less satisfied, more anxious and angry, and more symptomatic after short-term treatment than other groups.
"We don't have best available evidence for many client groups," Zane says, "and researchers need to proactively design studies to include major samples of certain groups that the research suggests do not respond to many types of mental health treatment."
Psychology's momentum and that of the larger EBP movement suggest directions for the field that extend beyond the work of the task force and may be addressed in future APA groups, task force members say.
One is to accept the reality that marketplace forces will play a major role in determining the direction of reimbursement and to strive to present psychology's case in language that insurers and policy-makers will understand, says Hollon. "We're kidding ourselves if we think the field alone is going to resolve the issue," he says.
Like Davidson, Hollon emphasizes the importance of testing tried-and-true therapies before it is too late. "If we have effective interventions and we don't have good data showing they're effective," he says, "they're going to die a slow death. And that would be a shame."
Task force participants also recognize the need to better translate research findings for clinicians, since there is simply too much research for clinicians to stay on top of by reading all of the relevant journals, says Reed. "We need to make data available to clinicians in a way that will be useful to them at the point of service," Reed believes.
Finally, the field must remember to consider a broad audience when making its case, says Reed. Ironically, the field can become so immersed in theoretical nuances when designing studies and discussing therapy progress that it misses the fact that the health-care system is looking for clear-cut data, he says.
"Psychologists will always want to see things in a more complex way, but we have to be effective in encouraging health-care policies that don't inappropriately restrict practice," Reed says. "That is where our challenge lies."
The APA Presidential Task Force on Evidence Based Practice in Psychology's proposed policy statement on evidence-based practice is now posted online.