Psychotherapy Articles

Psychotherapy Articles

Evidence-based practice in psychology is the integration of the best available research with clinical expertise in the context of patient characteristics, culture, and preferences.” APA Presidential Task Force on Evidence-Based Practice (2006, p. 273)

An advanced graduate student therapy trainee recently expressed concern about treatment of a difficult case seen in one of her placements. She was frustrated with a supervisor and torn between utilizing knowledge of the patient’s treatment history and family patterns versus following a different path suggested by a particular treatment manual. The frustration had been stirred up in context of a group discussion about tailoring treatments to fit individual patients, and using the empirical literature to do so.

She asked: “But doesn’t the research literature say that fidelity to treatment will bring the best effects? A patient I’m seeing now doesn’t like the approach for specific reasons, and it also hasn’t worked for her in the past. But, how can I respond to my patient’s needs and still be evidence-based? Isn’t it unethical to deviate from the manual if it is empirically supported?” Her plan before this discussion was simply to comply with supervisory input to follow the manual, but without much hope for its success with this patient.

The questions asked by this psychotherapist-in-training points to several challenges we face as educators and supervisors in the age of evidence-based practice. On the one hand, we need to provide specific training for empirically supported interventions. On the other hand, we need to help therapists develop the conceptual tools necessary to continue integrating research findings into their clinical work, and apply all these skills in a manner that takes into account individual client needs, preferences, and unique context (APA, 2006).

Reflecting our field’s current emphasis, the trainee mentioned above has been taught that empirically supported treatment packages (ESTs) represent the most ethical approach to treatment because of their proven track record in research (cf. Chambless & Crits-Christoph, 2006). She has even been told to steer clear of “non-EST” approaches by some faculty advisors. Given these directives, plus the constraints of time around provision of therapy in graduate training, she has focused almost exclusively on learning ESTs. As a result, she has considerable skill implementing a number of treatment packages for specific disorders, and can cite their empirical basis in randomized control trials (RCTs) with accuracy.

Her skill set as a psychotherapist is still quite limited, however. While she is gaining skill with a few interventions developed for discrete diagnoses, she has received little encouragement to be aware of (much less think integratively about) the broader empirical literature or identify principles that could help her more flexibly generalize and tailor her interventions (e.g., Castonguay & Beutler, 2006). When faced with clients whose needs do not easily fit the molds the models she knows, she is at a loss.

As educators, we should not be pleased with this result. Without additional input, this young psychotherapist will go out into practice with a relatively rigid skill set of limited applicability. The frustration she already feels suggests she is at risk for eventual “burn out” as a practitioner.

Old and New Views of Evidence Based Practice

Our trainee’s problems reflect tensions in our field over how best to weigh and apply research evidence. The primary view that has guided this young therapist’s education has held sway for roughly a decade and places emphasis on developing, testing, and disseminating treatment packages for discrete disorders (e.g., Gotham, 2006; McHugh & Barlow, 2010; Kazak et al, 2010).

A treatment qualifies as an EST based on successfully replicated, randomized control trial (RCT) studies (multiple single-case studies with strong research controls may also qualify for EST status; Chambless & Hollon, 1998). Lists of ESTs were initially compiled in an attempt to demonstrate that psychosocial treatments produced effects comparable to pharmacological interventions and should therefore receive research funding, training, and reimbursement in the era of managed care (APA Division of Clinical Psychology, 1995). An RCT study answers a single question about psychotherapy very well: “Does therapy X have an effect on disorder Y, if all other factors are controlled?”

The information provided by an RCT directly addresses the needs of an administrator overseeing a large system of care who wishes to ensure that “on average” there will be a positive effect if a particular approach is implemented. In an RCT, treatments are usually applied to a single category of disorder by clinicians trained to a high level of adherence. Randomization is used to distribute pre-treatment characteristics such as personality type, age, gender, motivation, and prior treatment experience evenly across groups so that they are unlikely to be responsible for any group differences in outcome. Dissemination of an EST tends to flow logically from the same research design: psychotherapists are trained to adhere to the EST manual and apply it with patients having a particular disorder (McHugh & Barlow, 2010; Kazak et al, 2010), just as in the case of our frustrated trainee.

By contrast, “evidence-based practice of psychology” (EBPP) has been defined by an APA Presidential Task Force (2006) as invoking all available research methodologies and focusing treatment on individual clients:

“It is important to clarify the relation between EBPP and empirically supported treatments (ESTs). EBPP is the more comprehensive concept. ESTs start with a treatment and ask whether it works for a certain disorder or problem under specified circumstances. EBPP starts with the patient and asks what research evidence (including relevant results from RCTs) will assist the psychologist in achieving the best outcome. In addition, ESTs are specific psychological treatments that have been shown to be efficacious in controlled clinical trials, whereas EBPP encompasses a broader range of clinical activities (e.g., psychological assessment, case formulation, therapy relationships). As such, EBPP articulates a decision-making process for integrating multiple streams of research evidence—including but not limited to RCTs—into the intervention process.” (p. 273)

Ultimately, the APA application of EBPP requires a higher standard from therapists and educators, and is likely to be worth the effort if it allows therapists like our trainee to effectively answer the questions she poses and meet the needs of her client. In addition to training with discrete treatment packages and intervention “tool kits,” the most successful therapists will also be prepared with sufficient background and conceptual skills to integrate what is known from across the research literature, combine it with clinical expertise, and apply it in ways that are flexible and responsive to client characteristics.

Skills Needed for Successful EBPP

The “competencies movement” in psychology seeks to identify the skills and attitudes that need to be acquired for professional development (Fouad et al, 2009; Kaslow et al, 2009). Its focus is comprehensive and sees psychotherapy skill acquisition as unfolding across levels of graduate training and professional practice.

Competencies are divided into two broad classes, those that are “functional,” reflecting discrete domains of professional activity (assessment, intervention, consultation, supervision, research/evaluation, supervision, teaching, administration, and advocacy), and those that are “foundational,” cutting across functional domains (professionalism, reflective practice, knowledge of scientific methods and findings, relationship skills, sensitivity to individual differences and cultural diversity, attention to ethical and legal standards and policies, and ability to interface with interdisciplinary systems). We wish to draw particular attention to foundational competencies that involve scientific method and recommend a particular kind of scientifically-minded thinking style vital for evidence based practice.

Scientific-Mindedness

By scientific-mindedness, we refer to a clinician’s willingness to engage in a process of inquiry that should involve not just consideration of the empirical literature, but also evidence available directly from clients. Ideally, the process begins with careful assessment that results in an individual case formulation, that is, a set of hypotheses about the sources and maintaining factors associated with an individual’s problems. Interventions are then selected in light of the relevant literature, and in consultation with the patient about his or her needs and preferences.

Ongoing evaluation of therapeutic impact then provides data about the effects of the intervention and can lead to flexible modification or a change in course as needed, and in collaboration with the client. Lambert and colleagues (e.g., Slade, Lambert, Harmon, Smart, & Bailey, 2008) provide evidence that feedback from formal, ongoing monitoring of symptom states can improve outcome. To extend this logic, depending on the individual formulation of a client, relevant outcome data may also involve clients’ patterns of thinking, feeling, or relating with others, motivation for change, quality of the in-session relationship, and more. To summarize, the proposition here is that psychotherapists be trained in a manner that leads to primary identity as a clinical scientist whose work places emphasis on generating and testing individual-level hypotheses about change, in a context of collaboration with clients and consultation with the empirical literature.

Critical Thinking and Integration

Critical thinking involves evaluating logic and weighing evidence. As applied to psychotherapy, it involves the ability to understand and evaluate published research results as well as to accurately assess the circumstances and experiences of individual clients. The complement to critical thinking is integrative ability, which involves being able to pull together different studies, different strands of data, and synthesize them into a specific hypothesis with associated plans of action.

Examples of integrative thinking would include pulling assessment data together into a case formulation with clear implications for treatment, detecting areas of overlap and convergence between multiple treatment methods, and using clinical experience to inform treatment decisions. With critical thinking, clinicians learn how to break problems into separate parts, evaluate and analyze underlying logic. Then, using integrative abilities they shuttle in the opposite direction, synthesizing information, generating new hypotheses and possible solutions that respond to unique circumstances. Both skills are needed.

Supervisors and educators can model these thinking skills and invite the same from trainees in concrete ways. For example, problems presented by an individual client could be used to demonstrate and directly apply principles of evidence-based practice. Students could be assigned to scour the empirical database about some aspect of the client’s presentation. The contents of EST manuals and other relevant material would be reviewed with an eye toward finding specific interventions of relevance. Once this review has occurred, the underlying logic and evidentiary base for treatment would be taken into consideration, as would areas of potential convergence across multiple studies or schools of thought. A mindful, collaborative, application of what has been learned would then be applied with the specific case.

Optimally, supervisor and trainee would become engaged in an active, collaborative, evidence-based endeavor involving careful assessment, consultation with the empirical literature, hypothesis formation about useful interventions, and systematic evaluation of their impact for an individual case. Three key elements of EBPP are present in the foregoing suggestion: primary focus on the individual through use of case conceptualization methods, active use of the existing evidence-base, and exercise of EBPP as a process of decision-making and empirical inquiry. At first, the training model would be slow and resource intensive, with a great deal of time spent focused on individual cases. With time and practice, the process can be abbreviated and tailored to training needs as clinical skills are effectively practiced and internalized.

Relationship Skills and EBPP

One of the more consistent findings in psychotherapy research studies with many different treatments and disorders is that a positive therapeutic relationship correlates with improved outcome (Horvath & Bedi, 2001; Wampold, 2001). Resources are increasingly available to summarize empirical work on the alliance and provide specific training recommendations (e.g., Muran & Barber, 2010; Norcross, 2002). The most studied aspect of the therapeutic relationship is the alliance, which consists of the affective bond between a patient and therapist, as well as their agreement about goals and therapeutic tasks for reaching them. Evidence-based practice may be particularly well-suited to enhance collaboration to the degree that it begins with focus on the individual client, thereby planting the seeds for a strong alliance.

Final Comments

The approach outlined here suggests that the curriculum for psychology training needs to include greater emphasis on “foundational” competencies so that skilled intervention is learned and applied in broader context of EBPP. Scientific-mindedness, critical thinking, integrative capacity and relational skills all must be modeled and practiced across the curriculum so that they become part of the language and culture of evidence-based professional practice. We believe that a basic introduction to evidence-based practice should occur from the earliest phases of psychotherapy training, rather than being treated as an ‘advanced topic’ to be learned only after diagnosis-specific interventions and ESTs have been mastered. Perhaps the easiest place to start implementing EBPP in training settings is simply to introduce the APAs definition of evidence-based practice and encourage critical thought and discussion about its elements and implications, as recommended by Levant and Hasan (2008). An edited volume by Norcross, Beutler, and Levant (2006) also provides a related, excellent overview of the issues and challenges our field faces integrating science and practice as the empirical database continues to grow.

Ultimately, our hope for future trainees is that they will continue to push and expand boundaries of our current knowledge, improving client outcomes through a process of active engagement with the evidence-base.

 

Ken Critchfield received his doctoral degree (Ph.D., 2002) in Clinical Psychology from the University of Utah where he used interpersonal models to study psychotherapy processes and case formulation. From 2004-2014, Dr. Critchfield was director of research, and eventually become co-director of the Interpersonal Reconstructive Therapy (IRT) clinic at the University of Utah Neuropsychiatric Institute. There, he worked closely with Dr. Lorna Smith Benjamin, creator of IRT, to operationalize and test efficacy and process of change of IRT as applied with adults having severe and chronic psychiatric problems often involving comorbid personality disorder and suicidality. He joined the faculty of the Department of Graduate Psychology at James Madison University in August, 2014 and continues to work in close collaboration with Dr. Benjamin. He is a licensed clinical psychologist and now co-directs JMU’s Combined-Integrated Doctoral Program in Clinical and School Psychology.

Cite This Article

Critchfield, K. L., & Knox, S. (2010). Conceptual skills needed for evidence-based practice of psychotherapy. Psychotherapy Bulletin.

References

APA Presidential Task Force on Evidence-Based Practice. (2006). Evidence-based practice in psychology. American Psychologist, 61, 271–285.

American Psychological Association Division of Clinical Psychology. (1995). Training in and dissemination of empirically-validated psychological treatments: Report and recommendations. The Clinical Psychologist, 48, 3–27.

Castonguay, L. G., & Beutler, L. E. (2006). Principles of therapeutic change that work. New York: Oxford University Press.

Chambless, D. L., & Crits-Christoph, P. (2006). What should be validated? The treatment method. In J. C. Norcross, L. E., Beutler, & R. F. Levant, (Eds.) Evidence-based practice in mental health: Debate and dialogue on the fundamental questions. Washington, DC: American Psychological Association, (pp. 191-200).

Chambless, D. L., & Hollon, S. D. (1998). Defining empirically supported therapies. Journal of Consulting and Clinical Psychology, 66(1), 7-18.

Fouad, N. A., Grus, C. L., Hatcher, R. L., Kaslow, N. J., Hutchings, P. S., Madson, M., et al. (2009). Competency benchmarks: A model for the understanding and measuring of competence in professional psychology across training levels. Training and Education in Professional Psychology, 4(Suppl.), S5–S26.

Gotham, H. J. (2006). Advancing the implementation of evidence-based practices into clinical practice: How do we get there from here? Professional Psychology: Research and Practice, 37, 606–613.

Horvath, A. O., & Bedi, R. P. (2002). The alliance. In Norcross, John C. (Ed), Psychotherapy relationships that work: Therapist contributions and responsiveness to patients. (pp. 37-69). New York, NY, US: Oxford University Press.

Kaslow, N. J., Grus, C. L., Campbell, L. F., Fouad, N. A., Hatcher, R. L., & Rodolfa, E. R. (2009). Competency Assessment Toolkit for professional psychology. Training and Education in Professional Psychology, 3, S27-S45. doi: 10.1037/a0015833

Kazak, A. E., Hoagwood, K., Weisz, J. R., Hood, K., Kratochwill, T. R., Vargas, L. A., Banez, G. A. (2010). A meta-systems approach to evidence-based practice for children and adolescents. American Psychologist, 65(2), 85-97.

Levant, R. F., & Hasan, N. T. (2008). Evidence-based practice in psychology. Professional Psychology: Research and Practice, 39(6), 658-662.

McHugh, R. K., & Barlow, D. H. (2010). The dissemination and implementation of evidence-based psychological treatments: A review of current efforts. American Psychologist, 65(2), 73-84.

Muran, J. C., & Barber, J. P. (2010). The therapeutic alliance: An evidence-based approach to practice and training. New York: Guilford.

Norcross, J. C. (2002). Psychotherapy relationships that work: Therapist contributions and responsiveness to patients. New York: Oxford University Press.

Norcross, J. C., Beutler, L. E., & Levant, R. F. (2006). Evidence-based practice in mental health: Debate and dialogue on the fundamental questions. Washington, DC: American Psychological Association.

Slade, K., Lambert, M. J., Harmon, S. C., Smart, D. W., & Bailey, R. (2008). Improving psychotherapy outcome: The use of immediate electronic feedback and revised clinical support tools. Clinical Psychology & Psychotherapy, 15, 287-303. doi: 10.1002/cpp.594

Wampold, B. E. (2001). The great psychotherapy debate: Models, methods, and findings. Mahwah, NJ, US, Lawrence Erlbaum Associates Publishers.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *