Psychotherapy Bulletin

Psychotherapy Bulletin

Making Psychotherapy Scalable by Teaching Nonprofessionals to Deliver Treatment to Each Other

Author’s Note: We would like to thank the Society for the Advancement of Psychotherapy for support of this research via a 2015 Charles J. Gelso, PhD, Psychotherapy Research Grant. This research was also supported by funding from the National Institutes of Health (F31MH103927), a Society for Psychotherapy Research Small Grant, an American Psychological Association Dissertation Research Award, and the University of Massachusetts Amherst. Correspondence regarding this article should be addressed to Sam L. Bernecker, Department of Psychology, Harvard University, 1284 William James Hall, 33 Kirkland St., Cambridge, MA, 02138. Email: [email protected]

Clinical Impact Statement: This article describes a study that tested the efficacy of a new Crowdsourcing Mental Health intervention, designed to train laypeople in the use of basic psychotherapy skills. The preliminary results suggest that a brief online training may be a useful way to teach helping skills to the general public.

Decades of psychotherapy outcome research and countless meta-analyses show that psychotherapy works. Unfortunately, psychotherapy is a luxury afforded to few. Only a minority of people with mental illness receive treatment (Kessler et al., 2005), due to both attitudinal barriers (e.g., stigma, desire for self-reliance) and structural barriers (e.g., cost, provider availability; Mojtabai et al., 2011). And psychotherapy is not particularly efficient, typically involving an hour per week across multiple, if not many, weeks with a trained professional. To improve public psychological well-being, “disruptive innovations” are needed that make treatment more efficient by overcoming these barriers (Rotheram-Borus, Swendeman, & Chorpita, 2012).

The current project preliminarily tested one such innovation, funded partly by the Society for the Advancement of Psychotherapy through a Charles J. Gelso, PhD, Psychotherapy Research Grant. We developed an intervention, called “Crowdsourcing Mental Health” (CMH), in which individuals with some level of psychological distress seek a “partner” who is already known to them (e.g., friend or family member) to participate with them. This partner may or may not also be experiencing distress; regardless, each partner has the opportunity to both provide and receive care. Both dyad members take an online course that teaches “talking” and “listening” skills. The talking skills guide the speaker through the process of exploring a stressor and making a coping plan. The listening skills, which are the focus of this report, include a suite of active listening attitudes and behaviors, including mindfully attending, taking a nonjudgmental attitude, making reflections, asking open-ended questions, and avoiding attempts to influence the speaker. After completing the course, the partners meet weekly face-to-face to discuss current stressors, taking turns in client-like and therapist-like roles.

CMH was inspired by two sets of psychotherapy findings. First, nonprofessional mental health practitioners appear to be effective at delivering simple treatments, and at times may not perform any worse than professional caregivers (Berman & Norton, 1985; van Ginneken et al., 2013). Second, although there is clearly some benefit for sophisticated mental health interventions, simple supportive psychotherapy does appear to reduce symptoms, and results can be comparable to more complex interventions for those with lower symptom severity (Cuijpers et al., 2012; Cuijpers, van Straten, Andersson, & van Oppen, 2008). Thus, an efficient intervention in which laypeople provide only psychotherapy’s simplest elements could cause at least moderate improvement, which could have a substantial public health impact when disseminated widely.

CMH circumvents structural barriers to treatment-seeking because of its near-universal accessibility through a free online course. It also overcomes attitudinal barriers: We surveyed over 1,000 Internet users to assess public interest in the intervention and about 60% of respondents indicated that they would try CMH (Bernecker, Banschback, Santorelli, & Constantino, 2017). Importantly, interest in CMH was nearly as high among respondents who stated that they would not use psychotherapy or medication. Further, CMH was appealing across levels of psychological distress and demographic characteristics. Thus, peer-delivered psychotherapy could be a viable vehicle for disseminating psychotherapy ingredients to the public.

However, prior to the current study, we know of no rigorous test of whether non-professionals can learn psychotherapy skills from an online course, particularly one that lacks interaction with an instructor. Therefore, this study assessed whether nonprofessionals who completed the CMH course changed their observable helping behaviors in mock counseling sessions recorded before and after the course. Because this was a test of the course’s teaching efficacy, participants were not required to meet with each other to engage in the intervention after they completed the course, and we did not assess effects on mental health of repeated interactions with one’s partner while using the skills. However, as a proxy for psychological benefits, we measured the perceived helpfulness of the sessions.

Method

Participants. Thirty pairs of friends, roommates, romantic partners, or family members (60 individuals) were recruited “to learn ways to manage stress and feel closer to another person” via flyers, Web advertisements, and listserv announcements in western Massachusetts. Approximately two-thirds of the participants were local undergraduate students.

CMH Course. The CMH course design is based on behavior modeling training, a well-supported method for learning behavioral skills (including counseling skills) that consists of four components: instruction, modeling, practice, and feedback (Taylor, Russ-Eft, & Chan, 2005). In the course, each skill is introduced with an instructional video with slides and audio lecture and a written review exercise. Learners then watch videos of actors modeling the skill. Finally, participants practice engaging in the skill, then complete self-evaluation questionnaires as feedback on whether they engaged in the necessary behaviors. The types of practice increase in complexity in order to scaffold progress, from written exercises (e.g., typing replies to videos of actors), to telephone practice with a “mentor” (anyone who has already taken the course; for this study research assistants served as mentors), to in-person practice with one’s partner. Including all exercises, the course takes approximately 20 to 25 hours to complete.

Procedure. Pairs were randomly assigned to an immediate training group or a waitlist group. Pairs in the immediate training group were video recorded completing a mock counseling session in the laboratory in which they were asked to take turns discussing a stressor in the way they ordinarily would. They then took the online course over a period of 4 to 8 weeks before returning to the lab to complete a second mock counseling session in which they were asked to take turns using the skills they learned in the course. The waitlist group was included to account for the possibility of changes due to repeated testing or other confounds, and participants in this condition completed two “pre-training” mock counseling sessions spaced approximately 4 weeks apart. The waitlist participants also completed a third session after taking the course, in the same format as the immediate training group, in order to increase the sample size available for testing pre- to post-training change.

Measures. Session transcripts were coded by trained research assistants (RAs), who were blind to group assignment and time point, and by the first author. Each sentence received a mutually-exclusive code: restatement, open-ended question, closed-ended question, self-disclosure, miscellaneous sympathetic utterance, or other. In addition to its main category, any sentence could also be marked as “influencing” when it included an attempt to influence the speaker (e.g., through advice-giving or reassurance). Two RAs coded each session and resolved any disagreements through discussion with each other and, if necessary, the larger coder group. Participants were considered to have achieved full competence to deliver the intervention if they met six criteria (all of which were dichotomous cutoffs based on the coded behaviors).

Additionally, participants rated the helpfulness of the mock CMH sessions with the CMH Session Reaction Scale (CSRS), a modified version of the Revised Session Reaction Scale (RSRS; Elliott, 1993), an instrument used for clients to rate psychotherapy sessions. The CSRS has two subscales, task reactions (progress towards problem resolution through insight, emotional relief, or problem-solving, coefficient α .86-.92) and relationship reactions (feeling understood by, connected to, and supported by one’s partner, coefficient α .84-.95).

Data analysis. For each outcome variable, we used Bayesian multilevel models to (1) estimate the amount of change in each behavior from pre- to post-course (aggregating across participants from both groups) and (2) test whether the amount of change between the first two sessions was different for the training group and the waitlist group. Multilevel modeling was necessary to account for clustering of data. A Bayesian approach was used because of its flexibility for modeling different forms of outcome variables, and because models would have been unidentified under frequentist maximum likelihood estimation. Due to space considerations, we present only the more easily-interpretable descriptive statistics and simply describe the conclusions implied by the Bayesian models (which will appear in a forthcoming article) in the text.

Results

Nine dyads (30% of participants) withdrew from the study prior to completion, primarily because one partner of each pair was concerned about time demands; sensitivity analyses suggest that attrition did not substantially bias these results.

Table 1 displays means, standard deviations, and within-person effect sizes for change in each outcome variable from before to after taking the course, aggregating across both the immediate and waitlist group. There was evidence that participants changed their listening behaviors in the desired directions from pre- to post-training for most variables, and that this change was a consequence of taking the course. Specifically, the course caused a substantial decrease in the number of sentences uttered; an increase in the proportion of sentences classified as attempts to influence the speaker, as self-focused talk, and as “other” (mostly off-topic); and an increase in the proportion of sentences classified as restatements. The effect of training on open-ended questions was less clear: There was evidence for change from before to after taking the course, but the magnitude was small enough that there was no “significant” difference between the training and waitlist groups. There was no evidence of an effect of training on closed-ended questions or miscellaneous sympathetic utterances. None of the participants were deemed competent to deliver CMH prior to taking the course, whereas 18 participants (30.0% of the full sample, 42.8% of completers) achieved competence after the course. In terms of perceived helpfulness, sessions that took place after training were viewed as more productive, but there was no change in how participants felt about their relationships with their partners.

Discussion

These results indicate that a massively scalable online course, at least one that is designed with evidence-based pedagogical techniques in mind, can substantially change non-professionals’ helping behaviors. For the most part, participants displayed strong active listening skills throughout post-training sessions, as measured through direct observation of objective criteria. Participants also reported that their post-training sessions were more productive in resolving their concerns, which suggests that the behaviors prescribed in the course could have positive effects on psychological well-being.

However, this course will require refinement before launching access for the general public. In order to decrease attrition, we plan to reduce its length and use participants’ qualitative feedback to make the course more engaging. Additionally, we hoped that 80% of completers would achieve competence, but only about half of that proportion met all competence criteria. In examining the reasons for failure, we found that most participants only missed one criterion; in hindsight, that criterion may have been too stringent, given that qualitatively, their behaviors appeared acceptable. The course’s only moderate success in promoting full competence may be, then, less of a concern than it initially appears, but we still plan to make some adjustments to the course in order to further improve learners’ post-training performance.

More broadly, by demonstrating that nonprofessionals can learn therapeutic skills from an online course, this study opens the door for a variety of strategies for using this vehicle to improve public health, increasing access to low-level care at essentially no cost. Versions of the course could be developed for different target populations, and lessons targeting specific symptoms or concerns could be tested and added. Courses like this could also be used to increase the efficiency of training for professional psychotherapists, counselors, social workers, and certified peer specialists.

Such large-scale, peer-delivered interventions also have the potential to advance psychotherapy research. Although perhaps the most rigorous method for investigating change mechanisms is to directly manipulate a putative mechanism, component treatment studies often produce null results (Bell, Marcus, & Goodlad, 2013), probably because the small effect of adding or removing a component is drowned out by noise in small samples. Once CMH or similar courses are launched to the general public, large component studies can be performed on the user base through “A/B testing,” in which users are randomized to different versions of the training course website. Thus, Internet-delivered peer trainings can eventually act as a “laboratory” to test psychotherapy change mechanisms.

Be the 1st to vote.
Cite This Article

Bernecker, S. L., & Constantino, M. J. (2018). Making psychotherapy scalable by teaching nonprofessionals to deliver treatment to each other. Psychotherapy Bulletin, 53(1), 43-48.

References

Bell, E. C., Marcus, D. K., & Goodlad, J. K. (2013). Are the parts as good as the whole? A meta-analysis of component treatment studies. Journal of Consulting and Clinical Psychology, 81(4), 722-736. doi:10.1037/a0033004

Berman, J. S., & Norton, N. C. (1985). Does professional training make a therapist more effective? Psychological Bulletin, 98(2), 401-407. doi: 10.1037/0033-2909.98.2.401

Bernecker, S. L., Banschback, K., Santorelli, G. D., & Constantino, M. J. (2017). A Web-disseminated self-help and peer support program could fill gaps in mental health care: Lessons from a consumer survey. JMIR Mental Health, 4(1), e5. doi:10.2196/mental.4751

Cuijpers, P., Driessen, E., Hollon, S. D., van Oppen, P., Barth, J., & Andersson, G. (2012). The efficacy of non-directive supportive therapy for adult depression: A meta-analysis. Clinical Psychology Review, 32(4), 280-291. doi:10.1016/j.cpr.2012.01.003

Cuijpers, P., van Straten, A., Andersson, G., & van Oppen, P. (2008). Psychotherapy for depression in adults: A meta-analysis of comparative outcome studies. Journal of Consulting and Clinical Psychology, 76(6), 909-922. doi: 10.1037/a0013075

Elliott, R. (1993). The Revised Session Reaction Scale [Measure instrument]. Toledo, OH: University of Toledo.

Kessler, R. C., Demler, O., Frank, R. G., Olfson, M., Pincus, H. A., Walters, E. E., … Zaslavsky, A. M. (2005). Prevalence and treatment of mental disorders, 1990 to 2003. New England Journal of Medicine, 352(24), 2515-2523. doi:10.1056/NEJMsa043266

Mojtabai, R., Olfson, M., Sampson, N. A., Jin, R., Druss, B., Wang, P. S., … Kessler, R. C. (2011). Barriers to mental health treatment: Results from the National Comorbidity Survey Replication. Psychological Medicine, 41(8), 1751-1761. doi:10.1017/S0033291710002291

Rotheram-Borus, M. J., Swendeman, D., & Chorpita, B. F. (2012). Disruptive innovations for designing and diffusing evidence-based interventions. American Psychologist, 67(6), 463-476. doi:10.1037/a0028180

Taylor, P. J., Russ-Eft, D. F., & Chan, D. W. L. (2005). A meta-analytic review of behavior modeling training. Journal of Applied Psychology, 90(4), 692-709. doi:10.1037/0021-9010.90.4.692

van Ginneken, N., Tharyan, P., Lewin, S., Rao, G. N., Meera, S. M., Pian, J., … Patel, V. (2013). Nonspecialist health worker interventions for the care of mental, neurological and substanceabuse disorders in lowand middleincome countries. Cochrane Database of Systematic Reviews 2013, 11. doi: 10.1002/14651858.CD009149.pub2

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *