Psychotherapy Articles

Psychotherapy Articles

Evaluating the Impact of Digital CBT Lesson Completion on Clinical Outcomes


This paper reports the latest results from a series of studies investigating predictors of outcomes for users of an online, self-guided Cognitive Behavioral Therapy (CBT) platform. Each disorder-specific module on the platform consists of lessons with educational slides that include interactive exercises. This study investigates the number of slides completed for each lesson and the associated improvement. Outcome questionnaires are completed at every lesson in the module. The results indicate minimal variability in the number of slides completed, as users generally complete a high percentage of the slides in each lesson. The resulting outcomes, reported as an effect size, were comparable to results from a large sample of adults receiving eight sessions or less of general outpatient psychotherapy.


Research into the clinical value of behavioral health products known as Digital Therapeutics (DTx) has matured over the past decade. A series of studies into one such digital product, Learn to Live, has demonstrated generally high levels of effectiveness compared to benchmarks, along with variability in results depending on the degree of personal support received by the user and the number of homework assignments completed (Brown & Jones, 2021; Brown & Jones, 2022a; Brown & Jones, 2022b; Brown & Jones, 2022c).

In addition to referencing published research, benchmarks have been established based on a large proprietary database of individuals in general outpatient psychotherapy generated by the ACORN Collaboration. Several thousands of mental health providers over the past 15 years participated in the ACORN Collaboration by submitting patient self-report measures of outcome at every session, resulting in what is arguably the largest database on mental health outcomes available for analyses by independent researchers.

The ACORN database provides the routine outpatient real-world psychotherapy sample outcome benchmark against which the Learn to Live results are compared (Brown et al., 2015a; Brown et al., 2015b; Brown et al., 2020; Mahon et al., 2023). The benchmark for the results of well-conducted clinical trials is derived from Wampold and the comprehensive analysis by Wampold & Imel (2015).

Previous studies have documented how outcomes are favorably impacted by personal coaching (Brown & Jones, 2021), automated text message support (Brown & Jones, 2022a), and support from family and friends (“teammates”) who encourage program completion (Brown & Jones, 2022b). These studies found independent benefits from each form of support.

Having established that users improved outcomes significantly with each type of support, the next research question was how outcomes vary according to the utilization of platform resources. Learn to Live’s clinical modules present therapeutic exercises based on Cognitive Behavioral Therapy (CBT) for specific disorders. Completion rates for “homework” or practice exercises were studied, and higher levels of completion were associated with better outcomes (Brown & Jones, 2022c). The current study expands on how engagement in therapeutic work—or using platform resources—impacts clinical outcomes.

This study evaluates how outcomes vary by the extent to which each lesson in a clinical module is completed. Lessons are broken down into interactive screens or “slides,” each consisting of over 20. Slide completion measures the degree to which the user completed each lesson. The first hypothesis is that there is significant variation among users in the average number of slides completed for each lesson. The second hypothesis is that outcomes improve according to the number of slides completed. Outcomes are measured by pre-post change at the last lesson completed.

Description of the sample

The sample selected from the Learn to Live database includes all users who enrolled between 12/1/2022 and 6/30/2023. In addition, users had to complete at least two lessons (and the accompanying clinical measure) in a clinical module. A minimum of two scores is needed to calculate pre-post change and the Severity-Adjusted Effect Size (SAES), which has been used in all previous articles.

A total of 2,882 users met these selection criterium criteria. Of these, 2,143 (74%) had intake scores in a clinical range, indicating a level of symptoms significantly higher than the general population. This ratio is typical of individuals seeking outpatient mental health care.


Learn to Live uses three different questionnaires depending on the lesson modules: depression (PHQ-9; Kroenke et al., 2001), social anxiety (SPIN-17; Conner et al., 2000), and general stress and worry (GAD-7; Spitzer et al., 2006). Consistent with the methodology used in meta-analyses of multiple studies using different questionnaires, the pre-post change is reported as effect size, which in this case was calculated by dividing the pre-post change by the standard deviations of the questionnaires.

The current study includes a new statistic. A general linear model calculates a residual slide completion score at each session. This statistical method estimates how much each user’s number of completed slides differs from the average at each lesson, thus enabling testing of the hypothesis that greater than average slide completion is associated with greater clinical change.


The average effect size for the entire sample is .59, with users completing an average of 3.7 lessons comprising over 80 slides. However, these aggregate numbers only tell part of the story of how utilization and outcome vary. Users completing five or more lessons showed much more change, with an average .98 effect size. The benchmark from well-conducted clinical trials is an effect size of .8, comparable to results in the ACORN database for therapy clients completing five to eight sessions. Furthermore, while 27% of the users completed at least five lessons in the Learn to Live sample, an identical percentage of ACORN clients completed this number of sessions.

Table 1 provides the results for users in the Learn to Live sample. By comparison, Table 2 summarizes results for psychotherapy patients in the ACORN database. These results are broken down by completion levels and range from two to eight lessons or sessions.

Table 1: Learn to Live sample

Last lesson completed 2 3 4 5 6 7 8
N 984 339 225 154 81 31 329
% of all users 46% 16% 10% 7% 4% 1% 15%
Effect size 0.35 .51 .73 .87 .90 1.27 1.02
Total slides completed 44.00 65.21 82.10 102.08 117.54 145.71 166.24

By way of comparison, table 2 provides results of patients in the ACORN database ending therapy after completing two to eight sessions.

Table 2: ACORN database comparison sample

Last session completed 2 3 4 5 6 7 8


















% of Sample  



22% 14% 10% 7% 6% 4%
Effect Size .58 .76 .93 0.89 .94 .94 .97

The Learn to Live slide completion analysis suggests that users completed slightly fewer slides as they worked through the modules and completed more lessons. The average number of completed slides per session was over 22 during lessons two and three. However, this fell to an average of 20 for all subsequent final lessons.—a relatively minor decrease.

The research hypothesis was that users completing more slides per session would have better outcomes. However, this hypothesis must be rejected because there was no meaningful association between the number of slides completed and the outcome. The reason for this is telling—contrary to assumptions, the variability in the number of slides completed was insufficient to have predictive power.

Learn to Live users tended to complete virtually all slides in each lesson. This uniform diligence was unexpected. It may indicate that the interactive experience of working through these digital lessons is compelling, but of course, additional study is needed to understand this finding.

Discussion and implications for quality improvement

This study rejects the hypothesis that slide completion per lesson is associated with outcome and the assumption behind this hypothesis. It was assumed there would be enough variance in slide completion to enable correlational analyses. Without this variability, the power of statistical modeling is severely limited. The goal was to supplement and add to previous Learn to Live analyses regarding the impact of resource utilization on clinical outcomes. However, this study suggests that there may be something unique about using digital resources needing further exploration.

This study’s finding is encouraging. Users find the slides useful throughout the lessons of a clinical module, and they tend to complete a high percentage of slides with each lesson. This finding’s implications relate to two fundamental issues for digital therapeutics products: enrollment and engagement.

DTx companies focus on increasing enrollment and engagement rates using the platform once enrolled. If the findings here are corroborated with further study, the more critical goal is enrollment. The challenge of increasing enrollment is more manageable than that of engagement. The reason for this is simple. Because most digital health users complete the clinical modules without external support or encouragement, the resources must be inherently engaging so that people complete the lessons on their own. This is an essential dimension of the digital experience to understand.

Jeb Brown has been at the forefront of research into so called "feedback informed treatment". He is the founder and coordinator for the ACORN Collaboration. Prior to this he help positions of leadership within United Behavioral Health and Aetna Health Plans.

Cite This Article

Brown, G.S & Edward, J. (2023, December). Evaluating the impact of digital CBT lesson completion on clinical outcomes.  Psychotherapy Bulletin, 59 (1), 11-14.


Brown, G. S., Simon, A. E., Cameron, J., & Minami, T. (2015a). A Collaborative Outcome Resource Network (ACORN): Tools for increasing the value of psychotherapy. Psychotherapy, 52(4), 412–421.

Brown, G. S., Simon, A. E., & Minami, T. (2015b). Are you any good…as a therapist? [Web article]. Retrieved from

Brown, J. Jones, E. & Cazauvieilh, C. (2020). Effectiveness for online cognitive behavioral therapy versus outpatient treatment: A session by session analysis. Society for the Advancement of Psychotherapy. Retrieved from

Brown, J. & Jones, E. (2020). Impact of coaching on rates of utilization and clinical change for digital self-care modules based on cognitive behavioral therapy.

Brown, J. & Jones, E.(2022a). Improving results for digital therapeutics.

Brown, J. & Jones, E. (2022b). Improving results for digital therapeutics with social support.

Connor, K.M., Kobak, K.A., Churchill, L.E., Katzelnick, D. and Davidson, J.R.T. (2001). Mini-SPIN: A brief screening assessment for generalized social anxiety disorder. Depress. Anxiety, 14: 137-140.

Kroenke, K., & Spitzer, R.L. (2002). The PHQ-9: a new depression diagnostic and severity measure. Psychiatric Annals, 32(9), 509-515. doi:10.1155/2012/309094

Mahon, D., Minami, T., & Brown, (G. S.). J. (2023). The variability of client, therapist and clinic in psychotherapy outcomes: A three-level hierarchical model. Counseling and Psychotherapy Research, 23, 761–769.

Spitzer, R. L., Kroenke, K., Williams, J. B., & Löwe, B. (2006). A brief measure for assessing generalized anxiety disorder: The GAD-7. Archives of Internal Medicine, 166(10), 1092-1097. doi:10.1001/archinte.166.10.1092

Wampold, B. & Imel, Z. (2015). The Great Psychotherapy Debate: The evidence for what makes psychotherapy work. Routledge. ISBN 978-0805857092. First edition 2001.


Submit a Comment

Your email address will not be published. Required fields are marked *