Several years ago, Evette Ludman and I undertook a focus group study to learn about early dropout from psychotherapy. We invited health system members who had attended a first therapy visit for depression and then did not return. Only about one-third of people we invited agreed to speak with us, but that’s a pretty good success rate for focus group recruitment.
We soon learned, however, that the one-third of people who joined our focus group were not the people we needed to hear from. Many were veterans of long-term psychotherapy who had returned for a single “refresher” visit. Some had been seeing therapists in private practice (using other insurance) and scheduled a visit with a new therapist to explore other options. None were people starting treatment for depression who gave up after a single visit. Our first focus group turned into a hypothetical discussion of why some other people might give up on therapy after just one visit.
In retrospect, we should have realized that we wouldn’t learn much about giving up on psychotherapy from people who volunteer to join a focus group about psychotherapy. People living with depression who are frustrated or discouraged about treatment don’t tend to become motivated research volunteers. We probably should have published something about that experience, but I’m still waiting for someone to establish The Journal of Instructive Failure.
That instructive failure, however, did shape our subsequent research about outreach to increase engagement in mental health treatment. Outreach and engagement interventions have been a major focus of our research, but we don’t study engagement interventions among people who are already engaged. We aim to reach people who are disconnected, discouraged, and convinced that treatment has nothing to offer. Volunteering to participate in research to increase engagement in treatment should probably make someone ineligible for research on that topic. For example: If we hope to learn whether outreach to people at risk can reduce suicide attempts, we certainly shouldn’t limit our research to people who volunteer for a study of outreach to prevent suicide attempt.
If we hope to find those who have been lost, we’ll have to look outside of the bright light under the lamppost. So our studies of outreach or engagement interventions follow a “randomized encouragement” design. We identify people who appear to need services but are not receiving them. We randomly assign some people to receive extra outreach, such as messages and phone calls to offer support and problem-solve barriers to getting mental health care. The rest continue to receive their usual care.
That real-world research design answers the question we care about: Among people who appear to have unmet need, will implementing an outreach intervention increase engagement in treatment – and ultimately lead to better outcomes. That’s the design of our MHRN Suicide Prevention Outreach Trial, testing two outreach interventions for people at risk of suicidal behavior. And it’s the design of our MHRN pilot study of Automated Outreach to Prevent Depression Treatment Dropout, testing systematic outreach to people who appear to have discontinued medication or dropped out of psychotherapy.
That real-world randomized encouragement design does impose some requirements, but I think they are features rather than bugs. First, we must be able to identify people with unmet need before they ask us for help. That’s been a central focus of our MHRN research, including our recent research on predicting suicidal behavior. Second, we must be able to use health system records to assess any impact or benefit. Relying on traditional research interviews or surveys would take us back to the problem of assessing outreach or engagement among people who volunteer to participate in research interviews or surveys. Third, any benefit of an outreach or engagement intervention is diluted by absence of benefit in those who do not participate. But that diluted effect is the true effect, if what we care about is the real-world effect of an outreach program.
Outreach interventions might seem to work well right under the lamppost, but that’s not where people get lost or left out.
Greg Simon