A recent news article about the European Union’s new privacy rules prompted me to think more about population-based suicide prevention programs. Caring outreach that respects privacy is a difficult balance.
Our health systems are in various stages of implementing systematic programs to identify people at high risk of self-harm or suicide. These programs are triggered by the standard depression questionnaires our patients complete before clinic visits. Whenever a patient reports frequent thoughts of death or self-harm, the treating clinician is expected to ask follow-up questions about suicide risk. That seems only natural; most patients who report thoughts of suicide would be unpleasantly surprised if no one bothered to ask.
Our Suicide Prevention Outreach Trial extends that caring outreach, but with an added layer of separation. Outreach specialists follow up on those visit questionnaires with an online message or phone call during the following week. Most people are grateful for that extra outreach, but a few respond, “Who are you? How do you know about what I told my doctor in private?”
Our health systems are considering new programs with one more level of separation. Outreach messages or calls could be triggered by an algorithm that identifies risk from health records data. In other words, the program could be triggered by, “something a computer found that I never even told my doctor”. The outreach calls or messages would come from a stranger (albeit a kind stranger) representing the health system. I suspect we’ll hear a few more complaints about invasions of privacy. But should that stop us from trying?
Of course, serving up personalized invitations based on machine learning algorithms is the core business model of social media companies. While invitations from social media apps usually ask us to buy products and services, social media superpowers can also be used for good. For example, Facebook (with help from our colleague Ursula Whiteside) has developed caring outreach interventions for people at risk for suicide. Facebook’s caring outreach interventions were originally directed at people identified by other human Facebook users. That program can now be activated by algorithms that continuously monitor a range of data types: text or photos in Facebook posts, comments from friends, and even Facebook Live audio and video streams.
Except in Europe. European data protection rules strictly limit how personal data can be used or shared without explicit consent. Facebook worries that mining social media data to identify people at risk for self-harm would violate those rules. So Facebook’s algorithm-driven suicide prevention outreach won’t be implemented for European users. I think that’s unfortunate. But I have to acknowledge the legitimate view that it’s intrusive or creepy to mine social media data and flag people at risk for self-harm.
Whether outreach is supportive or intrusive depends, of course, on what we do when we reach out. In our outreach programs, we are clear about the boundaries. We reach out to express concern, offer support, suggest resources, and facilitate connection with care. If people ask to be left alone, we stop.
Nevertheless, some people are bothered by the fact that strangers are drawing conclusions based on information that’s usually private – especially conclusions about things (like suicidal thoughts) that are often stigmatized. Some people will be offended that we know (or think we know) they are at risk, regardless of what we do or don’t do about it.
If you have followed the discussion about European privacy regulations, you may recognize one rallying cry of the privacy advocates: “The right to be forgotten.” That slogan asserts a right to control – or even erase – any private information. But that slogan feels a bit eerie when we apply it to suicide risk. Being forgotten sounds too much like the isolation or disconnectedness that increase risk for suicide. When it comes to suicide prevention, I’d prefer not to acknowledge that absolute European right to be forgotten. I’ll remain a perky (and possibly creepy) American, saying, “Excuse us, but we won’t forget about you!”
Greg Simon