SuicideTech: Enhancing Suicide Prevention and Postvention

The World Health Organization, as early as 2019, reported that close to 800,000 people die due to suicide every year. That is 1person every 40 seconds. Suicide is the third leading cause of death in 15-19-year-olds. 79% of global suicides occur in low- and middle-income countries. Ingestion of pesticide, hangings and firearms are among the most common methods of suicide globally. The single most important risk factor for suicide is subsequent attempts.

Even as suicide is a vexing public health challenge, many are not aware of the SuicideTech space that have been designed to pre- and post-vent suicide. A scan of this tech space, has shown up 8 distinct, and often integrated, categories designed to address suicide. I have been inspired to prepare this subcategory review by podcast I recorded with The Center for Suicide Awareness recently.

The SuicideTech categories we identified include:

Text messaging, is among the most prevalent digital interventions in this space. These tech manifestations provide access to counsel and support between crisis center, and those who need help. Examples include: Suicide Crisis Text Line, The Alex Project, and LifeLineChat. FaceBook, and other social media, also carry versions of these on its platform. A good differential aspect of these tech are their ubiquity and anonymity.

CrisisText Line, one of the notable digital suicide interventions.

Social media is a relevant intervention point when it comes to suicide, especially because of its social nature, where seek community and expressing the ups and downs of our lives. These platforms have provided tools, which allow ready access to support services and prevention information. They also have the potential for passively detecting behavioral patterns that point to a propensity for suicidal behavior. Examples include: Facebook, YouTube, Instagram, Twitter, and Twitch. The major platforms have policies and direction related to what to do if you suspect that someone in your community is at risk of attempting a suicide. A good differential aspect of these tech is facilitation of peer and community support.

Social Media Platforms are taking suicide seriously.

Digital content is less a tech than the information that tech transmits.  We see an increasing number of suicide awareness and prevention campaigns on every digital content channel imaginable. The same is true in every digital format from text/image to infographics to audio to video, etc. Content being what it is, it can always be more engaging and effective at easing users into getting the help they need. I have seen at least one such proposal from Lucas Chae, a UX Designer who posted a suicide intervention prototype for search engines that go beyond the standard Suicide Prevention Line we normally see. This proposal provides situational questions and suggestions to ease the user towards seeking help. Please let me know if you are aware of other novel examples.

Lucas Chae proposal for novel suicide search engine response.

Mobile apps have been differentially developed in this space to aid safety planning against subsequent attempts. Examples include MY3 app for consumers, which helps with safety planning by designating 3 support colleagues and helps with putting a suicide safety plan in place. Additional apps include NotOK, Suicide Lifeguard and Suicide Safety Plan. The (U.S.) Substance Abuse & Mental Health Services Administration (SAMSHA) has also produced an app, Suicide Safe Mobile App for healthcare providers to aid patients who are suicide risks. Additionally, we see a number of mobile apps designed to deliver a form of cognitive behavioral therapy that is particularly effective with suicidal ideation, dialectical behavioral therapy (DBT). DBT examples include: Calm Harm, DBT Diary Cards & Skill Coach, and DBT Coach. Efficacy evidence is split as too many apps still contain potentially harmful information and underutilize the interactive and data capabilities of mobiletech. A good differential aspect for this category is its mobile convenience and immediacy.

MY3 app supporting suicide postvention.

Chatbots, aka conversational agents, have been developed to provide prevention & safety planning resources that are on demand and sometimes anonymous. These are a near cousin of social media where the social is with a robot. They are also embedded in social media chat like Facebook [Woebot] and Twitter [BeALifeLine]. They can also facilitate crisis triage to a human being. Drawbacks are limited efficacy and safety research, empathy & human dialogue deficits, risk assessment, and data security. A good differential aspect of this is its conversational context and mimicry, though as yet imperfect

LifeLine chatbot triaging crises.

Videogames are an excellent example of what I call embedded mentaltech, a case where we do no rely on consumers coming to us but we go to where they are. Social media is another context of this sort and I think we know that with the rise of FortNite, Twitch and Discord, videogames are becoming social media. A good differential aspect is that of providing support in the flow of an experience player audience are already engaged in.

Virtual reality (VR) is a surprise to find in this space. The Black Dog Institute and the University of New South Wales are developing a VR application to address suicide prevention. Their VR experience, The Edge of the Present, is designed to provide hopeful and mood-altering virtualscapes with an eye to reducing suicidal ideation. A 2019 study done at The Big Anxiety Event showed promise and a long road to solid evidence lies ahead. We also found VR empathy training developed by Axon, 1st Responders and training developer, to improve 1st Responder de-escalation interactions with those who are suicide risks. We further found that Samsung VR is doing VR work to address suicide in South Korea. A good differential aspect is the provision of immersive simulations for training.

Axon develops VR simulations to train 1st Responders in suicide de-escalation.

Detective & predictive algorithms are also being used to aid suicide prevention. The elemental inputs which are enabling this area are speech analysis, facial emotion, motion tracking and natural language processing, mood monitoring, electro-dermal (skin) and brainwave detection. Proposals in this area include: a) mimicking human clinician’s mental state evaluation using algorithmic tech, to yield a digital mental state evaluation, near good as a human, b) combining mood tracking app data and genetic blood biomarkers (Levey, 2016) to predict suicidality in women, c) predicting suicide risk in soldier populations (Kessler, 2015) and d) developing continuous remote monitoring systems using sensors, cloud computing and big data to detect personalized real-time mental state and suicide risk model.

These possibilities would make anyone familiar with digital’s sinister repercussions related to privacy, confidentiality and stigma, cringe. Correspondingly, due vigilance and accountability in these areas MUST NOT be neglected nor forgotten.

Ongoing, there need be deeper insight into suicide attempt contexts to understand barriers that would impede tech functioning effectively in a suicide situation, like uncharged devices, unchecked notifications, etc.

It is also critical that we understand that these techs are intended to augment, not replace professionals.

Finally, it is critical that those at risk of completing a suicide be significantly engaged in the design and experience of SuicideTech.

We welcome your further comments and resource recommendations in response to this post.

And, by the way, please subscribe to our newsletter, participate in our Digital Stress Management Survey, and see more about our work at digitalmentalhealthproject.com.

Be well.

___________

Further supportive reading:

  1. Kessler RC, Warner CH, Ivany C, et al. Predicting suicides after psychiatric hospitalization in US Army soldiers: the Army Study to Assess Risk and Resilience in Service members (Army STARRS)JAMA (Psychiatry) 2015;72(1):49–57.
  2. Levey DF, Niculescu EM, Le-Niculescu H, et al. Towards understanding and predicting suicidality in women: biomarkers and clinical risk assessment.Mol Psychiatry. 2016;21(6):768-785. doi:10.1038/mp.2016.31
  3. Vahabzadeh A, Sahin N, Kalali A. Digital Suicide Prevention: Can Technology Become a Game-changer?.Innov Clin Neurosci. 2016;13(5-6):16-20. Published 2016 Jun 1.
  4. Witt, K., Spittal, M.J., Carter, G. et al.Effectiveness of online and mobile telephone applications (‘apps’) for the self-management of suicidal ideation and self-harm: a systematic review and meta-analysis. BMC Psychiatry 17, 297 (2017). https://doi.org/10.1186/s12888-017-1458-0

 

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *