The science of suicide prevention: Innovative technologies and ethical implications #IASP2019

Screenshot 2019-09-16 at 16.30.01

Suicidal behaviour is a complex phenomenon to understand. As thousands of people have been tragically affected by suicide, there is an urgency to understand it in order to develop efficient interventions. Despite scientists’ efforts to comprehend and predict suicide, it has been suggested that our current capacity to forecast suicide risk is no better than flipping a coin (Franklin et al., 2017). Although this is not very good news for scientists, recent research has shown that new technologies may help us to advance a step further in this enterprise.

#IASP2019 17-21 Sep 2019, Derry

Today is the first day of the 30th World Congress of the International Association for Suicide Prevention, which will be covered by The Mental Elf who will be live tweeting, podcasting and video streaming from the event in Derry. You can follow everything live on Twitter using the hashtag: #IASP2019.

Follow #IASP2019 on Twitter for all the updates from the 30th World Congress of the International Association for Suicide Prevention

Follow #IASP2019 on Twitter for all the updates from the 30th World Congress of the International Association for Suicide Prevention.

Smartphones

Among the new possibilities is the use of smartphones (van Ballegooijen, 2019). Through a methodological technique called Experience Sampling Method (ESM) (also known as Ecological Momentary Assessment – EMA), scientists can now access samples of people’s psychological states across the hours of a day and have an in-depth picture of how those states operate as a function of contextual factors (Kirtley, 2019).

Now, by tracking suicide risk at a micro-longitudinal level, scientists have discovered that suicidal ideation and its risk factors often vary considerably over a period as short as 4 to 8 hours (Kleiman, et al. 2017). This important evidence may help in the development of more efficient and person-specific interventions.

ESM/EMA data has also been collected through smart watches, which can also capture other health data such as heartbeat speed, body movement, and body temperature. The closer scientists get to people’s day-to-day life, the better is the understanding of suicide risk, and the better are the chances to prevent it.

By tracking suicide risk at a micro-longitudinal level, scientists have discovered that suicidal ideation and its risk factors often vary considerably over a period as short as 4 to 8 hours.

By tracking suicide risk at a micro-longitudinal level, scientists have discovered that suicidal ideation and its risk factors often vary considerably over a period as short as 4 to 8 hours.

Machine learning

A second new technology application to be mentioned is the use of Machine Learning (ML), a subfield of artificial intelligence. ML is understood as the study and application of algorithms that can enhance knowledge or performance with experience. ML can learn from data, identify patterns within data, and provide meaningful information with minimal human intervention (Linthicum et al., 2019).

In the science of suicide prevention, this technology has been generally used with three aims:

  1. Improve prediction accuracy of suicidal thoughts and behaviours
  2. Identify relevant risk factors and the interaction between those factors
  3. Uncover potential subgroups within the data, specifying their profiles according to the information available (Burke et al., 2019).

As the evidence suggests that most suicide attempts and deaths occur among individuals who denied experiencing suicidal thoughts at previous healthcare assessments or appointments (e.g., Louzon et al., 2016), Bernecker and colleagues (2019) investigated whether the application of ML could help to solve this issue through a multistage data analytic approach. The researchers examined this in a representative sample of American Army soldiers who denied having ever thought about suicide in a survey. They followed-up the soldiers through administrative records for 45 months to gather information of administratively-recorded suicide attempts. The results indicated that 70% of survey participants denied having ever thought about suicide. Administrative data identified 30% of this 70% who accounted for 81.2% of subsequent administratively-recorded suicide attempts. Approximately 10% of this high-risk subgroup accounted for around 45% of all suicide attempts in the full sample. The authors suggest this approach could be taken to pinpoint soldiers at high risk of suicidal behaviours, to provide them with assessments and targeted preventive interventions.

Research suggests that machine learning could be used to pinpoint soldiers at high risk of suicidal behaviours, to provide them with assessments and targeted preventive interventions.

Research suggests that machine learning could be used to pinpoint soldiers at high risk of suicidal behaviours, to provide them with assessments and targeted preventive interventions.

Virtual reality

Another new technology is virtual reality (VR). In the past decade, VR has been used in the treatment of several psychological disorders with promising results (Valmaggia et al., 2016). However, the examination of this technology as a potential investigation tool is still in the early stages in the science of suicide prevention.

A recent collection of studies by Franklin and colleagues (2019) tested the use of VR in a set of controlled laboratory experiments in which participants were exposed to different VR scenarios, including suicide attempt scenarios. The researchers found that relevant factors among those who completed suicide in the VR scenarios included male sex, suicidal desire, suicidal capability, agitation, and history of suicidal thoughts and behaviours. Franklin and his collaborators also found that the factors associated with reasons for not engaging in VR suicide are similar to the reasons people give for not engaging in actual suicide.

Ethical issues

Despite the interesting and promising results indicated by the use of new technologies, some important ethical issues should be highlighted and discussed among researchers, policy makers, clinicians, and service users.

Data protection and privacy

Firstly, an important issue that needs to be taken into consideration is an individual’s right to data protection and privacy. This matter is particularly pertinent within the context of the recent General Data Protection Regulation’s (GDPR) that have been introduced across the European Union, aiming to protect individual data and personal information from being used inappropriately. This needs to be considered specifically with the increased use of mobile phones and other personal devices in suicide research (van Ballegooijen, 2019). ESM/EMA data is collected in quite an intensive manner, and it could be argued that these techniques are intrusive or invasive to that individual’s life. It is important that this is monitored, and that an individual agreeing to take part in this type of research is made fully aware of how their personal data is going to be collected, stored, and used.

As much of the data that is utilised in ML research is collected routinely and those individuals may not be aware of its use in research, researchers adopting ML approaches also need to ensure that individual’s privacy is protected. Indeed, a recent UK survey reported that 63% of the adult population is uncomfortable with allowing personal data to be used to improve healthcare (Fenech et al., 2018). Every individual has a right to non-disclosure, or to not being identified, if that is what they wish, and these techniques may blur those lines slightly.

A recent UK survey reported that 63% of the adult population is uncomfortable with allowing personal data to be used to improve healthcare.

A recent UK survey reported that 63% of the adult population is uncomfortable with allowing personal data to be used to improve healthcare.

Potential for harm?

A further ethical consideration with the use of these exciting new technologies is the potential unintended consequences of relatively untested techniques. With ESM/EMA, for example, it may be potentially harmful for participants to be reminded about their suicidal or self-harm thoughts repeatedly over the course of a day, when indeed they may be trying to distract from these thoughts. It should be noted, though, that there is no evidence that being asked about suicide increases risk (Blades et al., 2018). As these techniques are still relatively novel, this is an aspect that requires monitoring, and, as ESM/EMA studies have taught us, awareness that individuals are different and may respond differently to these study designs is important.

The use of virtual reality may also have some unintended outcomes that need to be considered when designing such experiments. Participants are exposed to the visual imagery of suicidal behaviours, and as such may be placed more at risk as the behaviour potentially could be more accessible, or it may even potentially expand their repertoire of suicidal behaviours. Indeed, both exposure to suicidal behaviour in others and having mental imagery of death has been shown to differentiate those who think about suicide from those who act on their thoughts, and therefore may represent risk factors for enactment of suicidal thoughts (Wetherall et al., 2018). Although there is evidence that being exposed to suicidal imagery in itself does not increase risk (Cha et al., 2016), the novel context of VR needs to be given careful consideration in relation to suicidal and self-harm research.

We need to take time to consider the key ethical considerations relating to the use of new technologies for preventing suicides.

We need to take time to consider the key ethical considerations relating to the use of new technologies for preventing suicides.

Take home message

Although there are a number of key ethical considerations with the use of these new technologies and techniques, that is not to take away from their potential to change the face of suicide prevention research.

The take home message is to use them with care and consideration for the individuals whose experiences we as researchers so keenly want to understand, and hopefully they will be another piece in the puzzle of suicide prevention.

Conflicts of interest

None.

Links

Bernecker, S. L., Zuromski, K. L., Gutierrez, P. M., Joiner, T. E., King, A. J., Liu, H., … & Ursano, R. J. (2018). Predicting suicide attempts among soldiers who deny suicidal ideation in the Army Study to Assess Risk and Resilience in Servicemembers (Army STARRS). Behaviour research and therapy.

Blades, C. A., Stritzke, W. G. K., Page, A. C., & Brown, J. D. (2018). The benefits and risks of asking research participants about suicide: A meta-analysis of the impact of exposure to suicide-related content. Clinical Psychology Review, 64, 1-12.

Burke, T. A., Ammerman, B. A., & Jacobucci, R. (2018). The use of machine learning in the study of suicidal and non-suicidal self-injurious thoughts and behaviors: A systematic review. Journal of affective disorders, 245, 869-884.

Cha, C. B., Glenn, J. J., Deming, C. A., D’Angelo, E. J., Hooley, J. M., Teachman, B. A., & Nock, M. K. (2016). Examining potential iatrogenic effects of viewing suicide and self-injury stimuli. Psychological Assessment28(11), 1510-1515.

Fenech M, Strukelj N, Buston O. (2018) Ethical, social and political challenges of artificial intelligence in health. Wellcome Trust & Future Advocacy. Available at https://wellcome.ac.uk/sites/default/files/ai-in-health-ethical-social-political-challenges.pdf

Franklin, J. C., Huang, X., & Bastidas, D. (2019). Virtual reality suicide: Development of a translational approach for studying suicide causes. Behaviour research and therapy, 120, 103360.

Franklin, J. C., Ribeiro, J. D., Fox, K. R., Bentley, K. H., Kleiman, E. M., Huang, X., … & Nock, M. K. (2017). Risk factors for suicidal thoughts and behaviors: a meta-analysis of 50 years of research. Psychological Bulletin143(2), 187.

Kirtley, O., J. (2019, September 9). Out of the lab and into everyday life: Using Experience Sampling Methods to better understand self-harm and suicide [Blog post]. Retrieved from netECR at https://netecr.wordpress.com/2019/09/09/ema-suicide-research/.

Kleiman, E. M., Turner, B. J., Fedor, S., Beale, E. E., Huffman, J. C., & Nock, M. K. (2017). Examination of real-time fluctuations in suicidal ideation and its risk factors: Results from two ecological momentary assessment studies. Journal of Abnormal Psychology, 126(6), 726-738.

Linthicum, K. P., Schafer, K. M., & Ribeiro, J. D. (2019). Machine learning in suicide science: Applications and ethics. Behavioral sciences & the law37(3), 214-222.

Louzon, S. A., Bossarte, R., McCarthy, J. F., & Katz, I. R. (2016). Does suicidal ideation as measured by the PHQ-9 predict suicide among VA patients?. Psychiatric Services67(5), 517-522.

Valmaggia, L. R., Latif, L., Kempton, M. J., & Rus-Calafell, M. (2016). Virtual reality in the psychological treatment for mental health problems: An systematic review of recent evidence. Psychiatry research236, 189-195.

van Ballegooijen, W. (2019, September 9). How smartphones can revolutionise suicide research [Blog post]. Retrieved from netECR at https://netecr.wordpress.com/2019/09/09/smartphones-suicide-research/.

Wetherall, K., Cleare, S., Eschle, S., Ferguson, E., O’Connor, D.B., O’Carroll, R., O’Connor, R.C. (2018). From ideation to action: differentiating between those who think about suicide and those who attempt suicide in a national study of young adults. Journal of Affective Disorders, 241, 475 – 483.

Share on Facebook Tweet this on Twitter Share on LinkedIn Share on Google+