Mental health apps for people in crisis: helpful or harmful?

Featured

When I type mental health related keywords into well-known app stores, I see endless lists of apps. Most of them are free to download and several apps have been downloaded over a million times. It is an encouraging thought that there are so many readily available mental health resources. Many people who experience mental health problems do not seek help, and with an app they have something, at least.

From a more sceptic point of view, how can people know which one to pick from these long lists of apps? As a researcher in this field, I recognise a few titles such as TalkLife and Woebot from scientific papers or presentations. If I didn’t have this experience, only user reviews are an indication of an app’s quality, and these reviews are many and go in all directions. For all I know, some of these apps may even be harmful, for example by pointing people in the wrong direction when they are experiencing high levels of distress. But how do you know? Google and Apple give no directions.

I think it is especially important when it comes to support for self-harm and suicide prevention, that mental health apps provide helpful information, such as what you can do and where you can find help. One review (Martinengo et al., 2019) examined 69 depression and suicide prevention apps, which were found by searching the app stores for keywords such as ‘depression’, ‘suicide’ and ‘self-harm’. Of these 69 apps, five offered all evidence-based suicide prevention strategies recommended by UK, US and World Health Organisation guidelines. Six apps, including two that were downloaded over a million times, provided an erroneous crisis helpline number. This has been fixed in some cases after Martinengo and colleagues pointed this out to the app developers, but this makes me worried. If the authors had used different search or selection methods, they would have tested different apps and these errors would not have been noticed.

So which sources tell us which apps are useful (or harmful)? One approach is to search for popular health websites and see what they recommend, as suggested in a recent review (Parrish et al., 2021). I tried this myself and found many online articles on mental health apps, each website recommending a list of apps that had only a few similarities with other lists. Nevertheless, this search strategy gives some directions. Parrish and colleagues examined suicide-related language in mental health apps that were recommended by three popular websites, specifically looking at the crisis support provided in these apps. I will summarise this below.

There are countless mental health apps widely available, but how do we know which apps are actually helpful in a crisis?

There are countless mental health apps widely available, but how do we know which apps are actually helpful in a crisis?

Methods

Parrish and colleagues typed ‘best mental health apps’ in Google incognito mode and found three popular-press articles that provided lists of mental health-related apps. It’s not mentioned in the paper why they chose these three. I can’t replicate the results, because it’s 2 years later and the authors are based in the US, while I’m based in the Netherlands (and Google search results are dependent on location). The authors downloaded the apps that were recommended in the lists, searched for suicide-related language, and coded the proportion of apps that included general mental health resources (psychoeducation, online forum or chat, national mental health organisations, resources for help-seeking) and crisis- or suicide-specific resources within the app (a national suicide support hotline, a national emergency number, or safety planning). Next, they also searched the terms of service agreements, End User Licence Agreements (EULAs) and privacy policies of each app for crisis and suicide-related language.

Results

The search strategy identified 38 unique apps that were downloadable. Of these, 31 were free and 7 required payment. The in-app resources of the apps that required payment were not assessed. The apps that were studied targeted a range of mental health disorders, such as anxiety (n = 24), depression (n = 12), and post-traumatic stress disorder (n = 8). Mental health resources were provided in 14 apps, of which 11 provided at least one crisis-specific resource. Seven apps included advice to call a national crisis number in the event of an emergency. Three provided suicide-specific safety planning resources and ten included a suicide hotline number. These resources were not always easily found. In some apps, they were located several pages away from the home page. The terms of service agreements and privacy policies of 32 apps were accessible. Four of the terms of service agreements and one of the privacy policies contained crisis language. The majority of terms of service did not encourage users to call for help for suicidal ideation, but rather in the event of a medical emergency.

Of 31 mental health apps, 11 provided at least one crisis-specific resource, most commonly a suicide or crisis hotline.

Of 31 mental health apps, 11 provided at least one crisis-specific resource, most commonly a suicide or crisis hotline.

Conclusions

The authors conclude that, in mental health apps, language discussing suicidal ideation/suicidal behaviour is lacking, with only 11 of 31 apps providing in-app crisis resources. These results highlight the need to create an evidence-based and standardised approach to crisis management for mental health apps.

“These results highlight the need to create an evidence-based and standardised approach to crisis management for mental health apps”

These results highlight the need to develop a standardised approach to crisis management for mental health apps

Strengths and limitations

The strength of this study is that it examined apps that were recommended by articles on websites. This is a relevant search strategy, because it’s what people might do when deciding which app to download and install (especially considering that Google and Apple stores don’t give directions).

A limitation is that little is said about the quality of the apps. The authors list available resources, but without appraisal. When there’s a hotline, who is answering? This matters a lot. And when there is some sort of safety planning included, is there support or detailed guidance for using it? Lastly, in my opinion, providing a national emergency number is not really helpful and shouldn’t be coded as suicide-specific resource. People know the emergency numbers and don’t need an app for that, and, moreover, emergency numbers are not specific for mental health crises.

Related to this, I would have liked to see the authors’ opinion on the articles they retrieved the apps from. Were the articles on the right track? Are these indeed recommendable apps?

Further, I don’t see the merit of examining crisis or suicide-related content in the terms of service agreements, EULAs and privacy policies. A large part of the paper actually discusses this. Hardly anyone reads those things (Chivers, 2020) and even if one would, the terms of agreement of an app are not a place to look when in crisis. One of the apps’ terms of agreement states: ‘If you have suicidal thoughts or need medical help for other reasons, please consult a local doctor or therapist, or in urgent cases an emergency ambulance.’ At best, this might be useful for people to make an informed decision when choosing an app, as it suggests this app will not cover these kinds of issues. To be honest, I don’t think this statement is intended as advice for the user, but included as legal cover by the developer in case a user of the app is seriously injured by self-harm.

Who reads the T’s and C’s? Is there any merit in examining crisis or suicide-related content in the terms of service agreements, End User Licence Agreements and privacy policies?

Who reads the T’s and C’s? Is there any merit in examining crisis or suicide-related content in the terms of service agreements, End User Licence Agreements and privacy policies?

Implications for practice

The authors recommend that there should be a standardised approach to crisis management for mental health apps. I imagine this as a set of rules, based on evidence and clinical guidelines. For example, when there is a crisis hotline or chat, it should be available 24/7, have sufficient capacity, have adequate privacy policies, and provide evidence-based support by trained volunteers or professionals. This set of rules is not only useful for developing new apps, but also for checking the existing ones. I think that suicide and self-harm related information in mental health apps should be systematically and constantly examined. As long as there is no such systematic examination, studies like the ones of Parrish and colleagues (2021) and Martinengo and colleagues (2019) are very important. Even though they’re small scale initiatives, they offer a much-needed professional look on freely available mental health resources. The results of these studies should be published somewhere that is easily found by people searching for directions.

We need systematic examination of mental health apps by professionals, as well as easily accessible guidelines for people searching for in-app crisis support.

Professionals should examine mental health apps regularly and easily accessible guidelines should be created for people searching for in-app crisis support.

Statement of interests

None.

Links

Primary paper

Parrish, E. M., Filip, T. F., Torous, J., Nebeker, C., Moore, R. C., & Depp, C. A. (2021). Are mental health apps adequately equipped to handle users in crisis?. Crisis: The Journal of Crisis Intervention and Suicide Prevention.

Other references

Chivers, T. (2020). Privacy Complacency: The Hidden Dangers Lurking Beneath Today’s Surface-Level Data Protection

Martinengo, L., Van Galen, L., Lum, E., Kowalski, M., Subramaniam, M., & Car, J. (2019). Suicide prevention and depression apps’ suicide risk assessment and management: a systematic assessment of adherence to clinical guidelines. BMC medicine17(1), 1-12.

Photo credits

Share on Facebook Tweet this on Twitter Share on LinkedIn Share on Google+