#Instagram: Is it dangerous in terms of suicide and self-harm content?


Instagram, one of the latest bloomers of social media, has been rising in popularity since its online debut in October 2010. This platform focuses on editing and sharing photos, artworks, text images or many more interesting, beautiful or gritty aspects of individual life and is particularly favoured by young people (The Lancet, 2019).

However, there have been several instances of highly prolific youth suicide related to Instagram content within the media. This links to a common social media-based fear; that it can share images or narratives which “normalises, glamourises or reinforces” self-harm (Lavis & Winter, 2020), which could then in turn lead to increases of self-harm exposure and social contagion (Dunlop et al., 2011; Insel & Gould, 2008; Lupariello et al., 2019).

Previous reviews have considered self-harm or suicide-related content across different platforms (Daine et al., 2013; Dyson et al., 2016; Marchant et al., 2017). Yet none have considered Instagram content, thus this scoping systematic review aims to answer the question:

What research has been done on the topic of self-harm or suicide on Instagram, how has it been done, and what are its key findings?

Instagram is a popular social media platform, but there are concerns about its use for seeing or sharing explicit self-harm and suicide content.

Instagram is a popular social media platform, but there are concerns about its use for seeing or sharing explicit self-harm and suicide content.


Following PRISMA (Tricco et al., 2018) and Joanna Briggs Institute guidelines (Peters et al., 2015), the authors conducted a literature search (2010-05.01.2020) using multiple databases (Scopus, Web of Science, Medline, EBSCOhost, PsycINFO, EMBASE and ProQuest Central).

Inclusion criteria:

  • Peer-reviewed articles
  • Examining suicide, self-harm or non-suicidal self-injury
  • Examining Instagram

Descriptive and relevant information were extracted. The quality of articles was appraised using Critical Appraisal Skills Programme [CASP] checklists. All were reviewed by one author, with two fellow authors checking half each. Discrepancies were discussed and resolved within the team.


Ten articles were identified that met all inclusion criteria; these were published between 2016-2019 and considered high quality. The majority of studies focused on Instagram alone, while two also explored content from Twitter and Tumblr. Objectives and terminology of self-harm varied between studies. There was limited understanding of intention behind suicide, due to study designs.

Methodology differences

  • The minority of studies considered Instagram users rather than Instagram content. Two studies employed surveys to gain online responses from users and another interviewed them.
  • The rest of studies focused on self-harm or suicidal content (pictures, text images, hashtags, captions or comments) found on Instagram.
  • Thematic or content analysis was frequently used to describe the self-harm or suicide content on Instagram. This ranged from reporting descriptive techniques, frequencies, prevalence, audience response and time trends of NSSI (non suicidal self injury)-posting.

Main findings

  • Self-harm or suicide content was found in 9-66% of examined posts. This content could be images of self-harming behaviour or scars, memes, short videos, quotes, etc. Some content also evidenced suicidal intent through the use of hashtags.
  • Study authors stated that this type of content reflected posters’ distress and struggles, which could be linked to negative emotions or feelings, and mental health difficulties.
  • Self-harm content was highly engaged with by Instagram users, “liking” the post and often sharing empathetic responses.
  • Hashtags often evolve to avoid being flagged by Instagram. Artificial intelligence was adapted to distinguish the self-harm content by image within one study.
  • Study authors shared concerns that there was potential for contagion. From survey data, 43% of the sample had been exposed to self-harm content on Instagram, of which 33% stated that they had acted in a similar manner after seeing this content.
  • There was limited evidence to discuss managing self-harm content on Instagram. One study discussed how the Instagram reporting tool was unsuccessful as under 20% of users knew about it.
9-66% of examined Instagram content included self-harm or suicide.

This review suggests that there is a significant amount of self-harm or suicide content on Instagram, but it doesn’t tell us much about the impact of this content on the behaviour of Instagram users.


There are multiple ways to engage with self-harm and suicide research when exploring Instagram content. However, the actual prevalence of this content is still unknown. This can be related to users attempting to hide self-harm or suicide content to avoid censorship, and varying terminology or spelling when it comes to these thoughts and behaviours. Across studies, there were concerns whether self-harm and suicide content being available on Instagram could normalise and reinforce self-harming behaviours or influence social contagion. However, less evidence was available regarding this matter, therefore this review cannot state firmly the impact of self-harm and suicide content.

Results from the review mostly consider describing Instagram content, further research is necessary to understand how influential self-harm content is to users.

Results from the review mostly consider describing Instagram content. Further research is necessary to understand how influential self-harm content is to users.

Strengths and limitations

While Instagram is not completely new, having been online since 2010, research specifically focused on this platform is relevantly recent. This review gives an interesting first step of synthesising self-harm and suicide research on Instagram, and an overview of Instagram content.

The authors follow tried and tested guidelines; PRISMA (Tricco et al., 2018) and Joanna Briggs (Peters et al., 2015). These guidelines ensure the quality of the systematic review’s method to be rigorous. This is particularly useful as the method section is somewhat brief, so there is limited replicability.

Some papers within the systematic review did not undergo IRB (institutional review board) review. There is ongoing debate about whether ethical approval is needed for studies using online communities and social media as the data is publicly available. However, considering the sensitive nature of self-harm and suicide, and the scientific integrity of researchers, I believe ethical review should be required to move forward with these studies in future. This would protect the researchers and those engaging with the online platforms. For guidance, see the ethics guidelines for internet-mediated research produced by the British Psychological Society (BPS, 2017).

One aspect that was difficult for study authors (and as a reader) was terminology around self-harm and suicide. There are so many phrases that could be used. It seems that the heterogeneity hits this review at all levels: Instagram user and hashtag variation; study authors search terms to find Instagram content (potential for human error); and the systematic review search terms. While this is not a weakness of the review, it is an overarching limitation of this kind of social media research, which leads to content potentially being missed at different levels.

This is the first scoping systematic review to synthesis self-harm and suicide research on Instagram.

According to the authors, this is the first scoping systematic review to synthesis self-harm and suicide research on Instagram.

Implications for practice

This review and the included studies have evidenced that self-harm and suicide content is publicly available on Instagram. So far, most studies have described this content but have limited evidence to the purpose or impact of having self-harm and suicide content online. While authors are concerned about contagion, this is more a narrative than an explored research question. Future research should build on the solid evidence base of descriptive content and focus on the meaning by Instagram users and influence of such content to other users.

Online communities and social media platforms can also offer a safe anonymous space for people who struggle with self-harm and suicide (Williams et al., 2018; Lavis & Winter, 2020). Given that help-seeking is often low for those dealing with self-harm and suicide, utilising these spaces to engage and support these individuals may be a step towards an effective intervention. Inclusion of #chatsafe guidelines on Instagram’s platform may also allow for these online spaces to be protected for young people, while allowing users to share their experiences and respond to others safely (Robinson et al., 2020).

The #chatsafe guidelines developed by Orygen in Australia are the world’s first evidence-based guidelines for young people to communicate safely online about suicide.

The #chatsafe guidelines developed by Orygen in Australia are the world’s first evidence-based guidelines for young people to communicate safely online about suicide.

Statement of interests



Primary paper

Picardo, J., McKenzie, S. K., Collings, S., & Jenkin, G. (2020). Suicide and self-harm content on Instagram: A systematic scoping review. PloS one, 15(9), e0238603.

Other references

British Psychological Society. (2017). Ethics guidelines for internet-mediated research. Leicester, UK: British Psychological Society.

CASP. (2018). Critical Appraisal Skills Programme, CASP (Qualitative) Checklist.

Daine, K., Hawton, K., Singaravelu, V., Stewart, A., Simkin, S., & Montgomery, P. (2013). The power of the web: a systematic review of studies of the influence of the internet on self-harm and suicide in young people. PloS one8(10), e77555.

Doyle, L., Treacy, M. P., & Sheridan, A. (2015). Self‐harm in young people: Prevalence, associated factors, and help‐seeking in school‐going adolescents. International journal of mental health nursing, 24(6), 485-494.

Dyson, M. P., Hartling, L., Shulhan, J., Chisholm, A., Milne, A., Sundar, P., … & Newton, A. S. (2016). A systematic review of social media use to discuss and view deliberate self-harm actsPloS one11(5), e0155813.

Insel, B. J., & Gould, M. S. (2008). Impact of modeling on adolescent suicidal behavior. Psychiatric Clinics of North America, 31(2), 293-316.

Lavis, A., & Winter, R. (2020). #Online harms or benefits? An ethnographic analysis of the positives and negatives of peer‐support around self‐harm on social media. Journal of child psychology and psychiatry.

Lupariello, F., Curti, S. M., Coppo, E., et al. (2019). Self‐harm Risk Among Adolescents and the Phenomenon of the “Blue Whale Challenge”: Case Series and Review of the Literature. Journal of forensic sciences, 64(2), 638-642.

Marchant, A., Hawton, K., Stewart, A., Montgomery, P., Singaravelu, V., Lloyd, K., … & John, A. (2017). A systematic review of the relationship between internet use, self-harm and suicidal behaviour in young people.

Peters, M. D., Godfrey, C. M., Khalil, H., McInerney, P., Parker, D., & Soares, C. B. (2015). Guidance for conducting systematic scoping reviewsJBI Evidence Implementation13(3)

Robinson, J., Teh, Z., Lamblin, M., Hill, N. T., La Sala, L., & Thorn, P. (2020). Globalization of the# chatsafe guidelines: Using social media for youth suicide prevention. Early Intervention in Psychiatry.

The Lancet (2019). Social media, screen time, and young people’s mental health. Lancet (London, England), 393(10172), 611.

Tricco, A. C., Lillie, E., Zarin, W., O’Brien, K. K., Colquhoun, H., Levac, D., … & Straus, S. E. (2018). PRISMA extension for scoping reviews (PRISMA-ScR): checklist and explanationAnnals of internal medicine169(7), 467-473.

Williams, A. J., Nielsen, E., & Coulson, N. S. (2018). “They aren’t all like that”: Perceptions of clinical services, as told by self-harm online communitiesJournal of health psychology, 1359105318788403.

Photo credits

Share on Facebook Tweet this on Twitter Share on LinkedIn Share on Google+