Changes announced after Instagram and other social media firms met with Britain’s health secretary
Instagram’s chief said they’re working with experts and the wider industry to find ways to support people when they’re most in need.(Damian Dovarganes/Associated Press)The suicide of a young girl in the U.K. is prompting a heated debate about the responsibility of social media sites to remove harmful content. Her family says she had been viewing disturbing content about self harm on Instagram and Pinterest. Now the British government is considering banning certain platforms if companies don’t comply.2:12
The call for changes was backed by the British government after the family of 14-year-old Molly Russell found material related to depression and suicide on her Instagram account after her death in 2017.
Her father, Ian Russell, said he believes the content Molly viewed on Instagram played a contributing role in her death, a charge that received wide attention in the British press.
The changes were announced after Instagram and other tech firms, including Facebook, Snapchat and Twitter, met with British Health Secretary Matt Hancock and representatives from the Samaritans, a mental health charity that works to prevent suicide.
Instagram is also removing non-graphic images of self-harm from searches.
Facebook, which owns Instagram, said in a statement that independent experts advise that Facebook should “allow people to share admissions of self-harm and suicidal thoughts but should not allow people to share content promoting it.”
Where to get help:
In Quebec (French): Association québécoise de prévention du suicide: 1-866-APPELLE (1-866-277-3553)
Canadian Association for Suicide Prevention: Find a 24-hour crisis centre
If you’re worried someone you know may be at risk of suicide, you should talk to them about it, says the Canadian Association for Suicide Prevention. Here are some warning signs:
Hopelessness and helplessness.