Teen entered ‘dark rabbit hole of suicidal content’ online
Those are the kind of comments left on social media posts as innocent as a picture of a flower, as Sarah Lechmere – who has struggled with eating disorders – told the BBC. Social media posts also pointed her to pro-anorexia sites that gave her “tips” on how to self-harm, she said.
This is precisely why UK psychiatrists want to see social media companies forced to hand over their data – and to be taxed into paying – for research into the harms and benefits of social media use. The report, published by the Royal College of Psychiatrists, contains a forward written by Ian Russell, the father of Molly Russell, a 14-year-old who committed suicide in 2017 after entering what her father called the “dark rabbit hole of suicidal content” online.
Ian Russell describes how social media’s “pushy algorithms” trapped Molly, sequestering her in a community that encourages suffering people not only to self-harm but to also avoid seeking help:
I have no doubt that social media helped kill my daughter. Having viewed some of the posts Molly had seen, it is clear they would have normalized, encouraged and escalated her depression; persuaded Molly not to ask for help and instead keep it all to herself; and convinced her it was irreversible and that she had no hope.
… Online, Molly found a world that grew in importance to her and its escalating dominance isolated her from the real world. The pushy algorithms of social media helped ensure Molly increasingly connected to her digital life while encouraging her to hide her problems from those of us around her, those who could help Molly find the professional care she needed.
Ian Russell backs the report’s findings – particularly its calls for government and social media companies to do more to protect users from harmful content, not only by sharing content but also by funding research with a “turnover tax” that will also provide training for clinicians, teachers and others working with children, to help them identify children struggling with their mental health and to understand how social media might be affecting them.
A new regulator and a 2% tax on big tech companies
Last year, the UK government announced plans to set-up an online safety regulator to improve internet safety. The College is calling for that regulator to be empowered to compel social media companies to hand over their data.
As far as funding for research and self-harm prevention training goes, the UK has passed the Digital Services Tax. Scheduled to go into effect in April 2020, it will impose a 2% levy on the revenues of search engines, social media platforms and online marketplaces that “derive value from UK users.” That 2% will be assessed on digital companies’ global turnover.
Dr. Bernadka Dubicka, chair of the child and adolescent faculty at the Royal College of Psychiatrists and co-author of the report, said that she’s seeing more and more children self-harming and attempting suicide as a result of their social media use and online discussions. Whatever social media companies are doing to protect their most vulnerable users, it’s not enough, she said:
Self-regulation is not working. It is time for government to step-up and take decisive action to hold social media companies to account for escalating harmful content to vulnerable children and young people.
In November 2019, Facebook included Instagram in its transparency report for the first time. Facebook is getting better at finding self-harm content before it spreads: it said that since May, it’s removed about 845,000 pieces of suicide-related content, 79% of which it was able to proactively find before users reported it.