Death of Molly Russell leads to major reform
Will the new rules help? Social media sites will do more to keep young people safe from graphic images, after figures showed that teenage suicides in England have risen sharply.
Molly Russell was 14-years-old when she took her own life in 2017. After her death, her father Ian Russell found graphic posts about suicide and self-harm on her Instagram account.
Over the last year, he has campaigned tirelessly for Instagram to take responsibility. This week, he won a major victory: the platform pledged to ban all images, drawings, cartoons and memes related to suicide and self-harm.
It came after Russell travelled to Silicon Valley, the home of Instagram and other tech apps, on his campaign to rid social media of damaging images.
The new policies are an extension of rules introduced in February, which banned photographs of self-harm. Instagram says it has doubled the number of posts removed since the first quarter of the year. Between April and June, the app removed 834,000 posts — 77% of which had not been reported by users.
“The bottom line is we do not yet find enough of these images before they’re seen by other people,” admits Adam Mosseri, the Instagram chief who took over the platform in October 2018.
In September, the Office for National Statistics (ONS) in the UK announced that suicide rates had risen by 11.8% on the previous year, while the rate of deaths for people aged between 10 and 24 reached a 19-year high.
As anger and fear over the role of graphic content on social media has grown, the UK Government has pushed for internet companies to accept a duty of care over their young, vulnerable users. In April, it released the Online Harms White Paper calling for the formation of a regulator that would penalise sites that publish harmful or abusive content.
But not everyone is sold on the idea of a duty of care for internet firms. Lawyer Graham Smith worries that the Government’s loose definition of “harm” would the leave the powers open to misuse, potentially threatening freedom of speech online. Should the Government control what appears on social media?
If you are struggling with self-harm or suicidal thoughts, you can find resources and advice under Become An Expert.
Yes, says a huge lobby of child psychologists, mental health activists and parents, who argue that internet companies have a responsibility to protect vulnerable young users, which includes keeping them safe from harmful content. Lindsey Giller, a clinical psychologist, says that to those who have self-harmed, seeing similar images can “definitely” be triggering and even cause them to go back to these behaviours. This is a necessary step.
Not so, argues Smith. For him, an internet regulator to enforce this duty of care would be a “rule-maker, judge and enforcer all rolled into one”, with the power to set standards about what could be said online according to a vague, undefined notion of “harm”. Rather than get to the root of the problem, these government powers could actually make things worse for young people by acting as a useless band-aid on the true mental health crisis.
- What age should people be allowed to start using social media?
- How could our societies work better for young people’s mental health?
- Highlight three facts and three opinions in this article, in different colours.
- Create a poster or leaflet which gives advice about mental health to someone who is a few years younger than you. Use resources from Become An Expert to help.
Some People Say...
“A child’s mental health is just as important as their physical health and deserves the same quality of support.”Kate Middleton, the Duchess of Cambridge
What do you think?
Q & A
- What do we know?
- One in eight (12.8%) young people aged five to 19 had a mental health disorder in England in 2017, according to the NHS. This included “emotional, behavioural, hyperactivity and other less common disorders”. Among 17 to 19-year-olds, it was one in six, or 16.9%. Of the 11 to 16-year-olds with a mental disorder, a quarter (25.5%) had self-harmed or attempted suicide.
- What do we not know?
- The causes of self-harm among young people — partly because there is never a single cause. However, risk factors that can impact your mental health include quality of life, relationships with families, exploring gender or sexual identity, and pressure from peers and the media. Young people are also more vulnerable to mental health conditions as their brains are still developing.
- An image, video or piece of text that is copied by internet users and spreads rapidly online. Usually, they are funny, but can occasionally be more sinister.
- Silicon Valley
- An area in Northern California, near San Francisco Bay, where many of the world’s largest technology and internet businesses are based.
- Took over
- Facebook bought Instagram in 2012. Mosseri joined after the surprise departure of its co-founders Kevin Systrom and Mike Krieger.
- White Paper
- A government report giving information or proposals on an issue.
- A body that has power to set rules for an industry and can punish infringements.