Death of Molly Russell leads to major reform

Zero tolerance: 77% of the removed images had not been reported by users.

Will the new rules help? Social media sites will do more to keep young people safe from graphic images, after figures showed that teenage suicides in England have risen sharply.

Molly Russell was 14-years-old when she took her own life in 2017. After her death, her father Ian discovered that she had viewed graphic content relating to suicide and self-harm on Instagram and other social media sites.

This week he won a major victory: Instagram has pledged to ban all images, drawings, cartoons and memes related to suicide and self-harm.

The new policies are an extension of rules introduced in February, which banned photographs of self-harm. Instagram says it has doubled the number of posts removed since the first quarter of the year. Between April and June, the app removed 834,000 posts — 77% of which had not been reported by users.

In September, the Office for National Statistics (ONS) in the UK announced that suicide rates had risen by 11.8% on the previous year.

The UK Government has pushed for internet companies to accept a duty of care over their young, vulnerable users. In April, it released the Online Harms White Paper calling for the formation of a regulator that would penalise sites that publish harmful or abusive content.

But not everyone agrees with a duty of care for internet firms. Lawyer Graham Smith worries that the Government’s loose definition of “harm” potentially threatens freedom of speech online.

Should the Government control what appears on social media?

If you are struggling with self-harm or suicidal thoughts, you can find resources and advice under Become An Expert.

Heads together

Yes, says a huge lobby of child psychologists, mental health activists and parents, who argue that internet companies have a responsibility to protect vulnerable young users, which includes keeping them safe from harmful content.

Not so, argues Smith. Rather than get to the root of the problem, these government powers could actually make things worse for young people by acting as a useless band-aid on the true mental health crisis.

You Decide

  1. What age should people be allowed to start using social media?

Activities

  1. Highlight three facts and three opinions in this article, in different colours.

Some People Say...

“A child's mental health is just as important as their physical health and deserves the same quality of support.”

Kate Middleton, Duchess of Cambridge

What do you think?

Q & A

What do we know?
One in eight (12.8%) young people aged five to 19 had a mental health disorder in England in 2017, according to the NHS. This included “emotional, behavioural, hyperactivity and other less common disorders”.
What do we not know?
The causes of self-harm among young people — partly because there is never a single cause. It is a difficult problem.

Word Watch

Memes
An image, video or piece of text that is copied by internet users and spreads rapidly online. They are usually funny, but can occasionally be more sinister.
White Paper
A government report giving information or proposals on an issue.
Regulator
An organisation that could set rules for internet companies and give out punishments, like fines, if they violate the rules.
Penalise
Punish.

PDF Download

Please click on "Print view" at the top of the page to see a print friendly version of the article.