Get all your news in one place.
100's of premium titles.
One app.
Start reading
Daily Mirror
Daily Mirror
National
Ross Forman, 16

Molly Russell's dad urges tech companies to make internet safer after daughter's death

The father of 14-year-old Molly Russell who killed herself after viewing suicide images online, says the tech giants must take urgent steps to protect young people.

Speaking the day before the 18-month anniversary of his daughter’s death, Ian Russell said they are still hosting harmful content.

Mr Russell has always said the posts Molly had been looking at on Instagram before she died contributed to her death.

Now he has urged: “For the safety of young people, the platforms need to do something quickly to make the internet safer.”

Molly Russell who took her own life in November 2017 (PA)

Last month Health Secretary Matt Hancock met Facebook , Snapchat, Google and Instagram and they agreed to fund the Samaritans to help identify dangerous content and create a best practice guide to tackling it.

Mr Russell, from Harrow, north London, who runs the Molly Rose Foundation in memory of his daughter, believes that this is not moving fast enough.

He warned: “It’s still all too easy to find such dangerous content.”

He added: “In the hours between us saying ‘sleep well’ and the terrible dawning of next day, Molly’s only other influence must’ve come from beyond our house, beyond our love and protection - from the internet.”

Molly's dad Ian says Instagram is partly responsible for his daughter's death (BBC)

The Government will appoint an independent regulator to hand out fines and hold tech bosses personally liable for harmful content.

But legislation might take two years.

Speaking alongside Mr Russell, Andy Burrows, associate head of child safety online with the NSPCC, said: “Until we have legislation passed the Government should monitor whether platforms are playing ball with this interim code of practice [and] name and shame those that drag their heels.”

Tara Hopkins, from Instagram, said: “Our policies have never allowed content that encourages or promotes suicide, self-harm or eating disorders.

“We will remove it as soon as we are made aware of it.

“Many use Instagram to get support or support others, so we do allow content that discusses these topics.

“Our policies no longer allow graphic self-harm content.

"It will take time while we build technologies to find and remove it.”

- Samaritans (116 123) operates a 24-hour service available every day of the year. If you prefer to write down how you’re feeling, or if you’re worried about being overheard on the phone, you can email Samaritans at   jo@samaritans.org

Sign up to read this article
Read news from 100's of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.