I got an email from Twitter last week that said: “We’re writing to you because a concerned individual has recently alerted us to potentially suicidal or self-harming comments posted on your account.”
The offending comment that triggered this warning was in fact from 2012 when my then-favorite English soccer player Robin van Persie defected from my team Arsenal to our bitter rivals Manchester United, and I responded to this crushing development by tweeting: “I want to die.”
Since then, my plaintive cry has been retweeted 130,000 times, and been “liked” 111,000 times.
In fact, both those numbers have at times been significantly higher, suggesting some people have either regretted denoting their enthusiasm for my apparent desire to end my life, or been urged to do so by a concerned Twitter.
My old tweet regularly pops up to go viral again when I am involved in a controversial furor, and those who object to my views, often boasting the hashtag #BeKind in their social media biographies, wish to remind me of what thought process I should be considering.
This in turn prompts Twitter to send me the message of support, which continues: “In difficult times and when you need someone to talk to, it may help to speak to professionals who can assist you in coping with your current circumstances. If you are having thoughts of self-harm, suicide, or depression, we encourage you to please reach out to someone and request help. Please know that there are people out there who care about you, and that you are not alone. Take care, Twitter.”
Of course, I never had any intention of taking my own life.
My tweet was a joke, albeit one delivered with very real raw emotion at the time.
But what if I were feeling suicidal?
More pertinently, what if I were a suicidal teenager without a high profile to ensure people saw my posts or engagement?
What actual support would tech giants give me then to help dissuade me from such thoughts?
Sadly, we got the grim, tragic, horrifying answer in a London coroner’s courtroom last week in a case that should shock every parent to the very core of their being and make every tech firm executive stare deep into their soul.
As The Post reported, the coroner directly blamed social media for the suicide of a depressed 14-year-old British girl, Molly Russell, who was bombarded with messages and images promoting self-harm from Instagram and Pinterest’s artificial intelligence algorithms.
She was constantly directed, via emails and messages to her feeds, to dark, disturbing content that actively fueled her despair.
Her distraught father, Ian Russell, said that Molly, who was once “full of love and bubbling with excitement for what should have lay ahead in her life,” had been “pushed into a rabbit hole” of depressive content and “the bleakest of worlds.”
From the moment she first engaged with posts about depression and self-harm, the poor girl became victim to a relentless assault on her vulnerable senses that eventually helped drive her to kill herself.
“Everyone is better off without me,” she tweeted just four months before she died. “I don’t fit in this world.”
Staggeringly and shamefully, even after Molly took her life, one social media platform emailed her to point her to suicide-themed messages, including a picture of a girl’s cut thigh captioned, “I can’t tell you how many times I wish I was dead.”
Ian Russell, who monitored his daughter’s email account following her death, was appalled to see subject lines like “10 depression pins you might like” flood into her inbox.
And coroner Andrew Walker was scathing in his verdict, concluding that Molly “died from an act of self-harm whilst suffering from depression and the negative effects of online content … the platforms operated in such a way, using algorithms, as to result in binge periods of images provided without Molly requesting them. It is likely that the material viewed by Molly … contributed to her death in a more than minimal way.”
This is the first time in the world that a court has ruled that a death was caused by social media.
As the father of a girl who just got her first phone, albeit with zero access to any apps and subject to draconian parental review, this dreadful case rendered me simultaneously shocked, heartbroken and enraged.
And it must act as not just a massive wake-up call to a woefully complacent social media industry but also to equally complacent governments that have been disgracefully lax in ordering tech firms to enforce proper safety regulations.
As William, the new Prince of Wales, said in a statement: “No parent should ever have to endure what Ian Russell and his family have been through. Online safety for our children and young people needs to be a prerequisite, not an afterthought.”
The bottom line is that Molly Russell should have never been able to see the horrific content that aggravated her suicidal thoughts. The fact that she was deliberately targeted with it by robotic social media firm systems exploiting her interest in the deadly subject matter is even more disgraceful.
The New York Times reported that some of the material Molly viewed was so awful, one courtroom worker left the room to avoid having to see it. And a child psychologist expert witness said it was so “disturbing” that it caused him to lose sleep for weeks.
Yet executives from Meta, which owns Facebook and Instagram, admitted to the court they hadn’t studied the impact of such suicidal and depressive content on its youngest users.
How can that be for such rich, influential, powerful companies with such advanced technology?
The only explanation is that they didn’t care enough in their pursuit of money.
And as a result, Molly Russell is dead.
In the six months before she died, she looked at 2,100 pieces of content on Instagram alone that related to suicide, self-harm and depression.
But Meta defended its processes as maintaining a balance between free expression and safety.
As Ian Russell retorted: “If this demented trail of life-sucking content was safe, my daughter Molly would probably still be alive.”
Yes — she would.
Now that a court has ruled that social media led to a young girl’s death, surely we will see long-overdue legal floodgates open against giant tech firms for putting cash before care, and inevitable legislative reform to stop this merciless exploitation of impressionable young minds?
They need to show the same speedy, diligent empathy and support to teenage kids wanting to kill themselves that they show to TV presenters upset at footballers leaving their clubs.
If you are struggling with suicidal thoughts or are experiencing a mental health crisis and live in New York City, you can call 1-888-NYC-WELL for free and confidential crisis counseling. If you live outside the five boroughs, you can dial the 24/7 National Suicide Prevention hotline at 988 or go to SuicidePreventionLifeline.org.