top of page
Search

Social Media is Toxic for Kids

Updated: Mar 12, 2023

The inquest into Molly Russell’s death should be every parent’s wake up call.


Molly died from "an act of self-harm while suffering from depression and the negative effects of online content".


ree

Long read on Molly and why social media is so dangerous for children below👇


TLDR


Social media is like crack cocaine.

You don’t let your kids take crack cocaine.

Don’t let them use social media.


TLDR action: Don’t give your children smartphones – if they need to stay in touch give them a dumb phone to call and text


If you do give them a device:

  • Teach your children about how tech works and its dangers: Guidance here

  • Encourage them to limit sessions on their devices (especially social media apps)

  • Make all bedrooms device-free zones

  • Impose a household device amnesty each night and switch off wifi/data – this applies to you too

  • Turn off all notifications – if their phone pings every time a friend does something they will never put it down and can’t escape the digital world

  • Set all privacy/safety settings both on the device and within all apps to maximum privacy/safety: Guidance here

  • Look at children’s accounts regularly – safety trumps privacy

  • Make sure they are out IRL engaging with real people learning how to socialiase

  • Make sure they are getting exercise. Fresh air and nature are also good for mental health.

  • Make sure they are getting good sleep (see device-free zones/switching off above) – poor sleep will stop them learning and exacerbate mental health issues


TLDR Demand Regulation of Social Media Now:


All social media platforms should:

  • Have to formally verify the age of each user (the banks can do it, so can social media)

  • Be responsible for all content posted (you can’t advertise how to commit suicide in the paper, you shouldn’t be able to online)

  • Not be allowed to use AI algorithms for children’s feeds – posts should be served in chronological order only from sources they have opted in to follow (preferably with parental approval)

  • Have their AI algorithms regulated to minimise harm (this is perfectly possible)


The Long Read


The benefits of social media are marginal and the harm they do to us and our societies is enormous. It’s time to abolish social media. That’s my view, but knowing that many people quite like social media, I think at least we should be opting out of toxic platforms and pushing our governments for massive regulation. I’m aware that my views seem extreme, but as a technologist with an understanding of the platform mechanics and after monitoring this space for many years I’m convinced that in the near future we'll all have come to agree.


I’ve been meaning to write a piece aimed at fellow parents grappling with the phone/no phone, how much screen-time questions for ages and not taking action. The inquest into poor Molly Russell's death has spurred me into action. In case you haven’t heard about Molly and you are a parent it’s time to wake up and take action on your own children’s social media use.


Molly Russell

Molly, who was from north-west London, took her own life aged 14 in November 2017 after ‘seeing content about suicide and self harm’ on social media. She was active on Youtube, Pinterest, Instagram and (unbeknownst to her parents) Twitter.


Molly’s family had a protracted struggle to get access to her account data with the coroner eventually demanding the social media firms involved submit the data to the inquest.


The inquest started last month on 21st September and there has been extensive news coverage.


Her rather, Ian Russell, looked at some of his daughter’s web history after she died and called it the ‘bleakest of worlds’ telling the inquest into her death that he was shocked that such ‘dark, graphic, harmful material’ was easy for her as a child to see.



Mr Russell said: "Five years ago, Molly's feelings of worthlessness grew and her sense of helplessness deepened, and ending her life seemed to her like a solution - while to us her life seemed very normal.

"It's all too easy to forget the person she really was: someone full of love and hope and happiness, a young person full of promise and opportunity and potential.

"And so, as this inquest starts, we, her family, think it is essential to remember who Molly really was so we can each hold a picture in our minds of a caring individual, full of love and bubbling with excitement for what should have lay ahead in her life."'



Molly went down the rabbit hole of social media, sucked in by algorithms intentionally engineered to manipulate her and hold her attention at all costs. It was reported that Molly used her Instagram account up to 120 times a day.


She viewed thousands of self-harm posts.



“The inquest at North London Coroner's Court was told of the 16,300 posts Molly saved, shared or liked on Instagram in the six-months before her death, 2,100 were depression, self-harm or suicide-related.”



Dr Navin Venugopal was asked to look at the material she had viewed. Asked by the coroner whether it would have had any impact on her state of mind, the psychiatrist replied: “I suppose I will start off, I will talk about the effect the material had on my state of mind. I had to see it over a short period of time and it was very disturbing, distressing. There were periods where I was not able to sleep well for a few weeks so bearing in mind that the child saw this over a period of months I can only say that she was – especially bearing in mind that she was a depressed 14-year-old. It would certainly affect her and make her feel more helpless”.


The inquest was the first to call social media executives as witnesses.


Molly was sent emails containing images by Pinterest. The headings included “10 depression pins you might like” and “depression recovery, depressed girl and more pins trending on Pinterest”.


Through time, the content that Pinterest showed Molly changed. To start with she was viewing a variety of content but in the months closer to her death the content focused on depression, self-harm and suicide.


Judson Hoffman giving evidence on behalf of Pinterest as global head of community operations agreed that the platform was not safe when Molly used it: “There was content that should have been removed that was not removed.” He apologised.


Meanwhile Meta (the parent company of Instagram, Facebook and Whatsapp) offered up Elizabeth Lagone, the unironically titled ‘Head of Health & Wellbeing’. Lagone was shown a series of content Molly had seen and declared it was “safe for people to be able to express themselves”. When pressed she said: “Yes, it is safe”.


“Instagram's guidelines at the time said users were allowed to post content about suicide and self-harm to "facilitate the coming together to support" other users but not if it "encouraged or promoted" this.”


The coroner questioned why Instragram (Meta) felt it could decide what material was safe for children to see, asking: “So why are you given the entitlement to assist children this way? Who has given you permission to do this? You run a business. There are a great many people who are…trained medical professionals. What gives you the right to make decisions about the material to put before children?”


Videos were shown in court. The coroner said the content: “seeks to romanticise and in some way validate the act of harm to young people.”


It is very clear that the content was not safe.


Lagone refused to accept there was a connection between the content and Molly’s state of mind. She said: “I can’t speak about what Molly may have been thinking”. In which case, if Meta couldn’t know what Molly or other children would think when confronted with this content, and given the risk it might harm them, why was it pushed towards them?



“Asked by the family's lawyer Oliver Sanders KC whether it was obvious it was not safe for children to see "graphic suicide imagery", the executive said: "I don't know… these are complicated issues."”



The attitude and activities of Meta in particular fill me with rage.

The response from Meta typifies their attitude and failure to take responsibility for the huge impact their products and services have on people. Meta is a gargantuan global business which uses over one third of the global population as the raw material for its commercial products. Unfortunately all such businesses have a legal mandate to maximise value for their owners regardless of the impact on the wider world. While some business owners take some responsibility for their role in society, Mark Zuckerberg does not. This absence of any regard for the wider world is why social media platforms which are now part of the infrastructure of global society need governments to regulate them urgently.


The coroner said Molly endured: "binge periods of images, video clips and text, some of which were selected and provided without Molly requesting them".


The inquest concluded that Molly died from "an act of self-harm while suffering from depression and the negative effects of online content".


The circumstances of Molly’s death are heartbreakingly sad.



Why Is Social Media toxic for kids?

Putting aside the emotion it’s important for parents to understand how social media is designed and why it poses such a risk to children.


It is no accident that Molly was fed a growing volume of extremely disturbing content that she couldn’t turn away from. This is the attention economy at work.


Social media systems are designed to maximise profit by manipulating us to maintain our attention for as long as possible to be sold to the highest advertising bidder. These systems and their owners pay no regard to our or our children’s wellbeing.


In the beginning engineers designed social media platforms based on research about ‘persuasive tech’ – this essentially involved deploying psychological techniques to encourage extended use of the platforms. Thus, like buttons, a continuous stream of notifications and follower counts were built into all the apps – these affirmations give our brains a feel-good dopamine reward each time we see them. Dopamine rewards are a foundation of addiction. We need more and more to maintain the feel good.


Once large numbers of us were hooked, the big tech firms started building unprecedentedly large databases about us and our individual habits which were used to further weaponise social media’s addictive power through artificial intelligence algorithms.


The tech firms gave the algorithms the task of maximising the attention of the maximum number of users. The algorithms are trained to do this with a continuous live-stream of our individual behavioural data.


And so social media began the biggest pyscho-social experiment ever conducted.


No one, not even the engineers who oversee these algorithms, really know how they work. Each platform’s algorithms are different, but through measurement by external researchers (because big tech only ever admit what’s happening when a whistleblower reveals their internal research) we know broadly that the more emotive the content, the stickier it is. Making us angry makes us pay attention. Hatred and division make us pay attention. Fear & insecurity make us pay attention. Outrage makes us pay attention. In short nasty content keeps us hooked. Fluffy kittens do not.


And the algorithms, which personalise our feeds adapt to our behaviour. They learn what captures our attention and give us more of it. When our attention starts to waver, the algorithm serves more extreme content to keep us hooked. And this is the reason we get sucked into binges and continuously checking our phones.


Different algorithms work in different ways but the clear winner in the attention stakes at this point is TikTok. Hence its preferred status amongst teenagers.


Of course, because these platforms are pervasive in society, mad and bad actors have also worked out how the tech works and further weaponised them for their own ends.


This is why all our children are at risk of being sucked in by a wide array of extreme content be it self-harm or suicide content like Molly or extremism of countless other types from religious fundamentalists, incels and extreme misogynists, homophobes, gender ideologues encouraging girls to chop off their breasts, anti-vaxxers, conspiracy theorists and purveyors of porn. All egged on and stirred up by state actors such as Vladimir Putin and Russia’s Internet Research Agency staff and bots.


All it takes is for a child to accidentally fall into one of these rabbit holes and their whole life can change.


And this is just the risk of harm from the algorithmic feeds. There are numerous other well-documented problems associated with children’s social media use including being in constant 24 hour contact with their peer group, unrelenting social pressure, cyber-bullying, trolls, grooming and the unrealistic body and life imagery they are being soaked in all day every day.


To me it’s simple - Children don't need smartphones. Children should not be on social media. It's down to parents to decide.



Sources


Mounting Evidence (added Feb 2023):

https://jonathanhaidt.substack.com/p/social-media-mental-illness-epidemic


Coverage of the Inquest into the Death of Molly Russell:


Why is Social Media So Toxic for Kids


Practical advice from the Center for Humane Technology who saw the risk early:


Toxic content will be fed to your kids:


BJ Fogg founded the Stanford Persuasive Tech Lab in 1997: https://behaviordesign.stanford.edu/ethical-use-persuasive-technology


Prof. Philip Howard at Oxford Internet Institude on Fake News etc.


Roger McNamee, early Facbook investor on how Facebook damages society and his regrets

 
 
 

Comments


Contact

I'm open to volunteering requests, book recommendations & new ideas.

Thanks for submitting!

© 2023 by Rachel Harker

bottom of page