Opinion

The online safety bill leaves schools exposed

Schools can do plenty to teach children and families about online harms, says Diana Young, but government is ultimately responsible for the outcomes

Schools can do plenty to teach children and families about online harms, says Diana Young, but government is ultimately responsible for the outcomes

7 Feb 2023, 5:00

As a marketer, school governor and parent with primary-aged children, I am worried that the Online Safety Bill working its way through parliament does not go far enough. It should go one step further by banning access to social media platforms for all under 18-year-olds.

As a consultant, I conduct daily research on the latest trends across TikTok, Instagram and Facebook, resulting in the most distressing content on my curated ‘For You’ feed based on the platforms’ algorithms. Content from creators suffering the effects of an eating disorder, mental health issue, drug or alcohol abuse regularly appears. Many creators strive for virality, producing distressing content containing violence, racism, misogyny or even child neglect.

Most shockingly, despite flagging disturbing content as either ‘inappropriate’ or ‘not interested’, some social media algorithms continue to drive more of this content.

Meanwhile, a BBC report has found that demand for ‘Instagram smiles’ has left people with damage from wearing braces or “aligners” ordered online, while hashtag #tiktokmademebuyit contains 38.7 billion views on TikTok. Social media channels have the power to indoctrinate young children by driving unsolicited content or viral trends, such as the latest TikTok trend that sees creators prepare and eat embryonated eggs or “balut”, known to be a Filipino delicacy.

Teachers and school leaders know that children are impressionable. Pressure from children demanding mobile phones seems to be coming at a younger and younger age, and social media channels will not actively deter under 13s from joining their platforms.

In a recent broadcast interview, Kate Winslet spoke out against the negative impact of social media on children’s mental health and urged the government to make social media firms enforce age limits to help tackle the problem. However, other social media sites such as Roblox, offers “users of all ages the ability to socialize and play experiences with others in the community,” despite controversial reports of concerns regarding moderation of its bathroom roleplay experiences. There are also reports of cyber bullying on WhatsApp and Snapchat has been cited in numerous child murder investigations.

Schools can do plenty but they shouldn’t have to

For all these reasons and more, the age limit to join social media platforms should be raised to 18 years-old, with ID verification required to remove anonymity. Facebook appears to be moving towards this model with periodic requests for government ID to validate accounts.

But aside from government intervention, the role of teachers, school leaders and governors is paramount to safeguarding and educating children on safe internet use the internet. Online safety and social media policies should be updated regularly to reflect a changing landscape.

At Richard Atkins Primary School, where I am governor, we have bolstered our curriculum with online safety and relationships through PSHE and the computing curriculum from Year 1 upwards. We also host workshops with learning mentors on ‘gang awareness and social media’ for year 5 and 6 pupils.

However, schools cannot take responsibility for this alone. Parents are vital allies in monitoring and shaping how children and young people engage online. Girls Day School Trust schools, where my children attend, have embraced parent power and regularly host evening workshops with educational specialists such as Emma Gleadhill to assist parents as their children navigate the rapidly changing world of social media with their peer groups. The sessions are well-received and enable parents to share experiences.

We have already seen multiple child suicide cases reported globally with parents fighting to hold social media companies accountable. An inquest found content on Pinterest and Instagram contributed to the death of Molly Russell in 2017, while a report by the Samaritans revealed the dangers of social media’s self-harm content.

But how many more school-aged children should die as a result of harmful content on social media sites before the UK government takes firmer action? Schools are already taking preventative measures, and there is plenty they can do. Ultimately though, they shouldn’t have to and it is in the government’s power to ensure accountability falls where it should: with the platforms themselves, and with families.

Your thoughts

Leave a Reply

Your email address will not be published. Required fields are marked *

One comment

  1. “It should go one step further by banning access to social media platforms for all under 18-year-olds,” did you expect a round of applause for that? We’re not talking about smoking here. Anonymity is perfectly designed; helps protect whistle-blowers and vulnerable people who don’t want to be identified. Gutting the internet isn’t going to help anyone but governments who wish to control information shared that they don’t agree with, and at the least, find out where it started. Protecting children is great, but if only you understood how the internet works, you wouldn’t be begging for desperate measures.