The internet isn’t a playground: ‘duty of care’ should mean safeguarding
30 May, 2023
6 minute read

The internet isn’t a playground: ‘duty of care’ should mean safeguarding

In a new blog, Parent Zone founder and CEO Vicki Shotbolt argues the ‘duty of care’ idea in the Online Safety Bill should extend beyond simple health and safety for users.

 Divider

Resilience is a theme that runs through all of our work at Parent Zone. 

It’s the dynamic personality trait that allows someone to recover from difficult experiences, navigate and evaluate risks, and learn from experiences. Dynamic because it ebbs and flows. 

Families who are already coping with the rising cost of living and experiencing the impact of unmanageable bills may be feeling at a very low resilience ebb at the moment. Their resources are literally stretched to beyond breaking point. 

At times like this, it’s not just important for the government to step in. It's essential.

A Bill to protect families online

There are circumstances in which it’s not enough to fall back on resilience. When it’s entirely understandable and inevitable that people won’t cope – and shouldn’t be expected to. 

It’s in those circumstances that we need to be able to rely on legislation, support services, companies and professionals to be providing appropriate safeguards. The Online Safety Bill will be the legislation that provides regulation to ensure a digital world that is better at designing out the harms and more accountable for some of its problems. 

At the time of writing, it seems likely that the Bill will pull back from one of its most controversial provisions – the idea of ‘legal but harmful’. But it will retain the idea of a ‘duty of care’ for platforms. As policy makers consider changes to the Bill, we hope that they will look closely at the duty and think about what it could mean for families. 

Health and safety for the digital world

As the Bill stands, the basic premise is that we think of the digital world as we do the offline world. 

If it were a children’s playground, the builder would need to ensure that it met health and safety regulations. They’d need to do risk assessments to ensure that the equipment didn’t pose unreasonable risks. 

Health and safety legislation is the reason we see soft rubber, rather than gravel, underneath the playground swings. We needed this legislation to nudge playground designers beyond the easiest or cheapest option. 

What health and safety regulation doesn’t do is assist you if your child falls off the swing and breaks their arm. Providing equipment meets requirements, is well-maintained, and is not at fault, there is no expectation the supplier of the playground will take responsibility. 

Clearly it would be unreasonable to expect them to. After all, they don’t supervise the playground, or check that children are using it correctly. Health and safety places a reasonable burden on the builder and providers of a playground – but goes no further. 

Now, consider applying the same logic to the digital world. As written, the Online Safety Bill requires companies to do ‘risk assessments’. If they identify that their platform has risks (yet to be defined) that could be designed out, they’ll be legally required to do so. 

If they’re safe enough, and providing people use them as they are supposed to, happy days.

Digital Families history

Due to high demand you can now attend our Digital Families Conference 22 on the 18th of October virtually, as well as in-person. Register here for free

Responsibility for safeguarding

The problem is, the parallels drawn in the Online Safety Bill overlook some really fundamental differences between online and offline. 

The central one being that, unlike a children’s playground, online platforms do have supervisors. 

They have people and AI monitoring their platforms all the time. They are quite literally observing what people are doing on their sites, and so their role is as both provider of a space and supervisor of it.

Our safeguarding proposal

In 2016, Parent Zone wrote to the Minister for Children and Families to propose a Duty of Care be placed on platforms. We suggested that if a platform became aware of a child being at risk of harm, they would be required to take action to safeguard that child. 

This would mean that every platform would have to have a safeguarding officer – and know how to escalate issues to the appropriate services. If a child’s account was accessing self-harm or suicide content, they would have a legal duty to react. 

We came to believe this was vital, having worked with Moshi Monsters – a virtual world for children that in its heyday had 80 million users worldwide. 

Their safety team described how distressing it was when they identified a child using the game in a way that suggested unsafe behaviour or even neglect – and they had very limited levers to respond. 

They could close the account but options for getting a child help were limited to the cases that needed to be reported to the police. And even that reporting route is far from simple for platforms.

Moderators need safeguarding training

The challenge with giving platforms this kind of duty of care is that they would need to have reporting options. 

Local safeguarding boards, charities, health and the police would all need to come together to figure out what happens when a child is crying for help into the internet void. 

The scale of that task was deemed to be overwhelming. So, instead, the health and safety ‘playground’ option was settled on. The responsibility sits squarely with the platforms to make sure their site is safe enough. 

Lest there be any doubt, that would be progress. But it wouldn’t deal with the child or young person who is searching for, consuming and being sent harmful (but legal) content. 

That young person needs the platform to have a safeguarding duty of care that, at the very least, requires them to spot the account and stop sending that content to them. 

We’d argue that we need to go further and bring the moderators for these platforms into the community of professionals who have safeguarding training and responsibility. They have unprecedented visibility into children’s lives. Their view is unique. 

Giving them the training needed and the responsibility required to identify and protect children at risk of harm seems like common sense. Why would we not?

Let’s protect children properly

For a resilience-based approach to work, we have to have the safeguards in place that deal with the circumstances in which resilience isn’t enough. 

As policy makers look afresh at the Online Safety Bill, we hope they will see that their current proposals don’t go far enough to protect children. 

Yes, we need the health and safety approach: unsafe buildings are not OK. But even in safe, well-built buildings children need safeguarding. 

We expect sports clubs, churches, mosques, schools and many other settings to have safeguarding officers and a direct duty of care. Why are we lowering the bar for children’s digital spaces?

Divider


Latest Articles


Tech Shock Banner

The Tech Shock podcast – the 'wicked problem' of child financial harms

This week Vicki is joined by PUBLIC's Maya Daver-Massion and Zixuan Fu to unpack child financial harms.

Tech Shock Banner

The Tech Shock podcast – has media literacy’s time finally come?

Vicki and Geraldine are joined by Professor at Bournemouth University, Julian McDougall, to discuss all things media literacy.

Tech Shock Banner

The Tech Shock podcast – the emerging gender divide

Rosie Campbell, professor of politics and director of the Global Institute of Women's Leadership at King's College London, joins Vicki to discuss gender and online life.