Contributor

This article was contributed by The Trust & Safety Professional Association

The Trust & Safety Professional Association (TSPA) is an organisation supporting the global community of professionals who develop and enforce principles and policies that define acceptable behaviour and content online. TSPA is a forum for professionals to connect with a network of peers, find resources for career development, and exchange best practices for navigating challenges unique to the profession.

Main content

How content moderation can make online spaces safer for your child

SFIO CRACHO/stock.adobe.com

In this Q&A, Adelin Cai and Clara Tsao, co-founders of the Trust and Safety Professional Association, explain how content moderation works and how the process is working to keep children safer online.

What is online trust and safety and content moderation? 

Through a combination of technology and real people manually looking at content, tech companies are doing a lot of work behind the scenes to help make online communities safer for your child. 

Online communities and services have been around for decades, and as they’ve grown, they’ve needed to work out what content and behaviour is appropriate on their platforms, and what’s not. In the tech world, this is broadly called “trust and safety.”

Content moderation is one form of trust and safety work, where professionals define and enforce the rules specifically for people using their platforms. This includes child safety, such as defining guidelines around bullying, or deciding what should be age-restricted. The rules aren’t the same across the board. Companies and websites have their own, varying, policies.

Who does this work?

Today, there is a growing class of professionals working on trust and safety in online communities to help keep you and your child safer online. 
They work for large online services, small startups, and as volunteers. They sit anywhere from Silicon Valley (USA) and Dublin (Ireland) to Manila (Philippines) and Jakarta (Indonesia). 

Their experience ranges from frontline content reviewers assessing hundreds of thousands of pieces of content every day, to team leads responsible for engaging with subject matter experts on specialised topics like self-harm or radicalisation.

How do the different functions work to keep online spaces safer for your child?

In general, the following trust and safety functions exist:

  • Policy teams that set the rules or guidelines for appropriate content or behaviour.
  • Enforcement teams that apply the rules or guidelines through different types of processes, including:
    • reviewing and taking action against inappropriate content or behaviour that has been flagged by community reports or automated detection.
    • responding to people who need additional support as a result of the enforcement action.
    • responding to emergencies and sensitive incidents.
  • Technical teams that build the tools used to prevent and remove online abuse.
  • Research teams that identify new trends in child safety that need to be addressed. For example, if certain countries are proposing new age-verification systems.
  • Compliance teams that ensure that technology is designed with laws and regulations in mind —  such as the Age Appropriate Design Code in the UK — a set of 15 standards that online services should meet to protect children’s privacy. Or the Children’s Online Privacy Protection Act (COPPA) in the US, which places certain requirements on tech platforms providing services to children under 13 years old. 

What challenges do trust and safety professionals face?

They face a really important and difficult job. The people doing this work have to wrestle with thorny issues, day in and day out. They have to consider some complex policy issues; navigate problems like election interference, extremism, and harassment; and tackle spam, account takeovers, and fraud. While doing so, they are also working on safer spaces for connecting people, including children, online. 

You can find out more about the Trust & Safety Professional Association here.

Further Reading

Supporting your child with reporting unwanted content online

Related articles

  • Safety and settings

    Making Snapchat work for you

    How to be a bit more careful, and a bit better informed, when using Snapchat: a parent’s guide

  • Games, apps and tech

    Skin gambling: what parents should know

    When you hear the word ‘gambling’ the first image that comes into your head is probably a lottery coupon or a Las Vegas roulette wheel, not a child betting in their favourite online game. But a report by the Gambling Commission found that almost 40 per cent of 11 to 16-year-olds had spent their own money on gambling.

Explore further