Image: Internet Watch Foundation
Would you know what to do if you or a member of your family came across an illegal image online? The Internet Watch Foundation runs a hotline for reporting criminal online content in a secure and confidential way. We asked them to explain what they do…
When surfing the web we are all potentially a few clicks away from harmful and illegal imagery. The fact is, you don’t have to be exploring the hidden or ‘dark’ web, or even watching pornography to find something illegal popping up on your screen.
Luckily, most people will never come across illegal sexual content online. But it’s easier to find than you might imagine and if you do, there is somewhere you can go to report it.
The IWF is the UK Hotline for reporting illegal online sexual content, predominantly online child sexual abuse images and videos.
In 1996, the UK internet industry  founded the IWF as a means of ensuring their services were free of illegal content, specifically child sexual abuse material, sometimes, wrongly, called ‘child porn' . Today the IWF is an international hotline with 13 full-time analysts and a membership base of 117 companies, including Apple, BT, Google, Facebook and Virgin Media. Since 1996, the IWF has processed over 500,000 reports and removed over 150,000 URLs or webpages containing illegal sexual material globally.
Reporting illegal sexual material
Each working day, the IWF analysts work through reports made to the hotline, each assessing upwards of 80 different websites a day. In April 2014, the Department for Culture, Media and Sports of the UK government granted the IWF the unique ability to proactively seek out online child sexual abuse imagery, rather than simply take reports. That year, IWF analysts removed over 23,000 webpages containing illegal sexual content from searching.
But what are we actually talking about when we say criminal online content, or illegal sexual material? These are the three types of criminal online content the IWF removes:
- Child sexual abuse content hosted anywhere in the world; (this can include an explicit or sexual ‘selfie’ which a young person has posted online, if its content is graphic.)
- Criminally obscene adult content hosted in the UK.
- Non-photographic content (animation and cartoons) depicting child sexual abuse hosted in the UK.
Statistics published in the 2014 Annual Report tell us that 80% of the victims seen by the IWF Hotline in 2014 were children aged 10 and under, and 43% of content removed showed sexual activity between adults and children including rape or sexual torture, otherwise known as category A.
What happens if a naked or sexual photo of a child is found online?
With the growth of the selfie and the increased ownership of mobile digital devices at a younger age, sexting has become a cause for concern for parents. Once shared, the control over the naked or sexual ‘selfie’ risks being posted and shared online.
If your child is aged 17 or under and finds that an explicit image of them has been shared, contact Childline to report the image or video. Childline will then complete an IWF referral form and both organisations will work together to safeguard your child and attempt to remove the content.
If you come across any content showing child sexual abuse, remember:
- Anyone can stumble across online child sexual abuse images and videos;
- You can make an anonymous and confidential report to the IWF via www.iwf.org.uk;
- Childline and IWF work together to remove online sexual images or videos of young people. Contact Childline at www.childline.org.uk.
About the IWF
The IWF is the Hotline to report:
- child sexual abuse content hosted anywhere in the world;
- criminally obscene adult content hosted in the UK;
- non-photographic child sexual abuse images hosted in the UK.
For more information please visit www.iwf.org.uk.
The IWF is part of the UK Safer Internet Centre.
- Founding members were internet service providers: BT, Virgin Media and AOL UK.
- The IWF uses the term child sexual abuse content. Other terms (child pornography, child porn and kiddie porn) are considered to legitimise images which are not pornography but records of children being sexually exploited.