The internet and child sexual abuse

On 3 April 2017, it became a criminal offence for an adult (18 or over) to conduct sexual communications with a child under 16 years old.  This was with the intention of stopping sexual grooming at an early stage, particularly via social media and mobile messaging.

A staggering 1,316 offences were recorded within the first six months of the introduction of the offence.  Until then police could not intervene until groomers attempted to meet their targets face to face.  Of the cases recorded, the youngest victim was a seven year old girl, although girls aged between 12 and 15 were the most likely to be targeted by predators.  Facebook, Instagram and Snapchat were the most common sites used by offenders, making up 63% of all incidents.

The Independent Inquiry into Child Sexual Abuse conducted five days of public hearings in relation to the internet and child sexual abuse last month.  The Inquiry heard evidence in relation to the response of law enforcement agencies (the National Crime Agency, its CEOP Command, and the police) to child sexual abuse facilitated by the internet.

The Inquiry heard that Britain’s biggest police force witnessed a 700% spike in the number of online child abuse cases referred to them by national investigators over three years.

Following the hearings, there will now be further invitations for core participant applications arising from other aspects of this investigation which include:

  • The policies of internet service providers, providers of online platforms, and other relevant software and communication technology companies relating to child sexual abuse, and
  • The adequacy of the existing statutory and regulatory framework applicable to those organisations.

The objective of the investigation is to make practical recommendations that will minimise the opportunities for abuse facilitated by the internet in the future.

The NSPCC has criticised social media companies for not making the most of the technology they already use to enforce the new offence of sexual communication with a child.  Algorithms – the calculations that tell computers what to do – are currently used by social media companies to flag up images of child abuse, hate speech and extremist material.  The NSPCC said the same techniques should be used to pick up “grooming language” and then send an automatic alert to both the child and moderators.  This is an example of a possible recommendation open to the Inquiry to make to social media companies.

This is clearly a very challenging issue and one which needs to be tackled as there can be no doubt, when consideration is made of just the numbers above, that it is a large scale problem and one which is likely to grow. The challenge for the Inquiry in making timely recommendations for behaviour which is happening and developing now is great. Technology providers will need to engage with the Inquiry but also need to be taking steps now and not just wait for recommendations. Equally organisations with responsibilities for children and young people should be considering what they can be doing to ensure the risks posed by social media are known and understood by all.


Davey_C-3-web Written by Catherine Davey, associate at BLM

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s