Child sexual abuse covers a range of offence types, occurring online, offline or moving between both. It is estimated that there are between 680,000 and 830,000 UK based adult offenders who pose varying degrees of risk to children, equivalent to 1.3% to 1.6% of the UK adult population (click here to find out more about the methodology used). Estimates indicate that one in ten children experienced child sexual abuse before the age of 16 and the 2022 Independent Inquiry into Child Sexual Abuse estimates that this occurs to one in six girls and one in twenty boys. However, all forms of child sexual abuse remain consistently underreported.
It is estimated that about two thirds of physical sexual abuse takes place within the family environment, which remains challenging to detect due to the hidden nature of the crime. Understanding physical offending, including group based offending, remains an ongoing priority for law enforcement. It is likely that a significant proportion of victims and survivors do not recognise themselves as such. Victims and survivors of group based offending who had previously witnessed or experienced unhealthy relationships are more likely to be vulnerable to controlling, violent and otherwise abusive behaviours. This emphasises the need to continue to support and encourage victim disclosure.
Adults remain the primary perpetrators of child sexual abuse, but offences against children committed by other children continue to be reported, with most relating to contact abuse. The NSPCC’s Childline figures for 2021/22 show that, where known, 31% of counselling sessions about child sexual abuse and exploitation recorded a child as responsible for the abuse. Where it can be calculated in police recorded data relating to contact and online incidents involving under 18s, the most commonly identified age to carry out or be the victim of abuse is 14.
Self-generated indecent imagery, including material shared consensually between peers, or elicited by offenders through manipulation or coercion, is increasing. The Internet Watch Foundation classified 72% of reports they assessed as containing self-generated indecent imagery. Testimonies of young people indicate that sharing images is increasingly commonplace, with peer norms amplifying the pressure to generate imagery. It is unlikely these online interactions are reported to professionals, unless aggravating factors such as blackmail are involved.
Online spaces continue to provide strangers with the opportunity to initiate contact with children, enabling both online and contact offending. Industry decisions which undermine the agreed Safety by Design principles will reduce companies’ ability to protect children on their platforms, and will result in less identification and reporting of offending. For example, the National Center for Missing & Exploited Children estimate that with the implementation of end-to-end encryption by default, the number of CyberTipline reports will reduce by 80%. Therefore, without equivalent moderation and safeguarding mitigations in place on platforms used by significant numbers of children, like Facebook Messenger and Instagram, industry and law enforcement’s ability to protect children will be reduced.
Social media and gaming platforms allowing children to interact with strangers increase the risk of technology assisted grooming. Ofcom reports that 31% of children aged three to fifteen play games online with or against strangers. The interactive nature of gaming provides offenders with additional opportunities to groom children, as they are able to build rapport through shared gaming interests, the sharing of equipment, co-operative gameplay or via offers of in-game currency.
Livestreaming is used by offenders to direct and watch the sexual abuse of children. The integration of livestreaming functionalities into social media and gaming platforms allow offenders to interact with direct content being broadcast by children. The instantaneous nature of contact can expose children to inappropriate material without warning, and can permit offenders to elicit self-generated indecent imagery or other sexual abuse following live interactions where children are placed under pressure to respond.
Extended reality technologies, such as augmented and virtual reality, have been identified as evolving threats. This technology provides the ability to manipulate or merge virtual and physical worlds, and is an increasingly standard addition to technology already in use. It is a realistic possibility that extreme or indecent acts conducted within the virtual world could act as an early pathway to further child sexual abuse offending.
Professionals Working with Children, Young People and their Families
Parents and Carers
Ian Wynter who admitted being ‘sexually attracted to babies and toddlers’ targeted financially vulnerable individuals in countries such as the Philippines and Indonesia. He would give them explicit directions and pay them to livestream themselves carrying out abuse against young children.
Chat logs and financial records showed that Wynter had paid a man in the Philippines to abuse a two-year-old boy for him via video call. Video recordings of these calls show him directing the abuse, as well as requesting specific sexual acts in the chat box.
Officers recovered thousands of online chat logs which showed Wynter speaking with offenders based all over the world, discussing ‘fantasies’ and advising others on how to carry out abuse without getting caught.
17,000 indecent images of children (6,000 in category A) were stored across Wynter’s devices, many of which he had also shared online. Within these was a naked image of a child that he had taken himself.
He was sentenced on 09 November 2022 to 19 years in prison with an additional six years on license. He will also be subject to an indefinite Sexual Harm Prevention Order and has been placed on the sex offenders register.