Crimes involving child abuse imagery (CAI) have risen by a quarter in the past year, according to figures collated by the NSPCC.
Nearly half of those incidents where the platform was recorded occurred on Snapchat, while Meta’s family of apps, including Facebook, Instagram and WhatsApp, covered another 26%.
The figures, logged from freedom of information requests filed with 35 police forces across the country, show there were more than 33,000 offences where CAI was collected and distributed in one year.
“It’s alarming to see online child abuse continue to rise, especially when tech companies should be acting to make their sites safe by design ahead of incoming regulation,” said Peter Wanless, the NSPCC’s chief executive.
“Behind these crimes are children who have been targeted by adults who are able to organise and share sexual abuse with other offenders seamlessly across social media and messaging apps.”
The figures represent a sharp increase over the last five years. The first time the NSPCC carried out the investigation, for 2017-18, forces provided data for just under 18,000 offences, while even last year the total had barely broken 25,000.
The charity said the growing quantity of CAI shared online reflected increased availability and demand for material of children suffering sexual abuse. “The people viewing and sharing and distributing this material need to know it is not a victimless crime,” said Susie Hargreaves, the chief executive of the Internet Watch Foundation (IWF), which works with tech platforms and police to remove CAI when it is discovered. “They are real children, suffering real abuse and sexual torture, the effects of which can linger a lifetime.”
In data compiled by the IWF last month, the charity found that 90% of the webpages it discovered with CAI on them contained one or more “self-generated” images – taken by the subject of the image, often under duress. More than 100,000 webpages were discovered containing self-generated CAI of children under 10, although the same image may have appeared on more than one page.
In a statement, a Snapchat spokesperson said: “Child sexual abuse is horrific and has no place on Snapchat. We use cutting-edge detection technology to find and remove this type of content, and work with police to support their investigations. Snapchat also has extra safety features for 13- to 17-year-olds, including pop-up warnings if they’re contacted by someone they don’t know.”
Over the last year, the company detected 98% of content before it was reported, the spokesperson added, up from 94% previously.
The NSPCC warned the figures for detected CAI were likely to go down in the near future, due to plans from Meta to turn on end-to-end encryption for direct messages on Instagram and Facebook Messenger. It argued that further rollout should be delayed until Ofcom can study Meta’s risk assessment for the plans as part of the new regulatory regime introduced by last year’s passage of the Online Safety Act.
-
In the UK, the NSPCC offers support to children on 0800 1111, and adults concerned about a child on 0808 800 5000. The National Association for People Abused in Childhood (Napac) offers support for adult survivors on 0808 801 0331. In the US, call or text the Childhelp abuse hotline on 800-422-4453. In Australia, children, young adults, parents and teachers can contact the Kids Helpline on 1800 55 1800; adult survivors can seek help at Blue Knot Foundation on 1300 657 380. Other sources of help can be found at Child Helplines International
Categories: News