Paedophiles viewing child sex abuse images once every 23 minutes in England and Wales as crimes rocket | Social

The number of sex abuse image offences being recorded by police has risen by almost a quarter, new figures have revealed as the government threatens to regulate web giants if they do not crack down on the phenomenon.

It means paedophiles are committing crimes at least once every 23 in England and Wales, causing some senior police officers to call for the government to consider “alternatives” to prosecution to ease immense strain on the criminal justice system.

A total of more than 22,700 offences were recorded in 2017/18, up by 23 per cent on the previous year. Offences included taking, distributing and possessing indecent images of children.

Read more

The NSPCC said every image viewed “represents a real child who has been groomed and abused to supply the demand of this appalling trade”.

Tony Stower, head of the charity’s child safety online division, said: “The lack of adequate protections on social networks has given offenders all too easy access to children to target and abuse. This is the last chance saloon for social networks on whose platforms this abuse is often taking place.

“Our Wild West Web campaign is calling on government to introduce a tough independent regulator to hold social networks to account and tackle grooming to cut off supply of these images at source.” 

On Tuesday, Sajid Javid said internet companies had only a “matter of weeks” to draw up concrete plans to stop the images being uploaded and spread online.


‘Horrifying’ number of men view child sex abuse images online, police say

“We are already drawing up legislation,” he told delegates at a police conference, referring to wider internet laws that are expected to be presented in a white paper around Christmas.

The home secretary has called on technology firms to treat child sex abuse images with the same severity as terrorist propaganda and accused some of refusing to take the issue seriously.

“If web giants do not take more measures to remove this type of content from their platforms, then I won’t be afraid to take action,” he said. “How far we legislate will be informed by the action and attitude that the industry takes.”

Social networks are being used to groom children into sending naked images of themselves that are then passed onwards, sometimes for huge profit, the charity said.

A single offence recorded by police can involve hundreds of indecent images of children. The Internet Watch Foundation identified more than 78,000 URLs containing child sexual abuse images last year. 

Police are increasingly concerned about the rise of “peer-on-peer” image sharing between minors. Last month an NSPCC survey of 40,000 young people revealed an average of one in 50 schoolchildren had sent a nude or semi-clothed image to an adult.

Officials say the rise in recorded crimes does not necessarily reflect a higher prevalence of sexual abuse images, and could be partly explained by better reporting by web companies, police recording, more proactive investigations and greater awareness. 

But the number does increase the strain on the National Crime Agency and police forces, during a rise in violent crime and other demand.

Several senior police officers have called for the government to explore alternatives, including rehabilitation and treatment for people who view “low-level” indecent images but are assessed not to pose a threat of physical abuse.

“This would give us the capacity to deal with the scale and volume of abuse that the police, Crown Prosecution Service and courts are now consistently dealing with,” the National Police Chiefs’ Council lead for child protection, Simon Bailey, said last year. 

“This would enable us to focus our resources on targeting those who are a danger to children with the strongest criminal justice response, while providing a balanced and proportionate approach outside the courts to those who pose little threat.“

Chief Constable Bailey said the police service had “reached saturation point” with the unprecedented volume of reports that is expected to rise further.

He proposed investigating and arresting all paedophiles, but giving those judged not to be a direct abuse risk a conditional caution that requires them to undergo an approved rehabilitation programme and be put on the sex offenders register so that they would be monitored.

In October, West Midlands Police chief constable Dave Thompson told MPs there needed to be a “really big discussion to have in society about how we deal with this that is much more than law enforcement”.

Matthew Falder became one of the UK’s most prolific paedophiles after blackmailing at least 46 victims into sending him humiliating sexual images (NCA)

“It makes us all feel deeply uncomfortable to think that people who have that involvement in those activities should in any shape or form escape punishment, but the scale of it is just absolutely huge,” he added.

And earlier this year the National Crime Agency (NCA) called for a “fundamentally recalibrated approach” to child sex abuse images in light of the exponential increase in referrals from websites.

Officials said technology capable of preventing millions of images listed on international databases being uploaded already exists, and would allow its officers to focus on the highest-risk paedophiles.

The NCA’s director for vulnerabilities, Will Kerr, said: “It is not sustainable for companies to simply identify indecent images on their servers and report it to law enforcement, when we know that technologically you can prevent it at source.”

Andy Burrows, the NSPCC’s associate head of child safety online, said self-regulation had “demonstrably failed”.

“We have groomers using social networks contacting significant numbers of children with a view of taking them onto an encrypted site,” he told The Independent, calling on companies to detect “really simple” warning signs such as large numbers of friend requests to children with no obvious links to them.

Mr Burrows said social media companies must implement “safety by design” to limit how children can sign up to their sites and be reached by strangers.

“Technology changes so quickly and it’s going to continue to change and we need a regulator that can be sufficiently agile to respond to that,” he added.

You might also like
Leave A Reply

Your email address will not be published.