'AI is not being developed fast enough to stop child abuse'

The head of the National Crime Agency (NCA) Lynne Owens
The head of the National Crime Agency (NCA) Lynne Owens Credit: Lewis Whyld

The head of the National Crime Agency (NCA) has challenged the social media giants to explain why they can develop Artificial Intelligence (AI) to target adverts at users but not create AI capable of protecting children from child abuse.

In an exclusive interview with the Telegraph, Lynne Owens said the failure of the social media firms to prevent paedophiles targeting children on the open web was distracting the NCA and police from hunting down the “worst offenders” who were operating on the dark web.

The NCA says there are nearly 2.9 million accounts registered worldwide on the worst child sexual abuse sites on the dark web, of which around five per cent – some 140,000 – are from the UK.

At the same time the number of referrals of child abuse images to the NCA from the open web has risen by 1000 per cent since 2013, from 11,477 to 113,948 in 2018, according to the latest data provided to The Sunday Telegraph.

Each referral can include multiple images and abused children.

“This is my frustration if you can apply AI to deliver targeted adverts to an individual site, I don't understand why it's so difficult to develop AI to think of all the different ways in which people might choose to abuse children online,” said Ms Owens.

“We must have a much more assertive response from the technology companies on the open web.

“Because all the while that we are dealing with the demand they create, through not designing out child abuse in their systems, not applying their own artificial intelligence in the way we'd like them to and simply thinking a referral to us, lets them off the hook

“That is preventing us from really targeting the worst offenders on the dark web. So I have a plea for technology companies, which is that they really step into this space more than I think we currently see  they are doing.

“That will then allow us as law enforcement and with other partners to really interrogate the dark web so that we can intervene, arrest the worst offenders, which enables us to safeguard the children wherever they are in the world.

“And It allows us then to look to take down dark web servers using the most covert techniques where we should and where we can to protect children.”

Ms Owens also backed demands from intelligence agencies and ministers for law enforcement agencies to be provided with a “key” to unlock encrypted communications n in order to investigate child abuse.

They are concerned tighter encryption - as planned for Facebook Messenger - to give users greater privacy will thwart investigators and provide paedophiles with unseen channels to prey on children.

“We welcome some of the real benefits of encryption - the ability for people to keep their data secure, so they aren't going to be subject to fraud but in the worst cases whether that be terrorism or child abuse, there has to be a way for us to ensure people are being kept safe," she said.

“Now, the argument against that is, well, the minute you have created a key that breaks the encryption, it is available to criminals, too. But I don't think it can be beyond the wit of us to work with technology companies and for them to work with us to find a route through that, all with the appropriate safeguards in law in place.”

Although she welcomed the Government’s plans for a new statutory duty of care on social media firms to do more to protect children, she said it was a “shame” that ministers had to resort to laws to improve safety.

“I don't think it should take legislation for the [social media companies] to respond,” said Ms Owens.

"I think they should see their moral and ethical responsibility.

“We don't accept car makers not fitting alarms to their cars, We don't support housing estates being built without designing out crime.

"And that's part of their planning presumptions. We need to read that across to the online world.

“The reality is now the online world is as much our visible street as those we walk down. So the same sort of rules need to apply.”

The social media firms claim they have developed and aim to improve machine learning to weed out child abuse and terrorism.

However, a series of investigations by this newspaper and others have exposed major flaws that allow child abuse and terror images to continue online.

License this content