BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

How YouTube Plans To Ramp Up Its Fight On Terror

Following
This article is more than 6 years old.

Google's announced new plans to control terrorist content on YouTube, saying there's 'no place for terrorist content on our services'.

In a blog post that has also been published in the Financial Times, the company's general counsel, Kent Walker, has listed four steps it plans to take, including both technical solutions and more human involvement.

First, he says, Google plans to ramp up the capabilities of its video analysis models to automatically identify terrorist imagery - although this is only part of the job. As Walker points out, the same video can be valuable news reporting or a glorification of violence, depending on the context and who has posted it.

However, the company is now planning to devote more engineering resources to apply its most advanced machine learning research to train new content classifiers to help decide what should be taken down.

It's also recruiting a lot more people to act as 'trusted flaggers' - whose reports, says Kent, are accurate 90% of the time - and will work with 50 more specialist organisations and NGOs.

"Third, we will be taking a tougher stance on videos that do not clearly violate our policies — for example, videos that contain inflammatory religious or supremacist content," says Kent.

"In future these will appear behind an interstitial warning and they will not be monetised, recommended or eligible for comments or user endorsements. That means these videos will have less engagement and be harder to find."

Finally, YouTube is going to ramp up its own propaganda efforts, promoting anti-radicalisation videos. It's using techniques lifted from online targeted advertising to reach people that might be potential IS recruits and redirect them towards anti-terrorist videos.

"In previous deployments of this system, potential recruits have clicked through on the ads at an unusually high rate, and watched over half a million minutes of video content that debunks terrorist recruiting messages," says Kent.

And the company is also, it says, planning to work more closely with Facebook, Microsoft and Twitter to develop and share technology.

YouTube has had a flurry of activity to try and control terrorist content on its site since a number of major advertisers pulled out earlier this year.

Organisations from the BBC and the British government to L'Oreal and Honda withdrew their entire ad spend after discovering that their ads were showing up next to terrorist content, hate speech and homophobia.

Shortly afterwards, senior executives were hauled in for a meeting with the British Cabinet Office, which demanded that it should 'put this problem right by devoting sufficient resources to ensure that vile and illegal material is removed proactively from your platforms, and that neither you nor those that create these videos profit from hatred.'

And YouTube has come under particular fire in recent weeks following the news that one of the London Bridge attackers had apparently been radicalized by watching YouTube videos of US hate preacher Ahmad Musa Jibril.

Follow me on Twitter