Google set to crack down on terrorism

Joanna Estrada
June 19, 2017

Google counsel Kent Walker wrote in the FT that Google already has thousands of people around the world reviewing content, in addition to image-matching technology that prevents videos from being re-uploaded once they have already been removed.

Following a wave of terrorist attacks in the United Kingdom in recent months, Google's senior vice-president and general counsel, Kent Walker, used one of the leading publications in the country to outline a plan to combat the use by terrorists of Google's tools. In a blog post Sunday, June 18, 2017, Google said that it will train more workers, called "content classifiers", to identify and remove extremist and terrorism-related content faster.

Increase the number of independent experts in YouTube's Trusted Flagger programme. Using its Creators for Change program, YouTube will redirect those who wish to view extremist content to channels with counter-terrorist content which will provide them with another perspective to the terror menace.

The fourth effort is much broader, and aimed at counter-radicalization.

Alphabet Inc.'s Google said it is working with other internet firms, including Facebook, Microsoft and Twitter, to share and develop technology as well as accelerate joint efforts to tackle terrorism online.

These posts come after UK Prime Minister Theresa May criticised the internet for providing a safe space for terrorists to communicate - and after Facebook itself dodged a lawsuit that claimed it was the reason Palestinian terrorists were able to kill numerous Israelis. Google responded by changing the types of videos that can carry advertising, blocking ads on videos with hate speech or discriminatory content. Google also created a system to allow advertisers to exclude specific sites and channels in YouTube and Google's display network.

Google will almost double the number of independent experts it uses to flag problematic content and expand its work with counter-extremist groups to help identify content that may be used to radicalize and recruit extremists. Such videos, which could not be taken down, will appear behind a warning and will not be monetized nor open to user endorsements and comments.

"Human experts still play a role in nuanced decisions about the line between violent propaganda and religious or newsworthy speech", said Google in a blog post, adding that these experts were 90 percent accurate in terms of identifying content that violates YouTube's current content policies.

We cannot and must not pretend that things can continue as they are when it comes to Islamist extremism. And we will keep working on the problem until we get the balance right.

To that end, Walker also mentioned that Google is working with Facebook, Microsoft, and Twitter to create an global forum devoted to combating terrorist activities online. It is a sweeping and complex challenge. "We are committed to playing our part", noted Walker.

Other reports by Click Lancashire

Discuss This Article