YouTube – similar to other social media platforms such as Facebook and Twitter – is also used by extremist groups all around the world to spread their propaganda uninhibitedly and without fear. The service has struggled to control or restrict the videos which promote violence or offensive viewpoints, but at the same time do not infringe the company’s guidelines for video removal. However, the video uploading and viewing platform is now taking stronger steps to counter Extremist videos.

On Sunday, June 18, YouTube’s parent company Google, publicized a set of policies that were created to restrict extremist videos on its platform.

Google stated that the uploaded videos which clearly violated YouTube’s community guidelines will be removed with immediate effect. The issue of social media platforms being used as a medium for radicalizing the mindset of individuals belonging to specific religious or ethnic communities has been raised by many in the past.

YouTube takes steps against extremist videos

Kent Walker, general counsel member at Google in Europe, announced through a blog post on June 18, that Google and YouTube are working in tandem with the government, civil society groups, and law enforcement to tackle the spread of violent extremism on the web. In the blog post, he further added that YouTube will be undertaking four additional steps to restrict the “extremist and terrorism-related content” on its platform.

Kent stated that engineers have developed a technology that will effectively prevent recurring uploads of terrorism-related content. This tech uses image-matching algorithms.

What are the measures?

YouTube will be increasing the use of technology to get assistance in identifying terrorism and extremist videos uploaded on its platform.

Kent says that for the last six months, engineers at the YouTube have detected 50 percent of terrorism-related videos with the help of video analysis models. However, the platform will now resort to its most advanced machine learning research to identify such content more effectively without consuming much time.

In the second step, YouTube has decided to employ more independent experts under its Trusted Flagger program to take the call of what is newsworthy and religious content or violent extremist content.

Such things cannot be fulfilled by machines. Thirdly, YouTube will be stricter on videos that are violent, but clearly, do not cross the community guidelines rules. In these cases, it will issue warnings for that video, apart from making its ineligible for recommendations, endorsements, comments or any type of monetary benefit.

Lastly, it will take its own imitative in counter-radicalization efforts. Kent shared that management will be building a Creators for Change program on YouTube to raise up a voice against radicalization and hate that extremist groups are propagating.