Back in June, Google announced that it was going to be taking a tougher stance on terrorist and extremist content on platforms like YouTube, like reported by consumeraffairs.com.
Among the changes, the company said that it would be using machine learning systems to detect and remove terrorist videos, hiring more experts to review problematic content, and cracking down on videos that don’t necessarily violate its policies but still contain offensive, extremist, and inflammatory religious and supremacist messages.
Now, the company has released an update on how those efforts are paying off. In a blog post released on Tuesday, company officials said that progress is being made on all fronts.
“Altogether, we have taken significant steps over the last month in our fight against online terrorism. But this is not the end. We know there is more work to be done,” the company said.
“With the help of new machine learning technology, deep partnerships, ongoing collaborations with other companies through the Global Internet Forum, and our vigilant community we are confident we can continue to make progress against this ever-changing threat.”
Making progress
When it comes to its machine learning systems, YouTube says that they are faster, more accurate, and more efficient than ever before. The company notes that over 75% of videos showcasing extremist content were taken down after receiving a single human flag in the past month.
Additionally, the systems have more than doubled both the number of videos removed for violent extremism and the rate at which YouTube has taken those kinds of videos down. Officials say that they are encouraged by the improvements, but that the company will continue to invest in experts and technical resources to improve outcomes.
Speaking of experts, YouTube says that it has added 15 non-governmental organizations (NGOs) and institutions to its Trusted Flagger program to help identify hate speech, radicalization, and terrorism in videos that are used to recruit extremists. Some of the groups include the Anti-Defamation League, the No Hate Speech Movement, and the Institute for Strategic Dialogue.
“We will…regularly consult these experts as we update our policies to reflect new trends. And we’ll continue to add more organizations to our network of advisors over time,” officials said.
“Limited state”
Finally, YouTube says that it has taken several steps in the past month to impose tougher restrictions on videos that are not technically illegal but are flagged by users for showcasing hate speech and violent extremism.
The company says that although these videos will not be taken down from its site, they will be placed in a “limited state.”
“The videos will remain on YouTube behind an interstitial, won’t be recommended, won’t be monetized, and won’t have key features including comments, suggested videos, and likes,” the company said, adding that this new treatment will soon be rolled out to videos on desktop versions of YouTube.
Officials say that further updates on its progress will continue to roll out in the coming months.