This article exists as part of the online archive for HuffPost India, which closed in 2020. Some features are no longer enabled. If you have questions or concerns about this article, please contact indiasupport@huffpost.com.

YouTube Says It's Removing Thousands Of Extremist Videos. What Happens To Them?

Extremism experts and researchers want the platform to preserve neo-Nazi and far-right videos to better understand what's driving online hate.

YouTube implemented a new anti-hate policy Wednesday. To comply with it, the company claims it will remove thousands of white supremacist and far-right extremist videos from its platform. But what will happen to the many hours of footage from some of the most prolific extremists in the United States?

Extremism researchers are concerned with the answer to that question. “It’s good that YouTube is going to deplatform this stuff, if they follow through,” said Heidi Beirich, who leads the Intelligence Project at the Southern Poverty Law Center, an organization that monitors extremist groups. “What’s not OK is if this stuff just disappears into the ether forever.”

In many cases, the videos extremists have uploaded to the platform are the best guide to their actions and rhetoric. But beyond a vague promise of future action, YouTube won’t say what it has planned for the videos.

“We recognize some of this content has value to researchers and NGOs looking to understand hate in order to combat it, and we are exploring options to make it available to them in the future,” a YouTube spokesperson told HuffPost.

YouTube did not respond to follow-up questions as to whether the platform retains archived copies of the videos it removes, or whether it would make such videos available to law enforcement who may request their use for evidence in court cases or investigations. But data researchers say it’s likely that YouTube has held on to the content it has taken down.

“Typically, data like videos and chat history is never really removed,” said Megan Squire, a computer science professor at Elon University who studies online extremism. “That’s the case across most platforms.”

Although platforms holding onto user data indefinitely presents its own set of privacy issues, researchers and civil society groups such as the SPLC say it’s important for YouTube to preserve extremist videos while keeping them off their platform. The videos are not only useful for analysts and authorities, but they’re necessary for YouTube to teach its own learning algorithms how to get better at recognizing extremist content.

One of the biggest criticisms of YouTube’s algorithm is that it can continually push out extremist content to users through its recommended video feature. The platform allows users to generate ad revenue off their videos, and its recommendation engine creates an incentive for more radical, inflammatory videos that would stand out from the pack and gain the most views. The mix of the two turned YouTube into a facilitator of far-right radicalization as it directed viewers toward increasingly conspiratorial and racist videos, while making stars of their creators.

“If they’re serious about understanding what makes white supremacist content, or how their algorithms to suggest content have gone wrong, then they’ll need to use that bad data as input for their learning algorithms,” Squire said. “They can’t delete it, otherwise they won’t have a way to learn in the future.”

But YouTube, along with other platforms, has a shaky track record of successfully taking down and carefully preserving content. When the platform removed hundreds of accounts in 2017 that uploaded thousands of videos of the Syrian civil war, it created an immense backlash from activists, researchers and investigators who noted that the footage potentially contained evidence of atrocities and war crimes that could be used in future court cases. YouTube ultimately paired with a Berlin-based nongovernmental organization, Syrian Archive, to preserve and restore many of the videos, but the effort was incredibly difficult and time-consuming, with some videos possibly lost forever in the process. YouTube didn’t pay the Syrian Archive for its work, according to BuzzFeed.

YouTube’s latest crackdown on extremism appears to have not gone exactly to plan either, ensnaring some users who don’t hold extremist views while missing others who do. Romania-based history teacher Scott Allsop checked his email late on Wednesday and found that YouTube believed he was sharing extremist content and had removed videos from 15 years on subjects ranging from the Battle of Hastings to the Cold War, he told HuffPost. Among the videos was World War II-era Nazi footage, which he believes was the cause of his suspension. YouTube reinstated Allsop’s account upon his appeal ― and after a tweet of his went viral ― but it’s unclear how many other cases and appeals the platform has received in reaction to the removals.

In addition to making sure the videos YouTube is removing are from actual extremists, the company could create a program to make videos it has removed accessible for select organizations and individuals to study, researchers say. Some platforms, including Facebook, already allow limited access to data for such purposes, but YouTube’s thousands upon thousands of extremist videos present a unique opportunity to learn even more about what is driving online hate. If YouTube is going to remove videos, experts say, it’s the platform’s responsibility to understand why they were made in the first place.

“Platforms have thrown research money at a lot of issues in recent months, and this should not be an exception,” said Dia Kayyali at Witness, an NGO that supports the documentation of human rights abuses. “Platforms should support research into how this content actually affects people.”

Close
This article exists as part of the online archive for HuffPost India, which closed in 2020. Some features are no longer enabled. If you have questions or concerns about this article, please contact indiasupport@huffpost.com.