Sunday, September 26

I warned in 2018 that YouTube was fueling far-right extremism. This is what the platform should be doing | Youtube


IIn the fall of 2018, I published a Investigation report warning of a growing trend of far-right radicalization on YouTube. Specifically, I identified a loosely connected network of reactionary YouTubers, ranging from mainstream conservatives and libertarians to outspoken white supremacists and neo-Nazis, who were broadcasting their political ideas to a young audience. United by shared opposition to “warriors for social justice” and the mainstream media, they frequently collaborated with each other and amplified each other’s content. In the process, they made it extremely easy for a viewer to gradually move towards more extreme content.

The following March, I watched in horror, along with much of the rest of the world, as a white supremacist gunman killed 51 people and wounded 40 more at the Al Noor Mosque and the Linwood Islamic Center in Christchurch, New Zealand. Through the chaos of the day, researchers analyzed his manifesto and found that under the layers of irony and memes, the message was quite clear. He had been radicalized to believe in the Great Replacement, a white nationalist conspiracy theory that claims that white populations are being replaced on purpose by immigrants (often Muslim).

The shooter’s manifesto clearly explained his racist and Islamophobic beliefs, but provided little information on how he came to embrace them. On Monday, with the publication of the Royal Commission’s investigation into the attacks, we got a more complete picture: the Christchurch shooter was radicalized on YouTube by many of the propagandists that I myself and other investigators had warned about. So why didn’t YouTube take action earlier and what should they do now?

There are a million different ways YouTube could have been, and it could be acting now. They could enforce their terms of service more aggressively or make those terms more robust. They could make changes to your algorithm to stop you recommending increasingly extreme content. They might downgrade the fringe content that acts as a first step toward radicalization. They could refine their content moderation algorithms to capture content more effectively. And in fact, YouTube constantly claims that it has done many of those things.

And yet there is often a huge disconnect between the actions YouTube says it is taking and what users and creators actually experience. This is partly because these actions mean little if the platform does not have a clear idea of ​​how define hate speech, extremism, harassment or borderline content and what values it seeks to sustain itself in its actions. In fact, YouTube has often cornered itself by trying to stay as “apolitical” as possible and turn deeply value-based judgments into the analysis of minor details. In an attempt to avoid accusations of politicized censorship, the platform has frequently become entangled, focusing its decisions on the smallest technicalities when determining whether content has violated its terms.

The great irony is that in trying to stay apolitical, YouTube is constantly making the political choice not to care about or protect vulnerable communities. You can tweak your algorithms and update your policies as much as you like, but you won’t really address the underlying issues until you are firmly committed to protecting Muslim YouTube creators and users and stopping the spread of Islamophobia on your platform. This does not just mean clearly stating this commitment, although it would be a reasonable first step. (YouTube could, for example, follow the lead of New Zealand Prime Minister Jacinda Ardern and apologize for the role she played in facilitating the terror attack.) It would also mean dedicating significant resources to it and framing your approach to content in that regard.

A plaque commemorates the victims of the Christchurch shooting at the Al Noor Mosque in 2019.
A plaque commemorates the victims of the Christchurch shooting at the Al Noor Mosque in 2019. Photograph: Kai Schwörer / Getty Images

Because despite YouTube’s claims that it takes hate speech seriously, Islamophobia is still alive and well on the platform. Ben Shapiro, the conservative expert who frequently promotes Islamophobic ideas, is thriving on YouTube, with nearly 2.5 million subscribers and an additional 2.4 million on its medium The Daily Wire. Stephen Crowder, a controversial creator with more than 5 million subscribers has claimed that “Islamophobia is a perfectly rational ‘phobia'”, between similar statements. This propaganda comes not only from small fringe creators, but also from some of the biggest political commentators on the platform.

In the end, YouTube’s approach strangely mirrors that of the New Zealand government in the run-up to the attack. Members of the Muslim community interviewed for the commission’s report said they had been raising the alarm about the rise in Islamophobia to the government, but no one listened to them. Like a Muslim New Zealander said“The events of the day were heralded by so many telltale signs of his arrival, all of which were obvious and all of which were ignored by those in power to act.”

Instead, the government was hyper-focused on potential terrorist threats from Muslim individuals, leading one interviewee to say that “they were looking at us, not looking at our backs.” Similarly, social media platforms like YouTube have consistently taken swift and decisive action against ISIS recruiting channels and other threats they see coming from Muslim extremists, while also allowing widespread Islamophobic content to flourish. For YouTube, like the New Zealand government, the question is whether they can keep an eye on Muslims’ backs rather than just looking at them.

  • Becca Lewis is a doctoral candidate at Stanford University and a graduate affiliate at the Center for Information, Technology and Public Life at the University of North Carolina.




www.theguardian.com

Leave a Reply

Your email address will not be published. Required fields are marked *

Share