Saturday, October 16

‘He Let White Supremacists Organize’: The Toxic Legacy of Facebook Groups | Facebook


Sign up for the US Guardian Today newsletter.

Mark Zuckerberg, Facebook’s CEO, announced last week that the platform will no longer algorithmically recommend political groups to users in an attempt to “cool down” on the online division.

But experts say such policies are hard to enforce, let alone quantify, and the toxic legacy of the Groups feature and the algorithmic incentives that promote it will be hard to erase.

“This is like putting a band-aid on an open wound,” said Jessica J González, co-founder of the anti-hate speech group Change the Terms. “It does not do enough to combat the long history of abuse that has been allowed to grow on Facebook.”

Groups: a place to create ‘meaningful social infrastructure’

Facebook launched Groups, a feature that allows people with shared interests to communicate in closed forums, in 2010, but began making a more concerted effort to promote the feature around 2017 after the Cambridge Analytica scandal overshadowed the platform’s newsfeed. .

in a long blog post in February 2017 Called Building Global Community, Zuckerberg argued that there was “a real opportunity” through the groups to create “a meaningful social infrastructure in our lives.”

He added: “More than 1 billion people are active members of Facebook groups, but most don’t search for groups themselves: friends send invitations or Facebook suggests them. If we can improve our suggestions and help connect a billion people with meaningful communities, that can strengthen our social fabric. “

After increasing its group suggestions and publicizing the feature widely, including during a 60-second ad at Super Bowl 2020, Facebook did see an increase in usage. In February 2017 there were 100 million people on the platform who were in groups that they considered “significant”. Today, that number exceeds 600 million.

However, that rapid increase came with little oversight and proved difficult. By shifting its focus to Groups, Facebook began to rely more on unpaid moderators to control hate speech on the platform. The groups proved to be a more private place to speak, for conspiracy theories to proliferate, and for some users to stage violence in real life, all with little oversight from outside experts or moderators.

Facebook in 2020 introduced a number of new rules to “keep Facebook groups safe,” including new consequences for people who break the rules and greater responsibility given to group administrators to keep users online. The company says it has hired 35,000 people to tackle security at Facebook, including engineers, moderators and subject matter experts, and invested in artificial intelligence technology to detect posts that violate its guidelines.

“We apply the same rules to groups that we apply to any other form of content on the platform,” said a spokesperson for the Facebook company. “When we find groups that break our rules, we take action, from narrowing them down to removing them from recommendations and eliminating them entirely. Over the years, we have invested in new tools and artificial intelligence to find and remove harmful content and we have developed new policies to combat threats and abuse. “

Researchers have long complained that little is shared publicly regarding how exactly Facebook’s algorithms work, what is privately shared on the platform, and what information Facebook collects about users. The growing popularity of Groups made it even more difficult to track activity on the platform.

“It’s a black box,” González said of Facebook’s policy on Groups. “This is why many of us have been asking for more transparency about content moderation and compliance standards for years. “

Meanwhile, the platform’s algorithmic recommendations dragged users to the bottom of the rabbit hole. Little is known about exactly how Facebook’s algorithms work, but it is clear that the platform recommends that users join groups similar to those they already form based on keywords and shared interests. Facebook’s own researchers found that “64% of all joinings to extremist groups are due to our recommendation tools,” an internal report in 2016 found.

“Facebook has allowed white supremacists to organize and conspiracy theorists to organize across its entire platform and it has failed to contain that problem,” González said. “In fact, he has contributed significantly to the spread of that problem through his recommendation system.”

‘We have to do something to stop these conversations’

Facebook’s own research showed that algorithmic group recommendations may have contributed to the rise in violence and extremism.

On Sunday, the Wall Street Journal reported that internal documents showed that executives were aware of the risks posed by the groups and were repeatedly warned by researchers to address them. In a presentation in August 2020, the researchers said that approximately “70% of the 100 most active civic groups in America are considered inadvisable for reasons such as hatred, misinformation, intimidation and harassment.”

“We need to do something to prevent these conversations from happening and growing as fast as they do,” the researchers wrote, according to the Wall Street Journal, and suggested that steps be taken to slow the growth of the groups until more can be done to address the problem. issues.

Several months later, Facebook halted algorithmic recommendations for political groups ahead of the US elections, a move that has been extended indefinitely with the policy announced last week. The change appeared to be motivated by the January 6 uprising, which the FBI found was linked to organizing on Facebook.

Responding to the story in the Wall Street Journal, Facebook’s VP of Integrity, which oversees content moderation policies on the platform, said the issues were indicative of emerging threats rather than an inability to address long-term issues. . “If I had looked at the Groups several years ago, I may not have seen the same set of behaviors,” he said.

But the researchers say that using Groups to organize and radicalize users is an old problem. Facebook groups had been linked to a series of damaging incidents and movements long before the January violence.

“Political groups on Facebook have always benefited from outsiders and outsiders,” said Joan Donovan, a Data and Society principal investigator who studies the rise of hate speech on Facebook. “It’s really about reinforcement – the algorithm learns what you clicked on and what you like and tries to reinforce those behaviors. The groups become centers of coordination ”.

Facebook was criticized for its inability to control terrorist groups such as the Islamic State and al-Qaida using it already in 2016. It was widely used in organizing the Unite the Right rally in Charlottesville in 2019, where white nationalists and neo-Nazis marched violently. Militia groups, including the Proud Boys, Boogaloo Bois, and militia groups, organized, promoted, and increased their ranks on Facebook. In 2020, officials arrested men who had planned a violent kidnapping of Michigan Governor Gretchen Whitmer on Facebook. A 17-year-old in Illinois shot three people, killing two, in a protest organized on Facebook.

These same algorithms have allowed the anti-vaccine movement to thrive on Facebook, with hundreds of groups amassing hundreds of thousands of members over the years. A report by The Guardian in 2019 found that the majority of search results for the term “vaccination” were anti-vaccine, led by two misinformation groups, “Stop mandatory vaccination” and “Vaccination re-education discussion forum.” with more than 140,000 members each. Ultimately, these groups were linked to harassment campaigns against doctors who support vaccines.

In September 2020, Facebook prevented health groups from being algorithmically recommended for Put a stop to those misinformation problems. It has also added other rules to stop the spread of misinformation, including prohibiting users from creating a new group if an existing group they had managed is prohibited.

The origin of the QAnon movement can be traced back to a message board post in 2017. When Facebook banned content related to the movement in 2020, a report by The Guardian had exposed that Facebook groups dedicated to the dangerous conspiracy theory that QAnon were spreading the platform at a rapid pace, with thousands of groups and millions of members.

‘The calm before the storm’

Zuckerberg has said that in 2020 the company had deleted more than 1 million groups in the last year, but experts say the action, along with the new policy on group recommendations, is falling short.

The platform promised to stop recommending political groups to users before the November elections and then victoriously claimed to have cut political group recommendations in half. But a Markup report showed that 12 groups in the top 100 recommended groups to users in their Citizen Browser project, which tracks group links and recommendations served to a national panel of Facebook users, were political in nature.

In fact, the Stop the Steal groups that emerged to question the election results and ultimately led to the violent insurrection on January 6 amassed hundreds of thousands of followers, all while halting the groups’ algorithmic Facebook recommendations. politicians. Many researchers are also concerned that legitimate organizational groups will be swept up in Facebook’s actions against partisan political groups and extremism.

“I’m not very confident that they are going to be able to determine what a political group is or is not,” said Heidi Beirich of the Southern Poverty Law Center’s Facebook Royal Oversight Board, a group of academics and watchdogs who criticize the Facebook content moderation policies.

“They have allowed QAnon, militias and other groups to proliferate for so long, the remnants of these movements remain all over the platform,” he added. “I don’t think this is something they can solve overnight.”

“It doesn’t really take a mass movement, or a massive sea of ​​bodies, to do the kind of work on the Internet that allows small groups to have a massive impact on the public conversation,” added Donovan. “This is the calm before the storm.”


www.theguardian.com

Leave a Reply

Your email address will not be published. Required fields are marked *

Share