Thursday, December 9

New Leaked Facebook Documents Show Danger of Virality | Technology



The Patriot Party began recruiting supporters on Facebook after the Jan.6 assault on the Capitol. They wanted to become an alternative to the Republican Party. “We were growing fast, we had a lot of followers,” says the organization’s treasurer. Until Facebook decided to cut off their heads: “We managed to decapitate shoots like the words Patriot Party before their mass adoption,” says an internal Facebook document.

This Friday a group of American media among which are the Wall Street Journal, the Washington Post, the New York Times O the CNN have started publishing new articles based on bonus material that deep throat Frances Haugen took from the company, also shared with the U.S. Securities and Exchange Commission. Journal made a long series called “Facebook Files” in September. Now, at Haugen’s request, the rest of the documents have been shared with a larger group of journalists. The thousands of full pages of the original documents have not yet been published. Haugen appears in the British Parliament on Monday.

The new documents do not contain any spectacular revelations so far that will sink Facebook in one fell swoop. But they mean dozens of new clues that help understand the complexity and controversial decisions of a network like Facebook. It has also been known that Haugen, in order not to be detected, captured many of these documents with photos from his mobile phone to the computer screen, not with screenshots.

Each medium has chosen its preferred approach to the documents. But they all include the story of Carol Smith, a 41-year-old moderate conservative mother from North Carolina. Smith opened a Facebook account in 2019, was interested in “children, parenting, Christianity and community” and followed the official accounts of Fox News and then-President Donald Trump.

The first day the algorithm recommended him to follow conservative groups and memes. In a couple of days his recommendations began to radicalize. By the fifth day, the platform had dragged Smith to QAnon, the conspiracy group that believes the world is run by a pedophile ring and that Trump is the savior. Smith did not follow those recommendations, but his chronology also ended up full of posts linked to conspiracy theories, which violated company rules.

Smith however was not a real user. It was an internal Facebook experiment, which it repeated with progressive fictitious users ending up in similar pits. The experiment is not new and has been done with other networks. Although this time it is different because it is the work of researchers within the company, with all the tools and the ability to warn of dangers internally.

Most outlets have focused on how Facebook celebrated its success in tackling disinformation in the November elections, but then ignored the mess that led to the assault on the Capitol in January. The internal debate on the consequences of virality is, however, a topic of greater depth in the documents. At the moment it has only been covered in depth by the Journal.

Mark’s friend

Kang-Xing Jin has been friends with Mark Zuckerberg since his days at Harvard University. He is one of the first employees of the company. Also, according to the documents, he is one of those who has warned the most about the danger of virality: “There is a growing assortment of research that shows that viral channels are used more for the bad than for the good,” Jin writes in one of documents to warn of the dangers of 2020 elections and what it calls “rampant damaging virality.”

In dozens of internal documents, always reviewed according to the JournalFacebook, engineers and data scientists detect that viral content favors conspiracy theories, hoaxes and hate speech. Also, when a message was more widely shared, the probability that it was toxic content increased. Users who post very often are especially dangerous according to those same studies. Somehow, it seems, the algorithm rewards constancy and persistence.

During the 2020 election campaign, Facebook imposed rules limiting the spread of viral content. An engineer came to propose the elimination of the share button, which was unsuccessful.

The underlying problem in this debate is that virality implies business. Users spend longer on the platform, and see more ads, if the content is appealing to them. As Haugen said in Congress, the debate boils down to whether Facebook puts profits before democracy.

The Journal describes this debate in very concrete terms. The words of Jin, Zuckerberg’s college friend, found a tough rival: ad boss John Hegeman. “If we remove a small percentage of shares from the timeline of users, they decide to return less to Facebook,” he wrote in a document now released. Going less to Facebook logically implies going more to other platforms or simply not seeing ads on Facebook, which limits the business capacity of the company. In addition, Hegeman defended, most of the viral content is acceptable.

The documents show that the company always sided with Hegeman, except during the election campaign to avoid a disaster similar to 2016. The initial success however was complicated by the assault on the Capitol.

After that assault, the company began investigating 700,000 supporters of Stop the Steal, the digital movement that defends without evidence that Joe Biden won thanks to fraud. With that group of “susceptible users” mapped out, Facebook had the option of monitoring new complications and sending them a safer type of content: get them out of politics and give them kittens. These details show the enormous power to direct tastes and interests of a popular platform like Facebook.

Researchers at Facebook have identified some so-called “information corridors” of account pages and networks that distribute harmful content. This way you can also keep an eye on who sees what and spot the most dangerous ones.

The alternative to having to act against specific entities such as the Patriot Party is to activate limits on virality for everyone. This would make it more difficult for organizations based on hatred or lies to emerge so easily thanks to their inventions going viral. But Facebook prefers to target misbehaviors, at the risk of targeting groups that are safe or less dangerous than it appears. The power that this strategy confers on Facebook is enormous.

Facebook has also acted against the German organization Querdenken [pensamiento lateral] which does not constantly violate Facebook policies and promotes non-violence but which the German authorities monitor. These decisions ultimately lead to impossible questions, like the one a researcher asks in a document: there are movements that follow Facebook’s community rules but also distribute content that “is inherently harmful and violates the spirit of our standards. What do we do when authentic movements speak of hatred or delegitimize free elections? “

You can follow EL PAÍS TECNOLOGÍA at Facebook and Twitter or sign up here to receive our newsletter semanal.




elpais.com

Leave a Reply

Your email address will not be published. Required fields are marked *

Share