A small subset of Facebook users are reportedly responsible for the majority of content that expresses or encourages skepticism about Covid-19 vaccines, according to the first results of an internal Facebook study.
The study, first reported by the Washington Post, confirms what researchers have long argued about how the echo chamber effect can amplify certain beliefs within social media communities. It also shows how speech that falls short of misinformation about vaccines, which is banned on Facebook, can still contribute to vaccinations.
The Washington Post obtained a document describing the study, which has not been made public. Facebook researchers divided users, groups and pages into 638 “population segments” and studied them for “vaccine-reluctant beliefs,” according to the Post. This could include language like “I’m worried about getting the vaccine because it’s so new” or “I don’t know if a vaccine is safe”, rather than completely wrong information.
Each “segment” could have up to 3 million people, meaning the study could examine the activity of more than 1 billion people, less than half of Facebook’s roughly 2.8 billion monthly active users, the Post reported. The massive study also underscores how much information can be gleaned from Facebook’s user base and how the company is using this trove of data to examine public health outcomes.
The Post reported that the study found that in the population segment with the highest incidence of vaccine hesitancy, just 111 users were responsible for half of all flagged content within that segment. It also showed that only 10 of the 638 tagged population segments contained 50% of all vaccine vacillation content on the platform.
Facebook’s research on vaccine vacillation is part of an ongoing effort to aid public health campaigns during the pandemic, spokeswoman Dani Lever said, and is one of several studies Facebook is conducting.
“We routinely study things like voting, bias, hate speech, nudity and Covid, to understand emerging trends and to be able to build, refine and measure our products,” Lever said.
Meanwhile, Facebook, in the last year, partnered with more than 60 global health experts to provide accurate information on Covid-19 and vaccines. It announced in December 2020 that it would ban all misinformation about vaccines, suspend users who break the rules, and ultimately ban them if they continue to violate the policies.
The study is just the latest to illustrate the colossal effect that only a few players can have on the online information ecosystem. It comes on the heels of another Election Integrity Project study that found that a handful of right-wing “super-spreaders” on social media were responsible for most of the electoral misinformation in the run-up to the Capitol attack. In that report, the experts outlined a number of recommendations, including the total elimination of “super broadcast” accounts.
The Facebook study also found that there may be significant overlap between users exhibiting anti-vaccination behavior on Facebook and supporters of QAnon, an unfounded conspiracy theory surrounding a clique of Hollywood Democrats and celebrities in the “state. deep “involved in pedophilia and sex trafficking.
The overlay shows another long-term effect of QAnon’s rise, which has also been linked to the insurrection on Capitol Hill in January. Many far-right actors, including QAnon supporters and advocates, understand how to manipulate social media algorithms to reach a wider audience, said Sophie Bjork-James, a professor of anthropology at Vanderbilt University who researches the nationalist movement. white in the United States.
“QAnon is now a threat to public health,” said Bjork-James. “Over the past year, QAnon has spread widely within the online anti-vaccination community and, by extension, the alternative health community. The Facebook study shows that we will likely face the consequences of this for some time to come. “
George is Digismak’s reported cum editor with 13 years of experience in Journalism