Monday, November 29

Facebook Accused of Blaming Misinformation About COVID-19 Vaccine


Facebook has been accused of laying the blame for addressing misinformation about the COVID-19 vaccine.

This week, the social network said it had removed dozens of pages linked to “superpreaders” of anti-vaccine content.

TO March report by the Center to Counter Digital Hate (CCDH) and Anti-Vax Watch found that only twelve anti-Vax accounts were responsible for spreading around 65% of misinformation about vaccines online.

Although Facebook took action against the accounts, it has questioned the methodology behind the report and its findings.

But the CCDH has accused Facebook of “seriously misrepresenting” the investigation and refusing to take responsibility for dealing with misinformation on its platform.

Other social networks, such as Twitter and YouTube, have also come under fire for spreading misinformation about vaccines around the world.

US President Joe Biden has even said that platforms like Facebook are “killing” people by allowing false information about vaccines to circulate online.

Facebook has said it will continue to promote reliable vaccine information to users.

“We continue to work with external experts and governments to ensure that we are addressing these issues in the right way and making adjustments if necessary,” the blog post added.

The “misinformation dozen”

According to the CCDH report, twelve people had shared up to 73% of the anti-vax content that existed on Facebook in February and March.

The so-called “disinformation dozen” included US attorney Robert F. Kennedy Jr. and US physician Joseph Mercola.

In the wake of the report, Facebook representatives testified in Congress about the extent of misinformation about vaccines on their platform.

And on Wednesday, the social network confirmed in a blog post that he had taken action on a series of accounts linked to the twelve prominent anti-vax voices.

The company said more than three dozen pages, groups and accounts on both Facebook and Instagram had been removed.

More “sanctions” were also imposed on nearly two dozen additional accounts within the same network, where their content will not be recommended to users and will be viewed less on Facebook news sources.

Facebook said that the remaining content that still exists on the platform and is linked to the network does not violate its rules.

“Facebook has refused to take real responsibility”

Facebook stated that reports of vaccine misinformation had created a “flawed narrative” that twelve people were responsible for most of the vaccine misinformation online.

“There is no evidence to support this claim,” Facebook said in its blog post.

“Additionally, focusing on such a small group of people distracts us from the complex challenges we all face in addressing misinformation about COVID-19 vaccines.”

The social network has said that the 12 named accounts were responsible for “only 0.05%” of all views of vaccine-related content on the platform.

“That said, any amount of misinformation about the COVID-19 vaccine that violates our policies is too much for our standards,” the company added.

“Since the beginning of the pandemic across our platform, we have removed more than 3,000 accounts, pages and groups for repeatedly violating our rules against the spread of COVID-19 and vaccine misinformation and have removed more than 20 million content for violating these rules. “

But CCDH executive director Imran Ahmed has said Facebook’s response to the study has demonstrated a lack of accountability.

“This blog post is simply the continuation of his strategy to deny, divert and delay,” Ahmed said in a statement on Twitter.

“Facebook has refused to take real responsibility and continually resists calls for transparency [on how they moderate content]”he added.

“Facebook has an obligation to deliver your data immediately in the interest of public health.”




feedproxy.google.com

Leave a Reply

Your email address will not be published. Required fields are marked *

Share