Monday, November 29

Facebook is obstructing our work on disinformation. Other researchers could be the following | Laura Edelson and Damon McCoy


LLast week, Facebook deactivated our personal accounts, obstructing the investigation we conducted at New York University to study the spread of disinformation on the company’s platform. The movement It has already compromised our work, forcing us to suspend our investigations into Facebook’s role in amplifying misinformation about vaccines, sowing mistrust in our elections, and fueling the violent riots in the United States Capitol on January 6.

But even more important than the effect on our work is what Facebook’s hostility to outside scrutiny means to the many other researchers and journalists attempting to study Facebook’s effects on society. We have already heard from other researchers planning similar projects that are now being retired. If Facebook has its way, there will be no independent investigation of its platform.

Our dispute with Facebook centers on an investigative tool called Ad watcher. Ad Observer is a web browser extension that Facebook users can choose to install to share with us limited and anonymous information about the ads that Facebook displays to them. The data they share with us includes the categories that advertisers chose when targeting them. Examples might be “married women” or “interested in dating” or “lives in Florida.”

Using the data collected through Ad Observer, and also the data collected using the transparency tools that Facebook makes available to researchers, we have been able to help the public understand how the platform fails to deliver on its promises and shed light on how it sells. the attention of its users to advertisers.

In an upcoming article, we show that Facebook has included no more than 100,000 ads that meet its own criteria such as broadcast, political and social ads in its public archive. For example, he could not include ads that support Joe Biden before the 2020 elections; Amazon ads about the minimum salary; and a anti-masking ad aimed at conservatives run by a group called Reopen USA, whose Facebook page posts anti-vaccine and anti-masking memes.

We have also shown how highly partisan and misleading news sources get much more commitment on Facebook as trusted news sources, and we will post an expanded version of this analysis in a forthcoming article.

But we are not the only researchers using Ad Observer data. For the past three years, we have been making the information we collect through Ad Observer and through Facebook tools available to other researchers and journalists so that they can conduct their own investigations.

Markup has used our data to report on how ads with QAnon extremist content and merchandise militia The groups have escaped Facebook filters, despite the bans. The Markup also used the data to demonstrate how corporate advertisers like ExxonMobil and Comcast promote seemingly contradictory messages on hot topics for different audiences. Reporters from Florida to Kentucky and New Mexico used it to report on trends in politics advertising in its state before the 2020 elections.

By disabling our accounts last week, Facebook reclaimed that we were violating their terms of service, that we were compromising user privacy, and that they had no choice but to shut down due to an agreement they have with the Federal Trade Commission. All of these statements are wrong. Ad Observer collects information only about advertisers, not about our volunteers or their friends, and the FTC has set that our research does not violate your consent decree with Facebook.

Unfortunately, Facebook’s campaign against us is part of a broader pattern of hostility toward outside scrutiny. Last month, the New York Times reported that Facebook, after an internal controversy, was dismantling a team working on CrowdTangle, their marquee transparency tool for researchers who want to see how unpaid Facebook posts spread and gain engagement on the platform. Paper reported This week the White House itself was having so much trouble getting a direct response from Facebook about vaccine misinformation that officials asked to speak directly to the platform’s data scientists, rather than its lobbyists.

Social Science One, launched in 2018 to great fanfare, it was supposed to provide researchers with secure access to Facebook users’ data. But the data offered turned out to be much less useful. than expected, to the point that the funders of the project dropped out. In our work we have shown how Facebook transparency tools they don’t keep promises.

We cannot let Facebook decide unilaterally who is going to study the company and what tools it can use. The stakes are high. What happens on Facebook affects public confidence in our elections, the course of the pandemic, and the nature of social movements. We need the greatest understanding that researchers, journalists, and public scrutiny provide. If Facebook doesn’t allow this access voluntarily, it’s time lawmakers required it.


www.theguardian.com

Leave a Reply

Your email address will not be published. Required fields are marked *

Share