Tuesday, August 3

Facebook leak underscores strategy to operate in repressive regimes | Facebook


Facebook users can praise mass murderers and “violent non-state actors” in certain situations, according to internal guidelines that underscore how the tech corporation strives to operate in repressive regimes.

The leak also reveals that Facebook maintains a list of “recognized crimes” and instructs its moderators to distinguish between those and “crimes not recognized by FB” when applying company rules.

The list is designed to avoid helping countries where criminal law is considered incompatible with basic human rights. “We only recognize crimes that cause physical, financial or mental harm to people”, such as “theft, robbery and fraud, murder, vandalism [and] Non-consensual sexual touches, ”the guidelines say.

Crimes not recognized by Facebook include “claims about sexuality”, “peaceful protests against governments” and “discussing historical events / controversial issues like religion”. Facebook argues that this is the only way it can work in countries where the rule of law is unstable.

But revealing his explicit decision to place himself above the law may cause friction with governments around the world. “One of the biggest problems is that Facebook has entered every country on the planet with no idea of ​​the impact,” said Wendy Via, co-founder and president of the US-based Global Project Against Hate and Extremism.

She described Facebook, which was founded in the US, as having “little language ability and zero cultural competence,” adding: “You can’t build secret rules if you don’t understand the situation.”

Even in the US, Facebook has struggled to cope with cultural changes. The documents give insight into how Facebook fought to define and act against the far-right QAnon movement. The accounts associated with QAnon were prohibited on the platform in October 2020 after years of growing popularity and controversy.

By defining QAnon as a “violence-inducing conspiracy network,” the company now prohibits non-state actors who “are organized under a name, from signing a mission statement or symbol; And promote theories that attribute violent or dehumanizing behavior to people or organizations that have been discredited by credible sources; And we have advocated for incidents of real-world violence to draw attention to or repair the alleged harms promoted in those discredited theories, ”according to leaked guidelines for moderators, dating back to December 2020.

It highlights Facebook’s battle to function in some of the most autocratic regimes in the world. An exception to the guidelines on terrorist content, for example, allows users to praise mass murderers and “violent non-state actors,” a term that describes designated militant groups that engage in civil wars that do not target civilians. .

According to Facebook sources, the company acknowledged that in complex conflicts some violent non-state actors provided key services and entered into negotiations with governments. Praise or support for violence by these groups is not allowed.

In a reference to Myanmar, the Middle East, and North Africa in the guidelines, Facebook’s global escalation team, a group of more elite moderators who are typically employed directly by the company rather than outsourced, is told: “Allow content that praises state actors, unless it contains an explicit reference to violence. ”Content“ that discusses mass murder ”is also allowed as long as it is“ a discussion of the events that led to a mass murder, including when said discussions take a favorable position towards the fact or its author “.

An example of a legitimate comment is: “Where were the police? Why did this guy get to try to restore order? “The policy was implemented early in Myanmar and Syria and is now applied globally, it is understood.

Facebook’s decision to set its own standard of behavior above that of countries’ criminal law applies in other areas of its moderation guidelines as well. By telling moderators how to control discussions about “regulated goods,” for example, Facebook applies an international set of restrictions to the items where national and local laws differ.

Cannabis cannot be bought, sold or traded on Facebook despite being legal in various regions, but users can “admit” to smoking and “promote” it. The same restrictions apply to alcohol, although alcohol shops, unlike cannabis retailers, can advertise to those 18 and over. All “non-medical drugs” are strictly restricted on Facebook, invalidating local law.

Facebook sources said that the medical field had become more accepting of cannabis use to treat diseases, while “non-medical drugs” such as cocaine and heroin had no identified medical use.

Facebook’s policy of selectively enforcing national laws has been reflected in its previous public actions. In February, the BBC reported on the use of Facebook Marketplace to facilitate illegal sales of Amazonian land. According to the broadcaster, Facebook decided that “trying to deduce which sales are illegal would be too complex a task for it to carry out itself, and should be left to the local judiciary and other authorities.”

In the United States, the company spent four years deflecting criticism for treating Donald Trump differently from other users and other world leaders. In the summer of 2020, Facebook refused to take action against the US president over a post stating that “when the looting begins, the shooting begins.” Its chief executive, Mark Zuckerberg, said he viewed the post as a warning of state action rather than a threat that went against Facebook’s rules.

The same allegations of mild policing were made regarding how Facebook handled the QAnon movement. Until the platform declared its ban on QAnon in October, the community had not been largely affected by Facebook’s guidelines. Even after the ban, numerous pages not linked to QA remained accessible on the site.

In 2017, a separate leak of documents from the moderator to The Guardian revealed that the company only enforced rules against Holocaust denial in a subset of countries where such misinformation was illegal.

Rather than decide based on the potential harm caused, Facebook focused on those places where the site was likely to face legal process or lawsuit. In 2020, Facebook updated its policies to ban Holocaust denial from its platform worldwide.

A Facebook spokesperson said: “We do not allow anyone to praise violent actions and we remove content that represents or supports organizations that we ban under our policies. We recognize that in conflict zones some violent non-state actors provide key services and negotiate with governments, so we allow praise around such non-violent activities, but we do not allow praise of violence by these groups.

“We also maintain a list of crimes that we apply under these policies, but instead of breaking them down by country or region, they are crimes that are recognized globally. Since we are a global platform, we have a single set of policies on what we allow and apply to each country and region. While we have made progress in our enforcement, we know there is always more to do. ”The company had a process for national governments and courts to report content that they believed violated local laws.


www.theguardian.com

Leave a Reply

Your email address will not be published. Required fields are marked *