Thursday, December 2

Facebook whistleblower Frances Haugen calls for urgent external regulation | Facebook

Mark Zuckerberg “has unilateral control over 3 billion people” due to his unassailable position at the top of Facebook, whistleblower Frances Haugen told MPs as she called for urgent external regulation to curb the tech company’s management and reduce damage to society.

Haugen, a former Facebook employee who posted tens of thousands of harmful documents about its inner workings, traveled to London from the US for a parliamentary hearing and gave qualified endorsement of the UK government’s proposals to regulate the platforms. of social networks and make them take some responsibility for the content. on their sites.

The company’s internal culture prioritized profitability over its impact on the wider world, Haugen said, and “there is no will at the top to make sure these systems run in a sufficiently secure manner.” He added: “Until we incorporate a counterweight, these things will be operated in the interest of the shareholders and not in the public interest.”

He cautioned that Instagram, owned by Facebook and used by millions of children around the world, may never be safe for tweens.

Addressing a group of MPs and colleagues on Monday, Haugen said much of the blame for the world’s increasingly polarized politics lies with social media and the radical impact of services like Facebook Groups.

These can encourage small, intense communities that spawn conspiracy theories, he said. “I am deeply concerned that they have created a product that can alienate people from their real communities and isolate them in these rabbit holes and filter bubbles. What you find is that when misinformation directed at a community is sent to people, it can make it difficult to reintegrate into mainstream society because now the facts have not been shared. “

Amid growing concerns about Instagram’s impact on teens’ mental health and body image, Haugen said Facebook’s own research compared young users of the app to addicts who feel unable to walk away from a service. that makes them unhappy.

“The last thing they see at night is someone who is cruel to them. The first thing they see in the morning is a statement of hatred and that is much worse. “He stated that the company’s research found that Instagram was more dangerous than other social networks such as TikTok and Snapchat because the platform focuses on” social comparison about bodies, about people’s lifestyles, and that’s what ends up being worse for children. “

She added: “I am deeply concerned that it is not possible to make Instagram safe for a 14 year old and I sincerely doubt that it is possible to make it safe for a 10 year old.”

The whistleblower also urged Facebook to make it more difficult to share material, in order to curb the sharing of hate and misinformation, while pushing more content from family and friends to users’ news sources: “Go to scale systems Human is the safest way to design social networks. We liked social media before we had an algorithmic feed. “

One of their particular concerns is how Facebook can “mislead” the public into thinking that they are prioritizing the fight against disinformation outside the English-speaking world, while highlighting its impact on social divisions in Myanmar and Ethiopia. She suggested that tools designed to reduce harm in English publications may be less effective in the UK because they were trained in American English.

Ownership of Facebook is structured so that Zuckerberg, as the founder of the company, has a special kind of stake, meaning that ultimately he alone controls the business. This gives you enormous control over the social network of the same name, as well as Instagram and WhatsApp, owned by Facebook.

Haugen said the company was full of “good, kind and conscientious people,” but they were working with bad incentives set by management and a requirement to maximize financial returns for shareholders. “Facebook has been unwilling to accept even small amounts of profit being sacrificed for security. And that is not acceptable “.

He said there was little incentive within the company to raise flaws and deal with the side effects of its business model. “Facebook never set out to prioritize polarizing and decisive content, it just turned out to be a side effect of the decisions they made.”

A Facebook spokesperson said: “At the heart of these stories is a premise that is false. Yes, we are a company and we make a profit, but the idea that we do it at the expense of people’s safety or well-being misrepresents where our own business interests lie. The truth is, we’ve invested $ 13 billion and we have over 40,000 people to do a job: keeping people safe on Facebook. “

Leave a Reply

Your email address will not be published. Required fields are marked *