Monday, January 24

Regulating Big Tech will require pluralism and institutions | View


The need to regulate large technology companies like Facebook, Google or Amazon is now daily news. This week scandal involving Facebook it’s just the latest in a saga.

From the news battle between Google, Facebook and Australia at Trump / Parler dispute, from new proposals To regulate digital companies in the US in the face of mounting pressure from competition authorities in Europe, Big Tech firms are in the spotlight because of the power they wield.

Appearances before the United States Congress or before European and national parliaments are increasingly frequent. On both sides of the Atlantic, a general regulatory momentum is building. The EU has taken the first steps with a Commission proposal to address the market power of technology companies (through the Digital Markets Law) and internal moderation structures (through the Digital Services Law).

All of this is unlike anything we’ve seen in the past. There have been and still are private entities that de facto regulate markets – think of sports federations – or companies so critical to a given economy that they also have substantial political power. But never before have we had market-dominant companies whose goal is to create a community of ideas.

Previous monopolists were not involved in regulating discourse or spreading the knowledge that shapes public opinion. But for Big Tech companies, fostering a large community, similar to a public sphere, is key to the business model.

Moderating and staying committed is the end goal. An advertising-dependent business, Big Tech firms work as digital gatekeepers and carefully curate the content that is displayed to users. They are the editors of that public sphere: both the speech vehicles and the speech controllers.

Precisely because Big Tech is different, we can’t expect the old remedies to work. Classic antitrust or competition law will influence, but is not sufficient. The reason for this is simple: to regulate a public sphere, it is necessary to address more than just the market.

All constitutional states know that freedom of expression is the baseline, and that ideas then need an institutional process to become knowledge and form the basis for decision-making. If regulators want to create a healthy digital environment for users, they must ensure that there are two things: a) pluralism and b) institutions.

An algorithm market

We need to ensure that users can share ideas, contrast views, discuss and debate. Now it’s clear that Big Tech, concerned with networking, ad sales, and rapid growth, forgot about this. The algorithms work to maximize attention without editorial worries.

Online platforms generate deal bubbles that segregate the public sphere and divide communities, in order to better target like-minded people with advertising. These companies have focused their business model on this “grouping”.

If companies can unilaterally control the algorithm that selects content, and that algorithm is solely intended to ‘bundle’ and expand the network for advertising purposes, this will reduce pluralism rather than promote it. And this has consequences for democracy.

Therefore, we suggest the creation of an ‘algorithmic pluralism’ as a possible solution. We must create a real algorithmic market where different players can create and sell algorithmic options to users.

Imagine a scenario where people can change the content they see on their social feeds by choosing one of several available algorithms. A world where people acquire algorithms to install them on their networks, not only turn them off, as is already possible on some platforms such as Twitter. A world where companies compete to offer us more responsible and pluralistic algorithms.

When I turn off the ad-oriented algorithm, a constant stream of people agreeing with my views suddenly turns into a whole new world of disagreement. When I switch to sports oriented, my feed turns into a sports blog made up of different followers. My literature-oriented diet becomes a debate of contrasting opinions and positions on writing and art.

My ad algorithm shoots all kinds of product-related content my way, for those shopping days. My privacy algorithm prioritizes my data. In all of them, I see the myriad different points of view, commercial and political, religious and agnostic, artistic and literary, that the world has to offer. Jump from one community to another.

Big Tech would have to offer this algorithmic market on its own, or else allow an intermediary market to emerge. Different companies could develop creative ways to tailor content that could then be sold to platforms or individual users.

This would bring greater transparency, as companies would be incentivized to show how their algorithms outperform those of competitors and could be more directly responsible for failures. If an algorithm is flawed or not transparent, a competitor will take its place.

So, people could choose privacy-oriented providers that would show them diverse content, while still using the platforms they love. It would avoid the costly exercise of moving from one platform to a new (often depopulated) network. It would empower consumers to choose.

Independent monitoring

However, this alone will not be enough. We must also ensure that the regulatory power of Big Tech firms over discourse in the virtual public sphere is subject to institutions that, as they do in our democracies, work to transform contrasting views into real, sharable knowledge.

This is often the role of constitutions in liberal states: they function as frameworks that establish the rules of the game. They make our disagreements possible while rationalizing them. We suggest the adoption of similar quasi-constitutional principles within Big Tech companies to encourage healthier exchanges of ideas.

This means the imposition of due process obligations on companies, as the Digital Services Law (DSA) proposed by the EU intends to do (although imperfectly) through notices and deletions, for users to verify the power of the platforms for themselves.

It means increasing transparency obligations and fostering independent quasi-judicial bodies (Facebook Supervisory Board is actually a good start) as appellate bodies, but with broader supervisory powers, up to and even over the algorithms themselves.

It means creating within such companies the quasi-constitutional bureaucracies that are always necessary to prevent power from being exercised irresponsibly.

We recognize the ambition of these proposals. But we need that creative ambition to match the power of great technology, without empowering a police state.

If we want to keep our democratic values ​​intact, we must ensure that the democratic tools that limit state power are applied to big technologies. This means not only fostering plural markets for ideas, but reinforcing them, with institutional tools designed to act as a brake on power.

Miguel Poiares Maduro is a former Portuguese development minister and executive president of the European Digital Media Observatory (EDMO), a digital data verification and anti-disinformation project. Francisco de Abreu Duarte is a law researcher at the European University Institute.


feedproxy.google.com

Leave a Reply

Your email address will not be published. Required fields are marked *

Share
Share