After more than 16 hours of negotiations, the European Commission reached an agreement on the new Digital Services Law (DSA). A great regulation of the digital world that has the potential to change the Internet as we know it. These are the changes that companies like Google, Meta or Twitter must implement and what are some of the consequences that their implementation may have.
An ambitious and pioneering law to regulate online content. The DSA is legislation that wants to regulate practically all aspects that affect digital content, from algorithms to disinformation, through advertising and application stores. The first draft was announced at the end of 2020 and now the European Commission has reached an agreement, although the final text has not been published. The next step will be the European Parliament and Council, with the timetable set for it to come into effect around early 2024.
Europe anticipates the United States and marks the global debate. “With the DSA, the time when the big online platforms behaved as if they were too big to care is coming to an end,” summarized Thierry Breton, Commissioner for the Internal Market. This new legislation will mean a very important change for large technology companies. And it is that in case of not applying the changes, they will be subject to fines of up to 6% of their worldwide turnover.
While Europe seeks to specify the online rules, the United States has not yet made any regulatory movement. Although the debate on reforming Section 230, which affects information service providers and what should be allowed to publish on the Internet, is on the table.
The entry into force of the DSA could accelerate the United States to consider some changes, predictably in line. Another aspect to consider is the application by companies. The Digital Services Law only affects Europe, but for economic reasons these companies could end up applying the changes throughout the world.
“What is illegal offline will also be illegal online.” These are the changes that are coming. The DSA will prohibit targeted online advertising based on things like religion, sexual orientation, or ethnic origin. Minors may also not be the object of targeted advertising. Dark patterns, those web design tactics that try to trick users into taking certain actions, will also be banned. To unsubscribe from a service, it should be as simple as signing up.
The DSA does not specify the content that will be illegal, that remains in the hands of the different countries. But it does specify that platforms must explain why content has been removed and provide tools for users to claim deleted content.
In the case of online stores, they must track users who sell illegal services and products. While the large platforms will also be obliged to make reports and provide them to investigators and audits to “analyze how online risks evolve.” In other words, to be able to study if hate speech is growing or if certain disinformation is being combated.
The algorithms will be public and we will have the option to choose a system not based on our interests. Among the transparency measures, it is established that the large platforms, those with more than 45 million monthly users in the European Union, must share their content and product recommendation algorithms. This, for example, will have repercussions on platforms such as YouTube or Netflix, which must explain what criteria they follow to display their content.
In addition to this, users will also need to be able to use a non-profile based system. That is, a feed that is not directly linked to our interests. A chronological feed, for example. Instagram has offered it again. Twitter is in the middle of a debate about its algorithm. In the case of TikTok, it would be necessary to see how a feed not based on our profile is implemented.
Greater responsibility is also greater power of censorship. Europe wants the big platforms to take responsibility for the content that is published on them, fighting disinformation and hate speech. The DSA sets a whole series of rules in that direction, but while these companies are being asked to act faster, they are also being given more power to do so.
“In practice, it is giving social networks carte blanche to censor what they consider harmful, even if it is not illegal. Including what they consider disinformation,” Borja Adsuara points outDoctor of Law and expert in digital communication.
A dangerous last-minute addition: “Special measures in times of crisis.” The European Commission has decided to incorporate a section that was not in the initial proposal from a year ago. In the context of the war in Ukraine, the DSA adds a new response mechanism in times of crisis.
1/4 🚨Times of crisis must NOT be an excuse for undermining human rights!
In the closed-door #DigitalServicesAct negotiations, the @EU_Commission proposed a “crisis response mechanism” that would give itself the powers to unilaterally declare an EU-wide state of emergency. pic.twitter.com/HFd19pZ27F
– EDRi (@edri) April 12, 2022
“When a crisis occurs, such as a threat to public health or safety, the Commission can require large platforms to limit any urgent threats.” In other words, a section that greatly expands the power of censorship to fight disinformation, under cover in emergency situations.
The DSA limits this situation to a period of 3 months, but it is worrying enough that 38 Civil Society organizations have joined in a manifesto to warn about its possible abuse.
Image | OutreachPete
George is Digismak’s reported cum editor with 13 years of experience in Journalism