European regulation of online disinformation may be a “game changer” in 2022
After several years of asking the tech giants to regulate themselves on mis/disinformation and a range of other topics, the European Union is expected to issue new laws by mid 2022, some of which officials say will be “game changers.”
Europe is facing the same kinds of problems of loss of local news as the US, although their governments have done far more to support journalism than the US has—with countries like France and Denmark giving emergency grants, tax credits and other kinds of support for quality information. The growth of tech platforms has also led to increased online disinformation and misinformation even while trusted sources of information shrink and disappear.
Negotiations are continuing on different aspects of the new regulations but the French government, which will take over the rotating presidency from January-July 2022, is determined to get the Digital Services Act (DSA) and Digital Markets Act (DMA) passed before the 2022 elections in France, in part because President Macron wants to run on a record of regulating Big Tech and working well with Europe.
There are many aspects to the new laws and a number of different players and interests involved. The Digital Markets Act is focused on the market power of the platforms. It will allow regulators to do ex-ante regulation of the “scalable gatekeepers,” so that regulators can step in before a Google or Facebook grows so large that they destroy a market. The principle has been established. Now the argument in the European Parliament is over which companies will be included. The Center Right parliamentarians in the European People’s Party (EPP) want to limit this to just the GAFA companies (Google, Apple, Amazon and Facebook.) The Socialists and Democrats want to keep definitions looser so that in the future more companies can be considered gatekeepers and thus fall under the new regulations.
Many officials consider the second piece of legislation, The Digital Services Act, to be a “game changer” when it comes to online mis/disinformation. After years of allowing Facebook and Google to regulate themselves, the DSA will now require them to conduct regular assessments of the systemic risks they create for societies, show their plans to address those risks—including harms from mis/disinformation—and allow regulators to audit their algorithms. This focus on the governance of speech and harms created by disinformation and hate speech by platforms is similar to the UK’s Online Safety Bill which is likely to be approved by parliament in 2022.
One key difference however is that the DSA is focussed on societal risks and the OSB on risks created for the individual. The emphasis on systematic risks is intended to address the tech giants’ amplification of disinformation online. By contrast the OSB l takes more of a whack-a-mole approach to content, in that it pushes for deletion of content that can harm individuals.
Bas du formulaire
Both bills have the potential to become global standard setters largely because they will be the first comprehensive regulations to be passed by democratic governments. The US has lagged in regulating Big Tech, failing to require more transparency in political advertising or seriously modify Section 230 in order to make the platforms liable for the harmful content they disseminate. The Europeans are trying something different. The French compare the DSA provisions to banking regulations—checking for systemic risk generally and then spot checks to see whether adequate prevention measures are in place. They say too that France’s long tradition of regulating broadcasting makes French authorities well versed in this type of regulation.
“The DSA is an elegant solution. It allows us to mitigate harm without censoring content. Hate and stupidity are eternal. The problem is algorithmic propagation, “ said France’s digital ambassador Henri Verdier.
In addition to avoiding algorithmic amplification of mis/disinformation, the DSA includes provisions to stop advertisements appearing next to false content.
“The DSA is a big bouquet of various measures. We expect the platforms to take measures to mitigate risk, not to ensure that each and every piece of disinformation disappears. In any event, we do not see removal as THE solution to address disinformation” said Krisztina Stump, head of the European Commission unit on Media Convergence and Social Media pointing to the Guidance on the Code of Practice currently in place.
A ban on microtargeted advertising is also being discussed. This seems less likely to be included because of the rush to pass the bill, and some say it would not be needed if the General Data Protection Regulation (GDPR), an EU regulation implemented in 2018, had been properly enforced. Proponents of the ban note that microtargeting makes online mis/disinformation more personalized and therefore more dangerous and it gives the platforms an advantage over traditional media outlets that have lost their ad revenue to Big Tech.
“We are trying to break the business model of Google and Facebook,” Paul Tang, a Dutch Member of the European Parliament (MEP) said. “If we can’t break them up then we will break them open through interoperability that creates competition for core services.”
Other sticking points in the negotiations: the European Federation of Journalists (EFJ), the largest organization of journalists in Europe, representing over 320,000 journalists through unions and associations across 45 countries, wants to make sure that journalists’ content will not be unilaterally and arbitrarily removed by the platforms. The EFJ has come out in favor of a strong DSA, particularly the clauses that require full transparency of advertising, automated content moderation procedures and decisions about free speech. “We need safeguards to protect media freedom while not giving media throughout Europe a blanket exemption,” said Renate Schröder, the Director of the EFJ.
For example, some analysts are worried that exemptions for media will give a free pass to organizations like RT and Sputnik and other captured media as well as cover for the big tech platforms that want to keep amplifying disinformation and can use free expression protections as an excuse. At the same time, regulators are wary of censorship and overblocking. “We can’t prevent fake news or disinformation online. That is just not possible in a democracy without overblocking. The problem is the amplification and how it’s spread. Very few people see the Russian content, but they see a lot of it,” said Manuel Geier, an assistant at the European Parliament.
The balancing act between freedom of speech and regulation means different things for different people. Germany is worried that the DSA is weaker than their own Netz DG, which fines Google, Facebook and other large companies that have been repeatedly warned about illegal content (such as hate speech) on their platforms and have a pattern of not removing it. The Germans would like to see a timeline for removal of illegal content added into the DSA.
Enforcement is another key question. Just as Luxembourg hosts Amazon, Ireland undercut the rest of the EU by encouraging Apple to have nominal headquarters there and then undercharging them for taxes. If enforcement is done at the national level then what is to stop Ireland from lax enforcement of the DSA? Or smaller EU Member countries may not have the staffing and the expertise to regulate Big Tech. For these reasons, France and others want enforcement to be done by the EU which will mean staffing up the regulatory bodies and hiring a new generation of policy makers who understand how algorithms work. The question plays into what are known as “Country of Origin” rules and have proved contentious.
What is clear is that this is only a start. Whatever is passed is likely to be refined over decades. As Henri Verdier, France’s Digital Ambassador puts it: “Twenty years ago the first money laundering laws were that you could not bring a suitcase of dollar bills to a bank. Today, it’s more sophisticated. We consider regulation to be an ongoing conversation between the companies and the regulators.”