Disinformation: Brussels shows its teeth, Gafam show white paw

New code for a new start. After a year of tense discussions, the platforms, civil society and the European Commission have agreed on a new version of the “code of good practice” against online disinformation, launched in 2018. Three objectives: to open it up to actors other than the platforms themselves, such as messaging services; finally making the actions of the signatories measurable (State by State); and, above all, to crack down via the Digital Services Act (DSA) against very large platforms that do not respect their commitments.

A whole program, and a whole change of era: faced with the scale and the ravages of disinformation, the end of the self-regulation of platforms, which has shown all its limits, gives way to a form of supervised co-regulation. Admittedly, some will point out that the code is not stricto sensu binding, nor accompanied by sanctions, which can be a weakness; but it is a guide to what Brussels actually expects under the Digital Services Act (DSA), which itself provides for heavy fines for internet giants who are too passive in the face of online manipulation.

Numerical indicators

To measure the move from words to action, the new code, which is due to come into force next December, provides for no less than 150 quantitative and qualitative indicators, primarily focused, like the DSA, on very large platforms, those with more than 45 million users in the EU.

The subject is very sensitive. In this way, Brussels imposes transparency on part of the algorithms and artificial intelligence systems of the platforms. At this stage, the precise choice of the information that the platforms will have to share for the proper control of each indicator is delegated to a working group, invited to submit its copy in early 2023.

Advertising Revenue Hunt

The forty or so commitments made by signing players like Google or Meta (Facebook) bears witness to the extent of the scourge and the extent of the work. As a main weapon, Brussels wants to dry up advertising revenue from misinformation.

“From Brexit to the Russian war in Ukraine, in recent years, well-known social networks have allowed disinformation and destabilization strategies to spread without restraint, and have even profited financially”, denounced the Market Commissioner unique and digital, Thierry Breton, for whom “platforms should no longer receive a single euro from the dissemination of disinformation”. In the code, those who make advertising placement, like Google, undertake to fight against the dissemination of conspiratorial ads.

Also on the program are better support for researchers and fact-checkers, the fight against fake accounts and the artificial amplification of messages (bots), the hunt for identity theft (including “deep fakes”, fake videos ) and shaming influencers not listing their sponsored posts.

” It’s time to act “

The signatories will have to offer Internet users clear and simple systems for reporting contentious content and for transparency on political advertising, an issue which is also the subject of a dedicated European regulation, currently under review. Finally, the platforms undertake to build a rapid response system in the event of a crisis or election, which includes sending specific data at the request of the Commission.

Contrary to the cases of illegal content (appeals to hatred, child pornography, etc.), the philosophy of the code is not to impose the withdrawal of “infox”, which would come up against the principle of freedom of expression, but to promote reliable sources of information to break the bubbles in which Internet users are trapped.

The code notably mentions the standard set up by the Journalism Trust Initiative (JTI), at the initiative of Reporters Without Borders (RSF). “It is time to act, and today we are acting. We now have very important commitments to reduce the impact of disinformation and much more robust tools to measure their implementation,” says Vera Jourova, Vice-President of the European Commission.

Thirty-three signatories but not Telegram

The code is, to date, signed by thirty-three players: platforms and social networks such as Meta, Google, Twitter, Microsoft, TikTok, or even Twitch and advertising professionals who already participated in the previous code, joined this time by fact-checkers and NGOs like RSF. But not by platforms like Telegram, which are nevertheless regularly singled out for the spread of disinformation…

The application of the code of conduct will be closely monitored by the Commission. She will head the dedicated task force, which will include the network of European media regulators.

Disinformation: Brussels shows its teeth, Gafam show white paw