As lawmakers, we not only have to decide whether to regulate technological advancements, but also how to keep up with them. This is rarely an easy task.
The landmark Digital Services Act is a prime example. Having just marked its first anniversary in force, the DSA has become a target of Elon Musk and his campaign to interfere with European democracy. That should surprise no one, as DSA rules apply to social media platforms such as the one he controls.
He may rail against the DSA as a tool of censorship, but that couldn't be further from the truth.
The DSA obliges Very Large Online Platforms (VLOPs), such as Musk's X, to be transparent about the methods they use for content moderation. This includes the right for users to know the algorithm behind the platform’s recommender system.
That means the DSA empowers users by providing them with the knowledge of why they see the content that they do. It is designed to protect users from the whims of VLOPs, thereby helping them make informed decisions.
There is nothing unusual about this kind of oversight. The EU’s Audiovisual Media Services Directive gives viewers the right to be clearly informed about product placement. Citizens deserve the same respect online. Disinformation and manipulation of algorithms cause damage to our civic discourse.
The DSA also requires providers to diligently identify systemic risks, like the dissemination of illegal content. Such risks are, for instance, an actual or foreseeable negative impact on the exercise of fundamental rights, including electoral processes. Providers are obliged to take steps to mitigate these risks, such as by testing and adapting their algorithms.
The bigger problem is when the platform, or its owner, is the systemic risk itself. In the case of X, Musk poses the additional threat of political influence, given his quasi-official role in the administration of US President Donald Trump.
By allegedly manipulating algorithms and spreading disinformation on a large social media platform, Musk is manipulating his users’ free choice. This threatens democracy.
If we can protect against untransparent product placement on TV, then we can be just as diligent when it comes to transparency online. This does not need to be controversial or political: When tobacco products appear in traditional media, no one tells viewers to turn off their TVs. The same principle applies on the internet.
Despite the vast power Musk wields, we should not forget that we, as European lawmakers, have powers, too. We control the regulatory framework that promotes fair competition and protects European businesses and citizens.
To underscore this, DSA investigations have been launched against Meta, while proceedings against TikTok came to a close after the company agreed to make relevant changes to comply with the legislation.
X has met the Commission’s 15 February deadline to submit internal documentation on its recommender systems and any recent changes made to it, with the company showing a willingness to cooperate.
The DSA is just one piece of crucial legislation that protects our fundamental rights. We must make the most of it to ensure that European democracy is a level playing field for the freedom of speech — online and off.