The Parliament Magazine: After the vote on the amendments, and the adoption of the report by plenary, are you satisfied that all expectations have been met?
Christel Schaldemose: I am very happy with the result. After all, it is not unusual that when a report goes to plenary it will be adjusted a little bit. And that makes sense, because if we always knew that the outcome would be exactly as proposed, then we would not have to take a report to plenary.
Now the whole parliament stands behind it, with a very strong majority. So, I am really very happy, and we have a chance to improve the legislation when we are negotiating with the Council, starting Monday (31 January).
PM: What do you expect from Council? They agreed their mandate in November already, where do you see the biggest differences emerging between your position and theirs?
CS: That’s a good question, actually. We always have discussions about enforcement, because that is the main interest of the Council, as the Member States have to deliver on it. And then they will probably address our ban on targeted advertising for children because of consumer protection issues and compensation.
They had agreed on their negotiating mandate on the basis of the European Commission proposal and that did not foresee a ban on targeted ads.
The Council tends to say, ‘well, your additions are not what we have been discussing’ and sometimes declare themselves unwilling to negotiate on those grounds. I expect that they will try to do that, but we brought in a lot of additional aspects, and we believe they are all important.
Consumer compensation for not abiding by targeted ads restrictions, for example, will improve safety and trust online. So, I expect that we will have a tough fight over it, but I am also confident that we can deliver a good result.
"What we are proposing is that these platforms have to check the effects of their algorithm, and if there is evidence that there is negative impact, for instance on mental health, then they will have to mitigate for that and change the algorithm"
PM: One aspect Parliament paid more attention to is the transparency of algorithm use. I assume that was perhaps partly motivated by the testimony of Facebook whistleblower Frances Haugen to Parliament, when she remarked that regulating for it doesn’t work very well because the process of legislating takes time and the engineers behind the use of algorithms working for the big platforms will always be ahead. Do you think that your proposals take that into account?
CS: We tried to find ways to address the concerns Frances Haugen had on this. Instead of regulating on what they are not allowed to do on algorithms, we are saying that the very large online platforms have to conduct risk assessments, not only looking at illegal content but also other kinds that could be harmful to the user.
One example Haugen mentioned was the way that girls looking for help on nutritional issues were directed by Instagram to recommendations about self-harm and anorexia. So, what we are proposing is that these platforms have to check the effects of their algorithm, and if there is evidence that there is negative impact, for instance on mental health, then they will have to mitigate for that and change the algorithm.
In other words, we have placed the obligation with the platforms. They cannot just wait for us to do something by regulating in three years’ time, but they themselves have to check the consequences.
Of course, the Commission, vetted researchers and NGOs will have access to this process, too, so we are, in fact, opening what is known as the ‘black box’ of algorithm use, in real time and all the time, whenever the algorithm is changed.
We made a long list of issues the platforms need to consider, including health, fundamental rights, and the effects on democracy. I believe that this will work and change the negative sides of social media platforms for the better.
"There is no doubt that the Big Tech lobby will keep on trying because they want to decide for themselves what they can do. But we as politicians have an obligation to write the rule book and decide in a democratic way, to take back control of the internet"
PM: Some observers are predicting that the big platforms are going to use the so-called ‘trade secrets’ argument to circumvent the opening of their algorithm ‘black box’. How confident are you that this will not scupper your efforts?
CS: Very confident. They don’t have to open it up to their competitors but to the Commission, vetted researchers and NGOs. So, in my opinion, there should be no discussion about trade secrets. The platforms have to make sure that their algorithms are not causing harm to society, also by showing the parameters of their recommender systems, but they don’t have to reveal their algorithms.
PM: Thierry Breton, European Commissioner for the Internal Market, made a lot of what he described as beating back the lobbying efforts of Big Tech, not least in a funny clip he published on social media of the showdown scene in “The Good, the Bad and the Ugly”, with ‘good’ DSA as Clint Eastwood confronting ‘bad’ hate speech and disinformation, and ‘ugly’ Big Tech lobbying. He is not overly optimistic about that, is he?
CS: There is no doubt that the Big Tech lobby will keep on trying because they want to decide for themselves what they can do. But we as politicians have an obligation to write the rule book and decide in a democratic way, to take back control of the internet.
Will the companies comply? We don’t know yet, but if they don’t, they will be at risk of being fined rather substantively and of being faced with compensation claims.
Some of them will perhaps try and continue to make easy money and circumvent the rules but I believe the rules will be good enough to tackle that and companies will comply even if they don’t like it. Their immense lobbying efforts also showed us how much is at stake for all stakeholders.