Position of the European Parliament on the Digital Services Act

Acceptable with reservations

Position of the European Parliament on the Digital Services Act

Acceptable with reservations

Last month, the European Parliament voted on the Saliba report on the Digital Services Act. Overall, it’s an acceptable compromise that takes on board many of the suggestions I tabled. Having said that, some parts of the text could be improved. Let’s take a look at the particular components of the approved report.

What it means for the future of the Internet

From the user point of view, numerous aspects are crucial. Especially given that their fundamental rights will be affected by the act. The Parliament’s text validates some of the most important points which will need to be embodied into the future legislation:

Decision on illegality: Decisions taken by platforms can only be provisional, since the final decision on illegality has to remain with courts. This is especially important for speech that can be sensitive depending on the context (e.g. criticism, news reporting). In such cases, the different rights and interests need to be balanced.

Notice and action: The Digital Services Act needs to set out precise processes and timeframes of what needs to be done by whom. This means that users will know how they can notify the provider in case they find illegal content on the Internet, as well as the platforms will understand when and how to react.

Safeguards against abuse of notices: Abusing notices will have to be penalized. It won’t be possible in the future, that organizations will suggest their competitors to be de-ranked in listings. Moreover, it will not be possible that they notify copyright infringement for content they don’t own (good faith declaration, obligation to tackle repeated abuse of notices).

Notices and counter-notices: The Parliament introduced the right to be notified and the right to issue a counter-notice. This is a great start to ensure that people are able to make their voices heard in case their content got blocked or deprioritized. Platforms will have to notify the content provider and the person who got the notification will have the opportunity to appeal.

Dispute settlement: An independent dispute settlement that is fair, objective, impartial, transparent, and efficient will be available for users (in case of disagreement over the removal). And if the redress or counter-notice have established that legal content was removed, the platform will be obliged to restore it. However, the change based on the court decision remains always possible as an option.

Transparency of the notice and action system: Companies will have to publicize who issued notices, as well as how many times the removed content was legal or contentious.

No general monitoring obligation: Platforms shouldn’t be subject to a general monitoring obligation.

Interoperability: The Parliament finally called for a separate instrument within the Digital Services Act package. Basically, it should address the issue of technology companies in a monopolistic position. The future legislation will have to ensure an appropriate level of interoperability. This will require systemic operators to share tools, data, expertise, and resources in order to limit the risk of lock-in. For consumers, this is a huge problem solver. Nowadays, they have to switch between communication platforms such as Whatsapp, Facebook Messenger, Matrix, etc. When this idea is put into practice, they will still be able to communicate with their friends on all the platforms within one interface. A big achievement is that as part of the measures on the table, the Commission will also have to explore open standards.

What the Commission should do in addition

While the report is a great start (when considering fundamental rights), it missed the opportunity to address some of the fundamental defects of the current practice of many platforms: The report acknowledges and also encourages the use of automated filters by digital platforms. Given that these tools are prone to error, legal content will still be regularly removed, leaving no other chance than using the complaint mechanism. We need better incentives to keep legal content on-line and ban the use of automated technologies.

If we really want to address illegal content on-line, follow-up by law enforcement authorities is essential. We should therefore incentivize transparency about what action was taken in response to content removal.

See also