How to handle terrorism content on-line? That is a question that European lawmakers are about to answer soon: The next trilogue on the regulation setting rules on removals of terrorism content on-line shall take place tomorrow, December 10 .
For its far reaching measures, the regulation has been heavily criticized by NGOs. The International Commission of Jurists (ICJ), the EU Fundamental Rights Agency, and the UN Special Rapporteur point out the very same problem: without effective safeguards, the regulation could lead to overreaching suppression of content and could, therefore, undermine fundamental rights, especially the freedom of expression.
Position of the European Parliament
On December 10, the Parliament and the Council are about to finalize negotiations. In light of its original mandate, the Parliament made a package proposal back in February (see the leaked 4-column document). What would this proposal mean for European citizens?
- It would give a clear NO to (re-)upload filters, which could block legitimate content from being shared.
- An exception for educational, artistic, journalistic, and research purposes would be maintained.
- Authorities in the Member State of the hosting service provider would be involved—not just informed—for the execution of removal orders.
- Unfortunately, the removal time would still be 60 minutes. However, there would be a clear exception for technical or operational reasons.
Some of the biggest Member States—such as France—are pressing for a fast conclusion of the negotiations. Essentially because they are notably impacted by recent terrorist attacks. The Council pursue regulation that would entail stricter rules for platforms and fast cross-border removal orders. As a result, the Council refused to compromise on some of the most important points, and instead made a counter-proposal, which would include the following:
- Material disseminated by journalists, researchers, and academics would have to be evaluated on a case-by-case basis by competent authorities and hosting service providers. They would have to verify every piece of content in order to determine whether it constitutes a “genuine exercise” of fundamental rights, and thereby it can be exempted from the regulation! The dangers of this approach are well explained by Wikimedia.
- Member States would be able to order hosting providers in another jurisdiction to remove content.
- Possibilities for the Member State where the hosting provider is established to scrutinize such removal orders would be limited to cases where this order contains a “manifest error” or constitutes a “serious breach of fundamental rights”. However, in that case, the country has to signal to the issuing authority its wish to use its scrutiny right within 72 hours. Even in cases when the result of such scanning is negative and the first authority has to withdraw its removal order, the hosting provider can decide that based on its terms and conditions it wouldn’t reinstate the content.
- The one hour removal time would apply to all operators without any exception for small providers.
- Even though the text excludes a general obligation to use automated tools to detect content, their use is not forbidden in specific cases. As a matter of fact, the text encourages their use by mentioning them as one of the options available for providers to protect their services.
Call for action
This is the last chance to join the call of Liberties and protest against cross-border removal orders and [upload filters before the negotiations.