Dozens of human rights associations including Amnesty International, Center for Democracy & Technology (CDT), Electronic Frontier Foundation (EFF), and Human Rights Watch have released a letter strongly resisting the adoption of a proposed EU Regulation on preventing the dissemination of terrorist content online (the Proposed Regulation).
Initial EU actions combating terrorism began in 2013 with the identification of a number of areas in which the EU should take action, including information exchange and criminal justice reforms. In 2015, voluntary frameworks and partnerships were launched to reduce the accessibility to terrorist content online. After a series of attacks dating back to 2015, the European Commission in 2018 adopted measures, which have been used as a basis for the Proposed Regulation, to effectively tackle terrorist content online.
According to the Proposed Regulation, hosting service providers will have an obligation to take down “illegal terrorist content” upon a removal order issued by an administrative or judicial authority. The Proposed Regulation defines “illegal terrorist content” as “information which is used to incite and glorify the commission of terrorist offences, encouraging the contribution to and providing instructions for committing terrorist offences as well as promoting participation in terrorist groups.”
The human rights organizations primarily argue that the Proposed Regulation will serve as “a dangerous precedent for online content regulation worldwide” and pose a serious threat to fundamental rights. A main concern for human rights organizations is that the Proposed Regulation would grant EU Member States the unilateral power to order sites like Google, Facebook, and Twitter to remove online content hosted in another Member State without any mechanism for judicial review or any obligation to consider the rights of individuals to post such content. This power would be effective to remove content hosted anywhere in the EU and within one hour of posting.
The organizations in their letter further claim that:
The proposal continues to incentivise online platforms to use automated content moderation tools, such as upload filters… (that) are characterised by a profound lack of transparency and accuracy of automated decision making. [. . .] We urge the European Parliament to reject this proposal, as it poses serious threats to freedom of expression and opinion, freedom to access information, the right to privacy, and the rule of law. Moreover, it will set a dangerous precedent for any future EU legislation regulating the digital ecosystem by distorting the law enforcement framework under the pretext of strengthening the Digital Single Market. Therefore, the proposed Regulation on addressing the dissemination of terrorist content online as it stands now has no place in EU law.
The Proposed Regulation is headed for a final vote in April 2021 and civil rights groups are strongly urging the Members of the European Parliament to vote against ratification.