Filtering fundamental rights
The European Union's balancing of intellectual property and the freedom to receive information
The legislation process of the “Directive on Copyright in the Digital Single Market” has been accompanied by many unusual events, such as a – later removed – communication by the Commission containing insults of the opponents of the reform; a video published by the social media account of the Parliament defending the draft (before an actual vote by the parliament and created by one of the firmest lobbyist groups for the directive); the attempt to reschedule the vote to the beginning of march to preempt the demonstrations on March 23; a bomb threat against the Parliament’s rapporteur and the claim that thousands of protestants against the planned directive are paid for demonstrating. The vote on the finally resulting and highly controversial Article 17 (formerly known as Art. 13), as well as on the directive as a whole, is scheduled for tomorrow in the European Parliament.
The Provision in Question
There are two points heatedly discussed regarding the Article: Does it oblige content providers (such as Youtube) to set up systems filtering all the content uploaded for possible copyright violations (upload filter)? And if so: Are such filters in compliance with fundamental rights?
Art. 17 of the draft of the copyright directive reads as follows:
[…] (4) If no authorisation [by the copyrightholders] is granted, online content-sharing service providers [as defined in Art. 2 (6)] shall be liable for unauthorised acts of communication to the public, including making available to the public, of copyright-protected works and other subject matter, unless the service providers demonstrate that they have: […]
(b) made, in accordance with high industry standards of professional diligence, best efforts to ensure the unavailability of specific works and other subject matter for which the rightholders have provided the service providers with the relevant and necessary information […]
(c) […] and made best efforts to prevent their future uploads in accordance with point (b) […]
(8) The application of this Article shall not lead to any general monitoring obligation. […]
While the provision does not contain an explicit obligation to introduce upload filters (and the relationship between para 4 and 8 stays unclear), many experts and Civil Society Organizations have pointed out that the “best efforts to ensure the unavailability of specific works” and in particular the prevention of their future uploads demanded by the provision – having in mind the liability of the content providers for any work uploaded by their users and violating copyrights – will in fact inevitably lead to preventive filtering and over-blocking (the decision to block in cases of doubt) of the content uploaded. This will in particular affect works (legally) using copyrighted material, for example by reviewing, citing or parodying it.
Such use is explicitly allowed in Art. 17 para. 7 of the proposal, but taking into consideration that for example on Youtube every minute more than 400 hours of video material is uploaded, any evaluation would not be carried out by humans, but by algorithms. The odds that such algorithms – even in times of artificial intelligence – will be able to analyze complex human communication correctly and unmistakably recognize the lawful use of copyright protected material are more than bad.
The Human Rights Framework
Legislation within in the European Union is embedded in a complex system of different levels of human rights protection. Besides its obligation to respect the Charter of Fundamental Rights of the European Union (and therefore an indirect binding to the European Convention on Human Rights (ECHR), see inter alia 52 (3) of the Charter), the International Covenant on Civil and Political Rights (ICCPR) does apply as customary international law to the EU.
Consequently, the United Nations Special Rapporteur on the right to freedom of expression, David Kaye, has raised serious concerns regarding the compatibility of the copyright reform (in particular, but not limited to Art. 17) with the right to freedom of expression on several occasions. He pointed out that any system of proactive monitoring and filtering of content leads to a risk of censorship which is a disproportionate restriction to the freedom of expression and therefore inconsistent with Art. 19 (3) ICCPR. Furthermore, Kaye points to the unclear wording of the provision which seems problematic from the viewpoint of the requirement that any restriction should be “provided by law”.
Jurisprudence on Filter Systems
Evaluating the compatibility of upload filters with the Charter, on the one hand Arts. 7 (right to private life), 8 (protection of personal data) and 11 (freedom of expression and information) have to be taken into consideration. On the other hand, Art. 17 (2) protects the rights of copyright holders.
Firstly, the introduction of upload filters would interfere with Arts. 7 and 8. Their effect to the right to privacy and protection of personal data lies in the profound analysis of the circumstances of an upload necessary to check whether an exemption which allows the use of copyrighted material is met. Indicators for being exempted from the scope of application of the copyright directive could be personal data, like for example the profession or the political opinion of an individual (imagine an artist creating caricatures or parodies).
Secondly, the (wrongful) blocking of such content would interfere with the freedom of expression of the artist as well as with the freedom to receive information of the recipient of pieces of art, as laid down in Art. 11 of the Charter. Even though the analysis or blocking is not directly performed by organs of the Union, the Charter applies. The actual interference with rights guaranteed in the Charter lies not in the blocking of content by a private actor but in the legal obligation to take such a measure.
The ECJ has issued already two judgments on the balance between these rights and the right to intellectual property. In Scarlet Extended it stated (quite long-sighted looking at today’s discussion) that the obligation to install a filtering system “could potentially undermine freedom of information since that system might not distinguish adequately between unlawful content and lawful content, with the result that its introduction could lead to the blocking of lawful communications” and therefore does not strike the “fair balance […] between the right to intellectual property, on the one hand, […] the freedom to receive or impart information, on the other” (paras. 52 and 54). These findings have been confirmed a couple of months later in SABAM (paras. 44-52).
The Bigger Picture
Besides that, the slippery slope entered with introducing such a filter system must be taken into account. In the end, the legal obligation to introduce upload filters for social media companies (as most of them are content-sharing service providers in the sense of the directive) would lead to a constant, preventive surveillance of uploaded content and thereby its users. Having in mind the hate speech discussion, the extension of such systems to not only look for copyrighted material but also insulting, pornographic or terroristic content seems to be only a question of time.
In fact, a proposal for such a law – in form of the regulation on preventing the dissemination of terrorist content online (see Art. 6) – has already been introduced by the Commission and is, despite the backlash it faces from the United Nations, Civil Society Organizations and the EU Agency for fundamental rights, undergoing the process of legislation. Similar developments take place at the national level. In Germany for example the controversial Network Enforcement Act (NetzDG) introduced high fines for social media networks if insulting or defamatory comments are not deleted or blocked within an adequate amount of time.
The Vote on Tuesday
Considering all this, the introduction of mandatory upload filters does not comply with the Charter. The only way out would be to interpret the current proposal of the copyright directive, with its “best efforts to prevent […] future uploads” clause as requiring the fast subsequent removal of content on notice. However, such an interpretation hardly seems compatible with the wording of the proposal. It seems thus inevitable that the proposal if entering into force will unduly encroach on fundamental rights. The members of the European Parliament, voting on Tuesday on this Article, should take their responsibility seriously and prevent the coming into effect of a provision which would be contrary to the Charter of Fundamental Rights of the European Union.
Erik Tuchtfeld studies law at the Heidelberg University Faculty of Law. He is a research assistant at the Max Planck Institute for Comparative Public Law and International Law.
Cite as: Erik Tuchtfeld, “Filtering fundamental rights – The European Union’s balancing of intellectual property and the freedom to receive information”, Völkerrechtsblog, 25 March 2019, doi: 10.17176/20190325-151801-0.