Verwandte Beiträge
- How Article 17 of the EU Copyright Directive threatens Let's Play and Walkthrough Culture
- Council lets copyright reform pass – The die is cast
- Fateful Day: EU Parliament Approves Copyright Reform – No Amendments Made
- Reda: "You'll wish the mails had all come from bots."
- Council of Ministers approves compromise on copyright reform
- EU institutions agree on final text of Article 11
- Yet another independent study bashes Article 11
The EU copyright directive and its potential impact on cultural diversity on the internet
On July 6, the EU adopted the Directive on Copyright in the Digital Single Market (DSM Directive), following heated discussions of Articles 15 (formerly 11) and 17 (formerly 13) in particular. In Germany, tens of thousands of people took to the streets to demonstrate against the planned legislation in the lead-up to the vote in the European Parliament in March. Article 17 imposes much stricter liability on platforms such as YouTube. In the future, for example, these platforms will have to obtain permission from copyright holders for music videos uploaded by users. If they fail to do so, they will have to ensure that the content in question is not available on their service. The directive still needs to be transposed into the national legislation of the member states of the European Union by June 2021.
(This post was originally published on the Kluwer Copyright Blog, divided into part I and part II.)
The debates surrounding this EU copyright reform were heated. Article 17 in particular was heavily debated. Some argued the legislation would guarantee that creators could make money from their works. Others predicted the end of the internet or at least significant threats to fundamental freedoms on the net. Some politicians denigrated anxious citizen protestors as bots and purchased Google supporters. In reaction to the public uproar, the EU’s copyright rapporteur, Axel Voss (CDU/EPD), was quick to announce that upload filters should be avoided when implementing the EU Directive.
Indeed, many serious mistakes could be made during the implementation of the directive into national law. Some of this legislation’s major flaws can be corrected, however, and urgently need to be. Otherwise, the cultural diversity of the internet is at risk of being seriously impaired.
What is it all about? The host provider privilege
A fundamental principle concerning liability for rights infringements is that everyone is responsible for their own actions, and that one can only be held liable to a limited extent for the actions of others. This general rule also applies to the internet. Therefore, users are primarily liable if they post illegal content on platforms and hosting services. The providers of such services, on the other hand, are largely not responsible. Since they do not make, copy or upload any content, but only provide the technical facilities for the users to do so, they do not infringe copyright.
Platforms thus bear only a limited secondary responsibility for legal violations committed by their users. Above all, they must do what they do best: If they are alerted to infringements, they must stop them. Providers must block or remove illegal content when asked to do so. Their responsibility is therefore limited in principle to reactive behavior. This rule is called the host provider privilege or ‘safe harbor’, and it applies in a similar form throughout the liberal world. It was introduced in Europe in 2000 by the e-Commerce Directive (Article 14) and in the USA in 1998 by the Digital Millennium Copyright Act.
The effect of this allocation of responsibility is that user-generated content does not have to be proactively checked for infringements by service providers. The operators of user-generated content platforms, for example, have no reason to legally assess users’ contributions before they are uploaded. Quite the contrary: According to the existing rules the less they know about the content in question the lower their liability. If they do not check the uploaded content they are only obliged to react to complaints submitted ex post facto by rights holders. Thus, they do not have to block or remove content on their own initiative.
According to Article 17 of the DSM Directive, however, platform providers shall be directly liable for copyright infringements committed by their users. The first paragraph states that when copyright-protected material is uploaded by a user, providers are held responsible for actively, i.e. of their own volition, making this material available. For this (fictitious) act of use, providers need to obtain their own permission (a license). If not in possession of such a license, a provider can be sued the very moment the material goes online.
This primary liability massively increases the legal risk for platforms. If they are directly liable for any and all illegal uploads, they must check all content before publication and block it if they consider it illegal.
This new EU approach is based on various false assumptions, which are discussed in detail below. To follow the argument, it is important to note that the decision in favor of Article 17 was based exclusively on a very specific constellation. The aim was to oblige very large and powerful platforms, in particular YouTube, to conclude license agreements with equally large and powerful companies from the entertainment industry, as well as collective rights management organisations (CMOs). The approach was mainly promoted by the music industry and CMOs in the music sector.
When exceptional circumstances are seen as the general rule
Ironically this is the one constellation where it has been common for years to work with content filters and licenses. YouTube has been concluding contracts with music companies and collecting societies for a long time. Its Content ID System gives rights owners the ability to manage their content, allowing them to ensure that content is either blocked or approved and monetized. Thus, a closer look at Article 17 reveals that it is not about YouTube paying the music industry and collecting societies in the first place. It is about YouTube paying more.
Focusing only on this particular constellation, the European legislator devoted itself to the following basic assumptions:
(1) It is possible for service providers to conclude licenses for any content that users publish on their platforms.
(2) It is always in the interest of the author or copyright holder to prevent any use that does not comply with copyright law (especially those for which they have not granted a license).
Both assumptions may apply to the alleged standard case of “YouTube versus music industry.” However, Article 17 is by no means limited to this scenario. Rather, it comprehensively addresses the question of how to deal with copyright-relevant user content on all platforms. Looking at the context from this holistic point of view, both assumptions prove to be wrong and dangerous.
In principle, Article 17 requires platform providers to obtain licenses for user content, usually in exchange for a royalty. By acquiring a license, a provider can protect itself against liability risks. However, these licenses must be available before the licensed content is used, i.e. before it is made accessible on the platform.
The myth of comprehensive licensing for user-generated content
A comprehensive, preemptive clarification of rights cannot be achieved by any provider, for this would entail foreseeing any content users could possibly upload. Obviously, in view of the immense number of copyright protected works, this is per se impossible.
In addition, many rights cannot be clarified at all for practical reasons. For example, if there are no central clearing houses where all the necessary rights can be obtained (such as a CMO), licensing costs will increase immeasurably. Contrary to the assumptions made by the EU legislator, music is not at all a prime example in this respect. For the vast majority of content, there are neither CMOs nor central licensing bodies that could grant all the necessary rights. This applies, for example, to texts, films, photos or computer games – in short, to billions of copyright protected works.
In the absence of a central “one-stop shop” for licensing, platform providers would have to conclude individual contracts with thousands, perhaps millions, of authors and rights holders in order to clarify all conceivable rights applicable to user-uploaded content. And even if this effort could conceivably be undertaken, there would be countless cases where the rights could not be obtained for other reasons, e.g. because it is not clear who owns them, because the rights holder no longer exists (publishers go bankrupt or close for other reasons), because the author cannot be found, etc.
In other words, all-encompassing licensing for all user content on platforms is an unobtainable ideal, a myth. In countless millions of cases no rights can be obtained, even with the most complex licensing efforts.
Illegal content for the common good
The logic of the DSM Directive effectively requires that platform providers prevent non-licensed content from ever being put online. This content must be filtered and blocked. Since they otherwise expose themselves to incalculable liability risks through legal proceedings, providers have no other choice. Because manual checks of mass user content are simply impossible, “upload filters” must be used. This refers to algorithms that distinguish between legitimate and illegal uploads. What they do not recognize as licensed or at least obviously legal they will block or delete.
This reality is embedded into Article 17’s DNA. The approach is based on the second of the above misconceptions. The European legislator appears to assume that copyright infringements are always undesirable and that there is always an interest in stopping them. Again, this may be true as regards the relationship between the music industry and YouTube. However, from a holistic point of view, this assumption also proves to be wrong.
Copyright law is very extensive. It often prohibits even the most minor uses of third-party material in one’s own publications. This legal situation very often contradicts the interests not only of the user but also of the rights holder himself.
The host provider privilege as a safeguard for a diverse online culture
It is simply a fact that the internet is rich in content that is either illegal or has an unclear legal status under copyright law. Very often, the reason that such content is not removed is that authors or rights holders do not object to its publication. There can be many reasons for this, but a widespread reality is that many copyright infringements do not harm the rights owners but rather benefit them. Examples include fan content, remixes, tributes with small excerpts or karaoke videos. In some cases, these uses may be covered by exceptions or limitations; in many others they will not.
Even if there are illegal, minor violations are usually tolerated, such as user videos in which protected material (e.g. a painting on the wall) can be seen or podcasts in which someone recites a poem. Because such casual, minor copyright infringements generally cause no damage, the effort required to negotiate and conclude license agreements for these kinds of use is completely disproportionate.
But there is more to it: Even substantial subsequent uses are often useful rather than harmful to the rights holder. Remixes or mash-ups, sound collages or memes, which spread widely because of their quality or sheer popularity, and garner many social-media “likes,” often exert significant advertising effects on behalf of original rights owners. A vivid fan culture, for example, increases the awareness and popularity of music, films or video games immensely. Even if rights are technically infringed, such cultural expressions do not result in any disadvantages; generally net benefits accrue to authors and publishers.
Accordingly, this kind of online content is valued, tolerated and often even promoted. If extensive rights had to be cleared in every case, it usually would not exist, at least not on the internet.
The current liability regime for platform and host providers safeguards the resulting cultural diversity of the internet. It ensures that copyright infringing content is not deleted or blocked as long as the rights holder does not object to it. When rights owners tolerate such uses, content remains available even if technically illegal. The effect is to the common good. Everybody benefits: users, authors and rights holders, and the general public, which has a genuine interest in a diverse online culture.
The effect of “private judges” on the availability of tolerated illegal content
The implementation of Article 17 could fundamentally change this situation if national legislators do not pay close attention to these circumstances. Platform providers are neither users nor rights holders. They cannot judge, nor are they entitled to decide, whether illegal content is desirable, tolerated or in the public interest. If they are threatened with cease-and-desist proceedings or lawsuits for any copyright infringements committed by their myriad users, they must do everything in their power to minimize their liability. They can hardly sit back and “hope” that infringements will be tolerated by the copyright holder.
The responsibility they have under Article 17 to prevent copyright infringements in the first place inevitably results in a much tighter and stricter control than could ever be exercised by all authors or producers combined.
This situation demonstrates well that the general tendency in platform and internet regulation to involve intermediaries in legal disputes between other actors is dangerous. The approach may seem obvious at first glance: Due to their technical sovereignty over their systems, providers are in fact in the best position to eliminate conflicts and infringements. Because they are in the best position to enforce regulation, so the argument seems to go, they should make the decisions themselves. And they should be liable for any mistakes.
The inclusion of the uninvolved third party undermines two fundamental legal principles. On the one hand, it collides with the principle of “Where there is no plaintiff, there is no judge.” This (German) idiom refers to the basic idea that legal violations are generally only prosecuted if the infringed party takes action against them. This is true especially for civil law.
Because of their own responsibility and the resulting liability exposure, however, platform and host providers are now forced to intervene as third parties in the conflict of interest between rights holders and the users. They are to judge whether user content can be posted. And they must take a decision irrespective of whether the rights holder would even want to pursue the potentially illegal use. In short, in such cases there would be a judge without a plaintiff.
This is problematic because, among other things, the decisions made by service providers will often not be based on the same considerations that rights holders would have. After all, the interests of platform providers and rights holders are not the same.
Host and platform providers are not courts!
This circumstance points to the second fundamental problem of the inclusion of third parties in legal conflicts. It is normally left to the courts to resolve them. However, online service providers are not courts. They are neither democratically legitimized or neutral nor well suited for this role. Unlike the courts, they neither serve the public interest nor do they have a constitutional mandate.
It is obvious that very few disputes surrounding the application of Article 17 will ever be decided by a court. GEMA (a CMO for music in Germany) or Sony Music may go to court against YouTube. But the final decision on the filtering or blocking of individual content – i.e. millions and millions of individual cases in total – will usually be made by the platform provider. The fact that a platform will in most cases leave this decision to an algorithm should make the explosive nature of this issue clear.
Ways to cure major flaws for national legislators
In implementing Article 17, national legislators still have considerable influence over the final outcome. In transposing this legislation on the national level, they can mitigate excessive and collateral effects through sensible and carefully designed solutions. It appears that the creators of the DSM Directive were at least aware that Article 17 entails very considerable risks, the scope of which nobody has yet been able to assess. This explains why back doors have been provided, the use of which may significantly reduce the negative impact of this new regulatory approach.
First and foremost, it would be useful and important to keep the scope of Article 17 as narrow as possible when transposing it. There are many indications in the recitals of the Directive that only very large host and platform providers should be affected. They imply that Article 17 should only apply to platforms that seriously compete with large licensed streaming services (such as Spotify or Netflix). Startups are partially excluded up to a certain age or size. The same applies to non-commercial sites and certain forms of services such as online marketplaces or scientific repositories.
In short, Article 17 is clearly designed as a special rule, as an exception. When it is implemented, the scope must be limited to ensure that the exception does not become the general rule.
Restrict licensing obligations based on feasibility
It is also important to ensure that the licensing obligations for service providers are not overwhelming. According to Article 17, para 4, platforms are exempted from primary liability if they have made all reasonable efforts to obtain a license. If the provider has made this effort, it only has to block or remove content that has been reported by the rights holder.
In other words, once the licensing requirements have been met, liability rules will be applied that are broadly in line with the current liability regime. Accordingly, how these requirements are defined is of fundamental importance to the scope of Article 17. Hopeless and inappropriate licensing efforts that would amount to an individual clearing of thousands of individual rights should not be required. The obligations should also be graduated according to the capacity of service providers to meet them, and they should take into account the existence of central licensing bodies. Furthermore, the emergence of more and more effective central licensing bodies should be promoted.
Bottom line
Article 17 of the DSM Directive was designed with tunnel vision and without foresight. For the “problem” which it primarily targets it is obsolete: YouTube and the music industry have reached agreements in the absence of an Article 17. The vast majority of the other situations that are affected by Article 17 in general are not comparable with this supposed standard context. In most cases Article 17 is not manageable for authors and rights holders, nor for providers. If this fact is again neglected in the transposition process, the culture on the internet will suffer considerable damage.
Although it may seem grotesque, the primary goal in implementing Article 17 must be to keep its scope as narrow as possible. This is the only way to prevent massive collateral damage to cultural diversity online. If this were to happen, everyone would lose: creative people whose content is blocked, authors and rights holders who benefit from illegal but tolerable use, and of course the users, who will constantly see the message: “This video is not available in your country.”
Text freigegeben unter Creative Commons BY 3.0 de.Diese Lizenz gilt nicht für externe Inhalte, auf die Bezug genommen wird.