This guide aims to familiarise readers with Article 17 of the DSM Directive, which regulates the use of copyrighted content in certain online platforms. The section was authored by Teresa Nobre and Paul Keller. It represents the views of Communia on the implementation of that provision.
<aside> ⚠️ This guide is still work in progress and is only partially complete. We are still working on the implementation scenarios and the model language sections that can be found in our other guides. If you require additional information or have questions please get in touch via [email protected].
</aside>
Article 17 changes the liability rules for most for-profit content sharing platforms. They are now deemed to be “sharing” themselves the content uploaded by their users. As a consequence, they are directly liable for any uploads that infringe copyright. This means that they might need to license all content available on their platforms or employ automatic content recognition technologies to filter all uploads by users for potential infringements. This may result in a significant limitation of users' freedom of expression, since not every protected content uploaded by users is licenseable and, on the other hand, automated filters cannot recognize many perfectly legal uses of protected content (e.g. parodies).
Taking into account that users rights will be at greater risk if the platforms rely on filters than if they obtain authorization to communicate their users’ uploads, national lawmakers should fully explore legal mechanisms for granting those authorizations and limit, to the extent possible, the application of filtering technologies. Turning the exclusive right granted by Article 17 into a remuneration right or into a copyright exception or limitation subject to remuneration would be the ideal solutions. Member States should also add strong users rights safeguards and ensure they have broad and robust copyright exceptions in place.
Article 17 is supposed to address a so-called "value gap" identified by the music industry. According to this industry, online platforms such as YouTube and Facebook make huge profits by selling advertisements alongside copyrighted content uploaded by their users–all without adequately rewarding the copyright owners. However, making platforms liable for copyright infringements committed by their users will negatively impact users' rights. Due to potential impact on freedom of expression, the Polish government filed an action for annulment of Article 17, which means the CJEU still has a saying about the faith of this controversial provision. In the meantime, EU countries have until June 2020 to implement it.
Before the new Copyright Directive, online platforms that host content uploaded by users relied on Article 14 of the E-Commerce Directive to provide their services. Under this provision, platforms can host uploaded content without any risk, as long as they remove such content once they receive information that it is infringing on someone else’s rights. This limitation of liability for copyright infringement committed by users provided the legal foundation for the development of a wide range of online platforms that allow user uploads. A large number of platforms relied on this safe harbour (although the CJEU never clarified if big platforms such as YouTube could indeed be eligible for this protection); in addition, many platforms also entered into licensing agreements with copyright owners to ensure continued availability of copyrighted content on their platforms and to show advertisements alongside that content.
The new Copyright Directive, on the one hand, excludes certain for-profit content-sharing platforms from the above-mentioned liability protections, and, on the other hand, makes them liable for content uploaded by their users that infringes someone else’s copyright. As a result, they have two options: (a) they obtain authorizations from copyright owners to communicate such content or, if no authorization is granted, (b) they take a set of steps to be exempted from liability for such infringing content, such as actively searching for infringing content by filtering or other mechanism.
The first option is the preferable option, from the users rights perspective, but it will only be effective if Member States do not rely on individual licensing to grant authorizations to platforms for every piece of content that is available on their services. The second option may require the use of automated filtering technology and, thus, result in widespread over-blocking of users’ uploads, interfering with uses made under copyright exceptions and with fundamental freedoms, such as freedom of expression and data protection.
Let’s have a look at the Article in detail: