Robots as the new judges: Copyright, Hate Speech, and Platforms

Authors: Favale, M.

Journal: EIPR

Volume: 44

Issue: 8

Pages: 461-471

https://eprints.bournemouth.ac.uk/37685/

Source: Manual

Robots as the new judges: Copyright, Hate Speech, and Platforms

Authors: Favale, M.

Journal: European Intellectual Property Review

Volume: 44

Issue: 8

Pages: 461-471

ISSN: 0142-0461

Abstract:

On the 16th of October 2020, a middle-school teacher, Samuel Paty, was beheaded by a terrorist who would not know of his existence if not for a number of videos posted on social media, against which Mr Paty had filed for defamation with the local police. 1 Yet, a law against publishing heinous content online was approved in France on the 13th of May. 2 But in June, the Constitutional Council had repealed the article requiring to take down within 24 hours the incriminated content on the basis that it would trump freedom of expression. 3 Heated political debate has sparked on this decision in the light of the recent gruesome event. The topic of the liability of internet intermediaries has never been so contentious. Internet platforms have enjoyed immunity (known as Safe Harbour) both in European Union (EU) law and overseas. More recently (2019), a new Copyright Directive 4 entered into force. It was implemented by Member States in June 2021. This piece of legislation prompted criticism because it requires enhanced responsibility for internet platforms that do not remove quickly enough illegal content from their social media. 5 But how quick must an action be to be done "quickly enough" (e.g. expeditiously)? ISPs (Internet Service Providers) argue that by being "mere" intermediaries, they could not control the content that their subscribers were publishing online, and therefore could not be responsible for the unlawful actions taking place on their platforms. Rights holders argued in response that intermediaries would often benefit from infringing activities. Hence, their provision of services could not be considered totally neutral and therefore intermediaries should be held accountable. Currently, two new pieces of legislation are under way to horizontally streamline platforms’ filtering duties (the Digital Services Act and the Digital Markets Act). 6 However, a lot needs to be done to define the contours of these new norms, notably about different types of illegal content and whether they deserve different treatment. This paper discusses filtering obligation (Robots as opposed to judges) on copyright infringement v defamation/hate speech. It argues that it is not legally viable to implement the same norms on such different areas of law as the consequences of these norms’ infringement are incomparable.

https://eprints.bournemouth.ac.uk/37685/

https://uk.westlaw.com/Document/ID141D8E0062A11EDB093B11A590BB5CD/View/FullText.html

Source: BURO EPrints