CJEU Advocate General in favor of far-reaching obligations of Facebook to remove illegal content

While there are legal remedies for victims of hate comments or other legal violations on Facebook and similar platforms, an action against the original infringer often comes too late. It is in the nature of social networks that in an instant content can be “shared” to an unlimited number of accounts. Once out there, the infringer himself is no longer in a position to get rid of his creation. Only the operator of the social network has the technical possibilities to do so. But how far does the obligation of the operator to identify and delete illegal content go?

The Austrian Supreme Court referred this question to the CJEU in the case of Eva Glawischnig-Piesczek v Facebook Ireland Ltd (C-18/18). The decision of the CJEU is still pending. However, the European Advocate General, whom the CJEU follows in many cases, has now delivered his opinion: According to the opinion, national courts should be able to order a social network, by means of an injunction, to locate and delete not only the offending posting itself, but also all content identical to an illegal posting, regardless of its source (i.e. the same user). Even further, the network may also be obliged to delete equivalent postings of the same user, but not those of other users. Moreover, national courts are not precluded to give these obligations worldwide effect.


In 2016, a Facebook user posted an article entitled “Green Party: Minimum security for refugees should remain” [free translation]. The preview of the article (a so-called thumbnail) showed a picture of the then Green party leader Eva Glawischnig. The user also added a degrading accompanying text in which he described Glawischnig, among other things, as corrupt and a “bad traitor to the people”.

As Facebook refused to delete the post even after a complaint, Glawischnig applied for an injunction. The Vienna Commercial Court granted the injunction and stated that the posting was both defamatory (Section 1330 General Civil Code) and a violation of the right to one’s own image (Section 78 Copyright Act). An appeal to the right to freedom of expression was inadmissible without any connection to a political debate. Facebook blocked access to the content, but only from Austria.  

In the second instance, the Vienna Higher Regional Court confirmed Facebook’s obligation to delete all identical content and rejected Facebook’s request to restrict the order to Austria.  However, according to the Higher Regional Court, Facebook should only have to delete equivalent statements after third parties report them.

Subsequently the Supreme Court referred to the CJEU the question of whether a host provider such as Facebook may be obliged not only to delete infringing content of which the host provider is already aware (e.g. because of a user complaint), but also (i) to delete identical content from the same or other users (and thus monitor for such content), or even (ii) to delete equivalent information from the same or other user,s and (iii) whether this obligation could apply worldwide.

The E-Commerce Directive  

According to the E-Commerce Directive (ECD), the liability of host providers such as Facebook is limited. Host providers are platforms that essentially only store content entered by users and do not play an active role that would allow them to know and control the content. They are not responsible for content as long as they do not become aware of it (Art 14 ECD, implemented in Austria in Section 16 E-Commerce-Act (ECA)). Host providers must not be obliged to monitor all content or actively search for circumstances indicating illegal activity (Art 15 ECD).

A general obligation for host providers to monitor all comments preventively for hateful content would therefore be contrary to EU law. However, monitoring obligations are admissible in specific cases by order of courts (Recital 2. and Art 47 and 14 para 3 ECD, see also CJEU C-324/09 – L’Oréal v eBay).

The central problem is therefore the distinction between permissible specific and inadmissible general monitoring obligations.

The Opinion of the Advocate General

Identical contents

The Advocate General first states that a host provider may be required to delete any further identical violation of the same user. The obligation to monitor the content of an individual user for a specific violation is not a general monitoring obligation.

This also applies to content posted by other users with the same wording. The host provider would thus have to monitor all content on its platform. However, this monitoring is limited to identical contents and thus to a specific infringement.

The obligation to delete identical content is also intended to ensure a balance between the various fundamental rights since such monitoring does not require any elaborate technical aids and is therefore not an excessive burden on Facebook. On the other hand, such monitoring and deletion are necessary to ensure the effective protection of privacy and personal rights.

Equivalent contents

On the other hand, according to the Advocate General, in the case of equivalent content, i.e. postings which, for example, have a different spelling or punctuation, a monitoring obligation is only permissible for content from the original user. A monitoring obligation for equivalent content from other users would be an excessive burden as it would require technically advanced solutions. In addition, the host provider would leave its role as a passive intermediary of content and actively participate in shaping content and thus exercise a kind of censorship and possibly violate freedom of expression.  

Worldwide validity of the injunction

Since the ECD does not regulate the territorial scope of the obligation to remove content, the Advocate General considers that it is for the Member States to determine its scope. Provisional injunctions can therefore potentially apply worldwide.


If the CJEU were to follow the opinion of the Advocate General, it would be a strengthening of the rights of hate comments victims, which is welcome in principle. However, the boundary between inadmissible defamation and admissible free expression of opinion is fluid and it always depends on the individual case. Thus, each individual case should be closely scritunized and seen in perspective. In particular, the call for an obligation to delete content worldwide raises sensitive questions: What is considered permitted criticism in one country may be prohibited in another.

An exciting question for the future is whether the principles developed here by the Advocate General for comments also apply to “more modern” forms of defamation, for example, memes (pictures with short text) or deepfakes. The latter are deceptively real images or videos produced with the help of artificial intelligence. Such technologies can be used, for example, to make politicians give invented speeches (see, for example, here: The fact that the search for identical images and videos requires more technical effort than the search of text, speaks against an expansion of the principles developed by the Advocate General. The low burden of searching for texts was repeatedly used by the Advocate General as a justification. However, due to the ever-increasing capability of image recognition software, this is probably a problem that can be overcome.

Nevertheless, there are very limited cases where a distinction between permitted criticism, which can also take place as satire and parody, and unlawful defamation can be successful without competent human analysis. A proper and legally justifiable evaluation of such facts is in many cases only possible in context and requires precise knowledge of the relevant jurisprudence as well as social, cultural and political background knowledge.

Furthermore, it is not yet entirely clear whether Facebook is to be classified as a host provider at all. The Advocate General could not respond to this, as this was not part of the referred questions, but suggests that this is doubtful (para. 30 of the Advocate General’s opinion). Facebook, of course, does not reproduce the user-generated content “neutrally”, but sorts content from sources that the user already “follows” according to a complicated formula and continuously recommends new profile pages and groups. However, the CJEU will soon address a very similar question. Both the German Federal Supreme Court (C-682/18) and the Austrian Supreme Court (4 Ob 74/19i) have asked whether the video platform YouTube has lost its position as a host provider as a result of similar active interventions. Should the CJEU affirm this, it would only be a small step towards denying Facebook the host provider position. If the CJEU were to affirm this, this would probably also have an effect on Facebook.