BLOG >>

UPDATE – CJEU follows Advocate General and goes even further: Facebook can be obliged to delete hate postings of equivalent meaning

If courts declare a hate posting illegal, they can require social networks (here Facebook) to remove not only this posting but also all posts with the same meaning, even if a different user posted them. EU law is not opposed to this. The CJEU stated this in the Case Eva Glawischnig-Piesczek v Facebook (C-18/18). Contrary to what is often reported, the CJEU held that this obligation only applies once a national court has declared the content unlawful. It does not follow from the decision that social networks are required to pre-emptively remove content before a court order is issued.  

We have already reported here on the background of this decision. In a nutshell: Former Green Party leader Glawischnig took legal action against Facebook because the company refused to delete a defamatory comment about Glawischnig. In these proceedings, the Supreme Court ultimately referred the question to the CJEU as to how far deletion obligations for Facebook may go, especially if they go beyond the concrete posting of the user. According to the E-Commerce Directive (ECR), host providers must not be obliged to monitor all content and actively search for illegal circumstances (Art 15 ECR). General and preventive monitoring obligations are therefore taboo. However, a court may impose specific monitoring obligations.

In the present proceedings, the Supreme Court assumed that Facebook was a host provider. On the question of when a platform operator becomes a content provider, however, case law is currently in a state of flux (see the preliminary ruling applications of the German (C-682/18) and the Austrian (4 Ob 74/19i) Supreme Court both concerning the video platform YouTube).

Obligation to remove equivalent comments

The CJEU draws the limit of permissible monitoring obligations even further than the Advocate General (see here) since the Advocate General not only allows an obligation to delete equivalent postings from the original user, but also for equivalent postings from other users. However, this still does not apply preventively, but only if ordered by a national court in an individual case.

In addition, the CJEU broadens the definition of “information with an equivalent meaning” beyond the definition given by the Advocate General. The information is of the same meaning if it conveys a message the content of which remains essentially unchanged. The illegality of information does not arise from the use of specific terms but from the message conveyed thereby (paragraph 39f of the decision). The Advocate General understood equivalent information to mean only essentially identical postings with a different spelling or punctuation. However, according to the CJEU, meaningful information must contain specific elements (e.g. the name of the offended person) that allow the host provider to use automated search tools and technologies. The host provider must not be obliged to make an autonomous assessment in each individual case (paragraph 45f of the decision).

This offers the potential for further dispute. It is difficult for automated search programs to detect information with the same meaning but not the same wording. Machines cannot yet understand the meaning of text and “specific details” can have a completely different meaning in different contexts. Looking at the example the CJEU chose – the name of the offended person – this becomes particularly clear. For instance, the name of a well-known politician might be used thousands of times every day in social networks, and it will be difficult to rely on “specific details” when assessing legitimate criticism and unacceptable defamation.

Social networks therefore do not have an easy task. Some voices already fear for freedom of expression and see Facebook pushed into the role of judge and jury over acceptable criticism. However, it should be stressed that Facebook is not and cannot be obliged to make an “autonomous assessment” of what is allowed and what is not. This decision must be taken by a court, which must also comply with the legal framework. This suggests that “information with an equivalent meaning” is to be understood more narrowly than one might think at first glance. It therefore remains to be seen how the Austrian Supreme Court will finally formulate Facebook’s cease-and-desist obligation.

Worldwide scope of the obligation to remove comments?

Also criticised was the CJEU’s ruling that the ECR does not prevent member states from ordering a host provider to delete a posting worldwide. According to the critics, the standards of freedom of expression are internationally wholly different, and even authoritarian states can demand a worldwide removal.

However, the CJEU also shares these concerns in principle. Only in September the CJEU stated that the so-called “right to be forgotten” on the Internet should at most lead to removal within the EU, because the balance between the right to privacy and freedom of expression may be determined differently in each country (including within the EU) (see C-507/17 – Google (Portée territoriale du déréférencement)).

In the present decision, the CJEU merely stated that Union law does not contain any provision as to which territory is covered by the removal obligations in question. It is therefore up to the Member States to regulate this question. The criticism of the CJEU must therefore be viewed against the background that, due to the absence of a norm in Union law on the territorial scope the hands of the court were tied in the “Glawischnig” case. This is the difference to the “Google” case, where the GDPR was applied, which contains a norm on its territorial scope.

Therefore, the Austrian Supreme Court must now rule on the scope of the removal obligation.