seeming contradiction of its recent decision in Google v. CNIL, the Court of Justice of the European
Union (CJEU or Court) has ruled that European Union (EU) law allows EU nations
to order platforms the host content like in Facebook in this case to remove
illegal content or any identical or equivalent illegal content in the EU and
possibly throughout the world. Of course, this ruling seems contrary to the ruling
in the Google v. CNIL case, but it must be stressed that the Google case
was interpreting a provision of the General Data Protection Regulation (GDPR).
This case interprets an older provision of EU law, Directive 2000/31/EC of the
European Parliament and of the Council of 8 June 2000 on certain legal aspects
of information society services, in particular electronic commerce, in the
internal market (aka the “Directive on electronic commerce.”) Consequently,
under this EU law, EU nations may enact and enforce laws to allow their courts
to order multi-national platforms to remove unlawful information, copies of
such information, and equivalent information. The CJEU added that an EU nation
could also order “a host provider to remove information covered by the
injunction or to block access to that information worldwide within the
framework of the relevant international law.” However, the CJEU stops short of
imposing a duty on a company to conduct a comprehensive and exhaustive search
on its platforms for all such illegal information; rather just that information
deemed to be identical or equivalent would need to be taken down.
This case concerns “a message [on Facebook] containing statements harmful to the reputation of Ms Glawischnig-Piesczek…a member of the Nationalrat (National Council, Austria), chair of the parliamentary party ‘die Grünen’ (The Greens) and federal spokesperson for that party.” The CJEU explained that
On 3 April 2016, a Facebook Service user shared on that user’s personal page an article from the Austrian online news magazine oe24.at entitled ‘Greens: Minimum income for refugees should stay’, which had the effect of generating on that page a ‘thumbnail’ of the original site, containing the title and a brief summary of the article, and a photograph of Ms Glawischnig-Piesczek. That user also published, in connection with that article, a comment which the referring court found to be harmful to the reputation of the applicant in the main proceedings, and which insulted and defamed her. This post could be accessed by any Facebook user.
The CJEU related that “[b]ecause Facebook Ireland did not withdraw the comment in question, Ms Glawischnig-Piesczek brought an action” that eventually reached Austria’s Supreme Court (the Oberster Gerichtshof) which subsequently asked the CJEU for a preliminary ruling on EU law, specifically:
- Does Article 15(1) of Directive [2000/31] generally preclude any of the obligations listed below of a host provider which has not expeditiously removed illegal information, specifically not just this illegal information within the meaning of Article 14(1)(a) of [that] directive, but also other identically worded items of information:
- in the relevant Member State;
- of the relevant user worldwide;
- of the relevant user in the relevant Member State?
(2) In so far as Question 1 is answered in the negative: does this also apply in each case for information with an equivalent meaning?
(3) Does this also apply for information with an equivalent meaning as soon as the operator has become aware of this circumstance?
As noted this case explicates a provision of EU law and regulation aside and apart from the GDPR. The CJEU explained that in substantive part, Article 15 of Directive [2000/31] provides that “[m]ember States shall not impose a general obligation on providers, when providing the services covered by Articles 12, 13 and 14, (i.e. “mere conduit,” “caching,” and “hosting”) to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity.”
The CJEU held that
Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’), in particular Article 15(1), must be interpreted as meaning that it does not preclude a court of a Member State from:
- ordering a host provider to remove information which it stores, the content of which is identical to the content of information which was previously declared to be unlawful, or to block access to that information, irrespective of who requested the storage of that information;
- ordering a host provider to remove information which it stores, the content of which is equivalent to the content of information which was previously declared to be unlawful, or to block access to that information, provided that the monitoring of and search for the information concerned by such an injunction are limited to information conveying a message the content of which remains essentially unchanged compared with the content which gave rise to the finding of illegality and containing the elements specified in the injunction, and provided that the differences in the wording of that equivalent content, compared with the wording characterising the information which was previously declared to be illegal, are not such as to require the host provider to carry out an independent assessment of that content, and
- ordering a host provider to remove information covered by the injunction or to block access to that information worldwide within the framework of the relevant international law.
There are some important differences to note between the Facebook and Google decisions. First, the grounds upon which the challenges were brought differ. In the Google case, the CJEU was looking at the so-called Right To Be Forgotten under the GDPR, and in the Facebook case, the Court, as noted, was examining the applicability of Directive [2000/31]. Second, the information in Facebook was found to be defamatory and hence illegal while in the Google case, the French data protection authority (DPA) was acting generally when it informed Google that “when granting a request from a natural person for links to web pages to be removed from the list of results displayed following a search conducted on the basis of that person’s name, it must apply that removal to all its search engine’s domain name extensions.” Google objected to this statement of policy and then was fined. Finally, the CJEU made clear in Google that while “EU law does not currently require that the de-referencing granted concern all versions of the search engine in question, it also does not prohibit such a practice.” The Court added that “[a]ccordingly, a supervisory or judicial authority of a Member State remains competent to weigh up, in the light of national standards of protection of fundamental rights…a data subject’s right to privacy and the protection of personal data concerning him or her, on the one hand, and the right to freedom of information, on the other, and, after weighing those rights against each other, to order, where appropriate, the operator of that search engine to carry out a de-referencing concerning all versions of that search engine.”