4 October 2019 | By Michael Thaidigsmann
Online platforms may be compelled by courts to remove defamatory content world-wide, ECJ rules
The European Court of Justice has issued an important ruling on hate speech. Internet providers such as Facebook are obliged to delete or deny access to content deemed illegal by a court – even if a comment is not identical to the original post.
Online service providers and social media platforms may be compelled to block or take down illegal content from all of their platforms world-wide, the European Court of Justice has ruled. This applies to cases where content was found to be unlawful by a court in the EU and where the relevant post is either “identical” or “equivalent” in nature to the content previously declared to be unlawful.
Currently, social media platforms such as Facebook or Twitter are not liable for stored information if they have no knowledge of its illegal nature or if they act “expeditiously” to remove it as soon as they are made aware of it.
The case was referred to the judges in Luxembourg by Austria’s Supreme Court. Eva Glawischnig-Piesczek, a former MP as well as leader of the Austrian Green party, sued Facebook – which has its European headquarters in Dublin – and sought an injunction to force Facebook Ireland to remove a user comment the court had deemed defamatory.
In interpreting the E-Commerce Directive, the EU’s highest court found that a host provider can be ordered to take down content that is “essentially unchanged” compared with the original content that “gave rise to the finding of illegality and containing the elements specified in the injunction,” given that the “provider may have recourse to automated search tools and technologies.”
The Facebook user in question had shared on his page an article from the Austrian online news magazine oe24.at entitled ‘Greens: Minimum income for refugees should stay’. That had the effect of generating on that page a ‘thumbnail’ of the original site, containing the title and a brief summary of the article, and a photograph of the politician.
A comment accompanying the article, which labelled her as a “lousy traitor,” “corrupt oaf” and member of a “fascist party,” was found by an Austrian court to be insulting and harmful to Glawischnig-Piesczek’s reputation.
Although the social media platform geo-blocked access to the post for users based in Austria, it remained accessible to Facebook in the rest of the world. In her lawsuit, Glawischnig-Piesczek demanded that Facebook not only take down the specific post she had identified but also look for and delete “identical” and “equivalent” posts everywhere.
In principle, the EU’s E-Commerce Directive prohibits member states from imposing general monitoring obligations on online service providers. However, according to the ECJ, removal of content that is “equivalent” (but not identical) to the original post if the differences in the wording of that content, compared with the wording characterising the information that had been declared illegal by a court, “are not such as to require the host provider to carry out an independent assessment of that content.”
In other words, where automated search tools and technologies to identify content similar to that deemed illegal are at their disposal, social media and other online providers must take down such information, nonetheless.
The justices in Luxembourg also ruled that providers can be forced by national courts to remove, or block access to, objectionable posts from their platforms worldwide, “within the framework of the relevant international law”, and not merely restrict their measures to the country in question.
Ted Shapiro, a solicitor at the UK law firm Wiggin, was quoted by the Guardian newspaper as saying: “This provides more clarity for platforms as to what they must do to comply with their obligations under the law.
“The court recognised that simply requiring a platform to take down a specific piece of defamatory content and content identical to that would be too easy to circumvent. […] This, the court said, does not constitute an excessive burden on the platform, particularly where it has recourse to automated search tools and technologies.”
However, campaigners for free speech lambasted the ruling.
The court was “presuming a level of technological sophistication and degree of specificity that simply do not, and likely never will, exist,” American law professor Jennifer Raskal wrote, adding: “If platforms do exactly what the court suggests — take down any post with a combination of particular words of concern — it would almost certainly sweep in large quantities of totally harmless and legitimate speech. We are talking about global censorship on a potentially broad scale, and a severe ossification of public debate and discourse as a result.”
Thomas Hughes, director of Article 19, an NGO that promotes free speech, said: “This judgment has major implications for online freedom of expression around the world.” It “also means that a court in one EU member state will be able to order the removal of social media posts in other countries, even if they are not considered unlawful there. This would set a dangerous precedent where the courts of one country can control what internet users in another country can see. This could be open to abuse, particularly by regimes with weak human rights records.”