Policy —

When must search engines concede the “right to be forgotten”?

A new ruling from Italy helps clarify when public interest prevails.

FBI surveillance photograph.
FBI surveillance photograph.

One of the most important new concepts to emerge in the world of privacy in recent years is the European Union's "right to be forgotten." Although the idea was first proposed during the revision of the EU's data protection rules in 2012, it was a judgment handed down in May 2014 by Europe's highest court, the Court of Justice of the European Union (CJEU), that gave it legal force.

According to that ruling (PDF), the results of search engines operating in the EU are subject to the data protection laws there, since they frequently contain personal data. One consequence is that people have a right to demand that certain kinds of information should be deleted from those search results—but not necessarily from the sites they link to. As the CJEU's press release explained: "even initially lawful processing of accurate data may, in the course of time, become incompatible with the directive where, having regard to all the circumstances of the case, the data appear to be inadequate, irrelevant or no longer relevant, or excessive in relation to the purposes for which they were processed and in the light of the time that has elapsed."

Strictly speaking, then, the "right to be forgotten" is more a "right to be remembered correctly." However, that right is not absolute: it has to be weighed against the public interest. Links to outdated material that the person concerned believes is no longer relevant do not need to be removed if "there are particular reasons, such as the role played by the data subject in public life, justifying a preponderant interest of the public in having access to the information when such a search is made."

That places a considerable burden on search engines, which are required to weigh the competing interests of the data subject and the public when requests to remove links are made. As Google explains in its FAQ about European Privacy in Search: "Our removals team has to look at each page individually and base decisions on the limited context provided by the requestor and the information on the webpage. Is it a news story? Does it relate to a criminal charge that resulted in a later conviction or was dismissed?" According to the latest Google Transparency Report, of the 856,294 links that people requested should be removed under the EU's right to be forgotten ruling, 41 percent were deleted.

Given that the CJEU decision is less than a year old, it is not surprising that all parties affected by it—EU data protection authorities, Internet companies, privacy lawyers, and individuals—are still trying to understand exactly what it means in practice. In its EU privacy FAQ, Google mentions that it was invited to discuss its own "practical implementation" of the ruling with the Article 29 Working Party. This is "an independent advisory body on data protection and privacy, set up under Article 29 of the [EU's] Data Protection Directive," and is made up of representatives from the national data protection authorities of the EU Member States, the European Data Protection Supervisor, and the European Commission.

In November last year, the Article 29 Working Party issued its "common interpretation of the [CJEU] ruling." This includes the controversial view that "limiting de-listing to EU domains on the grounds that users tend to access search engines via their national domains cannot be considered a sufficient means to satisfactorily guarantee the rights of data subjects according to the ruling. In practice, this means that in any case de-listing should also be effective on all relevant domains, including .com." In other words, the Article 29 Working Party believes that search engine companies should be required to remove links in their results worldwide, not just in Europe.

Although those general guidelines are valuable, specific cases are needed to help flesh out the right to be forgotten ruling, and the Italian Privacy Authority this week issued a decision (original in Italian) that provides new indications in this respect.

A complaint had been made to the Authority about a refusal by Google to remove a link to an article reporting on a judicial inquiry involving the complainant. The Authority noted that this was a recent event and that there was a strong public interest. It ruled that, in this case, freedom of the press won out over the right to be forgotten.

On another issue, the Authority found in favor of the individual concerned. It held that automatically generated "snippets" typically included with search results must reflect accurately the information to which they refer. In the present case, the snippet associated the name of the person complaining with other crimes that were more serious than those in the relevant investigation. In fact, Google had already rectified this problem before the Authority required it, but the ruling will be useful for future cases.

Channel Ars Technica