Google On Complexity Of ‘Right To Be Forgotten’
As previously reported, Google (as well as Microsoft and Yahoo) attended a meeting last week with EU regulators to discuss the “right to be forgotten” ruling and the search engines’ approach to handling it.
Each of the companies was given a questionnaire (via The New York Times), asking about various aspects of their practices related to complying with the ruling. Google’s has been made publicly available, and in it, the company discusses complications it faces.
Asked about criteria used to balance the company’s own economic interest and/or the interest of the general public in having access to info versus the right of the data subject to have search results delisted, Google said:
The core service of a search engine is to help users find the information they seek, and thus it is in a search engine’s general economic interest to provide the fastest, most comprehensive, and most relevant search results possible. Beyond that abstractconsideration, however, our economic interest does not have a practical or direct impact on the balancing of rights and interests when we consider a particular removal request.
We must balance the privacy rights of the individual with interests that speak in favour of the accessibility of information including the public’s interest to access to information, as well as the webmaster’s right to distribute information. When evaluating requests, we will look at whether the search results in question include outdated or irrelevant information about the data subject, as well as whether there’s a public interest in the information.
In reviewing a particular removal request, we will consider a number of specific criteria. These include the individual (for example, whether an individual is a public figure), the publisher of the information (for example, whether the link requested to be removed points to material published by a reputable news source or government website), and the nature of the information available via the link (for example, if it is political speech, if it was published by the data subject him- or herself, or if the information pertains to the data subject’s profession or a criminal conviction).
Each criterion, the company continued, has its own “potential complications and challenges”. It then proceeded to list these examples:
- It is deemed to be legitimate by some EU Member that their courts publish rulings that include the full names of the parties, while courts in other Member States anonymise their rulings before publication.
- The Internet has lowered the barrier to entry for citizen journalists, making it more difficult to precisely define a reputable news source online than in print or broadcast media.
- It can be difficult to draw the line between significant political speech and simple political activity, e.g. in a case where a person requests removal of photos of him- or herself picketing at a rally for a politically unpopular cause.
As previously assessed, it’s a real mess.
Google says in the document that it has not considered sharing delisted search results with other search engines, adding, “We would note that sharing the delisted URLs without further information about the request would not enable other search engine providers to make informed decisions about removals, but sharing this information along with details or a copy of the complaint itself would raise concerns about additional disclosure and data processing.”
For some reason, I’m reminded of that time Google accused Bing of stealing its search results.
You can read Google’s full questionnaire responses here.
As of July 18th, Google had received over 91,000 removal requests involving over 328,000 URLs. Earlier this week, Google announced dates for presentations to its Advisory Council, aimed at evolving the public conversation and informing ongoing strategy.
Image via Google