Google must purge search results on demand, says EU court
A European court has ruled that people have a "right to be forgotten" and thus Google and other search engines must prune their search results in accordance with an individual's request for privacy. Specifically the court said that Google "must delete inadequate, irrelevant or no longer relevant" data upon request. Privacy advocates cheered the decision but others have doubts. "This move may sound reassuring for individuals and their personal freedom; however, it also looks difficult to enforce on a large scale, and may be very disruptive for the functioning of search engines going forward," says Luca Schiovani, an analyst at Ovum.
The Guardian has a good article on the particulars surrounding the court's decision. I recommend you give it a read. For the moment, suffice it to say, this issue isn't entirely settled. Consider this excerpt from that article:
"Google said: 'This is a disappointing ruling for search engines and online publishers in general. We are very surprised that it differs so dramatically from the advocate general's opinion and the warnings and consequences that he spelled out. We now need to take time to analyze the implications.'
The British justice secretary, Chris Grayling, has been a leading opponent of Reding's proposals for an explicit EU 'right to be forgotten.'
The Ministry of Justice has estimated that the European commission's proposals could cost British businesses, which include many leading data and tech firms, £360m a year. The information commissioner has called the 'right to be forgotten' proposals 'a regime that no one will pay for.'"
Ovum analyst Schiovani explains the problem this way: "Policy makers in the EU have long advocated for the introduction of a clear 'right to be forgotten,' which is included in the draft of the new Data Protection regulation under discussion in the EU Parliament and Council. However, these provisions should only apply to the direct controllers of personal data (e.g. a social network complying with the request to fully delete information related to an account); involving search engines for something they are not directly responsible for is likely to entail a burdensome cost, especially if the amount of requests of erasure should escalate in the future."
The EU court sees the situation differently, however. "The Court has argued that, while a search engine cannot be considered as the 'controller' of personal data in a third party website, it is a 'controller' of the index of the search engine which links key words to the relevant URL addresses; it may technically block certain search results. As such, it has to comply with the obligations of a 'data controller' set out in the EC's Directive on Data Protection of 1995," explains Schiovani.
Solving privacy issues is a tricky affair. There is no easy path forward to strike an acceptable balance between privacy and public benefit. You can expect attempts to solve the problem to be clunky and messy for awhile. It's a difficult process to figure out precisely where to draw the privacy line.
For example, have you thought of this scenario: Given the U.S. now defines corporations as people too, can future regulations intended to protect individual right to privacy be used by corporations to erase past transgressions from Internet searches so that researchers, journalists, attorneys and others cannot find that information? And what of politicians? Can they cover up criminal arrests and investigations and other information voters should know on the grounds that such information is "inadequate, irrelevant or no longer relevant?"
So I ask again: where exactly should we draw the line on privacy and can we even do that in the U.S. if even a business is a "person?" I would appreciate it if you shared your thoughts on the matter in the comments below.
- see The Guardian article