Searching for the right balance
July 11th, 2014 | Published in Google Blog
In May, the Court of Justice of the European Union established a “right to be forgotten." Today, we published an op-ed by David Drummond, senior vice president of corporate development and chief legal officer, in the U.K.'s The Guardian, Germany's Frankfurter Allgemeine Zeitung, France's Le Figaro and Spain's El Pais, discussing the ruling and our response. We're republishing the op-ed in full below. -Ed.
When you search online, there’s an unwritten assumption that you’ll get an instant answer, as well as additional information if you need to dig deeper. This is all possible because of two decades worth of investment and innovation by many different companies. Today, however, search engines across Europe face a new challenge—one we’ve had just two months to get our heads around. That challenge is figuring out what information we must deliberately omit from our results, following a new ruling from the European Court of Justice.
In the past we’ve restricted the removals we make from search to a very short list. It includes information deemed illegal by a court, such as defamation, pirated content (once we’re notified by the rights holder), malware, personal information such as bank details, child sexual abuse imagery and other things prohibited by local law (like material that glorifies Nazism in Germany).
We’ve taken this approach because, as article 19 of the Universal Declaration of Human Rights states: “Everyone has the right to freedom of opinion and expression; this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers."
But the European Court found that people have the right to ask for information to be removed from search results that include their names if it is “inadequate, irrelevant or no longer relevant, or excessive.” In deciding what to remove, search engines must also have regard to the public interest. These are, of course, very vague and subjective tests. The court also decided that search engines don’t qualify for a “journalistic exception.” This means that The Guardian could have an article on its website about an individual that’s perfectly legal, but we might not legally be able to show links to it in our results when you search for that person’s name. It’s a bit like saying the book can stay in the library, it just cannot be included in the library’s card catalogue.
It’s for these reasons that we disagree with the ruling. That said, we obviously respect the court’s authority and are doing our very best to comply quickly and responsibly. It’s a huge task as we’ve had over 70,000 take-down requests covering 250,000 webpages since May. So we now have a team of people individually reviewing each application, in most cases with limited information and almost no context.
The examples we’ve seen so far highlight the difficult value judgments search engines and European society now face: former politicians wanting posts removed that criticize their policies in office; serious, violent criminals asking for articles about their crimes to be deleted; bad reviews for professionals like architects and teachers; comments that people have written themselves (and now regret). In each case, someone wants the information hidden, while others might argue it should be out in the open.
When it comes to determining what’s in the the public interest, we’re taking into account a number of factors. These include whether: the information relates to a politician, celebrity, or other public figure; if the material comes from a reputable news source, and how recent it is; whether it involves political speech; questions of professional conduct that might be relevant to consumers; the involvement of criminal convictions that are not yet “spent”; and if the information is being published by a government. But these will always be difficult and debatable judgments.
We’re also doing our best to be transparent about removals: for example, we’re informing websites when one of their pages has been removed. But we cannot be specific about why we have removed the information because that could violate the individual’s privacy rights under the court's decision.
Of course, only two months in, our process is still very much a work in progress. It’s why we incorrectly removed links to some articles last week (they have since been reinstated). But the good news is that the ongoing, active debate that’s happening will inform the development of our principles, policies and practices—in particular about how to balance one person’s right to privacy with another’s right to know.
That’s why we've also set up an advisory council of experts, the final membership of which we're announcing today. These external experts from the worlds of academia, the media, data protection, civil society and the tech sector are serving as independent advisors to Google. The council will be asking for evidence and recommendations from different groups, and will hold public meetings this autumn across Europe to examine these issues more deeply. Its public report will include recommendations for particularly difficult removal requests (like criminal convictions); thoughts on the implications of the court’s decision for European Internet users, news publishers, search engines and others; and procedural steps that could improve accountability and transparency for websites and citizens.
The issues here at stake are important and difficult, but we’re committed to complying with the court’s decision. Indeed it's hard not to empathize with some of the requests we've seen—from the man who asked that we not show a news article saying he had been questioned in connection with a crime (he’s able to demonstrate that he was never charged) to the mother who requested that we remove news articles for her daughter’s name as she had been the victim of abuse. It’s a complex issue, with no easy answers. So a robust debate is both welcome and necessary, as, on this issue at least, no search engine has an instant or perfect answer.
When you search online, there’s an unwritten assumption that you’ll get an instant answer, as well as additional information if you need to dig deeper. This is all possible because of two decades worth of investment and innovation by many different companies. Today, however, search engines across Europe face a new challenge—one we’ve had just two months to get our heads around. That challenge is figuring out what information we must deliberately omit from our results, following a new ruling from the European Court of Justice.
In the past we’ve restricted the removals we make from search to a very short list. It includes information deemed illegal by a court, such as defamation, pirated content (once we’re notified by the rights holder), malware, personal information such as bank details, child sexual abuse imagery and other things prohibited by local law (like material that glorifies Nazism in Germany).
We’ve taken this approach because, as article 19 of the Universal Declaration of Human Rights states: “Everyone has the right to freedom of opinion and expression; this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers."
But the European Court found that people have the right to ask for information to be removed from search results that include their names if it is “inadequate, irrelevant or no longer relevant, or excessive.” In deciding what to remove, search engines must also have regard to the public interest. These are, of course, very vague and subjective tests. The court also decided that search engines don’t qualify for a “journalistic exception.” This means that The Guardian could have an article on its website about an individual that’s perfectly legal, but we might not legally be able to show links to it in our results when you search for that person’s name. It’s a bit like saying the book can stay in the library, it just cannot be included in the library’s card catalogue.
It’s for these reasons that we disagree with the ruling. That said, we obviously respect the court’s authority and are doing our very best to comply quickly and responsibly. It’s a huge task as we’ve had over 70,000 take-down requests covering 250,000 webpages since May. So we now have a team of people individually reviewing each application, in most cases with limited information and almost no context.
The examples we’ve seen so far highlight the difficult value judgments search engines and European society now face: former politicians wanting posts removed that criticize their policies in office; serious, violent criminals asking for articles about their crimes to be deleted; bad reviews for professionals like architects and teachers; comments that people have written themselves (and now regret). In each case, someone wants the information hidden, while others might argue it should be out in the open.
When it comes to determining what’s in the the public interest, we’re taking into account a number of factors. These include whether: the information relates to a politician, celebrity, or other public figure; if the material comes from a reputable news source, and how recent it is; whether it involves political speech; questions of professional conduct that might be relevant to consumers; the involvement of criminal convictions that are not yet “spent”; and if the information is being published by a government. But these will always be difficult and debatable judgments.
We’re also doing our best to be transparent about removals: for example, we’re informing websites when one of their pages has been removed. But we cannot be specific about why we have removed the information because that could violate the individual’s privacy rights under the court's decision.
Of course, only two months in, our process is still very much a work in progress. It’s why we incorrectly removed links to some articles last week (they have since been reinstated). But the good news is that the ongoing, active debate that’s happening will inform the development of our principles, policies and practices—in particular about how to balance one person’s right to privacy with another’s right to know.
That’s why we've also set up an advisory council of experts, the final membership of which we're announcing today. These external experts from the worlds of academia, the media, data protection, civil society and the tech sector are serving as independent advisors to Google. The council will be asking for evidence and recommendations from different groups, and will hold public meetings this autumn across Europe to examine these issues more deeply. Its public report will include recommendations for particularly difficult removal requests (like criminal convictions); thoughts on the implications of the court’s decision for European Internet users, news publishers, search engines and others; and procedural steps that could improve accountability and transparency for websites and citizens.
The issues here at stake are important and difficult, but we’re committed to complying with the court’s decision. Indeed it's hard not to empathize with some of the requests we've seen—from the man who asked that we not show a news article saying he had been questioned in connection with a crime (he’s able to demonstrate that he was never charged) to the mother who requested that we remove news articles for her daughter’s name as she had been the victim of abuse. It’s a complex issue, with no easy answers. So a robust debate is both welcome and necessary, as, on this issue at least, no search engine has an instant or perfect answer.