Katie Engelhart attends the public hearing of Google’s Advisory Council, set up in response to a European Court of Justice judgement.
Imagine this: A man is murdered. Decades later, his widow is living in Italy—and finds that whenever she searches on Google for her name, an old article about her husband’s murder appears in the search results. She is horrified. Imagine now that this woman could go to Google lawyers and say: ‘remove the link from my search results!’ The original article would still exist in the online archives of whatever news site published it. The article would also appear in search results for the dead man’s name. But the woman’s own Google search would be cleansed of all references to the murder.
Such a murder took place. And such a request of Google was indeed made by a woman in Italy. As a result, Google erased the link to the decades-old news article—or in other words, allowed its search engine to “forget” the crime.
At a public hearing in central London in October 2014, Google Executive Chairman Eric Schmidt discussed similar real-life examples: of individuals who have recently approached Google with de-link requests. There was a convicted pedophile who wanted recent data about his conviction removed from searches of his name. (Google said No.) And a prior victim of violent crime who wanted links to news articles about the assault removed. (Google said Yes.)
These requests are possible because of a controversial ruling made by the European Court of Justice (ECJ) in May 2014 on the so-called “right to be forgotten.” The court ruled that Google and other search engines must entertain requests by individuals to erase (or “forget”) links that appear on Internet searches of their names—in cases when the linked-to information is “inadequate, irrelevant or excessive.” Google was ordered to balance “sensitivity for the individual’s private life and the interest of the public in having access to that information”—while also upholding “fundamental rights, such as the freedom of expression and of the media.”
That’s a pretty tall order for a search engine. Between May and October, Google fielded requests to de-link some half a million websites; it has removed about 58% of requested URLs.
The ruling means that, in Europe, Google will have to go beyond existing legislation on copyright, privacy and defamation. Under the “right to be forgotten,” Google may now have to erase links to data that is legal, factual and already in the public domain.
Reaction to the ruling was frenzied—and, in some cases, hysterical. Martin Clark, publisher of Mail Online, said that erasing links was “the equivalent of going into libraries and burning books you don’t like.” Free speech activists argued that the decision heralds the death of the free and fair internet in Europe. At the same time, proponents insisted that the ruling preserves the notion of “dignity” online.
In news reporting on the so-called “right to be forgotten debate”, a facile cultural narrative emerged, pitting an ostensibly free speech-loving America against a regulation-worshiping and privacy-obsessed Europe. California v. Brussels.
But some advocates on both sides of the line have come together to raise a common point: that these debates should be happening in European courtrooms, not open-concept Google campuses in Silicon Valley. They argue that the ECJ has effectively cast Google—a for-profit, California-based internet company—in the role of philosopher king: responsible for determining what constitutes the public interest in Europe.
Google itself (responsible for over 90% of internet searches in Europe) was none too pleased. The company has long insisted that it is a neutral purveyor of data rather than a “data controller” in terms of data protection law. Google never asked to be a “decision maker”, Chairman Schmidt charged. But the court has forced the company’s hand, and it must now take on a more explicitly editorial role.
In response to this mighty task, Google put together an “Advisory Council on the Right to be Forgotten,” which includes an Oxford ethics philosopher, a former German justice minister, a Belgian lawyer and Wikipedia founder Jimmy Wales. And then, the company launched a two-month-long road trip: seven town hall-style meetings held in major cities across Europe.
The London event was stop six of seven. The meeting—free and open to the public—was held near King’s Cross Station in a building shared by The Guardian. The crowd was surprisingly well heeled: mostly 20- and 30-somethings in dark-coloured business casual—rather than unkempt Wikipedia editors or sneaker-clad teenaged hackers, as I might have expected. Nobody spoke out of turn or interrupted. And attendees nibbled neatly on fancy, free sandwiches of red pepper hummus on focaccia and “32 days hung Northumbrian beef with red cabbage slaw.”
Some have dismissed the Google tour as a PR stunt, and suggested that the company is engaging in debate only to demonstrate how very murky and logistically-untenable the ECJ’s ruling is—and thus, to drum up opposition to the concept of a “right to be forgotten”. If the allegations are true, well, mission accomplished.
In his address, Schmidt presented several cases that, he said, gave Google lawyers pause. There was the case of a convicted criminal whose conviction is spent and who now wants links to articles about his/her conviction deleted. And the case of an individual who wants sensitive information about him/her erased, even though it was that very individual who published the data in the first place.
Emma Carr of the campaign group Big Brother Watch argued that the ECJ ruling should have applied to publishers rather than search engines. “The problem should be tackled at the source”, she said, “with the websites carrying the information being required to remove [it]”. Indeed, this would be a more complete kind of “forgetting” since the data itself, rather than just some links to it, would be erased. But Luciano Floridi, an Oxford philosopher who sits on Google’s Advisory Council, wondered if the existing set-up wasn’t better since it offered “a compromise” between digital remembering and forgetting. Several representatives from the media, including David Jordan, director of editorial policy and standards at the BBC agreed that erasure decisions should be left to publishers, but cautioned that information should only be removed in extreme circumstances, lest websites start “rewriting the past and altering history”. Jordan criticized “the lack of a formal appeal process”, whereby publishers could appeal a Google de-link decision.
Carr also raised the issue of how we draw a “line between public and private individuals” in an era when those concepts often appear blurred. Jordan proposed a hypothetical case: a voluntary school board member who evaluates the quality of school lunches. Is this man a “public figure”? And thus, does all his data belong in the public domain? This is especially complicated since private/public classifications are dynamic. Evan Harris, a former British MP and now associate director of the Hacked Offcampaign, pointed out that individuals could purge links about prior fraud before running for office. Should those links be re-instated when the subject enters the public domain? And by extension, is everybody’s personal data of public interest, on the grounds that we are all potential future public figures?
What about criminality? A number of requests have come from convicted criminals whose convictions are spent. Should Google “forget” references to their convictions? Or does information about criminality belong in the public domain? And anyway, what should Google policy be, given that the ECJ ruling covers 28 EU member states with widely varying criminal laws?
The issue of “harm”, and what constitutes “harm” was also debated. Oxford University’s Floridi noted that “embarrassment comes in degrees. Social embarrassment becomes social stigma becomes losing your job… Do we have a way of understanding when embarrassment, discomfort and unpleasantness become harm?” Perhaps the boundaries are different, depending on the individual and the context. And perhaps they should be adjusted when, say, the person in question is a child, or has a mental health issue.
Another contentious term was “relevant”, borrowed from the ECJ ruling. The court advised Google to consider the age of the data in question. But how old must data be before it loses relevance? And does it ever? Might ostensibly irrelevant information become relevant at some future date? “For historians, even trivial facts can become relevant over time”, argued Gabrielle Guillemin, senior legal officer at the non-profit ARTICLE 19. Alan Wardle, head of policy and public affairs at NSPCC, countered: “most people don’t want to be of interest to historians twenty years down the line. They want to get on with their lives now.”
Philosophical musings aside, Google’s review process promises to be enormously complicated and costly. The company reportedly hired dozens of lawyers and paralegals to deal with requests on a case-by-case basis. “It’s not obvious to me that this can ever be automated”, said Schmidt. “We would if we could. We like to automate things.”
Katie Engelhart is a London-based writer and reporter. @katieengelhart www.katieengelhart.com
reply report Report comment
It does seem here as though the European Court of Justice is attempting to have its cake and eat it. If it has decided that Google should be bound by its ruling that individuals have the right to have search results modified, why does it then deem that Google will implement these decisions satisfactorily?
It would seem like a counter-intuitive move to suggest that one dominant data provider has unweildly power over individual lives, but then suggest that that same giant should then act as final arbiter in deciding what appears on screens. Essentially it is taking moral decisions away from Google’s oblivious algorithms and entrusting them to their human actors.
In this instance the obvious solution is for the European Union to put its money where its gavel is and to create an independent body to adjucate on these cases for all search engines, not just Google.