Can Google’s algorithm slander a politician’s wife?

Type ‘Bettina Wulff’, the name of a former German president’s wife, into Google and the autocomplete function will add ‘escort’. Is this algorithmic addition a form of defamation? Sebastian Huempfer explores the case.

The case

When Google users search for Bettina Wulff in English or German, the autocomplete function suggests they add “escort” or the German expressions for “prostitute” and “red-light” to the end of their query. These suggestions reflect widespread but unfounded rumours about the former German president’s wife, which were initially propagated by her husband’s political rivals.

On 8 September 2012 Wulff filed a lawsuit against Google for defamation, accusing the company of “destroying [her] reputation”. Wulff also sued the well-known German TV host Günther Jauch for talking about the rumours and issued cease and desist letters to 34 publications in Germany and other countries. In a settlement, Jauch agreed to stop mentioning the rumours and courts ordered several publications to pay damages to Wulff.

Google has won five similar lawsuits in Germany and the company has so far refused to change these automatic suggestions, arguing: “search terms in Google autocomplete reflect the actual search terms of all users” and are determined algorithmically with no editorial decisions involved. Google has, however, changed autocomplete suggestions following defamation and intellectual property lawsuits in Japan, France and the UK. Autocomplete is also permanently disabled for many search terms, including “cocaine” and “schoolgirl”, to avoid displaying offensive or illegal suggestions.

On its support site, Google says autocomplete allows users to “rest [their] fingers” and easily “repeat a favourite search”. The company adds that they “apply a narrow set of removal policies for pornography, violence, hate speech, and terms that are frequently used to find content that infringes copyrights”. Autocomplete suggestions can be, and in Wulff’s case are, very different from the actual top search results.

Author opinion

Our seventh draft principle states: “We must be able to protect our privacy and to counter slurs on our reputations, but not prevent scrutiny that is in the public interest.” Assuming a slur is a slur even if the perpetrator is an algorithm rather than a person, I think Google should yield to Bettina Wulff’s demands, as it has in other cases.

Autocomplete has undoubtedly damaged Wulff’s reputation, and public interest does not justify this slander. Saving people who genuinely want to type “Bettina Wulff escort” a few keystrokes does not justify the spread of unfounded rumours. Anyone who wants information about these rumours does not need autocomplete to find it, so the free flow of information is not in danger. Removing the suggestions would therefore not violate our second draft principle, which prohibits only illegitimate encroachments on the internet.

That being said, I do not think Google has a responsibility to preemptively remove undesirable autocomplete suggestions. To do so would be impossible. If Wulff had become famous for campaigning against prostitution, the suggestion “Bettina Wulff prostitution” would not be a slur. So simply blacklisting certain words will hardly work. In fact, I would argue that existing preemptive blacklists already go too far: autocomplete will, for example, never suggest “Neonazi”.

The question remains whether there is any point in removing these suggestions after the damage has been done. There may be more difficult cases than Bettina Wulff’s, and I don’t pretend to know where to draw the line. But in this case, I think it is quite clear that autocomplete should be reigned in because there would be no detriment to free speech involved.

- Sebastian Huempfer

Read more:


Comments (3)

Automated machine translations are provided by Google Translate. They should give you a rough idea of what the contributor has said, but cannot be relied on to give an accurate, nuanced translation. Please read them with this in mind.

  1. A great piece!
    Dominic’s comment reminds me of Michael Bloomberg’s recent attempt of preventing just that by buying around 400 domains that included his name, including unfavourable ones like MichaelBloombergisaWeiner.nyc or BloombergistooRich.nyc.
    Unsurprisingly, this backfired as media outlets began reporting on the full list of bought domains…

  2. If person have rightful demands on institutes’ product, it should listen what subscriber said. The attitude toward every Internet company is necessary.

    After that, the company ought to think about public affects. Then, make a deeply thinking about how to balance personal privacy and fair of information.

  3. This is a great article and a tough question to argue either way.

    It gives me an idea: If I am going to run for political office I will get my campaign team to produce many websites saying how I am a nice guy. Then when people search my name it will autocomplete “Dominic Burbidge nice guy”. Who knows, maybe my political opponents will then sue Google.

Leave a comment in any language

Highlights

Swipe left to browse all of the highlights.


Free Speech Debate is a research project of the Dahrendorf Programme for the Study of Freedom at St Antony's College in the University of Oxford. www.freespeechdebate.ox.ac.uk

The University of Oxford